Feb 23 14:16:29.598216 master-0 systemd[1]: Starting Kubernetes Kubelet... Feb 23 14:16:30.685178 master-0 kubenswrapper[4171]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 23 14:16:30.685178 master-0 kubenswrapper[4171]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 23 14:16:30.685178 master-0 kubenswrapper[4171]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 23 14:16:30.685178 master-0 kubenswrapper[4171]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 23 14:16:30.685178 master-0 kubenswrapper[4171]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 23 14:16:30.685178 master-0 kubenswrapper[4171]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 23 14:16:30.688021 master-0 kubenswrapper[4171]: I0223 14:16:30.687821 4171 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 23 14:16:30.696840 master-0 kubenswrapper[4171]: W0223 14:16:30.696772 4171 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 23 14:16:30.696840 master-0 kubenswrapper[4171]: W0223 14:16:30.696813 4171 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 23 14:16:30.696840 master-0 kubenswrapper[4171]: W0223 14:16:30.696830 4171 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 23 14:16:30.696840 master-0 kubenswrapper[4171]: W0223 14:16:30.696845 4171 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 23 14:16:30.697050 master-0 kubenswrapper[4171]: W0223 14:16:30.696857 4171 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 23 14:16:30.697050 master-0 kubenswrapper[4171]: W0223 14:16:30.696871 4171 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 23 14:16:30.697050 master-0 kubenswrapper[4171]: W0223 14:16:30.696881 4171 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 23 14:16:30.697050 master-0 kubenswrapper[4171]: W0223 14:16:30.696889 4171 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 23 14:16:30.697050 master-0 kubenswrapper[4171]: W0223 14:16:30.696898 4171 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 23 14:16:30.697050 master-0 kubenswrapper[4171]: W0223 14:16:30.696906 4171 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 23 14:16:30.697050 master-0 kubenswrapper[4171]: W0223 14:16:30.696926 4171 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 23 14:16:30.697050 master-0 kubenswrapper[4171]: W0223 14:16:30.696936 4171 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 23 14:16:30.697050 master-0 kubenswrapper[4171]: W0223 14:16:30.696947 4171 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 23 14:16:30.697050 master-0 kubenswrapper[4171]: W0223 14:16:30.696958 4171 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 23 14:16:30.697050 master-0 kubenswrapper[4171]: W0223 14:16:30.696969 4171 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 23 14:16:30.697050 master-0 kubenswrapper[4171]: W0223 14:16:30.696980 4171 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 23 14:16:30.697050 master-0 kubenswrapper[4171]: W0223 14:16:30.696991 4171 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 23 14:16:30.697050 master-0 kubenswrapper[4171]: W0223 14:16:30.697001 4171 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 23 14:16:30.697050 master-0 kubenswrapper[4171]: W0223 14:16:30.697011 4171 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 23 14:16:30.697050 master-0 kubenswrapper[4171]: W0223 14:16:30.697022 4171 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 23 14:16:30.697050 master-0 kubenswrapper[4171]: W0223 14:16:30.697031 4171 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 23 14:16:30.697050 master-0 kubenswrapper[4171]: W0223 14:16:30.697040 4171 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 23 14:16:30.697050 master-0 kubenswrapper[4171]: W0223 14:16:30.697048 4171 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 23 14:16:30.697050 master-0 kubenswrapper[4171]: W0223 14:16:30.697057 4171 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 23 14:16:30.697721 master-0 kubenswrapper[4171]: W0223 14:16:30.697066 4171 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 23 14:16:30.697721 master-0 kubenswrapper[4171]: W0223 14:16:30.697078 4171 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 23 14:16:30.697721 master-0 kubenswrapper[4171]: W0223 14:16:30.697089 4171 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 23 14:16:30.697721 master-0 kubenswrapper[4171]: W0223 14:16:30.697101 4171 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 23 14:16:30.697721 master-0 kubenswrapper[4171]: W0223 14:16:30.697112 4171 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 23 14:16:30.697721 master-0 kubenswrapper[4171]: W0223 14:16:30.697122 4171 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 23 14:16:30.697721 master-0 kubenswrapper[4171]: W0223 14:16:30.697133 4171 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 23 14:16:30.697721 master-0 kubenswrapper[4171]: W0223 14:16:30.697148 4171 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 23 14:16:30.697721 master-0 kubenswrapper[4171]: W0223 14:16:30.697161 4171 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 23 14:16:30.697721 master-0 kubenswrapper[4171]: W0223 14:16:30.697171 4171 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 23 14:16:30.697721 master-0 kubenswrapper[4171]: W0223 14:16:30.697181 4171 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 23 14:16:30.697721 master-0 kubenswrapper[4171]: W0223 14:16:30.697192 4171 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 23 14:16:30.697721 master-0 kubenswrapper[4171]: W0223 14:16:30.697203 4171 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 23 14:16:30.697721 master-0 kubenswrapper[4171]: W0223 14:16:30.697214 4171 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Feb 23 14:16:30.697721 master-0 kubenswrapper[4171]: W0223 14:16:30.697228 4171 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 23 14:16:30.697721 master-0 kubenswrapper[4171]: W0223 14:16:30.697239 4171 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 23 14:16:30.697721 master-0 kubenswrapper[4171]: W0223 14:16:30.697249 4171 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 23 14:16:30.697721 master-0 kubenswrapper[4171]: W0223 14:16:30.697260 4171 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 23 14:16:30.697721 master-0 kubenswrapper[4171]: W0223 14:16:30.697272 4171 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 23 14:16:30.698272 master-0 kubenswrapper[4171]: W0223 14:16:30.697287 4171 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 23 14:16:30.698272 master-0 kubenswrapper[4171]: W0223 14:16:30.697299 4171 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 23 14:16:30.698272 master-0 kubenswrapper[4171]: W0223 14:16:30.697313 4171 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 23 14:16:30.698272 master-0 kubenswrapper[4171]: W0223 14:16:30.697324 4171 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 23 14:16:30.698272 master-0 kubenswrapper[4171]: W0223 14:16:30.697337 4171 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 23 14:16:30.698272 master-0 kubenswrapper[4171]: W0223 14:16:30.697348 4171 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 23 14:16:30.698272 master-0 kubenswrapper[4171]: W0223 14:16:30.697359 4171 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 23 14:16:30.698272 master-0 kubenswrapper[4171]: W0223 14:16:30.697369 4171 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 23 14:16:30.698272 master-0 kubenswrapper[4171]: W0223 14:16:30.697380 4171 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 23 14:16:30.698272 master-0 kubenswrapper[4171]: W0223 14:16:30.697391 4171 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 23 14:16:30.698272 master-0 kubenswrapper[4171]: W0223 14:16:30.697406 4171 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 23 14:16:30.698272 master-0 kubenswrapper[4171]: W0223 14:16:30.697417 4171 feature_gate.go:330] unrecognized feature gate: Example Feb 23 14:16:30.698272 master-0 kubenswrapper[4171]: W0223 14:16:30.697426 4171 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 23 14:16:30.698272 master-0 kubenswrapper[4171]: W0223 14:16:30.697436 4171 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 23 14:16:30.698272 master-0 kubenswrapper[4171]: W0223 14:16:30.697448 4171 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 23 14:16:30.698272 master-0 kubenswrapper[4171]: W0223 14:16:30.697458 4171 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 23 14:16:30.698272 master-0 kubenswrapper[4171]: W0223 14:16:30.697467 4171 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 23 14:16:30.698272 master-0 kubenswrapper[4171]: W0223 14:16:30.697508 4171 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 23 14:16:30.698272 master-0 kubenswrapper[4171]: W0223 14:16:30.697517 4171 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 23 14:16:30.698851 master-0 kubenswrapper[4171]: W0223 14:16:30.697525 4171 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 23 14:16:30.698851 master-0 kubenswrapper[4171]: W0223 14:16:30.697536 4171 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 23 14:16:30.698851 master-0 kubenswrapper[4171]: W0223 14:16:30.697546 4171 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 23 14:16:30.698851 master-0 kubenswrapper[4171]: W0223 14:16:30.697557 4171 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 23 14:16:30.698851 master-0 kubenswrapper[4171]: W0223 14:16:30.697567 4171 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 23 14:16:30.698851 master-0 kubenswrapper[4171]: W0223 14:16:30.697579 4171 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 23 14:16:30.698851 master-0 kubenswrapper[4171]: W0223 14:16:30.697591 4171 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 23 14:16:30.698851 master-0 kubenswrapper[4171]: W0223 14:16:30.697601 4171 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 23 14:16:30.698851 master-0 kubenswrapper[4171]: W0223 14:16:30.697611 4171 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 23 14:16:30.698851 master-0 kubenswrapper[4171]: W0223 14:16:30.697623 4171 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 23 14:16:30.698851 master-0 kubenswrapper[4171]: I0223 14:16:30.697805 4171 flags.go:64] FLAG: --address="0.0.0.0" Feb 23 14:16:30.698851 master-0 kubenswrapper[4171]: I0223 14:16:30.697826 4171 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 23 14:16:30.698851 master-0 kubenswrapper[4171]: I0223 14:16:30.697844 4171 flags.go:64] FLAG: --anonymous-auth="true" Feb 23 14:16:30.698851 master-0 kubenswrapper[4171]: I0223 14:16:30.697859 4171 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 23 14:16:30.698851 master-0 kubenswrapper[4171]: I0223 14:16:30.697873 4171 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 23 14:16:30.698851 master-0 kubenswrapper[4171]: I0223 14:16:30.697883 4171 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 23 14:16:30.698851 master-0 kubenswrapper[4171]: I0223 14:16:30.697897 4171 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 23 14:16:30.698851 master-0 kubenswrapper[4171]: I0223 14:16:30.697915 4171 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 23 14:16:30.698851 master-0 kubenswrapper[4171]: I0223 14:16:30.697925 4171 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 23 14:16:30.698851 master-0 kubenswrapper[4171]: I0223 14:16:30.697935 4171 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 23 14:16:30.698851 master-0 kubenswrapper[4171]: I0223 14:16:30.697945 4171 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 23 14:16:30.699423 master-0 kubenswrapper[4171]: I0223 14:16:30.697955 4171 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 23 14:16:30.699423 master-0 kubenswrapper[4171]: I0223 14:16:30.697966 4171 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 23 14:16:30.699423 master-0 kubenswrapper[4171]: I0223 14:16:30.697976 4171 flags.go:64] FLAG: --cgroup-root="" Feb 23 14:16:30.699423 master-0 kubenswrapper[4171]: I0223 14:16:30.697985 4171 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 23 14:16:30.699423 master-0 kubenswrapper[4171]: I0223 14:16:30.697996 4171 flags.go:64] FLAG: --client-ca-file="" Feb 23 14:16:30.699423 master-0 kubenswrapper[4171]: I0223 14:16:30.698009 4171 flags.go:64] FLAG: --cloud-config="" Feb 23 14:16:30.699423 master-0 kubenswrapper[4171]: I0223 14:16:30.698019 4171 flags.go:64] FLAG: --cloud-provider="" Feb 23 14:16:30.699423 master-0 kubenswrapper[4171]: I0223 14:16:30.698031 4171 flags.go:64] FLAG: --cluster-dns="[]" Feb 23 14:16:30.699423 master-0 kubenswrapper[4171]: I0223 14:16:30.698044 4171 flags.go:64] FLAG: --cluster-domain="" Feb 23 14:16:30.699423 master-0 kubenswrapper[4171]: I0223 14:16:30.698053 4171 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 23 14:16:30.699423 master-0 kubenswrapper[4171]: I0223 14:16:30.698063 4171 flags.go:64] FLAG: --config-dir="" Feb 23 14:16:30.699423 master-0 kubenswrapper[4171]: I0223 14:16:30.698074 4171 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 23 14:16:30.699423 master-0 kubenswrapper[4171]: I0223 14:16:30.698084 4171 flags.go:64] FLAG: --container-log-max-files="5" Feb 23 14:16:30.699423 master-0 kubenswrapper[4171]: I0223 14:16:30.698096 4171 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 23 14:16:30.699423 master-0 kubenswrapper[4171]: I0223 14:16:30.698106 4171 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 23 14:16:30.699423 master-0 kubenswrapper[4171]: I0223 14:16:30.698120 4171 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 23 14:16:30.699423 master-0 kubenswrapper[4171]: I0223 14:16:30.698131 4171 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 23 14:16:30.699423 master-0 kubenswrapper[4171]: I0223 14:16:30.698141 4171 flags.go:64] FLAG: --contention-profiling="false" Feb 23 14:16:30.699423 master-0 kubenswrapper[4171]: I0223 14:16:30.698152 4171 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 23 14:16:30.699423 master-0 kubenswrapper[4171]: I0223 14:16:30.698161 4171 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 23 14:16:30.699423 master-0 kubenswrapper[4171]: I0223 14:16:30.698171 4171 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 23 14:16:30.699423 master-0 kubenswrapper[4171]: I0223 14:16:30.698181 4171 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 23 14:16:30.699423 master-0 kubenswrapper[4171]: I0223 14:16:30.698193 4171 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 23 14:16:30.699423 master-0 kubenswrapper[4171]: I0223 14:16:30.698203 4171 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 23 14:16:30.699423 master-0 kubenswrapper[4171]: I0223 14:16:30.698213 4171 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 23 14:16:30.700244 master-0 kubenswrapper[4171]: I0223 14:16:30.698223 4171 flags.go:64] FLAG: --enable-load-reader="false" Feb 23 14:16:30.700244 master-0 kubenswrapper[4171]: I0223 14:16:30.698233 4171 flags.go:64] FLAG: --enable-server="true" Feb 23 14:16:30.700244 master-0 kubenswrapper[4171]: I0223 14:16:30.698244 4171 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 23 14:16:30.700244 master-0 kubenswrapper[4171]: I0223 14:16:30.698255 4171 flags.go:64] FLAG: --event-burst="100" Feb 23 14:16:30.700244 master-0 kubenswrapper[4171]: I0223 14:16:30.698266 4171 flags.go:64] FLAG: --event-qps="50" Feb 23 14:16:30.700244 master-0 kubenswrapper[4171]: I0223 14:16:30.698278 4171 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 23 14:16:30.700244 master-0 kubenswrapper[4171]: I0223 14:16:30.698291 4171 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 23 14:16:30.700244 master-0 kubenswrapper[4171]: I0223 14:16:30.698304 4171 flags.go:64] FLAG: --eviction-hard="" Feb 23 14:16:30.700244 master-0 kubenswrapper[4171]: I0223 14:16:30.698319 4171 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 23 14:16:30.700244 master-0 kubenswrapper[4171]: I0223 14:16:30.698331 4171 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 23 14:16:30.700244 master-0 kubenswrapper[4171]: I0223 14:16:30.698341 4171 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 23 14:16:30.700244 master-0 kubenswrapper[4171]: I0223 14:16:30.698351 4171 flags.go:64] FLAG: --eviction-soft="" Feb 23 14:16:30.700244 master-0 kubenswrapper[4171]: I0223 14:16:30.698363 4171 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 23 14:16:30.700244 master-0 kubenswrapper[4171]: I0223 14:16:30.698372 4171 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 23 14:16:30.700244 master-0 kubenswrapper[4171]: I0223 14:16:30.698382 4171 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 23 14:16:30.700244 master-0 kubenswrapper[4171]: I0223 14:16:30.698392 4171 flags.go:64] FLAG: --experimental-mounter-path="" Feb 23 14:16:30.700244 master-0 kubenswrapper[4171]: I0223 14:16:30.698401 4171 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 23 14:16:30.700244 master-0 kubenswrapper[4171]: I0223 14:16:30.698411 4171 flags.go:64] FLAG: --fail-swap-on="true" Feb 23 14:16:30.700244 master-0 kubenswrapper[4171]: I0223 14:16:30.698421 4171 flags.go:64] FLAG: --feature-gates="" Feb 23 14:16:30.700244 master-0 kubenswrapper[4171]: I0223 14:16:30.698432 4171 flags.go:64] FLAG: --file-check-frequency="20s" Feb 23 14:16:30.700244 master-0 kubenswrapper[4171]: I0223 14:16:30.698442 4171 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 23 14:16:30.700244 master-0 kubenswrapper[4171]: I0223 14:16:30.698453 4171 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 23 14:16:30.700244 master-0 kubenswrapper[4171]: I0223 14:16:30.698465 4171 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 23 14:16:30.700244 master-0 kubenswrapper[4171]: I0223 14:16:30.698511 4171 flags.go:64] FLAG: --healthz-port="10248" Feb 23 14:16:30.700244 master-0 kubenswrapper[4171]: I0223 14:16:30.698521 4171 flags.go:64] FLAG: --help="false" Feb 23 14:16:30.700244 master-0 kubenswrapper[4171]: I0223 14:16:30.698531 4171 flags.go:64] FLAG: --hostname-override="" Feb 23 14:16:30.701101 master-0 kubenswrapper[4171]: I0223 14:16:30.698541 4171 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 23 14:16:30.701101 master-0 kubenswrapper[4171]: I0223 14:16:30.698551 4171 flags.go:64] FLAG: --http-check-frequency="20s" Feb 23 14:16:30.701101 master-0 kubenswrapper[4171]: I0223 14:16:30.698561 4171 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 23 14:16:30.701101 master-0 kubenswrapper[4171]: I0223 14:16:30.698571 4171 flags.go:64] FLAG: --image-credential-provider-config="" Feb 23 14:16:30.701101 master-0 kubenswrapper[4171]: I0223 14:16:30.698580 4171 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 23 14:16:30.701101 master-0 kubenswrapper[4171]: I0223 14:16:30.698589 4171 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 23 14:16:30.701101 master-0 kubenswrapper[4171]: I0223 14:16:30.698600 4171 flags.go:64] FLAG: --image-service-endpoint="" Feb 23 14:16:30.701101 master-0 kubenswrapper[4171]: I0223 14:16:30.698610 4171 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 23 14:16:30.701101 master-0 kubenswrapper[4171]: I0223 14:16:30.698619 4171 flags.go:64] FLAG: --kube-api-burst="100" Feb 23 14:16:30.701101 master-0 kubenswrapper[4171]: I0223 14:16:30.698630 4171 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 23 14:16:30.701101 master-0 kubenswrapper[4171]: I0223 14:16:30.698641 4171 flags.go:64] FLAG: --kube-api-qps="50" Feb 23 14:16:30.701101 master-0 kubenswrapper[4171]: I0223 14:16:30.698650 4171 flags.go:64] FLAG: --kube-reserved="" Feb 23 14:16:30.701101 master-0 kubenswrapper[4171]: I0223 14:16:30.698660 4171 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 23 14:16:30.701101 master-0 kubenswrapper[4171]: I0223 14:16:30.698670 4171 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 23 14:16:30.701101 master-0 kubenswrapper[4171]: I0223 14:16:30.698679 4171 flags.go:64] FLAG: --kubelet-cgroups="" Feb 23 14:16:30.701101 master-0 kubenswrapper[4171]: I0223 14:16:30.698690 4171 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 23 14:16:30.701101 master-0 kubenswrapper[4171]: I0223 14:16:30.698699 4171 flags.go:64] FLAG: --lock-file="" Feb 23 14:16:30.701101 master-0 kubenswrapper[4171]: I0223 14:16:30.698709 4171 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 23 14:16:30.701101 master-0 kubenswrapper[4171]: I0223 14:16:30.698719 4171 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 23 14:16:30.701101 master-0 kubenswrapper[4171]: I0223 14:16:30.698729 4171 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 23 14:16:30.701101 master-0 kubenswrapper[4171]: I0223 14:16:30.698743 4171 flags.go:64] FLAG: --log-json-split-stream="false" Feb 23 14:16:30.701101 master-0 kubenswrapper[4171]: I0223 14:16:30.698753 4171 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 23 14:16:30.701101 master-0 kubenswrapper[4171]: I0223 14:16:30.698763 4171 flags.go:64] FLAG: --log-text-split-stream="false" Feb 23 14:16:30.701101 master-0 kubenswrapper[4171]: I0223 14:16:30.698773 4171 flags.go:64] FLAG: --logging-format="text" Feb 23 14:16:30.701101 master-0 kubenswrapper[4171]: I0223 14:16:30.698782 4171 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 23 14:16:30.701846 master-0 kubenswrapper[4171]: I0223 14:16:30.698792 4171 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 23 14:16:30.701846 master-0 kubenswrapper[4171]: I0223 14:16:30.698802 4171 flags.go:64] FLAG: --manifest-url="" Feb 23 14:16:30.701846 master-0 kubenswrapper[4171]: I0223 14:16:30.698811 4171 flags.go:64] FLAG: --manifest-url-header="" Feb 23 14:16:30.701846 master-0 kubenswrapper[4171]: I0223 14:16:30.698827 4171 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 23 14:16:30.701846 master-0 kubenswrapper[4171]: I0223 14:16:30.698837 4171 flags.go:64] FLAG: --max-open-files="1000000" Feb 23 14:16:30.701846 master-0 kubenswrapper[4171]: I0223 14:16:30.698849 4171 flags.go:64] FLAG: --max-pods="110" Feb 23 14:16:30.701846 master-0 kubenswrapper[4171]: I0223 14:16:30.698858 4171 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 23 14:16:30.701846 master-0 kubenswrapper[4171]: I0223 14:16:30.698869 4171 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 23 14:16:30.701846 master-0 kubenswrapper[4171]: I0223 14:16:30.698879 4171 flags.go:64] FLAG: --memory-manager-policy="None" Feb 23 14:16:30.701846 master-0 kubenswrapper[4171]: I0223 14:16:30.698889 4171 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 23 14:16:30.701846 master-0 kubenswrapper[4171]: I0223 14:16:30.698899 4171 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 23 14:16:30.701846 master-0 kubenswrapper[4171]: I0223 14:16:30.698909 4171 flags.go:64] FLAG: --node-ip="192.168.32.10" Feb 23 14:16:30.701846 master-0 kubenswrapper[4171]: I0223 14:16:30.698919 4171 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 23 14:16:30.701846 master-0 kubenswrapper[4171]: I0223 14:16:30.698940 4171 flags.go:64] FLAG: --node-status-max-images="50" Feb 23 14:16:30.701846 master-0 kubenswrapper[4171]: I0223 14:16:30.698950 4171 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 23 14:16:30.701846 master-0 kubenswrapper[4171]: I0223 14:16:30.698960 4171 flags.go:64] FLAG: --oom-score-adj="-999" Feb 23 14:16:30.701846 master-0 kubenswrapper[4171]: I0223 14:16:30.698970 4171 flags.go:64] FLAG: --pod-cidr="" Feb 23 14:16:30.701846 master-0 kubenswrapper[4171]: I0223 14:16:30.698979 4171 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5001a555eb05eef7f23d64667303c2b4db8343ee900c265f7613c40c1db229" Feb 23 14:16:30.701846 master-0 kubenswrapper[4171]: I0223 14:16:30.698994 4171 flags.go:64] FLAG: --pod-manifest-path="" Feb 23 14:16:30.701846 master-0 kubenswrapper[4171]: I0223 14:16:30.699003 4171 flags.go:64] FLAG: --pod-max-pids="-1" Feb 23 14:16:30.701846 master-0 kubenswrapper[4171]: I0223 14:16:30.699014 4171 flags.go:64] FLAG: --pods-per-core="0" Feb 23 14:16:30.701846 master-0 kubenswrapper[4171]: I0223 14:16:30.699024 4171 flags.go:64] FLAG: --port="10250" Feb 23 14:16:30.701846 master-0 kubenswrapper[4171]: I0223 14:16:30.699034 4171 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 23 14:16:30.701846 master-0 kubenswrapper[4171]: I0223 14:16:30.699044 4171 flags.go:64] FLAG: --provider-id="" Feb 23 14:16:30.702523 master-0 kubenswrapper[4171]: I0223 14:16:30.699054 4171 flags.go:64] FLAG: --qos-reserved="" Feb 23 14:16:30.702523 master-0 kubenswrapper[4171]: I0223 14:16:30.699063 4171 flags.go:64] FLAG: --read-only-port="10255" Feb 23 14:16:30.702523 master-0 kubenswrapper[4171]: I0223 14:16:30.699073 4171 flags.go:64] FLAG: --register-node="true" Feb 23 14:16:30.702523 master-0 kubenswrapper[4171]: I0223 14:16:30.699083 4171 flags.go:64] FLAG: --register-schedulable="true" Feb 23 14:16:30.702523 master-0 kubenswrapper[4171]: I0223 14:16:30.699093 4171 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 23 14:16:30.702523 master-0 kubenswrapper[4171]: I0223 14:16:30.699110 4171 flags.go:64] FLAG: --registry-burst="10" Feb 23 14:16:30.702523 master-0 kubenswrapper[4171]: I0223 14:16:30.699120 4171 flags.go:64] FLAG: --registry-qps="5" Feb 23 14:16:30.702523 master-0 kubenswrapper[4171]: I0223 14:16:30.699130 4171 flags.go:64] FLAG: --reserved-cpus="" Feb 23 14:16:30.702523 master-0 kubenswrapper[4171]: I0223 14:16:30.699139 4171 flags.go:64] FLAG: --reserved-memory="" Feb 23 14:16:30.702523 master-0 kubenswrapper[4171]: I0223 14:16:30.699152 4171 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 23 14:16:30.702523 master-0 kubenswrapper[4171]: I0223 14:16:30.699162 4171 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 23 14:16:30.702523 master-0 kubenswrapper[4171]: I0223 14:16:30.699172 4171 flags.go:64] FLAG: --rotate-certificates="false" Feb 23 14:16:30.702523 master-0 kubenswrapper[4171]: I0223 14:16:30.699185 4171 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 23 14:16:30.702523 master-0 kubenswrapper[4171]: I0223 14:16:30.699194 4171 flags.go:64] FLAG: --runonce="false" Feb 23 14:16:30.702523 master-0 kubenswrapper[4171]: I0223 14:16:30.699204 4171 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 23 14:16:30.702523 master-0 kubenswrapper[4171]: I0223 14:16:30.699216 4171 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 23 14:16:30.702523 master-0 kubenswrapper[4171]: I0223 14:16:30.699229 4171 flags.go:64] FLAG: --seccomp-default="false" Feb 23 14:16:30.702523 master-0 kubenswrapper[4171]: I0223 14:16:30.699241 4171 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 23 14:16:30.702523 master-0 kubenswrapper[4171]: I0223 14:16:30.699252 4171 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 23 14:16:30.702523 master-0 kubenswrapper[4171]: I0223 14:16:30.699265 4171 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 23 14:16:30.702523 master-0 kubenswrapper[4171]: I0223 14:16:30.699278 4171 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 23 14:16:30.702523 master-0 kubenswrapper[4171]: I0223 14:16:30.699291 4171 flags.go:64] FLAG: --storage-driver-password="root" Feb 23 14:16:30.702523 master-0 kubenswrapper[4171]: I0223 14:16:30.699303 4171 flags.go:64] FLAG: --storage-driver-secure="false" Feb 23 14:16:30.702523 master-0 kubenswrapper[4171]: I0223 14:16:30.699316 4171 flags.go:64] FLAG: --storage-driver-table="stats" Feb 23 14:16:30.702523 master-0 kubenswrapper[4171]: I0223 14:16:30.699328 4171 flags.go:64] FLAG: --storage-driver-user="root" Feb 23 14:16:30.703217 master-0 kubenswrapper[4171]: I0223 14:16:30.699341 4171 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 23 14:16:30.703217 master-0 kubenswrapper[4171]: I0223 14:16:30.699354 4171 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 23 14:16:30.703217 master-0 kubenswrapper[4171]: I0223 14:16:30.699366 4171 flags.go:64] FLAG: --system-cgroups="" Feb 23 14:16:30.703217 master-0 kubenswrapper[4171]: I0223 14:16:30.699378 4171 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Feb 23 14:16:30.703217 master-0 kubenswrapper[4171]: I0223 14:16:30.699396 4171 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 23 14:16:30.703217 master-0 kubenswrapper[4171]: I0223 14:16:30.699408 4171 flags.go:64] FLAG: --tls-cert-file="" Feb 23 14:16:30.703217 master-0 kubenswrapper[4171]: I0223 14:16:30.699421 4171 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 23 14:16:30.703217 master-0 kubenswrapper[4171]: I0223 14:16:30.699439 4171 flags.go:64] FLAG: --tls-min-version="" Feb 23 14:16:30.703217 master-0 kubenswrapper[4171]: I0223 14:16:30.699451 4171 flags.go:64] FLAG: --tls-private-key-file="" Feb 23 14:16:30.703217 master-0 kubenswrapper[4171]: I0223 14:16:30.699463 4171 flags.go:64] FLAG: --topology-manager-policy="none" Feb 23 14:16:30.703217 master-0 kubenswrapper[4171]: I0223 14:16:30.699510 4171 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 23 14:16:30.703217 master-0 kubenswrapper[4171]: I0223 14:16:30.699526 4171 flags.go:64] FLAG: --topology-manager-scope="container" Feb 23 14:16:30.703217 master-0 kubenswrapper[4171]: I0223 14:16:30.699539 4171 flags.go:64] FLAG: --v="2" Feb 23 14:16:30.703217 master-0 kubenswrapper[4171]: I0223 14:16:30.699556 4171 flags.go:64] FLAG: --version="false" Feb 23 14:16:30.703217 master-0 kubenswrapper[4171]: I0223 14:16:30.699572 4171 flags.go:64] FLAG: --vmodule="" Feb 23 14:16:30.703217 master-0 kubenswrapper[4171]: I0223 14:16:30.699586 4171 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 23 14:16:30.703217 master-0 kubenswrapper[4171]: I0223 14:16:30.699601 4171 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 23 14:16:30.703217 master-0 kubenswrapper[4171]: W0223 14:16:30.699899 4171 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 23 14:16:30.703217 master-0 kubenswrapper[4171]: W0223 14:16:30.699918 4171 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 23 14:16:30.703217 master-0 kubenswrapper[4171]: W0223 14:16:30.699933 4171 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 23 14:16:30.703217 master-0 kubenswrapper[4171]: W0223 14:16:30.699944 4171 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 23 14:16:30.703217 master-0 kubenswrapper[4171]: W0223 14:16:30.699957 4171 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 23 14:16:30.703217 master-0 kubenswrapper[4171]: W0223 14:16:30.699971 4171 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 23 14:16:30.703886 master-0 kubenswrapper[4171]: W0223 14:16:30.699985 4171 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 23 14:16:30.703886 master-0 kubenswrapper[4171]: W0223 14:16:30.699996 4171 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 23 14:16:30.703886 master-0 kubenswrapper[4171]: W0223 14:16:30.700007 4171 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 23 14:16:30.703886 master-0 kubenswrapper[4171]: W0223 14:16:30.700019 4171 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 23 14:16:30.703886 master-0 kubenswrapper[4171]: W0223 14:16:30.700029 4171 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 23 14:16:30.703886 master-0 kubenswrapper[4171]: W0223 14:16:30.700041 4171 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 23 14:16:30.703886 master-0 kubenswrapper[4171]: W0223 14:16:30.700051 4171 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 23 14:16:30.703886 master-0 kubenswrapper[4171]: W0223 14:16:30.700062 4171 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 23 14:16:30.703886 master-0 kubenswrapper[4171]: W0223 14:16:30.700072 4171 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 23 14:16:30.703886 master-0 kubenswrapper[4171]: W0223 14:16:30.700082 4171 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 23 14:16:30.703886 master-0 kubenswrapper[4171]: W0223 14:16:30.700117 4171 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Feb 23 14:16:30.703886 master-0 kubenswrapper[4171]: W0223 14:16:30.700129 4171 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 23 14:16:30.703886 master-0 kubenswrapper[4171]: W0223 14:16:30.700139 4171 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 23 14:16:30.703886 master-0 kubenswrapper[4171]: W0223 14:16:30.700149 4171 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 23 14:16:30.703886 master-0 kubenswrapper[4171]: W0223 14:16:30.700159 4171 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 23 14:16:30.703886 master-0 kubenswrapper[4171]: W0223 14:16:30.700173 4171 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 23 14:16:30.703886 master-0 kubenswrapper[4171]: W0223 14:16:30.700187 4171 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 23 14:16:30.703886 master-0 kubenswrapper[4171]: W0223 14:16:30.700199 4171 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 23 14:16:30.703886 master-0 kubenswrapper[4171]: W0223 14:16:30.700216 4171 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 23 14:16:30.704703 master-0 kubenswrapper[4171]: W0223 14:16:30.700229 4171 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 23 14:16:30.704703 master-0 kubenswrapper[4171]: W0223 14:16:30.700244 4171 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 23 14:16:30.704703 master-0 kubenswrapper[4171]: W0223 14:16:30.700257 4171 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 23 14:16:30.704703 master-0 kubenswrapper[4171]: W0223 14:16:30.700270 4171 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 23 14:16:30.704703 master-0 kubenswrapper[4171]: W0223 14:16:30.700283 4171 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 23 14:16:30.704703 master-0 kubenswrapper[4171]: W0223 14:16:30.700296 4171 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 23 14:16:30.704703 master-0 kubenswrapper[4171]: W0223 14:16:30.700307 4171 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 23 14:16:30.704703 master-0 kubenswrapper[4171]: W0223 14:16:30.700320 4171 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 23 14:16:30.704703 master-0 kubenswrapper[4171]: W0223 14:16:30.700331 4171 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 23 14:16:30.704703 master-0 kubenswrapper[4171]: W0223 14:16:30.700342 4171 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 23 14:16:30.704703 master-0 kubenswrapper[4171]: W0223 14:16:30.700354 4171 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 23 14:16:30.704703 master-0 kubenswrapper[4171]: W0223 14:16:30.700365 4171 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 23 14:16:30.704703 master-0 kubenswrapper[4171]: W0223 14:16:30.700375 4171 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 23 14:16:30.704703 master-0 kubenswrapper[4171]: W0223 14:16:30.700386 4171 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 23 14:16:30.704703 master-0 kubenswrapper[4171]: W0223 14:16:30.700396 4171 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 23 14:16:30.704703 master-0 kubenswrapper[4171]: W0223 14:16:30.700408 4171 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 23 14:16:30.704703 master-0 kubenswrapper[4171]: W0223 14:16:30.700419 4171 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 23 14:16:30.704703 master-0 kubenswrapper[4171]: W0223 14:16:30.700429 4171 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 23 14:16:30.704703 master-0 kubenswrapper[4171]: W0223 14:16:30.700441 4171 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 23 14:16:30.704703 master-0 kubenswrapper[4171]: W0223 14:16:30.700451 4171 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 23 14:16:30.705338 master-0 kubenswrapper[4171]: W0223 14:16:30.700462 4171 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 23 14:16:30.705338 master-0 kubenswrapper[4171]: W0223 14:16:30.700511 4171 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 23 14:16:30.705338 master-0 kubenswrapper[4171]: W0223 14:16:30.700524 4171 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 23 14:16:30.705338 master-0 kubenswrapper[4171]: W0223 14:16:30.700535 4171 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 23 14:16:30.705338 master-0 kubenswrapper[4171]: W0223 14:16:30.700547 4171 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 23 14:16:30.705338 master-0 kubenswrapper[4171]: W0223 14:16:30.700558 4171 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 23 14:16:30.705338 master-0 kubenswrapper[4171]: W0223 14:16:30.700570 4171 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 23 14:16:30.705338 master-0 kubenswrapper[4171]: W0223 14:16:30.700580 4171 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 23 14:16:30.705338 master-0 kubenswrapper[4171]: W0223 14:16:30.700591 4171 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 23 14:16:30.705338 master-0 kubenswrapper[4171]: W0223 14:16:30.700601 4171 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 23 14:16:30.705338 master-0 kubenswrapper[4171]: W0223 14:16:30.700612 4171 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 23 14:16:30.705338 master-0 kubenswrapper[4171]: W0223 14:16:30.700631 4171 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 23 14:16:30.705338 master-0 kubenswrapper[4171]: W0223 14:16:30.700646 4171 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 23 14:16:30.705338 master-0 kubenswrapper[4171]: W0223 14:16:30.700658 4171 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 23 14:16:30.705338 master-0 kubenswrapper[4171]: W0223 14:16:30.700670 4171 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 23 14:16:30.705338 master-0 kubenswrapper[4171]: W0223 14:16:30.700681 4171 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 23 14:16:30.705338 master-0 kubenswrapper[4171]: W0223 14:16:30.700693 4171 feature_gate.go:330] unrecognized feature gate: Example Feb 23 14:16:30.705338 master-0 kubenswrapper[4171]: W0223 14:16:30.700707 4171 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 23 14:16:30.705338 master-0 kubenswrapper[4171]: W0223 14:16:30.700719 4171 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 23 14:16:30.705338 master-0 kubenswrapper[4171]: W0223 14:16:30.700731 4171 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 23 14:16:30.706102 master-0 kubenswrapper[4171]: W0223 14:16:30.700741 4171 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 23 14:16:30.706102 master-0 kubenswrapper[4171]: W0223 14:16:30.700752 4171 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 23 14:16:30.706102 master-0 kubenswrapper[4171]: W0223 14:16:30.700763 4171 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 23 14:16:30.706102 master-0 kubenswrapper[4171]: W0223 14:16:30.700774 4171 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 23 14:16:30.706102 master-0 kubenswrapper[4171]: W0223 14:16:30.700785 4171 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 23 14:16:30.706102 master-0 kubenswrapper[4171]: W0223 14:16:30.700796 4171 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 23 14:16:30.706102 master-0 kubenswrapper[4171]: W0223 14:16:30.700806 4171 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 23 14:16:30.706102 master-0 kubenswrapper[4171]: I0223 14:16:30.700839 4171 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 23 14:16:30.715046 master-0 kubenswrapper[4171]: I0223 14:16:30.714965 4171 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Feb 23 14:16:30.715046 master-0 kubenswrapper[4171]: I0223 14:16:30.715034 4171 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 23 14:16:30.715188 master-0 kubenswrapper[4171]: W0223 14:16:30.715164 4171 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Feb 23 14:16:30.715188 master-0 kubenswrapper[4171]: W0223 14:16:30.715184 4171 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 23 14:16:30.715188 master-0 kubenswrapper[4171]: W0223 14:16:30.715191 4171 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 23 14:16:30.715283 master-0 kubenswrapper[4171]: W0223 14:16:30.715198 4171 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 23 14:16:30.715283 master-0 kubenswrapper[4171]: W0223 14:16:30.715206 4171 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 23 14:16:30.715283 master-0 kubenswrapper[4171]: W0223 14:16:30.715214 4171 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 23 14:16:30.715283 master-0 kubenswrapper[4171]: W0223 14:16:30.715220 4171 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 23 14:16:30.715283 master-0 kubenswrapper[4171]: W0223 14:16:30.715226 4171 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 23 14:16:30.715283 master-0 kubenswrapper[4171]: W0223 14:16:30.715231 4171 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 23 14:16:30.715283 master-0 kubenswrapper[4171]: W0223 14:16:30.715237 4171 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 23 14:16:30.715283 master-0 kubenswrapper[4171]: W0223 14:16:30.715243 4171 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 23 14:16:30.715283 master-0 kubenswrapper[4171]: W0223 14:16:30.715249 4171 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 23 14:16:30.715283 master-0 kubenswrapper[4171]: W0223 14:16:30.715257 4171 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 23 14:16:30.715283 master-0 kubenswrapper[4171]: W0223 14:16:30.715267 4171 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 23 14:16:30.715283 master-0 kubenswrapper[4171]: W0223 14:16:30.715274 4171 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 23 14:16:30.715283 master-0 kubenswrapper[4171]: W0223 14:16:30.715280 4171 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 23 14:16:30.715283 master-0 kubenswrapper[4171]: W0223 14:16:30.715287 4171 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 23 14:16:30.715283 master-0 kubenswrapper[4171]: W0223 14:16:30.715294 4171 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 23 14:16:30.715283 master-0 kubenswrapper[4171]: W0223 14:16:30.715301 4171 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 23 14:16:30.715283 master-0 kubenswrapper[4171]: W0223 14:16:30.715308 4171 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 23 14:16:30.715283 master-0 kubenswrapper[4171]: W0223 14:16:30.715315 4171 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 23 14:16:30.715818 master-0 kubenswrapper[4171]: W0223 14:16:30.715322 4171 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 23 14:16:30.715818 master-0 kubenswrapper[4171]: W0223 14:16:30.715329 4171 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 23 14:16:30.715818 master-0 kubenswrapper[4171]: W0223 14:16:30.715334 4171 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 23 14:16:30.715818 master-0 kubenswrapper[4171]: W0223 14:16:30.715342 4171 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 23 14:16:30.715818 master-0 kubenswrapper[4171]: W0223 14:16:30.715350 4171 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 23 14:16:30.715818 master-0 kubenswrapper[4171]: W0223 14:16:30.715357 4171 feature_gate.go:330] unrecognized feature gate: Example Feb 23 14:16:30.715818 master-0 kubenswrapper[4171]: W0223 14:16:30.715363 4171 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 23 14:16:30.715818 master-0 kubenswrapper[4171]: W0223 14:16:30.715369 4171 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 23 14:16:30.715818 master-0 kubenswrapper[4171]: W0223 14:16:30.715377 4171 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 23 14:16:30.715818 master-0 kubenswrapper[4171]: W0223 14:16:30.715386 4171 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 23 14:16:30.715818 master-0 kubenswrapper[4171]: W0223 14:16:30.715391 4171 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 23 14:16:30.715818 master-0 kubenswrapper[4171]: W0223 14:16:30.715397 4171 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 23 14:16:30.715818 master-0 kubenswrapper[4171]: W0223 14:16:30.715405 4171 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 23 14:16:30.715818 master-0 kubenswrapper[4171]: W0223 14:16:30.715412 4171 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 23 14:16:30.715818 master-0 kubenswrapper[4171]: W0223 14:16:30.715418 4171 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 23 14:16:30.715818 master-0 kubenswrapper[4171]: W0223 14:16:30.715424 4171 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 23 14:16:30.715818 master-0 kubenswrapper[4171]: W0223 14:16:30.715429 4171 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 23 14:16:30.715818 master-0 kubenswrapper[4171]: W0223 14:16:30.715435 4171 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 23 14:16:30.715818 master-0 kubenswrapper[4171]: W0223 14:16:30.715440 4171 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 23 14:16:30.716394 master-0 kubenswrapper[4171]: W0223 14:16:30.715445 4171 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 23 14:16:30.716394 master-0 kubenswrapper[4171]: W0223 14:16:30.715452 4171 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 23 14:16:30.716394 master-0 kubenswrapper[4171]: W0223 14:16:30.715457 4171 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 23 14:16:30.716394 master-0 kubenswrapper[4171]: W0223 14:16:30.715463 4171 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 23 14:16:30.716394 master-0 kubenswrapper[4171]: W0223 14:16:30.715469 4171 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 23 14:16:30.716394 master-0 kubenswrapper[4171]: W0223 14:16:30.715491 4171 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 23 14:16:30.716394 master-0 kubenswrapper[4171]: W0223 14:16:30.715497 4171 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 23 14:16:30.716394 master-0 kubenswrapper[4171]: W0223 14:16:30.715503 4171 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 23 14:16:30.716394 master-0 kubenswrapper[4171]: W0223 14:16:30.715510 4171 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 23 14:16:30.716394 master-0 kubenswrapper[4171]: W0223 14:16:30.715518 4171 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 23 14:16:30.716394 master-0 kubenswrapper[4171]: W0223 14:16:30.715525 4171 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 23 14:16:30.716394 master-0 kubenswrapper[4171]: W0223 14:16:30.715532 4171 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 23 14:16:30.716394 master-0 kubenswrapper[4171]: W0223 14:16:30.715539 4171 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 23 14:16:30.716394 master-0 kubenswrapper[4171]: W0223 14:16:30.715548 4171 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 23 14:16:30.716394 master-0 kubenswrapper[4171]: W0223 14:16:30.715559 4171 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 23 14:16:30.716394 master-0 kubenswrapper[4171]: W0223 14:16:30.715566 4171 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 23 14:16:30.716394 master-0 kubenswrapper[4171]: W0223 14:16:30.715574 4171 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 23 14:16:30.716394 master-0 kubenswrapper[4171]: W0223 14:16:30.715581 4171 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 23 14:16:30.716394 master-0 kubenswrapper[4171]: W0223 14:16:30.715587 4171 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 23 14:16:30.716394 master-0 kubenswrapper[4171]: W0223 14:16:30.715593 4171 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 23 14:16:30.717094 master-0 kubenswrapper[4171]: W0223 14:16:30.715599 4171 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 23 14:16:30.717094 master-0 kubenswrapper[4171]: W0223 14:16:30.715606 4171 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 23 14:16:30.717094 master-0 kubenswrapper[4171]: W0223 14:16:30.715611 4171 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 23 14:16:30.717094 master-0 kubenswrapper[4171]: W0223 14:16:30.715617 4171 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 23 14:16:30.717094 master-0 kubenswrapper[4171]: W0223 14:16:30.715622 4171 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 23 14:16:30.717094 master-0 kubenswrapper[4171]: W0223 14:16:30.715628 4171 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 23 14:16:30.717094 master-0 kubenswrapper[4171]: W0223 14:16:30.715633 4171 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 23 14:16:30.717094 master-0 kubenswrapper[4171]: W0223 14:16:30.715639 4171 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 23 14:16:30.717094 master-0 kubenswrapper[4171]: W0223 14:16:30.715645 4171 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 23 14:16:30.717094 master-0 kubenswrapper[4171]: W0223 14:16:30.715651 4171 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 23 14:16:30.717094 master-0 kubenswrapper[4171]: W0223 14:16:30.715656 4171 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 23 14:16:30.717094 master-0 kubenswrapper[4171]: W0223 14:16:30.715662 4171 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 23 14:16:30.717094 master-0 kubenswrapper[4171]: I0223 14:16:30.715672 4171 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 23 14:16:30.717094 master-0 kubenswrapper[4171]: W0223 14:16:30.715895 4171 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 23 14:16:30.717094 master-0 kubenswrapper[4171]: W0223 14:16:30.715909 4171 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 23 14:16:30.717548 master-0 kubenswrapper[4171]: W0223 14:16:30.715915 4171 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 23 14:16:30.717548 master-0 kubenswrapper[4171]: W0223 14:16:30.715921 4171 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 23 14:16:30.717548 master-0 kubenswrapper[4171]: W0223 14:16:30.715927 4171 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 23 14:16:30.717548 master-0 kubenswrapper[4171]: W0223 14:16:30.715933 4171 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 23 14:16:30.717548 master-0 kubenswrapper[4171]: W0223 14:16:30.715939 4171 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 23 14:16:30.717548 master-0 kubenswrapper[4171]: W0223 14:16:30.715945 4171 feature_gate.go:330] unrecognized feature gate: Example Feb 23 14:16:30.717548 master-0 kubenswrapper[4171]: W0223 14:16:30.715951 4171 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 23 14:16:30.717548 master-0 kubenswrapper[4171]: W0223 14:16:30.715957 4171 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 23 14:16:30.717548 master-0 kubenswrapper[4171]: W0223 14:16:30.715962 4171 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 23 14:16:30.717548 master-0 kubenswrapper[4171]: W0223 14:16:30.715968 4171 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 23 14:16:30.717548 master-0 kubenswrapper[4171]: W0223 14:16:30.715975 4171 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 23 14:16:30.717548 master-0 kubenswrapper[4171]: W0223 14:16:30.715982 4171 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 23 14:16:30.717548 master-0 kubenswrapper[4171]: W0223 14:16:30.715990 4171 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 23 14:16:30.717548 master-0 kubenswrapper[4171]: W0223 14:16:30.715997 4171 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 23 14:16:30.717548 master-0 kubenswrapper[4171]: W0223 14:16:30.716004 4171 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 23 14:16:30.717548 master-0 kubenswrapper[4171]: W0223 14:16:30.716012 4171 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 23 14:16:30.717548 master-0 kubenswrapper[4171]: W0223 14:16:30.716018 4171 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 23 14:16:30.717548 master-0 kubenswrapper[4171]: W0223 14:16:30.716026 4171 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 23 14:16:30.717548 master-0 kubenswrapper[4171]: W0223 14:16:30.716033 4171 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 23 14:16:30.717548 master-0 kubenswrapper[4171]: W0223 14:16:30.716040 4171 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 23 14:16:30.718109 master-0 kubenswrapper[4171]: W0223 14:16:30.716046 4171 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 23 14:16:30.718109 master-0 kubenswrapper[4171]: W0223 14:16:30.716053 4171 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 23 14:16:30.718109 master-0 kubenswrapper[4171]: W0223 14:16:30.716060 4171 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 23 14:16:30.718109 master-0 kubenswrapper[4171]: W0223 14:16:30.716067 4171 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 23 14:16:30.718109 master-0 kubenswrapper[4171]: W0223 14:16:30.716102 4171 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 23 14:16:30.718109 master-0 kubenswrapper[4171]: W0223 14:16:30.716109 4171 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 23 14:16:30.718109 master-0 kubenswrapper[4171]: W0223 14:16:30.716116 4171 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 23 14:16:30.718109 master-0 kubenswrapper[4171]: W0223 14:16:30.716123 4171 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 23 14:16:30.718109 master-0 kubenswrapper[4171]: W0223 14:16:30.716130 4171 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 23 14:16:30.718109 master-0 kubenswrapper[4171]: W0223 14:16:30.716137 4171 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 23 14:16:30.718109 master-0 kubenswrapper[4171]: W0223 14:16:30.716144 4171 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 23 14:16:30.718109 master-0 kubenswrapper[4171]: W0223 14:16:30.716150 4171 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 23 14:16:30.718109 master-0 kubenswrapper[4171]: W0223 14:16:30.716157 4171 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 23 14:16:30.718109 master-0 kubenswrapper[4171]: W0223 14:16:30.716164 4171 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 23 14:16:30.718109 master-0 kubenswrapper[4171]: W0223 14:16:30.716174 4171 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 23 14:16:30.718109 master-0 kubenswrapper[4171]: W0223 14:16:30.716185 4171 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 23 14:16:30.718109 master-0 kubenswrapper[4171]: W0223 14:16:30.716192 4171 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 23 14:16:30.718109 master-0 kubenswrapper[4171]: W0223 14:16:30.716200 4171 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 23 14:16:30.718109 master-0 kubenswrapper[4171]: W0223 14:16:30.716209 4171 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 23 14:16:30.718109 master-0 kubenswrapper[4171]: W0223 14:16:30.716216 4171 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 23 14:16:30.718693 master-0 kubenswrapper[4171]: W0223 14:16:30.716223 4171 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 23 14:16:30.718693 master-0 kubenswrapper[4171]: W0223 14:16:30.716231 4171 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 23 14:16:30.718693 master-0 kubenswrapper[4171]: W0223 14:16:30.716239 4171 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 23 14:16:30.718693 master-0 kubenswrapper[4171]: W0223 14:16:30.716246 4171 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 23 14:16:30.718693 master-0 kubenswrapper[4171]: W0223 14:16:30.716255 4171 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 23 14:16:30.718693 master-0 kubenswrapper[4171]: W0223 14:16:30.716264 4171 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 23 14:16:30.718693 master-0 kubenswrapper[4171]: W0223 14:16:30.716273 4171 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 23 14:16:30.718693 master-0 kubenswrapper[4171]: W0223 14:16:30.716282 4171 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 23 14:16:30.718693 master-0 kubenswrapper[4171]: W0223 14:16:30.716289 4171 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 23 14:16:30.718693 master-0 kubenswrapper[4171]: W0223 14:16:30.716297 4171 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 23 14:16:30.718693 master-0 kubenswrapper[4171]: W0223 14:16:30.716305 4171 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 23 14:16:30.718693 master-0 kubenswrapper[4171]: W0223 14:16:30.716312 4171 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 23 14:16:30.718693 master-0 kubenswrapper[4171]: W0223 14:16:30.716318 4171 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 23 14:16:30.718693 master-0 kubenswrapper[4171]: W0223 14:16:30.716326 4171 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 23 14:16:30.718693 master-0 kubenswrapper[4171]: W0223 14:16:30.716334 4171 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 23 14:16:30.718693 master-0 kubenswrapper[4171]: W0223 14:16:30.716340 4171 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 23 14:16:30.718693 master-0 kubenswrapper[4171]: W0223 14:16:30.716347 4171 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 23 14:16:30.718693 master-0 kubenswrapper[4171]: W0223 14:16:30.716354 4171 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 23 14:16:30.718693 master-0 kubenswrapper[4171]: W0223 14:16:30.716363 4171 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 23 14:16:30.719284 master-0 kubenswrapper[4171]: W0223 14:16:30.716372 4171 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Feb 23 14:16:30.719284 master-0 kubenswrapper[4171]: W0223 14:16:30.716379 4171 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 23 14:16:30.719284 master-0 kubenswrapper[4171]: W0223 14:16:30.716387 4171 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 23 14:16:30.719284 master-0 kubenswrapper[4171]: W0223 14:16:30.716395 4171 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 23 14:16:30.719284 master-0 kubenswrapper[4171]: W0223 14:16:30.716404 4171 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 23 14:16:30.719284 master-0 kubenswrapper[4171]: W0223 14:16:30.716413 4171 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 23 14:16:30.719284 master-0 kubenswrapper[4171]: W0223 14:16:30.716421 4171 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 23 14:16:30.719284 master-0 kubenswrapper[4171]: W0223 14:16:30.716431 4171 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 23 14:16:30.719284 master-0 kubenswrapper[4171]: W0223 14:16:30.716441 4171 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 23 14:16:30.719284 master-0 kubenswrapper[4171]: W0223 14:16:30.716448 4171 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 23 14:16:30.719284 master-0 kubenswrapper[4171]: W0223 14:16:30.716456 4171 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 23 14:16:30.719284 master-0 kubenswrapper[4171]: I0223 14:16:30.716468 4171 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 23 14:16:30.719284 master-0 kubenswrapper[4171]: I0223 14:16:30.716836 4171 server.go:940] "Client rotation is on, will bootstrap in background" Feb 23 14:16:30.719824 master-0 kubenswrapper[4171]: I0223 14:16:30.719781 4171 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Feb 23 14:16:30.721753 master-0 kubenswrapper[4171]: I0223 14:16:30.721717 4171 server.go:997] "Starting client certificate rotation" Feb 23 14:16:30.721753 master-0 kubenswrapper[4171]: I0223 14:16:30.721752 4171 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 23 14:16:30.722127 master-0 kubenswrapper[4171]: I0223 14:16:30.722040 4171 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 23 14:16:30.761209 master-0 kubenswrapper[4171]: I0223 14:16:30.761105 4171 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 23 14:16:30.764067 master-0 kubenswrapper[4171]: I0223 14:16:30.763950 4171 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 23 14:16:30.768017 master-0 kubenswrapper[4171]: E0223 14:16:30.767936 4171 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Feb 23 14:16:30.783857 master-0 kubenswrapper[4171]: I0223 14:16:30.783805 4171 log.go:25] "Validated CRI v1 runtime API" Feb 23 14:16:30.793813 master-0 kubenswrapper[4171]: I0223 14:16:30.793766 4171 log.go:25] "Validated CRI v1 image API" Feb 23 14:16:30.795802 master-0 kubenswrapper[4171]: I0223 14:16:30.795744 4171 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 23 14:16:30.802728 master-0 kubenswrapper[4171]: I0223 14:16:30.802664 4171 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4 be859281-f98a-48e6-a6b4-cc97afbc917c:/dev/vda3] Feb 23 14:16:30.802822 master-0 kubenswrapper[4171]: I0223 14:16:30.802717 4171 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0}] Feb 23 14:16:30.836808 master-0 kubenswrapper[4171]: I0223 14:16:30.836157 4171 manager.go:217] Machine: {Timestamp:2026-02-23 14:16:30.833951824 +0000 UTC m=+0.937353393 CPUVendorID:AuthenticAMD NumCores:16 NumPhysicalCores:1 NumSockets:16 CpuFrequency:2800000 MemoryCapacity:50514153472 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:b2003aa684e6437e87dd9193d3a162ac SystemUUID:b2003aa6-84e6-437e-87dd-9193d3a162ac BootID:f84e7e92-cd63-4a9e-83cc-11dcb3ddc406 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:25257074688 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:10102833152 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:25257078784 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:88:49:0a Speed:-1 Mtu:9000} {Name:eth2 MacAddress:fa:16:3e:4d:95:0b Speed:-1 Mtu:9000} {Name:ovs-system MacAddress:4a:ca:1a:d6:6e:89 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:50514153472 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[12] Caches:[{Id:12 Size:32768 Type:Data Level:1} {Id:12 Size:32768 Type:Instruction Level:1} {Id:12 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:12 Size:16777216 Type:Unified Level:3}] SocketID:12 BookID: DrawerID:} {Id:0 Threads:[13] Caches:[{Id:13 Size:32768 Type:Data Level:1} {Id:13 Size:32768 Type:Instruction Level:1} {Id:13 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:13 Size:16777216 Type:Unified Level:3}] SocketID:13 BookID: DrawerID:} {Id:0 Threads:[14] Caches:[{Id:14 Size:32768 Type:Data Level:1} {Id:14 Size:32768 Type:Instruction Level:1} {Id:14 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:14 Size:16777216 Type:Unified Level:3}] SocketID:14 BookID: DrawerID:} {Id:0 Threads:[15] Caches:[{Id:15 Size:32768 Type:Data Level:1} {Id:15 Size:32768 Type:Instruction Level:1} {Id:15 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:15 Size:16777216 Type:Unified Level:3}] SocketID:15 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 23 14:16:30.836808 master-0 kubenswrapper[4171]: I0223 14:16:30.836717 4171 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 23 14:16:30.837138 master-0 kubenswrapper[4171]: I0223 14:16:30.836925 4171 manager.go:233] Version: {KernelVersion:5.14.0-427.109.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202602022246-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 23 14:16:30.837544 master-0 kubenswrapper[4171]: I0223 14:16:30.837463 4171 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 23 14:16:30.837891 master-0 kubenswrapper[4171]: I0223 14:16:30.837821 4171 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 23 14:16:30.838221 master-0 kubenswrapper[4171]: I0223 14:16:30.837881 4171 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 23 14:16:30.838311 master-0 kubenswrapper[4171]: I0223 14:16:30.838293 4171 topology_manager.go:138] "Creating topology manager with none policy" Feb 23 14:16:30.838370 master-0 kubenswrapper[4171]: I0223 14:16:30.838321 4171 container_manager_linux.go:303] "Creating device plugin manager" Feb 23 14:16:30.840403 master-0 kubenswrapper[4171]: I0223 14:16:30.840353 4171 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 23 14:16:30.840520 master-0 kubenswrapper[4171]: I0223 14:16:30.840418 4171 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 23 14:16:30.840715 master-0 kubenswrapper[4171]: I0223 14:16:30.840633 4171 state_mem.go:36] "Initialized new in-memory state store" Feb 23 14:16:30.840775 master-0 kubenswrapper[4171]: I0223 14:16:30.840764 4171 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 23 14:16:30.846330 master-0 kubenswrapper[4171]: I0223 14:16:30.846275 4171 kubelet.go:418] "Attempting to sync node with API server" Feb 23 14:16:30.846330 master-0 kubenswrapper[4171]: I0223 14:16:30.846323 4171 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 23 14:16:30.846505 master-0 kubenswrapper[4171]: I0223 14:16:30.846363 4171 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 23 14:16:30.846505 master-0 kubenswrapper[4171]: I0223 14:16:30.846385 4171 kubelet.go:324] "Adding apiserver pod source" Feb 23 14:16:30.846505 master-0 kubenswrapper[4171]: I0223 14:16:30.846421 4171 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 23 14:16:30.855989 master-0 kubenswrapper[4171]: I0223 14:16:30.855879 4171 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-6.rhaos4.18.git7ed6156.el9" apiVersion="v1" Feb 23 14:16:30.856199 master-0 kubenswrapper[4171]: W0223 14:16:30.856086 4171 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Feb 23 14:16:30.856280 master-0 kubenswrapper[4171]: E0223 14:16:30.856253 4171 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Feb 23 14:16:30.856351 master-0 kubenswrapper[4171]: W0223 14:16:30.856100 4171 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Feb 23 14:16:30.856351 master-0 kubenswrapper[4171]: E0223 14:16:30.856319 4171 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Feb 23 14:16:30.858058 master-0 kubenswrapper[4171]: I0223 14:16:30.858011 4171 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 23 14:16:30.858349 master-0 kubenswrapper[4171]: I0223 14:16:30.858310 4171 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 23 14:16:30.858412 master-0 kubenswrapper[4171]: I0223 14:16:30.858352 4171 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 23 14:16:30.858412 master-0 kubenswrapper[4171]: I0223 14:16:30.858370 4171 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 23 14:16:30.858412 master-0 kubenswrapper[4171]: I0223 14:16:30.858385 4171 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 23 14:16:30.858412 master-0 kubenswrapper[4171]: I0223 14:16:30.858401 4171 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 23 14:16:30.858663 master-0 kubenswrapper[4171]: I0223 14:16:30.858418 4171 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 23 14:16:30.858663 master-0 kubenswrapper[4171]: I0223 14:16:30.858435 4171 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 23 14:16:30.858663 master-0 kubenswrapper[4171]: I0223 14:16:30.858449 4171 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 23 14:16:30.858663 master-0 kubenswrapper[4171]: I0223 14:16:30.858466 4171 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 23 14:16:30.858663 master-0 kubenswrapper[4171]: I0223 14:16:30.858514 4171 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 23 14:16:30.858663 master-0 kubenswrapper[4171]: I0223 14:16:30.858536 4171 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 23 14:16:30.859120 master-0 kubenswrapper[4171]: I0223 14:16:30.859067 4171 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 23 14:16:30.859237 master-0 kubenswrapper[4171]: I0223 14:16:30.859127 4171 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 23 14:16:30.859688 master-0 kubenswrapper[4171]: I0223 14:16:30.859652 4171 server.go:1280] "Started kubelet" Feb 23 14:16:30.860301 master-0 kubenswrapper[4171]: I0223 14:16:30.859776 4171 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 23 14:16:30.860301 master-0 kubenswrapper[4171]: I0223 14:16:30.859854 4171 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 23 14:16:30.860301 master-0 kubenswrapper[4171]: I0223 14:16:30.859984 4171 server_v1.go:47] "podresources" method="list" useActivePods=true Feb 23 14:16:30.860695 master-0 kubenswrapper[4171]: I0223 14:16:30.860678 4171 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 23 14:16:30.861340 master-0 systemd[1]: Started Kubernetes Kubelet. Feb 23 14:16:30.862224 master-0 kubenswrapper[4171]: I0223 14:16:30.862156 4171 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Feb 23 14:16:30.871506 master-0 kubenswrapper[4171]: I0223 14:16:30.871424 4171 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 23 14:16:30.871506 master-0 kubenswrapper[4171]: I0223 14:16:30.871492 4171 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 23 14:16:30.872162 master-0 kubenswrapper[4171]: E0223 14:16:30.872092 4171 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:16:30.873070 master-0 kubenswrapper[4171]: I0223 14:16:30.872989 4171 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 23 14:16:30.873070 master-0 kubenswrapper[4171]: I0223 14:16:30.873058 4171 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Feb 23 14:16:30.873193 master-0 kubenswrapper[4171]: I0223 14:16:30.873087 4171 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 23 14:16:30.875079 master-0 kubenswrapper[4171]: I0223 14:16:30.874861 4171 reconstruct.go:97] "Volume reconstruction finished" Feb 23 14:16:30.875079 master-0 kubenswrapper[4171]: I0223 14:16:30.874889 4171 reconciler.go:26] "Reconciler: start to sync state" Feb 23 14:16:30.875289 master-0 kubenswrapper[4171]: I0223 14:16:30.875254 4171 factory.go:55] Registering systemd factory Feb 23 14:16:30.875340 master-0 kubenswrapper[4171]: I0223 14:16:30.875293 4171 factory.go:221] Registration of the systemd container factory successfully Feb 23 14:16:30.875620 master-0 kubenswrapper[4171]: E0223 14:16:30.875540 4171 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Feb 23 14:16:30.877608 master-0 kubenswrapper[4171]: W0223 14:16:30.877183 4171 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Feb 23 14:16:30.877691 master-0 kubenswrapper[4171]: E0223 14:16:30.877651 4171 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Feb 23 14:16:30.879619 master-0 kubenswrapper[4171]: E0223 14:16:30.875587 4171 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.1896e5d16143a25f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:30.859616863 +0000 UTC m=+0.963018362,LastTimestamp:2026-02-23 14:16:30.859616863 +0000 UTC m=+0.963018362,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:30.881686 master-0 kubenswrapper[4171]: I0223 14:16:30.881642 4171 server.go:449] "Adding debug handlers to kubelet server" Feb 23 14:16:30.881749 master-0 kubenswrapper[4171]: I0223 14:16:30.881695 4171 factory.go:153] Registering CRI-O factory Feb 23 14:16:30.881749 master-0 kubenswrapper[4171]: I0223 14:16:30.881721 4171 factory.go:221] Registration of the crio container factory successfully Feb 23 14:16:30.881826 master-0 kubenswrapper[4171]: I0223 14:16:30.881812 4171 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 23 14:16:30.881874 master-0 kubenswrapper[4171]: I0223 14:16:30.881849 4171 factory.go:103] Registering Raw factory Feb 23 14:16:30.881874 master-0 kubenswrapper[4171]: I0223 14:16:30.881873 4171 manager.go:1196] Started watching for new ooms in manager Feb 23 14:16:30.882884 master-0 kubenswrapper[4171]: I0223 14:16:30.882854 4171 manager.go:319] Starting recovery of all containers Feb 23 14:16:30.883543 master-0 kubenswrapper[4171]: E0223 14:16:30.883517 4171 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Feb 23 14:16:30.906561 master-0 kubenswrapper[4171]: I0223 14:16:30.906196 4171 manager.go:324] Recovery completed Feb 23 14:16:30.927289 master-0 kubenswrapper[4171]: I0223 14:16:30.927236 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:30.928910 master-0 kubenswrapper[4171]: I0223 14:16:30.928862 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:30.928964 master-0 kubenswrapper[4171]: I0223 14:16:30.928919 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:30.928964 master-0 kubenswrapper[4171]: I0223 14:16:30.928931 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:30.930008 master-0 kubenswrapper[4171]: I0223 14:16:30.929968 4171 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 23 14:16:30.930082 master-0 kubenswrapper[4171]: I0223 14:16:30.930056 4171 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 23 14:16:30.930126 master-0 kubenswrapper[4171]: I0223 14:16:30.930106 4171 state_mem.go:36] "Initialized new in-memory state store" Feb 23 14:16:30.935171 master-0 kubenswrapper[4171]: I0223 14:16:30.935141 4171 policy_none.go:49] "None policy: Start" Feb 23 14:16:30.936207 master-0 kubenswrapper[4171]: I0223 14:16:30.936053 4171 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 23 14:16:30.936207 master-0 kubenswrapper[4171]: I0223 14:16:30.936096 4171 state_mem.go:35] "Initializing new in-memory state store" Feb 23 14:16:30.972355 master-0 kubenswrapper[4171]: E0223 14:16:30.972284 4171 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:16:31.022355 master-0 kubenswrapper[4171]: I0223 14:16:31.021749 4171 manager.go:334] "Starting Device Plugin manager" Feb 23 14:16:31.022355 master-0 kubenswrapper[4171]: I0223 14:16:31.021827 4171 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 23 14:16:31.022355 master-0 kubenswrapper[4171]: I0223 14:16:31.021850 4171 server.go:79] "Starting device plugin registration server" Feb 23 14:16:31.022355 master-0 kubenswrapper[4171]: I0223 14:16:31.022422 4171 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 23 14:16:31.064468 master-0 kubenswrapper[4171]: I0223 14:16:31.022446 4171 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 23 14:16:31.064468 master-0 kubenswrapper[4171]: I0223 14:16:31.022727 4171 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 23 14:16:31.064468 master-0 kubenswrapper[4171]: I0223 14:16:31.022917 4171 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 23 14:16:31.064468 master-0 kubenswrapper[4171]: I0223 14:16:31.022932 4171 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 23 14:16:31.064468 master-0 kubenswrapper[4171]: E0223 14:16:31.025343 4171 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Feb 23 14:16:31.064468 master-0 kubenswrapper[4171]: I0223 14:16:31.053045 4171 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 23 14:16:31.064468 master-0 kubenswrapper[4171]: I0223 14:16:31.056280 4171 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 23 14:16:31.064468 master-0 kubenswrapper[4171]: I0223 14:16:31.056361 4171 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 23 14:16:31.064468 master-0 kubenswrapper[4171]: I0223 14:16:31.056392 4171 kubelet.go:2335] "Starting kubelet main sync loop" Feb 23 14:16:31.064468 master-0 kubenswrapper[4171]: E0223 14:16:31.056513 4171 kubelet.go:2359] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Feb 23 14:16:31.064468 master-0 kubenswrapper[4171]: W0223 14:16:31.059319 4171 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Feb 23 14:16:31.064468 master-0 kubenswrapper[4171]: E0223 14:16:31.059440 4171 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Feb 23 14:16:31.076774 master-0 kubenswrapper[4171]: E0223 14:16:31.076686 4171 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Feb 23 14:16:31.122971 master-0 kubenswrapper[4171]: I0223 14:16:31.122884 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:31.124170 master-0 kubenswrapper[4171]: I0223 14:16:31.124117 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:31.124170 master-0 kubenswrapper[4171]: I0223 14:16:31.124172 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:31.124338 master-0 kubenswrapper[4171]: I0223 14:16:31.124189 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:31.124338 master-0 kubenswrapper[4171]: I0223 14:16:31.124233 4171 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Feb 23 14:16:31.125384 master-0 kubenswrapper[4171]: E0223 14:16:31.125306 4171 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Feb 23 14:16:31.157499 master-0 kubenswrapper[4171]: I0223 14:16:31.157425 4171 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["kube-system/bootstrap-kube-scheduler-master-0","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","kube-system/bootstrap-kube-controller-manager-master-0"] Feb 23 14:16:31.157839 master-0 kubenswrapper[4171]: I0223 14:16:31.157566 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:31.159356 master-0 kubenswrapper[4171]: I0223 14:16:31.159202 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:31.159356 master-0 kubenswrapper[4171]: I0223 14:16:31.159359 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:31.159649 master-0 kubenswrapper[4171]: I0223 14:16:31.159382 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:31.159649 master-0 kubenswrapper[4171]: I0223 14:16:31.159570 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:31.159943 master-0 kubenswrapper[4171]: I0223 14:16:31.159881 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Feb 23 14:16:31.160018 master-0 kubenswrapper[4171]: I0223 14:16:31.159966 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:31.160743 master-0 kubenswrapper[4171]: I0223 14:16:31.160679 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:31.160869 master-0 kubenswrapper[4171]: I0223 14:16:31.160730 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:31.160869 master-0 kubenswrapper[4171]: I0223 14:16:31.160798 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:31.161112 master-0 kubenswrapper[4171]: I0223 14:16:31.161069 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:31.161274 master-0 kubenswrapper[4171]: I0223 14:16:31.161125 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:31.161274 master-0 kubenswrapper[4171]: I0223 14:16:31.161158 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:31.161274 master-0 kubenswrapper[4171]: I0223 14:16:31.161178 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:31.161274 master-0 kubenswrapper[4171]: I0223 14:16:31.161261 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Feb 23 14:16:31.161517 master-0 kubenswrapper[4171]: I0223 14:16:31.161304 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:31.162287 master-0 kubenswrapper[4171]: I0223 14:16:31.162238 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:31.162287 master-0 kubenswrapper[4171]: I0223 14:16:31.162286 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:31.162516 master-0 kubenswrapper[4171]: I0223 14:16:31.162308 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:31.162516 master-0 kubenswrapper[4171]: I0223 14:16:31.162464 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:31.163169 master-0 kubenswrapper[4171]: I0223 14:16:31.162804 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:31.163169 master-0 kubenswrapper[4171]: I0223 14:16:31.162856 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:31.163169 master-0 kubenswrapper[4171]: I0223 14:16:31.162888 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:31.163169 master-0 kubenswrapper[4171]: I0223 14:16:31.162996 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Feb 23 14:16:31.163169 master-0 kubenswrapper[4171]: I0223 14:16:31.163124 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:31.163541 master-0 kubenswrapper[4171]: I0223 14:16:31.163461 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:31.163541 master-0 kubenswrapper[4171]: I0223 14:16:31.163533 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:31.163712 master-0 kubenswrapper[4171]: I0223 14:16:31.163556 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:31.163779 master-0 kubenswrapper[4171]: I0223 14:16:31.163737 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:31.164391 master-0 kubenswrapper[4171]: I0223 14:16:31.163880 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:16:31.164391 master-0 kubenswrapper[4171]: I0223 14:16:31.163925 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:31.164391 master-0 kubenswrapper[4171]: I0223 14:16:31.164083 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:31.164391 master-0 kubenswrapper[4171]: I0223 14:16:31.164123 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:31.164391 master-0 kubenswrapper[4171]: I0223 14:16:31.164147 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:31.165156 master-0 kubenswrapper[4171]: I0223 14:16:31.164653 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:31.165156 master-0 kubenswrapper[4171]: I0223 14:16:31.164737 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:31.165156 master-0 kubenswrapper[4171]: I0223 14:16:31.164755 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:31.165156 master-0 kubenswrapper[4171]: I0223 14:16:31.164883 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:31.165156 master-0 kubenswrapper[4171]: I0223 14:16:31.164918 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:31.165156 master-0 kubenswrapper[4171]: I0223 14:16:31.164934 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:31.165621 master-0 kubenswrapper[4171]: I0223 14:16:31.165192 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:16:31.165621 master-0 kubenswrapper[4171]: I0223 14:16:31.165233 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:31.166286 master-0 kubenswrapper[4171]: I0223 14:16:31.166240 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:31.166440 master-0 kubenswrapper[4171]: I0223 14:16:31.166294 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:31.166440 master-0 kubenswrapper[4171]: I0223 14:16:31.166318 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:31.176539 master-0 kubenswrapper[4171]: I0223 14:16:31.176468 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:16:31.176658 master-0 kubenswrapper[4171]: I0223 14:16:31.176570 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:16:31.176658 master-0 kubenswrapper[4171]: I0223 14:16:31.176626 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c997c8e9d3be51d454d8e61e376bef08-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"c997c8e9d3be51d454d8e61e376bef08\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Feb 23 14:16:31.176849 master-0 kubenswrapper[4171]: I0223 14:16:31.176683 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/12dab5d350ebc129b0bfa4714d330b15-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"12dab5d350ebc129b0bfa4714d330b15\") " pod="openshift-etcd/etcd-master-0-master-0" Feb 23 14:16:31.176849 master-0 kubenswrapper[4171]: I0223 14:16:31.176720 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:16:31.176849 master-0 kubenswrapper[4171]: I0223 14:16:31.176748 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:16:31.176849 master-0 kubenswrapper[4171]: I0223 14:16:31.176777 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:16:31.177080 master-0 kubenswrapper[4171]: I0223 14:16:31.176891 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:16:31.177080 master-0 kubenswrapper[4171]: I0223 14:16:31.176986 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/56c3cb71c9851003c8de7e7c5db4b87e-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"56c3cb71c9851003c8de7e7c5db4b87e\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Feb 23 14:16:31.177080 master-0 kubenswrapper[4171]: I0223 14:16:31.177052 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c997c8e9d3be51d454d8e61e376bef08-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"c997c8e9d3be51d454d8e61e376bef08\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Feb 23 14:16:31.177242 master-0 kubenswrapper[4171]: I0223 14:16:31.177090 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/12dab5d350ebc129b0bfa4714d330b15-certs\") pod \"etcd-master-0-master-0\" (UID: \"12dab5d350ebc129b0bfa4714d330b15\") " pod="openshift-etcd/etcd-master-0-master-0" Feb 23 14:16:31.177242 master-0 kubenswrapper[4171]: I0223 14:16:31.177140 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:16:31.177242 master-0 kubenswrapper[4171]: I0223 14:16:31.177175 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:16:31.177411 master-0 kubenswrapper[4171]: I0223 14:16:31.177250 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:16:31.177411 master-0 kubenswrapper[4171]: I0223 14:16:31.177342 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:16:31.177411 master-0 kubenswrapper[4171]: I0223 14:16:31.177389 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:16:31.177637 master-0 kubenswrapper[4171]: I0223 14:16:31.177431 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/56c3cb71c9851003c8de7e7c5db4b87e-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"56c3cb71c9851003c8de7e7c5db4b87e\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Feb 23 14:16:31.278632 master-0 kubenswrapper[4171]: I0223 14:16:31.278594 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:16:31.278801 master-0 kubenswrapper[4171]: I0223 14:16:31.278647 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:16:31.278801 master-0 kubenswrapper[4171]: I0223 14:16:31.278787 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c997c8e9d3be51d454d8e61e376bef08-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"c997c8e9d3be51d454d8e61e376bef08\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Feb 23 14:16:31.278933 master-0 kubenswrapper[4171]: I0223 14:16:31.278842 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/12dab5d350ebc129b0bfa4714d330b15-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"12dab5d350ebc129b0bfa4714d330b15\") " pod="openshift-etcd/etcd-master-0-master-0" Feb 23 14:16:31.278933 master-0 kubenswrapper[4171]: I0223 14:16:31.278882 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/12dab5d350ebc129b0bfa4714d330b15-certs\") pod \"etcd-master-0-master-0\" (UID: \"12dab5d350ebc129b0bfa4714d330b15\") " pod="openshift-etcd/etcd-master-0-master-0" Feb 23 14:16:31.278933 master-0 kubenswrapper[4171]: I0223 14:16:31.278913 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:16:31.278933 master-0 kubenswrapper[4171]: I0223 14:16:31.278926 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c997c8e9d3be51d454d8e61e376bef08-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"c997c8e9d3be51d454d8e61e376bef08\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Feb 23 14:16:31.279166 master-0 kubenswrapper[4171]: I0223 14:16:31.278945 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:16:31.279166 master-0 kubenswrapper[4171]: I0223 14:16:31.278879 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:16:31.279166 master-0 kubenswrapper[4171]: I0223 14:16:31.278975 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:16:31.279166 master-0 kubenswrapper[4171]: I0223 14:16:31.279024 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/12dab5d350ebc129b0bfa4714d330b15-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"12dab5d350ebc129b0bfa4714d330b15\") " pod="openshift-etcd/etcd-master-0-master-0" Feb 23 14:16:31.279166 master-0 kubenswrapper[4171]: I0223 14:16:31.279027 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:16:31.279166 master-0 kubenswrapper[4171]: I0223 14:16:31.279069 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/12dab5d350ebc129b0bfa4714d330b15-certs\") pod \"etcd-master-0-master-0\" (UID: \"12dab5d350ebc129b0bfa4714d330b15\") " pod="openshift-etcd/etcd-master-0-master-0" Feb 23 14:16:31.279166 master-0 kubenswrapper[4171]: I0223 14:16:31.279073 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/56c3cb71c9851003c8de7e7c5db4b87e-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"56c3cb71c9851003c8de7e7c5db4b87e\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Feb 23 14:16:31.279166 master-0 kubenswrapper[4171]: I0223 14:16:31.279118 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c997c8e9d3be51d454d8e61e376bef08-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"c997c8e9d3be51d454d8e61e376bef08\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Feb 23 14:16:31.279166 master-0 kubenswrapper[4171]: I0223 14:16:31.279132 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:16:31.279166 master-0 kubenswrapper[4171]: I0223 14:16:31.279151 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:16:31.279166 master-0 kubenswrapper[4171]: I0223 14:16:31.278917 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:16:31.279847 master-0 kubenswrapper[4171]: I0223 14:16:31.279209 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:16:31.279847 master-0 kubenswrapper[4171]: I0223 14:16:31.279231 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:16:31.279847 master-0 kubenswrapper[4171]: I0223 14:16:31.279271 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/56c3cb71c9851003c8de7e7c5db4b87e-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"56c3cb71c9851003c8de7e7c5db4b87e\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Feb 23 14:16:31.279847 master-0 kubenswrapper[4171]: I0223 14:16:31.279290 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:16:31.279847 master-0 kubenswrapper[4171]: I0223 14:16:31.279357 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:16:31.279847 master-0 kubenswrapper[4171]: I0223 14:16:31.279454 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:16:31.279847 master-0 kubenswrapper[4171]: I0223 14:16:31.279529 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:16:31.279847 master-0 kubenswrapper[4171]: I0223 14:16:31.279603 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:16:31.279847 master-0 kubenswrapper[4171]: I0223 14:16:31.279640 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:16:31.279847 master-0 kubenswrapper[4171]: I0223 14:16:31.279670 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:16:31.279847 master-0 kubenswrapper[4171]: I0223 14:16:31.279702 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/56c3cb71c9851003c8de7e7c5db4b87e-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"56c3cb71c9851003c8de7e7c5db4b87e\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Feb 23 14:16:31.279847 master-0 kubenswrapper[4171]: I0223 14:16:31.279707 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:16:31.279847 master-0 kubenswrapper[4171]: I0223 14:16:31.279668 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c997c8e9d3be51d454d8e61e376bef08-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"c997c8e9d3be51d454d8e61e376bef08\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Feb 23 14:16:31.279847 master-0 kubenswrapper[4171]: I0223 14:16:31.279748 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:16:31.279847 master-0 kubenswrapper[4171]: I0223 14:16:31.279775 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:16:31.279847 master-0 kubenswrapper[4171]: I0223 14:16:31.279780 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/56c3cb71c9851003c8de7e7c5db4b87e-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"56c3cb71c9851003c8de7e7c5db4b87e\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Feb 23 14:16:31.325869 master-0 kubenswrapper[4171]: I0223 14:16:31.325780 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:31.327317 master-0 kubenswrapper[4171]: I0223 14:16:31.327268 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:31.327578 master-0 kubenswrapper[4171]: I0223 14:16:31.327334 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:31.327578 master-0 kubenswrapper[4171]: I0223 14:16:31.327363 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:31.327578 master-0 kubenswrapper[4171]: I0223 14:16:31.327432 4171 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Feb 23 14:16:31.328634 master-0 kubenswrapper[4171]: E0223 14:16:31.328539 4171 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Feb 23 14:16:31.479167 master-0 kubenswrapper[4171]: E0223 14:16:31.478924 4171 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Feb 23 14:16:31.507889 master-0 kubenswrapper[4171]: I0223 14:16:31.507800 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Feb 23 14:16:31.520539 master-0 kubenswrapper[4171]: I0223 14:16:31.520470 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Feb 23 14:16:31.541438 master-0 kubenswrapper[4171]: I0223 14:16:31.541372 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Feb 23 14:16:31.562680 master-0 kubenswrapper[4171]: I0223 14:16:31.562607 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:16:31.574282 master-0 kubenswrapper[4171]: I0223 14:16:31.574222 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:16:31.729972 master-0 kubenswrapper[4171]: I0223 14:16:31.729761 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:31.731709 master-0 kubenswrapper[4171]: I0223 14:16:31.731651 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:31.731825 master-0 kubenswrapper[4171]: I0223 14:16:31.731723 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:31.731825 master-0 kubenswrapper[4171]: I0223 14:16:31.731742 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:31.731825 master-0 kubenswrapper[4171]: I0223 14:16:31.731821 4171 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Feb 23 14:16:31.733049 master-0 kubenswrapper[4171]: E0223 14:16:31.732973 4171 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Feb 23 14:16:31.864158 master-0 kubenswrapper[4171]: I0223 14:16:31.864041 4171 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Feb 23 14:16:32.082543 master-0 kubenswrapper[4171]: W0223 14:16:32.082212 4171 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Feb 23 14:16:32.082543 master-0 kubenswrapper[4171]: E0223 14:16:32.082313 4171 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Feb 23 14:16:32.201725 master-0 kubenswrapper[4171]: W0223 14:16:32.201530 4171 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Feb 23 14:16:32.201725 master-0 kubenswrapper[4171]: E0223 14:16:32.201633 4171 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Feb 23 14:16:32.246067 master-0 kubenswrapper[4171]: W0223 14:16:32.245924 4171 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Feb 23 14:16:32.246067 master-0 kubenswrapper[4171]: E0223 14:16:32.246031 4171 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Feb 23 14:16:32.281007 master-0 kubenswrapper[4171]: E0223 14:16:32.280884 4171 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Feb 23 14:16:32.305469 master-0 kubenswrapper[4171]: W0223 14:16:32.305339 4171 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Feb 23 14:16:32.305469 master-0 kubenswrapper[4171]: E0223 14:16:32.305431 4171 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Feb 23 14:16:32.464874 master-0 kubenswrapper[4171]: I0223 14:16:32.464822 4171 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 14:16:32.533651 master-0 kubenswrapper[4171]: I0223 14:16:32.533559 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:32.535056 master-0 kubenswrapper[4171]: I0223 14:16:32.535011 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:32.535160 master-0 kubenswrapper[4171]: I0223 14:16:32.535068 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:32.535160 master-0 kubenswrapper[4171]: I0223 14:16:32.535087 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:32.535160 master-0 kubenswrapper[4171]: I0223 14:16:32.535142 4171 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Feb 23 14:16:32.536237 master-0 kubenswrapper[4171]: E0223 14:16:32.536163 4171 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Feb 23 14:16:32.548267 master-0 kubenswrapper[4171]: W0223 14:16:32.548188 4171 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12dab5d350ebc129b0bfa4714d330b15.slice/crio-dd18675422a4846ac8ff692dbc3019546e4c2cecfa8b4d0fe07976539e44abe0 WatchSource:0}: Error finding container dd18675422a4846ac8ff692dbc3019546e4c2cecfa8b4d0fe07976539e44abe0: Status 404 returned error can't find the container with id dd18675422a4846ac8ff692dbc3019546e4c2cecfa8b4d0fe07976539e44abe0 Feb 23 14:16:32.625932 master-0 kubenswrapper[4171]: W0223 14:16:32.625854 4171 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc997c8e9d3be51d454d8e61e376bef08.slice/crio-57996809f1e2dec5f618cc991b1ec9797922b627eb03d04dabd6bb6cb4205117 WatchSource:0}: Error finding container 57996809f1e2dec5f618cc991b1ec9797922b627eb03d04dabd6bb6cb4205117: Status 404 returned error can't find the container with id 57996809f1e2dec5f618cc991b1ec9797922b627eb03d04dabd6bb6cb4205117 Feb 23 14:16:32.710896 master-0 kubenswrapper[4171]: W0223 14:16:32.710794 4171 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9ad9373c007a4fcd25e70622bdc8deb.slice/crio-64bb509af2e5ff8862c488db172772b3eb4a331f81000bc6d7b1d4be31a7f27d WatchSource:0}: Error finding container 64bb509af2e5ff8862c488db172772b3eb4a331f81000bc6d7b1d4be31a7f27d: Status 404 returned error can't find the container with id 64bb509af2e5ff8862c488db172772b3eb4a331f81000bc6d7b1d4be31a7f27d Feb 23 14:16:32.812086 master-0 kubenswrapper[4171]: I0223 14:16:32.811991 4171 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 23 14:16:32.813736 master-0 kubenswrapper[4171]: E0223 14:16:32.813683 4171 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Feb 23 14:16:32.864667 master-0 kubenswrapper[4171]: I0223 14:16:32.864371 4171 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Feb 23 14:16:32.971543 master-0 kubenswrapper[4171]: W0223 14:16:32.971431 4171 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod687e92a6cecf1e2beeef16a0b322ad08.slice/crio-0829d0cc308970ce0a149e45fa21b4352374f015f4b31f7eb48a14c16cbea5b2 WatchSource:0}: Error finding container 0829d0cc308970ce0a149e45fa21b4352374f015f4b31f7eb48a14c16cbea5b2: Status 404 returned error can't find the container with id 0829d0cc308970ce0a149e45fa21b4352374f015f4b31f7eb48a14c16cbea5b2 Feb 23 14:16:33.063655 master-0 kubenswrapper[4171]: I0223 14:16:33.063400 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"56c3cb71c9851003c8de7e7c5db4b87e","Type":"ContainerStarted","Data":"6b4c5917f42a018e656736ffe0ec509b45d342d70ccb1039a1f41866022cf32e"} Feb 23 14:16:33.065175 master-0 kubenswrapper[4171]: I0223 14:16:33.065114 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"687e92a6cecf1e2beeef16a0b322ad08","Type":"ContainerStarted","Data":"0829d0cc308970ce0a149e45fa21b4352374f015f4b31f7eb48a14c16cbea5b2"} Feb 23 14:16:33.067327 master-0 kubenswrapper[4171]: I0223 14:16:33.067248 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"c9ad9373c007a4fcd25e70622bdc8deb","Type":"ContainerStarted","Data":"64bb509af2e5ff8862c488db172772b3eb4a331f81000bc6d7b1d4be31a7f27d"} Feb 23 14:16:33.069019 master-0 kubenswrapper[4171]: I0223 14:16:33.068957 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"c997c8e9d3be51d454d8e61e376bef08","Type":"ContainerStarted","Data":"57996809f1e2dec5f618cc991b1ec9797922b627eb03d04dabd6bb6cb4205117"} Feb 23 14:16:33.070894 master-0 kubenswrapper[4171]: I0223 14:16:33.070847 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"12dab5d350ebc129b0bfa4714d330b15","Type":"ContainerStarted","Data":"dd18675422a4846ac8ff692dbc3019546e4c2cecfa8b4d0fe07976539e44abe0"} Feb 23 14:16:33.863432 master-0 kubenswrapper[4171]: I0223 14:16:33.863086 4171 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Feb 23 14:16:33.882175 master-0 kubenswrapper[4171]: E0223 14:16:33.882120 4171 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Feb 23 14:16:33.949179 master-0 kubenswrapper[4171]: W0223 14:16:33.949124 4171 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Feb 23 14:16:33.949284 master-0 kubenswrapper[4171]: E0223 14:16:33.949178 4171 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Feb 23 14:16:33.973351 master-0 kubenswrapper[4171]: W0223 14:16:33.973286 4171 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Feb 23 14:16:33.973351 master-0 kubenswrapper[4171]: E0223 14:16:33.973342 4171 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Feb 23 14:16:34.004942 master-0 kubenswrapper[4171]: E0223 14:16:34.004757 4171 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.1896e5d16143a25f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:30.859616863 +0000 UTC m=+0.963018362,LastTimestamp:2026-02-23 14:16:30.859616863 +0000 UTC m=+0.963018362,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:34.136992 master-0 kubenswrapper[4171]: I0223 14:16:34.136871 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:34.137794 master-0 kubenswrapper[4171]: I0223 14:16:34.137732 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:34.137794 master-0 kubenswrapper[4171]: I0223 14:16:34.137787 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:34.137794 master-0 kubenswrapper[4171]: I0223 14:16:34.137799 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:34.138056 master-0 kubenswrapper[4171]: I0223 14:16:34.137872 4171 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Feb 23 14:16:34.138607 master-0 kubenswrapper[4171]: E0223 14:16:34.138562 4171 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Feb 23 14:16:34.664230 master-0 kubenswrapper[4171]: W0223 14:16:34.664183 4171 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Feb 23 14:16:34.664437 master-0 kubenswrapper[4171]: E0223 14:16:34.664243 4171 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Feb 23 14:16:34.864170 master-0 kubenswrapper[4171]: I0223 14:16:34.864109 4171 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Feb 23 14:16:35.077034 master-0 kubenswrapper[4171]: I0223 14:16:35.076787 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"12dab5d350ebc129b0bfa4714d330b15","Type":"ContainerStarted","Data":"5809ecf60a8e4db68dfab073298af03c567dcc4e91a5b6d7f6d78ca758010d15"} Feb 23 14:16:35.079358 master-0 kubenswrapper[4171]: I0223 14:16:35.078772 4171 generic.go:334] "Generic (PLEG): container finished" podID="c997c8e9d3be51d454d8e61e376bef08" containerID="02553ea2f34fd5b9d9104437dd7120800883c473073a6a74895604093906e009" exitCode=0 Feb 23 14:16:35.079358 master-0 kubenswrapper[4171]: I0223 14:16:35.078826 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"c997c8e9d3be51d454d8e61e376bef08","Type":"ContainerDied","Data":"02553ea2f34fd5b9d9104437dd7120800883c473073a6a74895604093906e009"} Feb 23 14:16:35.079358 master-0 kubenswrapper[4171]: I0223 14:16:35.078887 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:35.079978 master-0 kubenswrapper[4171]: I0223 14:16:35.079936 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:35.079978 master-0 kubenswrapper[4171]: I0223 14:16:35.079977 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:35.080073 master-0 kubenswrapper[4171]: I0223 14:16:35.079987 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:35.137871 master-0 kubenswrapper[4171]: W0223 14:16:35.137828 4171 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Feb 23 14:16:35.137987 master-0 kubenswrapper[4171]: E0223 14:16:35.137881 4171 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Feb 23 14:16:35.864002 master-0 kubenswrapper[4171]: I0223 14:16:35.863955 4171 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Feb 23 14:16:36.082634 master-0 kubenswrapper[4171]: I0223 14:16:36.082585 4171 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_c997c8e9d3be51d454d8e61e376bef08/kube-rbac-proxy-crio/0.log" Feb 23 14:16:36.083186 master-0 kubenswrapper[4171]: I0223 14:16:36.083098 4171 generic.go:334] "Generic (PLEG): container finished" podID="c997c8e9d3be51d454d8e61e376bef08" containerID="4b406f1bac6a5013c8196bd53ff19cd46bcbed85b0589d351d7e1f24937b3d67" exitCode=1 Feb 23 14:16:36.083186 master-0 kubenswrapper[4171]: I0223 14:16:36.083174 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:36.083250 master-0 kubenswrapper[4171]: I0223 14:16:36.083180 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"c997c8e9d3be51d454d8e61e376bef08","Type":"ContainerDied","Data":"4b406f1bac6a5013c8196bd53ff19cd46bcbed85b0589d351d7e1f24937b3d67"} Feb 23 14:16:36.084221 master-0 kubenswrapper[4171]: I0223 14:16:36.083730 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:36.084221 master-0 kubenswrapper[4171]: I0223 14:16:36.083753 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:36.084221 master-0 kubenswrapper[4171]: I0223 14:16:36.083762 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:36.084221 master-0 kubenswrapper[4171]: I0223 14:16:36.084009 4171 scope.go:117] "RemoveContainer" containerID="4b406f1bac6a5013c8196bd53ff19cd46bcbed85b0589d351d7e1f24937b3d67" Feb 23 14:16:36.086505 master-0 kubenswrapper[4171]: I0223 14:16:36.084965 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"12dab5d350ebc129b0bfa4714d330b15","Type":"ContainerStarted","Data":"86dd361ededa7f9d61d9c2bea900261b661a76c0468603804e9af20765f8d8cd"} Feb 23 14:16:36.086505 master-0 kubenswrapper[4171]: I0223 14:16:36.085068 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:36.087242 master-0 kubenswrapper[4171]: I0223 14:16:36.087216 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:36.087242 master-0 kubenswrapper[4171]: I0223 14:16:36.087242 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:36.087311 master-0 kubenswrapper[4171]: I0223 14:16:36.087251 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:36.864334 master-0 kubenswrapper[4171]: I0223 14:16:36.864251 4171 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Feb 23 14:16:36.973903 master-0 kubenswrapper[4171]: I0223 14:16:36.973832 4171 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 23 14:16:36.975497 master-0 kubenswrapper[4171]: E0223 14:16:36.975445 4171 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Feb 23 14:16:37.083779 master-0 kubenswrapper[4171]: E0223 14:16:37.083708 4171 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Feb 23 14:16:37.087238 master-0 kubenswrapper[4171]: I0223 14:16:37.087208 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:37.088087 master-0 kubenswrapper[4171]: I0223 14:16:37.088067 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:37.088145 master-0 kubenswrapper[4171]: I0223 14:16:37.088101 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:37.088145 master-0 kubenswrapper[4171]: I0223 14:16:37.088111 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:37.338718 master-0 kubenswrapper[4171]: I0223 14:16:37.338666 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:37.339928 master-0 kubenswrapper[4171]: I0223 14:16:37.339834 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:37.339928 master-0 kubenswrapper[4171]: I0223 14:16:37.339874 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:37.339928 master-0 kubenswrapper[4171]: I0223 14:16:37.339882 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:37.339928 master-0 kubenswrapper[4171]: I0223 14:16:37.339918 4171 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Feb 23 14:16:37.340880 master-0 kubenswrapper[4171]: E0223 14:16:37.340814 4171 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Feb 23 14:16:37.863826 master-0 kubenswrapper[4171]: I0223 14:16:37.863767 4171 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Feb 23 14:16:38.206396 master-0 kubenswrapper[4171]: W0223 14:16:38.206215 4171 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Feb 23 14:16:38.206396 master-0 kubenswrapper[4171]: E0223 14:16:38.206373 4171 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Feb 23 14:16:38.863826 master-0 kubenswrapper[4171]: I0223 14:16:38.863769 4171 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Feb 23 14:16:39.092664 master-0 kubenswrapper[4171]: I0223 14:16:39.091892 4171 generic.go:334] "Generic (PLEG): container finished" podID="687e92a6cecf1e2beeef16a0b322ad08" containerID="40ca3552a0c110bf631be979ddbff1eb4abba63ee7c1c34c419314566066d566" exitCode=0 Feb 23 14:16:39.092664 master-0 kubenswrapper[4171]: I0223 14:16:39.092178 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"687e92a6cecf1e2beeef16a0b322ad08","Type":"ContainerDied","Data":"40ca3552a0c110bf631be979ddbff1eb4abba63ee7c1c34c419314566066d566"} Feb 23 14:16:39.093847 master-0 kubenswrapper[4171]: I0223 14:16:39.092473 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:39.094353 master-0 kubenswrapper[4171]: I0223 14:16:39.094150 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:39.094353 master-0 kubenswrapper[4171]: I0223 14:16:39.094187 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:39.094353 master-0 kubenswrapper[4171]: I0223 14:16:39.094200 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:39.094353 master-0 kubenswrapper[4171]: I0223 14:16:39.094236 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"c9ad9373c007a4fcd25e70622bdc8deb","Type":"ContainerStarted","Data":"ad9ef13f95d7901e7f24b0914da444cc2df5f3bc77853f6da272e6cf3ddf8974"} Feb 23 14:16:39.096840 master-0 kubenswrapper[4171]: I0223 14:16:39.096785 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:39.098052 master-0 kubenswrapper[4171]: I0223 14:16:39.097943 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:39.098052 master-0 kubenswrapper[4171]: I0223 14:16:39.097981 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:39.098052 master-0 kubenswrapper[4171]: I0223 14:16:39.097997 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:39.098447 master-0 kubenswrapper[4171]: I0223 14:16:39.098377 4171 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_c997c8e9d3be51d454d8e61e376bef08/kube-rbac-proxy-crio/1.log" Feb 23 14:16:39.099077 master-0 kubenswrapper[4171]: I0223 14:16:39.098959 4171 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_c997c8e9d3be51d454d8e61e376bef08/kube-rbac-proxy-crio/0.log" Feb 23 14:16:39.100567 master-0 kubenswrapper[4171]: I0223 14:16:39.099551 4171 generic.go:334] "Generic (PLEG): container finished" podID="c997c8e9d3be51d454d8e61e376bef08" containerID="e6d5d550060f9bd110cd9ce89235c3a55d49cefa1174199bfdb6dbdcec650ff6" exitCode=1 Feb 23 14:16:39.100567 master-0 kubenswrapper[4171]: I0223 14:16:39.099648 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"c997c8e9d3be51d454d8e61e376bef08","Type":"ContainerDied","Data":"e6d5d550060f9bd110cd9ce89235c3a55d49cefa1174199bfdb6dbdcec650ff6"} Feb 23 14:16:39.100567 master-0 kubenswrapper[4171]: I0223 14:16:39.099721 4171 scope.go:117] "RemoveContainer" containerID="4b406f1bac6a5013c8196bd53ff19cd46bcbed85b0589d351d7e1f24937b3d67" Feb 23 14:16:39.100567 master-0 kubenswrapper[4171]: I0223 14:16:39.099924 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:39.102663 master-0 kubenswrapper[4171]: I0223 14:16:39.101439 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:39.102663 master-0 kubenswrapper[4171]: I0223 14:16:39.101473 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:39.102663 master-0 kubenswrapper[4171]: I0223 14:16:39.101529 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:39.102663 master-0 kubenswrapper[4171]: I0223 14:16:39.101996 4171 scope.go:117] "RemoveContainer" containerID="e6d5d550060f9bd110cd9ce89235c3a55d49cefa1174199bfdb6dbdcec650ff6" Feb 23 14:16:39.102663 master-0 kubenswrapper[4171]: E0223 14:16:39.102265 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(c997c8e9d3be51d454d8e61e376bef08)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="c997c8e9d3be51d454d8e61e376bef08" Feb 23 14:16:39.105149 master-0 kubenswrapper[4171]: I0223 14:16:39.105095 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"56c3cb71c9851003c8de7e7c5db4b87e","Type":"ContainerStarted","Data":"1161af5c0919fc04c557fffb0fa1799b448226d91a3bed741eb027099a2bf8f9"} Feb 23 14:16:39.105552 master-0 kubenswrapper[4171]: I0223 14:16:39.105450 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:39.106823 master-0 kubenswrapper[4171]: I0223 14:16:39.106756 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:39.106823 master-0 kubenswrapper[4171]: I0223 14:16:39.106778 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:39.106823 master-0 kubenswrapper[4171]: I0223 14:16:39.106789 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:39.563157 master-0 kubenswrapper[4171]: W0223 14:16:39.563051 4171 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Feb 23 14:16:39.564090 master-0 kubenswrapper[4171]: E0223 14:16:39.563224 4171 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Feb 23 14:16:40.139320 master-0 kubenswrapper[4171]: I0223 14:16:40.139119 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"687e92a6cecf1e2beeef16a0b322ad08","Type":"ContainerStarted","Data":"bdd3290dcf6f732f006b381bec2edfc3a7a58623787040a36811efd529225351"} Feb 23 14:16:40.141147 master-0 kubenswrapper[4171]: I0223 14:16:40.141100 4171 generic.go:334] "Generic (PLEG): container finished" podID="c9ad9373c007a4fcd25e70622bdc8deb" containerID="ad9ef13f95d7901e7f24b0914da444cc2df5f3bc77853f6da272e6cf3ddf8974" exitCode=1 Feb 23 14:16:40.141290 master-0 kubenswrapper[4171]: I0223 14:16:40.141214 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"c9ad9373c007a4fcd25e70622bdc8deb","Type":"ContainerDied","Data":"ad9ef13f95d7901e7f24b0914da444cc2df5f3bc77853f6da272e6cf3ddf8974"} Feb 23 14:16:40.143073 master-0 kubenswrapper[4171]: I0223 14:16:40.142986 4171 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_c997c8e9d3be51d454d8e61e376bef08/kube-rbac-proxy-crio/1.log" Feb 23 14:16:40.143922 master-0 kubenswrapper[4171]: I0223 14:16:40.143876 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:40.144469 master-0 kubenswrapper[4171]: I0223 14:16:40.144424 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:40.145285 master-0 kubenswrapper[4171]: I0223 14:16:40.145244 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:40.145285 master-0 kubenswrapper[4171]: I0223 14:16:40.145283 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:40.145413 master-0 kubenswrapper[4171]: I0223 14:16:40.145296 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:40.145671 master-0 kubenswrapper[4171]: I0223 14:16:40.145635 4171 scope.go:117] "RemoveContainer" containerID="e6d5d550060f9bd110cd9ce89235c3a55d49cefa1174199bfdb6dbdcec650ff6" Feb 23 14:16:40.145838 master-0 kubenswrapper[4171]: E0223 14:16:40.145798 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(c997c8e9d3be51d454d8e61e376bef08)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="c997c8e9d3be51d454d8e61e376bef08" Feb 23 14:16:40.145885 master-0 kubenswrapper[4171]: I0223 14:16:40.145873 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:40.145928 master-0 kubenswrapper[4171]: I0223 14:16:40.145889 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:40.145928 master-0 kubenswrapper[4171]: I0223 14:16:40.145902 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:41.026320 master-0 kubenswrapper[4171]: E0223 14:16:41.025486 4171 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Feb 23 14:16:41.195708 master-0 kubenswrapper[4171]: W0223 14:16:41.195639 4171 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Feb 23 14:16:41.195885 master-0 kubenswrapper[4171]: E0223 14:16:41.195711 4171 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 23 14:16:41.195885 master-0 kubenswrapper[4171]: W0223 14:16:41.195874 4171 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Feb 23 14:16:41.195958 master-0 kubenswrapper[4171]: E0223 14:16:41.195898 4171 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 23 14:16:41.199458 master-0 kubenswrapper[4171]: I0223 14:16:41.198530 4171 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 23 14:16:41.868574 master-0 kubenswrapper[4171]: I0223 14:16:41.868517 4171 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 23 14:16:42.868938 master-0 kubenswrapper[4171]: I0223 14:16:42.868553 4171 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 23 14:16:43.152420 master-0 kubenswrapper[4171]: I0223 14:16:43.151713 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"687e92a6cecf1e2beeef16a0b322ad08","Type":"ContainerStarted","Data":"e3d987d25306f70a7327b5bce6ea549b476972db2d3366cf37d35b30c1531578"} Feb 23 14:16:43.152420 master-0 kubenswrapper[4171]: I0223 14:16:43.151867 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:43.153793 master-0 kubenswrapper[4171]: I0223 14:16:43.153287 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:43.153793 master-0 kubenswrapper[4171]: I0223 14:16:43.153310 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:43.153793 master-0 kubenswrapper[4171]: I0223 14:16:43.153319 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:43.156987 master-0 kubenswrapper[4171]: I0223 14:16:43.156634 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"c9ad9373c007a4fcd25e70622bdc8deb","Type":"ContainerStarted","Data":"f38113657e6647d113d4b8b771a4b871cb4df714ffeae8172aebba272b7e4da9"} Feb 23 14:16:43.156987 master-0 kubenswrapper[4171]: I0223 14:16:43.156698 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:43.157868 master-0 kubenswrapper[4171]: I0223 14:16:43.157396 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:43.157868 master-0 kubenswrapper[4171]: I0223 14:16:43.157412 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:43.157868 master-0 kubenswrapper[4171]: I0223 14:16:43.157423 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:43.157868 master-0 kubenswrapper[4171]: I0223 14:16:43.157635 4171 scope.go:117] "RemoveContainer" containerID="ad9ef13f95d7901e7f24b0914da444cc2df5f3bc77853f6da272e6cf3ddf8974" Feb 23 14:16:43.492658 master-0 kubenswrapper[4171]: E0223 14:16:43.492572 4171 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 23 14:16:43.741182 master-0 kubenswrapper[4171]: I0223 14:16:43.741071 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:43.742903 master-0 kubenswrapper[4171]: I0223 14:16:43.742774 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:43.742903 master-0 kubenswrapper[4171]: I0223 14:16:43.742836 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:43.742903 master-0 kubenswrapper[4171]: I0223 14:16:43.742854 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:43.743119 master-0 kubenswrapper[4171]: I0223 14:16:43.742918 4171 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Feb 23 14:16:43.750954 master-0 kubenswrapper[4171]: E0223 14:16:43.750888 4171 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Feb 23 14:16:43.869683 master-0 kubenswrapper[4171]: I0223 14:16:43.869630 4171 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 23 14:16:44.013937 master-0 kubenswrapper[4171]: E0223 14:16:44.013758 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.1896e5d16143a25f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:30.859616863 +0000 UTC m=+0.963018362,LastTimestamp:2026-02-23 14:16:30.859616863 +0000 UTC m=+0.963018362,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.021420 master-0 kubenswrapper[4171]: E0223 14:16:44.021274 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.1896e5d16564e30a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:30.92890497 +0000 UTC m=+1.032306469,LastTimestamp:2026-02-23 14:16:30.92890497 +0000 UTC m=+1.032306469,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.028039 master-0 kubenswrapper[4171]: E0223 14:16:44.027921 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.1896e5d1656538f1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:30.928926961 +0000 UTC m=+1.032328460,LastTimestamp:2026-02-23 14:16:30.928926961 +0000 UTC m=+1.032328460,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.034416 master-0 kubenswrapper[4171]: E0223 14:16:44.034302 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.1896e5d165655fa3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:30.928936867 +0000 UTC m=+1.032338366,LastTimestamp:2026-02-23 14:16:30.928936867 +0000 UTC m=+1.032338366,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.041286 master-0 kubenswrapper[4171]: E0223 14:16:44.041184 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.1896e5d16c495859 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:31.044540505 +0000 UTC m=+1.147942034,LastTimestamp:2026-02-23 14:16:31.044540505 +0000 UTC m=+1.147942034,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.097119 master-0 kubenswrapper[4171]: E0223 14:16:44.096899 4171 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.1896e5d16564e30a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.1896e5d16564e30a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:30.92890497 +0000 UTC m=+1.032306469,LastTimestamp:2026-02-23 14:16:31.124152758 +0000 UTC m=+1.227554287,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.102319 master-0 kubenswrapper[4171]: E0223 14:16:44.102201 4171 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.1896e5d1656538f1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.1896e5d1656538f1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:30.928926961 +0000 UTC m=+1.032328460,LastTimestamp:2026-02-23 14:16:31.124183456 +0000 UTC m=+1.227584975,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.108666 master-0 kubenswrapper[4171]: E0223 14:16:44.108546 4171 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.1896e5d165655fa3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.1896e5d165655fa3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:30.928936867 +0000 UTC m=+1.032338366,LastTimestamp:2026-02-23 14:16:31.124200429 +0000 UTC m=+1.227601958,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.121968 master-0 kubenswrapper[4171]: E0223 14:16:44.121840 4171 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.1896e5d16564e30a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.1896e5d16564e30a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:30.92890497 +0000 UTC m=+1.032306469,LastTimestamp:2026-02-23 14:16:31.15932364 +0000 UTC m=+1.262725169,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.127745 master-0 kubenswrapper[4171]: E0223 14:16:44.127633 4171 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.1896e5d1656538f1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.1896e5d1656538f1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:30.928926961 +0000 UTC m=+1.032328460,LastTimestamp:2026-02-23 14:16:31.159372781 +0000 UTC m=+1.262774310,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.132266 master-0 kubenswrapper[4171]: E0223 14:16:44.132166 4171 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.1896e5d165655fa3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.1896e5d165655fa3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:30.928936867 +0000 UTC m=+1.032338366,LastTimestamp:2026-02-23 14:16:31.159394262 +0000 UTC m=+1.262795791,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.136838 master-0 kubenswrapper[4171]: E0223 14:16:44.136729 4171 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.1896e5d16564e30a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.1896e5d16564e30a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:30.92890497 +0000 UTC m=+1.032306469,LastTimestamp:2026-02-23 14:16:31.160711756 +0000 UTC m=+1.264113285,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.141704 master-0 kubenswrapper[4171]: E0223 14:16:44.141603 4171 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.1896e5d1656538f1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.1896e5d1656538f1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:30.928926961 +0000 UTC m=+1.032328460,LastTimestamp:2026-02-23 14:16:31.160780958 +0000 UTC m=+1.264182517,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.148080 master-0 kubenswrapper[4171]: E0223 14:16:44.147984 4171 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.1896e5d165655fa3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.1896e5d165655fa3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:30.928936867 +0000 UTC m=+1.032338366,LastTimestamp:2026-02-23 14:16:31.160811236 +0000 UTC m=+1.264212765,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.153937 master-0 kubenswrapper[4171]: E0223 14:16:44.153864 4171 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.1896e5d16564e30a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.1896e5d16564e30a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:30.92890497 +0000 UTC m=+1.032306469,LastTimestamp:2026-02-23 14:16:31.161147952 +0000 UTC m=+1.264549481,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.161051 master-0 kubenswrapper[4171]: E0223 14:16:44.160915 4171 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.1896e5d1656538f1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.1896e5d1656538f1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:30.928926961 +0000 UTC m=+1.032328460,LastTimestamp:2026-02-23 14:16:31.161169713 +0000 UTC m=+1.264571232,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.164121 master-0 kubenswrapper[4171]: I0223 14:16:44.164062 4171 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:16:44.170004 master-0 kubenswrapper[4171]: I0223 14:16:44.169939 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:44.170164 master-0 kubenswrapper[4171]: I0223 14:16:44.170125 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:44.171161 master-0 kubenswrapper[4171]: I0223 14:16:44.170639 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"c9ad9373c007a4fcd25e70622bdc8deb","Type":"ContainerStarted","Data":"b545413980bb822863005db697b932a984f3d1797f9e0fd0d4ca5331ec57bc46"} Feb 23 14:16:44.171330 master-0 kubenswrapper[4171]: I0223 14:16:44.171299 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:44.171372 master-0 kubenswrapper[4171]: I0223 14:16:44.171343 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:44.171372 master-0 kubenswrapper[4171]: I0223 14:16:44.171361 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:44.171912 master-0 kubenswrapper[4171]: I0223 14:16:44.171881 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:44.171963 master-0 kubenswrapper[4171]: I0223 14:16:44.171918 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:44.171963 master-0 kubenswrapper[4171]: I0223 14:16:44.171935 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:44.177639 master-0 kubenswrapper[4171]: E0223 14:16:44.177523 4171 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.1896e5d165655fa3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.1896e5d165655fa3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:30.928936867 +0000 UTC m=+1.032338366,LastTimestamp:2026-02-23 14:16:31.161187966 +0000 UTC m=+1.264589495,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.183947 master-0 kubenswrapper[4171]: E0223 14:16:44.183854 4171 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.1896e5d16564e30a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.1896e5d16564e30a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:30.92890497 +0000 UTC m=+1.032306469,LastTimestamp:2026-02-23 14:16:31.162274422 +0000 UTC m=+1.265675951,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.190397 master-0 kubenswrapper[4171]: E0223 14:16:44.190242 4171 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.1896e5d1656538f1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.1896e5d1656538f1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:30.928926961 +0000 UTC m=+1.032328460,LastTimestamp:2026-02-23 14:16:31.162298083 +0000 UTC m=+1.265699611,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.196759 master-0 kubenswrapper[4171]: E0223 14:16:44.196635 4171 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.1896e5d165655fa3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.1896e5d165655fa3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:30.928936867 +0000 UTC m=+1.032338366,LastTimestamp:2026-02-23 14:16:31.162317595 +0000 UTC m=+1.265719124,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.201882 master-0 kubenswrapper[4171]: E0223 14:16:44.201786 4171 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.1896e5d16564e30a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.1896e5d16564e30a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:30.92890497 +0000 UTC m=+1.032306469,LastTimestamp:2026-02-23 14:16:31.162841785 +0000 UTC m=+1.266243314,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.208629 master-0 kubenswrapper[4171]: E0223 14:16:44.208425 4171 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.1896e5d1656538f1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.1896e5d1656538f1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:30.928926961 +0000 UTC m=+1.032328460,LastTimestamp:2026-02-23 14:16:31.162881639 +0000 UTC m=+1.266283168,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.214688 master-0 kubenswrapper[4171]: E0223 14:16:44.214520 4171 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.1896e5d165655fa3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.1896e5d165655fa3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:30.928936867 +0000 UTC m=+1.032338366,LastTimestamp:2026-02-23 14:16:31.162897403 +0000 UTC m=+1.266298922,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.222096 master-0 kubenswrapper[4171]: E0223 14:16:44.221999 4171 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.1896e5d16564e30a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.1896e5d16564e30a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:30.92890497 +0000 UTC m=+1.032306469,LastTimestamp:2026-02-23 14:16:31.163514537 +0000 UTC m=+1.266916066,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.227263 master-0 kubenswrapper[4171]: E0223 14:16:44.227149 4171 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.1896e5d1656538f1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.1896e5d1656538f1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:30.928926961 +0000 UTC m=+1.032328460,LastTimestamp:2026-02-23 14:16:31.163548153 +0000 UTC m=+1.266949682,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.235670 master-0 kubenswrapper[4171]: E0223 14:16:44.235517 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.1896e5d1c0f02e20 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:56c3cb71c9851003c8de7e7c5db4b87e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8177c465e14c63854e5c0fa95ca0635cffc9b5dd3d077ecf971feedbc42b1274\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:32.464760352 +0000 UTC m=+2.568161881,LastTimestamp:2026-02-23 14:16:32.464760352 +0000 UTC m=+2.568161881,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.242168 master-0 kubenswrapper[4171]: E0223 14:16:44.242054 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.1896e5d1c61b18f0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:12dab5d350ebc129b0bfa4714d330b15,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d77a77c401bcfaa65a6ab6de82415af0e7ace1b470626647e5feb4875c89a5ef\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:32.551459056 +0000 UTC m=+2.654860585,LastTimestamp:2026-02-23 14:16:32.551459056 +0000 UTC m=+2.654860585,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.247630 master-0 kubenswrapper[4171]: E0223 14:16:44.247474 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.1896e5d1caab4883 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:c997c8e9d3be51d454d8e61e376bef08,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cb2014728aa54e620f65424402b14c5247016734a9a982c393dc011acb1a1f52\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:32.628017283 +0000 UTC m=+2.731418812,LastTimestamp:2026-02-23 14:16:32.628017283 +0000 UTC m=+2.731418812,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.254158 master-0 kubenswrapper[4171]: E0223 14:16:44.254033 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.1896e5d1cfd62475 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:c9ad9373c007a4fcd25e70622bdc8deb,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8177c465e14c63854e5c0fa95ca0635cffc9b5dd3d077ecf971feedbc42b1274\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:32.714712181 +0000 UTC m=+2.818113700,LastTimestamp:2026-02-23 14:16:32.714712181 +0000 UTC m=+2.818113700,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.261218 master-0 kubenswrapper[4171]: E0223 14:16:44.261063 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.1896e5d1df5e6ae4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:687e92a6cecf1e2beeef16a0b322ad08,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8177c465e14c63854e5c0fa95ca0635cffc9b5dd3d077ecf971feedbc42b1274\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:32.975301348 +0000 UTC m=+3.078702867,LastTimestamp:2026-02-23 14:16:32.975301348 +0000 UTC m=+3.078702867,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.269882 master-0 kubenswrapper[4171]: E0223 14:16:44.269777 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.1896e5d248fe16dc openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:c997c8e9d3be51d454d8e61e376bef08,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cb2014728aa54e620f65424402b14c5247016734a9a982c393dc011acb1a1f52\" in 2.119s (2.119s including waiting). Image size: 464984427 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:34.747373276 +0000 UTC m=+4.850774765,LastTimestamp:2026-02-23 14:16:34.747373276 +0000 UTC m=+4.850774765,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.277262 master-0 kubenswrapper[4171]: E0223 14:16:44.277105 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.1896e5d249c453b4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:12dab5d350ebc129b0bfa4714d330b15,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d77a77c401bcfaa65a6ab6de82415af0e7ace1b470626647e5feb4875c89a5ef\" in 2.208s (2.208s including waiting). Image size: 529218694 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:34.76036498 +0000 UTC m=+4.863766469,LastTimestamp:2026-02-23 14:16:34.76036498 +0000 UTC m=+4.863766469,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.284632 master-0 kubenswrapper[4171]: E0223 14:16:44.284561 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.1896e5d2549db5d2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:12dab5d350ebc129b0bfa4714d330b15,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container: etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:34.94238357 +0000 UTC m=+5.045785059,LastTimestamp:2026-02-23 14:16:34.94238357 +0000 UTC m=+5.045785059,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.293193 master-0 kubenswrapper[4171]: E0223 14:16:44.293066 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.1896e5d254bd56d0 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:c997c8e9d3be51d454d8e61e376bef08,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:34.9444564 +0000 UTC m=+5.047857889,LastTimestamp:2026-02-23 14:16:34.9444564 +0000 UTC m=+5.047857889,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.300316 master-0 kubenswrapper[4171]: E0223 14:16:44.300210 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.1896e5d255ccbaa9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:12dab5d350ebc129b0bfa4714d330b15,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:34.962242217 +0000 UTC m=+5.065643706,LastTimestamp:2026-02-23 14:16:34.962242217 +0000 UTC m=+5.065643706,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.307873 master-0 kubenswrapper[4171]: E0223 14:16:44.307744 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.1896e5d255f662f8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:12dab5d350ebc129b0bfa4714d330b15,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d77a77c401bcfaa65a6ab6de82415af0e7ace1b470626647e5feb4875c89a5ef\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:34.96497228 +0000 UTC m=+5.068373779,LastTimestamp:2026-02-23 14:16:34.96497228 +0000 UTC m=+5.068373779,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.313814 master-0 kubenswrapper[4171]: E0223 14:16:44.313740 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.1896e5d25685260d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:c997c8e9d3be51d454d8e61e376bef08,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:34.974328333 +0000 UTC m=+5.077729822,LastTimestamp:2026-02-23 14:16:34.974328333 +0000 UTC m=+5.077729822,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.319609 master-0 kubenswrapper[4171]: E0223 14:16:44.319459 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.1896e5d25d007e84 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:c997c8e9d3be51d454d8e61e376bef08,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cb2014728aa54e620f65424402b14c5247016734a9a982c393dc011acb1a1f52\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:35.083075204 +0000 UTC m=+5.186476713,LastTimestamp:2026-02-23 14:16:35.083075204 +0000 UTC m=+5.186476713,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.326762 master-0 kubenswrapper[4171]: E0223 14:16:44.326631 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.1896e5d2602c7385 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:12dab5d350ebc129b0bfa4714d330b15,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container: etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:35.136287621 +0000 UTC m=+5.239689110,LastTimestamp:2026-02-23 14:16:35.136287621 +0000 UTC m=+5.239689110,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.333035 master-0 kubenswrapper[4171]: E0223 14:16:44.332936 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.1896e5d2614ab705 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:12dab5d350ebc129b0bfa4714d330b15,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:35.155048197 +0000 UTC m=+5.258449686,LastTimestamp:2026-02-23 14:16:35.155048197 +0000 UTC m=+5.258449686,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.340431 master-0 kubenswrapper[4171]: E0223 14:16:44.340312 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.1896e5d26b0d67d7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:c997c8e9d3be51d454d8e61e376bef08,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:35.318802391 +0000 UTC m=+5.422203880,LastTimestamp:2026-02-23 14:16:35.318802391 +0000 UTC m=+5.422203880,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.348350 master-0 kubenswrapper[4171]: E0223 14:16:44.348209 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.1896e5d26c0fbde7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:c997c8e9d3be51d454d8e61e376bef08,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:35.335732711 +0000 UTC m=+5.439134200,LastTimestamp:2026-02-23 14:16:35.335732711 +0000 UTC m=+5.439134200,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.357454 master-0 kubenswrapper[4171]: E0223 14:16:44.357243 4171 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.1896e5d25d007e84\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.1896e5d25d007e84 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:c997c8e9d3be51d454d8e61e376bef08,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cb2014728aa54e620f65424402b14c5247016734a9a982c393dc011acb1a1f52\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:35.083075204 +0000 UTC m=+5.186476713,LastTimestamp:2026-02-23 14:16:38.125978468 +0000 UTC m=+8.229379957,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.364463 master-0 kubenswrapper[4171]: E0223 14:16:44.364312 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.1896e5d3184e73d1 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:56c3cb71c9851003c8de7e7c5db4b87e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8177c465e14c63854e5c0fa95ca0635cffc9b5dd3d077ecf971feedbc42b1274\" in 5.76s (5.76s including waiting). Image size: 943734757 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:38.225523665 +0000 UTC m=+8.328925194,LastTimestamp:2026-02-23 14:16:38.225523665 +0000 UTC m=+8.328925194,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.369812 master-0 kubenswrapper[4171]: E0223 14:16:44.369669 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.1896e5d319ba043e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:687e92a6cecf1e2beeef16a0b322ad08,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8177c465e14c63854e5c0fa95ca0635cffc9b5dd3d077ecf971feedbc42b1274\" in 5.274s (5.274s including waiting). Image size: 943734757 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:38.249350206 +0000 UTC m=+8.352751695,LastTimestamp:2026-02-23 14:16:38.249350206 +0000 UTC m=+8.352751695,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.375803 master-0 kubenswrapper[4171]: E0223 14:16:44.375666 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.1896e5d31b82a317 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:c9ad9373c007a4fcd25e70622bdc8deb,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8177c465e14c63854e5c0fa95ca0635cffc9b5dd3d077ecf971feedbc42b1274\" in 5.564s (5.564s including waiting). Image size: 943734757 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:38.279275287 +0000 UTC m=+8.382676786,LastTimestamp:2026-02-23 14:16:38.279275287 +0000 UTC m=+8.382676786,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.384446 master-0 kubenswrapper[4171]: E0223 14:16:44.384331 4171 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.1896e5d26b0d67d7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.1896e5d26b0d67d7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:c997c8e9d3be51d454d8e61e376bef08,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:35.318802391 +0000 UTC m=+5.422203880,LastTimestamp:2026-02-23 14:16:38.391559938 +0000 UTC m=+8.494961437,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.391460 master-0 kubenswrapper[4171]: E0223 14:16:44.391333 4171 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.1896e5d26c0fbde7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.1896e5d26c0fbde7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:c997c8e9d3be51d454d8e61e376bef08,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:35.335732711 +0000 UTC m=+5.439134200,LastTimestamp:2026-02-23 14:16:38.469247381 +0000 UTC m=+8.572648870,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.398331 master-0 kubenswrapper[4171]: E0223 14:16:44.398219 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.1896e5d32c0557d2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:687e92a6cecf1e2beeef16a0b322ad08,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:38.55627669 +0000 UTC m=+8.659678179,LastTimestamp:2026-02-23 14:16:38.55627669 +0000 UTC m=+8.659678179,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.405557 master-0 kubenswrapper[4171]: E0223 14:16:44.405349 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.1896e5d32cb5d23b kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:c9ad9373c007a4fcd25e70622bdc8deb,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container: kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:38.567842363 +0000 UTC m=+8.671243852,LastTimestamp:2026-02-23 14:16:38.567842363 +0000 UTC m=+8.671243852,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.414699 master-0 kubenswrapper[4171]: E0223 14:16:44.414531 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.1896e5d32d09296b kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:56c3cb71c9851003c8de7e7c5db4b87e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container: kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:38.573304171 +0000 UTC m=+8.676705700,LastTimestamp:2026-02-23 14:16:38.573304171 +0000 UTC m=+8.676705700,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.422081 master-0 kubenswrapper[4171]: E0223 14:16:44.421937 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.1896e5d32f6891a7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:687e92a6cecf1e2beeef16a0b322ad08,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:38.613111207 +0000 UTC m=+8.716512696,LastTimestamp:2026-02-23 14:16:38.613111207 +0000 UTC m=+8.716512696,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.433250 master-0 kubenswrapper[4171]: E0223 14:16:44.433089 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.1896e5d334c4724a kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:c9ad9373c007a4fcd25e70622bdc8deb,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:38.70301857 +0000 UTC m=+8.806420099,LastTimestamp:2026-02-23 14:16:38.70301857 +0000 UTC m=+8.806420099,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.442032 master-0 kubenswrapper[4171]: E0223 14:16:44.441775 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.1896e5d334d9f8be kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:c9ad9373c007a4fcd25e70622bdc8deb,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:457c564075e8b14b1d24ff6eab750600ebc90ff8b7bb137306a579ee8445ae95\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:38.704429246 +0000 UTC m=+8.807830745,LastTimestamp:2026-02-23 14:16:38.704429246 +0000 UTC m=+8.807830745,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.448595 master-0 kubenswrapper[4171]: E0223 14:16:44.448447 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.1896e5d335786673 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:56c3cb71c9851003c8de7e7c5db4b87e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:38.714812019 +0000 UTC m=+8.818213548,LastTimestamp:2026-02-23 14:16:38.714812019 +0000 UTC m=+8.818213548,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.455349 master-0 kubenswrapper[4171]: E0223 14:16:44.455216 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.1896e5d34c3c03bb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:687e92a6cecf1e2beeef16a0b322ad08,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8177c465e14c63854e5c0fa95ca0635cffc9b5dd3d077ecf971feedbc42b1274\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:39.096730555 +0000 UTC m=+9.200132054,LastTimestamp:2026-02-23 14:16:39.096730555 +0000 UTC m=+9.200132054,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.463203 master-0 kubenswrapper[4171]: E0223 14:16:44.462973 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.1896e5d34c8f85c5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:c997c8e9d3be51d454d8e61e376bef08,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(c997c8e9d3be51d454d8e61e376bef08),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:39.102203333 +0000 UTC m=+9.205604862,LastTimestamp:2026-02-23 14:16:39.102203333 +0000 UTC m=+9.205604862,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.470511 master-0 kubenswrapper[4171]: E0223 14:16:44.470359 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.1896e5d366b96ed8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:687e92a6cecf1e2beeef16a0b322ad08,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container: kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:39.541157592 +0000 UTC m=+9.644559111,LastTimestamp:2026-02-23 14:16:39.541157592 +0000 UTC m=+9.644559111,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.476719 master-0 kubenswrapper[4171]: E0223 14:16:44.476565 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.1896e5d36f7e4246 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:687e92a6cecf1e2beeef16a0b322ad08,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:39.688274502 +0000 UTC m=+9.791676031,LastTimestamp:2026-02-23 14:16:39.688274502 +0000 UTC m=+9.791676031,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.482914 master-0 kubenswrapper[4171]: E0223 14:16:44.482772 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.1896e5d36f9abbad openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:687e92a6cecf1e2beeef16a0b322ad08,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fd63e2c1185e529c6e9f6e1426222ff2ac195132b44a1775f407e4593b66d4c\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:39.690140589 +0000 UTC m=+9.793542118,LastTimestamp:2026-02-23 14:16:39.690140589 +0000 UTC m=+9.793542118,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.488415 master-0 kubenswrapper[4171]: E0223 14:16:44.488005 4171 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.1896e5d34c8f85c5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.1896e5d34c8f85c5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:c997c8e9d3be51d454d8e61e376bef08,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(c997c8e9d3be51d454d8e61e376bef08),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:39.102203333 +0000 UTC m=+9.205604862,LastTimestamp:2026-02-23 14:16:40.145775219 +0000 UTC m=+10.249176728,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.497511 master-0 kubenswrapper[4171]: E0223 14:16:44.497299 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.1896e5d40e042b8b kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:c9ad9373c007a4fcd25e70622bdc8deb,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:457c564075e8b14b1d24ff6eab750600ebc90ff8b7bb137306a579ee8445ae95\" in 3.643s (3.643s including waiting). Image size: 505137106 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:42.347850635 +0000 UTC m=+12.451252124,LastTimestamp:2026-02-23 14:16:42.347850635 +0000 UTC m=+12.451252124,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.504109 master-0 kubenswrapper[4171]: E0223 14:16:44.503965 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.1896e5d40f14d49b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:687e92a6cecf1e2beeef16a0b322ad08,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fd63e2c1185e529c6e9f6e1426222ff2ac195132b44a1775f407e4593b66d4c\" in 2.675s (2.675s including waiting). Image size: 514875199 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:42.365719707 +0000 UTC m=+12.469121196,LastTimestamp:2026-02-23 14:16:42.365719707 +0000 UTC m=+12.469121196,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.510871 master-0 kubenswrapper[4171]: E0223 14:16:44.510742 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.1896e5d41c97e975 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:c9ad9373c007a4fcd25e70622bdc8deb,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container: cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:42.592414069 +0000 UTC m=+12.695815558,LastTimestamp:2026-02-23 14:16:42.592414069 +0000 UTC m=+12.695815558,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.517168 master-0 kubenswrapper[4171]: E0223 14:16:44.517019 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.1896e5d41ca284ea openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:687e92a6cecf1e2beeef16a0b322ad08,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container: kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:42.593109226 +0000 UTC m=+12.696510715,LastTimestamp:2026-02-23 14:16:42.593109226 +0000 UTC m=+12.696510715,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.523731 master-0 kubenswrapper[4171]: E0223 14:16:44.523537 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.1896e5d41da06665 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:687e92a6cecf1e2beeef16a0b322ad08,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:42.609747557 +0000 UTC m=+12.713149046,LastTimestamp:2026-02-23 14:16:42.609747557 +0000 UTC m=+12.713149046,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.530080 master-0 kubenswrapper[4171]: E0223 14:16:44.529944 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.1896e5d41dcd8820 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:c9ad9373c007a4fcd25e70622bdc8deb,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:42.612705312 +0000 UTC m=+12.716106811,LastTimestamp:2026-02-23 14:16:42.612705312 +0000 UTC m=+12.716106811,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.537347 master-0 kubenswrapper[4171]: E0223 14:16:44.537208 4171 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.1896e5d43e83ae4c kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:c9ad9373c007a4fcd25e70622bdc8deb,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8177c465e14c63854e5c0fa95ca0635cffc9b5dd3d077ecf971feedbc42b1274\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:43.161513548 +0000 UTC m=+13.264915027,LastTimestamp:2026-02-23 14:16:43.161513548 +0000 UTC m=+13.264915027,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.544722 master-0 kubenswrapper[4171]: E0223 14:16:44.544522 4171 event.go:359] "Server rejected event (will not retry!)" err="events \"bootstrap-kube-controller-manager-master-0.1896e5d32cb5d23b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.1896e5d32cb5d23b kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:c9ad9373c007a4fcd25e70622bdc8deb,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container: kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:38.567842363 +0000 UTC m=+8.671243852,LastTimestamp:2026-02-23 14:16:43.648941932 +0000 UTC m=+13.752343461,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.552415 master-0 kubenswrapper[4171]: E0223 14:16:44.552272 4171 event.go:359] "Server rejected event (will not retry!)" err="events \"bootstrap-kube-controller-manager-master-0.1896e5d334c4724a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.1896e5d334c4724a kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:c9ad9373c007a4fcd25e70622bdc8deb,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:38.70301857 +0000 UTC m=+8.806420099,LastTimestamp:2026-02-23 14:16:43.882564228 +0000 UTC m=+13.985965757,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:44.870139 master-0 kubenswrapper[4171]: I0223 14:16:44.869994 4171 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 23 14:16:45.142381 master-0 kubenswrapper[4171]: I0223 14:16:45.142123 4171 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:16:45.151787 master-0 kubenswrapper[4171]: I0223 14:16:45.151733 4171 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:16:45.171814 master-0 kubenswrapper[4171]: I0223 14:16:45.171711 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:45.171814 master-0 kubenswrapper[4171]: I0223 14:16:45.171711 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:45.172088 master-0 kubenswrapper[4171]: I0223 14:16:45.172040 4171 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:16:45.172554 master-0 kubenswrapper[4171]: I0223 14:16:45.172465 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:45.172554 master-0 kubenswrapper[4171]: I0223 14:16:45.172540 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:45.172554 master-0 kubenswrapper[4171]: I0223 14:16:45.172552 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:45.172805 master-0 kubenswrapper[4171]: I0223 14:16:45.172753 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:45.172948 master-0 kubenswrapper[4171]: I0223 14:16:45.172867 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:45.173022 master-0 kubenswrapper[4171]: I0223 14:16:45.172985 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:45.177858 master-0 kubenswrapper[4171]: I0223 14:16:45.177804 4171 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:16:45.414870 master-0 kubenswrapper[4171]: I0223 14:16:45.414661 4171 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 23 14:16:45.457964 master-0 kubenswrapper[4171]: I0223 14:16:45.457836 4171 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 23 14:16:45.868991 master-0 kubenswrapper[4171]: I0223 14:16:45.868913 4171 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 23 14:16:45.933024 master-0 kubenswrapper[4171]: W0223 14:16:45.932932 4171 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "master-0" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Feb 23 14:16:45.933024 master-0 kubenswrapper[4171]: E0223 14:16:45.933012 4171 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"master-0\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 23 14:16:46.173984 master-0 kubenswrapper[4171]: I0223 14:16:46.173823 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:46.175030 master-0 kubenswrapper[4171]: I0223 14:16:46.174961 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:46.175030 master-0 kubenswrapper[4171]: I0223 14:16:46.175009 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:46.175030 master-0 kubenswrapper[4171]: I0223 14:16:46.175024 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:46.870963 master-0 kubenswrapper[4171]: I0223 14:16:46.870886 4171 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 23 14:16:47.176833 master-0 kubenswrapper[4171]: I0223 14:16:47.176628 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:47.178103 master-0 kubenswrapper[4171]: I0223 14:16:47.178050 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:47.178171 master-0 kubenswrapper[4171]: I0223 14:16:47.178110 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:47.178171 master-0 kubenswrapper[4171]: I0223 14:16:47.178130 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:47.871778 master-0 kubenswrapper[4171]: I0223 14:16:47.871685 4171 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 23 14:16:48.365571 master-0 kubenswrapper[4171]: I0223 14:16:48.365403 4171 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:16:48.366456 master-0 kubenswrapper[4171]: I0223 14:16:48.365814 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:48.367911 master-0 kubenswrapper[4171]: I0223 14:16:48.367837 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:48.368026 master-0 kubenswrapper[4171]: I0223 14:16:48.367918 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:48.368026 master-0 kubenswrapper[4171]: I0223 14:16:48.367938 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:48.688119 master-0 kubenswrapper[4171]: I0223 14:16:48.688020 4171 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:16:48.694395 master-0 kubenswrapper[4171]: I0223 14:16:48.694303 4171 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:16:48.870516 master-0 kubenswrapper[4171]: I0223 14:16:48.870386 4171 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 23 14:16:49.105375 master-0 kubenswrapper[4171]: W0223 14:16:49.105289 4171 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Feb 23 14:16:49.105375 master-0 kubenswrapper[4171]: E0223 14:16:49.105357 4171 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 23 14:16:49.180917 master-0 kubenswrapper[4171]: I0223 14:16:49.180864 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:49.182166 master-0 kubenswrapper[4171]: I0223 14:16:49.182116 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:49.182310 master-0 kubenswrapper[4171]: I0223 14:16:49.182256 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:49.182365 master-0 kubenswrapper[4171]: I0223 14:16:49.182343 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:49.869508 master-0 kubenswrapper[4171]: I0223 14:16:49.869358 4171 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 23 14:16:50.128461 master-0 kubenswrapper[4171]: I0223 14:16:50.128315 4171 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:16:50.135200 master-0 kubenswrapper[4171]: I0223 14:16:50.135125 4171 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:16:50.183549 master-0 kubenswrapper[4171]: I0223 14:16:50.183026 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:50.184683 master-0 kubenswrapper[4171]: I0223 14:16:50.184614 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:50.184683 master-0 kubenswrapper[4171]: I0223 14:16:50.184675 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:50.184867 master-0 kubenswrapper[4171]: I0223 14:16:50.184699 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:50.502800 master-0 kubenswrapper[4171]: E0223 14:16:50.502743 4171 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 23 14:16:50.588790 master-0 kubenswrapper[4171]: I0223 14:16:50.588723 4171 csr.go:261] certificate signing request csr-h885b is approved, waiting to be issued Feb 23 14:16:50.751762 master-0 kubenswrapper[4171]: I0223 14:16:50.751625 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:50.753061 master-0 kubenswrapper[4171]: I0223 14:16:50.752941 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:50.753061 master-0 kubenswrapper[4171]: I0223 14:16:50.753005 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:50.753061 master-0 kubenswrapper[4171]: I0223 14:16:50.753023 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:50.753341 master-0 kubenswrapper[4171]: I0223 14:16:50.753153 4171 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Feb 23 14:16:50.760906 master-0 kubenswrapper[4171]: E0223 14:16:50.760843 4171 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Feb 23 14:16:50.872016 master-0 kubenswrapper[4171]: I0223 14:16:50.871908 4171 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 23 14:16:51.025825 master-0 kubenswrapper[4171]: E0223 14:16:51.025622 4171 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Feb 23 14:16:51.057697 master-0 kubenswrapper[4171]: I0223 14:16:51.057602 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:51.058684 master-0 kubenswrapper[4171]: I0223 14:16:51.058656 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:51.058761 master-0 kubenswrapper[4171]: I0223 14:16:51.058693 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:51.058761 master-0 kubenswrapper[4171]: I0223 14:16:51.058701 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:51.059006 master-0 kubenswrapper[4171]: I0223 14:16:51.058985 4171 scope.go:117] "RemoveContainer" containerID="e6d5d550060f9bd110cd9ce89235c3a55d49cefa1174199bfdb6dbdcec650ff6" Feb 23 14:16:51.068040 master-0 kubenswrapper[4171]: E0223 14:16:51.067867 4171 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.1896e5d25d007e84\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.1896e5d25d007e84 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:c997c8e9d3be51d454d8e61e376bef08,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cb2014728aa54e620f65424402b14c5247016734a9a982c393dc011acb1a1f52\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:35.083075204 +0000 UTC m=+5.186476713,LastTimestamp:2026-02-23 14:16:51.062550545 +0000 UTC m=+21.165952034,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:51.185378 master-0 kubenswrapper[4171]: I0223 14:16:51.185340 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:51.186270 master-0 kubenswrapper[4171]: I0223 14:16:51.186226 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:51.186270 master-0 kubenswrapper[4171]: I0223 14:16:51.186263 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:51.186270 master-0 kubenswrapper[4171]: I0223 14:16:51.186277 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:51.299112 master-0 kubenswrapper[4171]: E0223 14:16:51.298908 4171 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.1896e5d26b0d67d7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.1896e5d26b0d67d7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:c997c8e9d3be51d454d8e61e376bef08,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:35.318802391 +0000 UTC m=+5.422203880,LastTimestamp:2026-02-23 14:16:51.291341319 +0000 UTC m=+21.394742808,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:51.314519 master-0 kubenswrapper[4171]: E0223 14:16:51.314340 4171 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.1896e5d26c0fbde7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.1896e5d26c0fbde7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:c997c8e9d3be51d454d8e61e376bef08,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:35.335732711 +0000 UTC m=+5.439134200,LastTimestamp:2026-02-23 14:16:51.308528784 +0000 UTC m=+21.411930323,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:51.870159 master-0 kubenswrapper[4171]: I0223 14:16:51.870105 4171 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 23 14:16:52.188573 master-0 kubenswrapper[4171]: I0223 14:16:52.188535 4171 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_c997c8e9d3be51d454d8e61e376bef08/kube-rbac-proxy-crio/2.log" Feb 23 14:16:52.190012 master-0 kubenswrapper[4171]: I0223 14:16:52.189984 4171 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_c997c8e9d3be51d454d8e61e376bef08/kube-rbac-proxy-crio/1.log" Feb 23 14:16:52.190700 master-0 kubenswrapper[4171]: I0223 14:16:52.190632 4171 generic.go:334] "Generic (PLEG): container finished" podID="c997c8e9d3be51d454d8e61e376bef08" containerID="6ac0b900bdb2d552799e0b2929f88eaa7518eb0c998cb215c17a947032781e19" exitCode=1 Feb 23 14:16:52.190700 master-0 kubenswrapper[4171]: I0223 14:16:52.190682 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"c997c8e9d3be51d454d8e61e376bef08","Type":"ContainerDied","Data":"6ac0b900bdb2d552799e0b2929f88eaa7518eb0c998cb215c17a947032781e19"} Feb 23 14:16:52.190880 master-0 kubenswrapper[4171]: I0223 14:16:52.190721 4171 scope.go:117] "RemoveContainer" containerID="e6d5d550060f9bd110cd9ce89235c3a55d49cefa1174199bfdb6dbdcec650ff6" Feb 23 14:16:52.190880 master-0 kubenswrapper[4171]: I0223 14:16:52.190832 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:52.191729 master-0 kubenswrapper[4171]: I0223 14:16:52.191687 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:52.191860 master-0 kubenswrapper[4171]: I0223 14:16:52.191753 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:52.191860 master-0 kubenswrapper[4171]: I0223 14:16:52.191777 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:52.192650 master-0 kubenswrapper[4171]: I0223 14:16:52.192184 4171 scope.go:117] "RemoveContainer" containerID="6ac0b900bdb2d552799e0b2929f88eaa7518eb0c998cb215c17a947032781e19" Feb 23 14:16:52.192650 master-0 kubenswrapper[4171]: E0223 14:16:52.192409 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(c997c8e9d3be51d454d8e61e376bef08)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="c997c8e9d3be51d454d8e61e376bef08" Feb 23 14:16:52.199306 master-0 kubenswrapper[4171]: E0223 14:16:52.198805 4171 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.1896e5d34c8f85c5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.1896e5d34c8f85c5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:c997c8e9d3be51d454d8e61e376bef08,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(c997c8e9d3be51d454d8e61e376bef08),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:16:39.102203333 +0000 UTC m=+9.205604862,LastTimestamp:2026-02-23 14:16:52.192351629 +0000 UTC m=+22.295753158,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:16:52.231450 master-0 kubenswrapper[4171]: W0223 14:16:52.231373 4171 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Feb 23 14:16:52.231450 master-0 kubenswrapper[4171]: E0223 14:16:52.231440 4171 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 23 14:16:52.868602 master-0 kubenswrapper[4171]: I0223 14:16:52.868567 4171 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 23 14:16:52.992018 master-0 kubenswrapper[4171]: W0223 14:16:52.991946 4171 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Feb 23 14:16:52.992402 master-0 kubenswrapper[4171]: E0223 14:16:52.992018 4171 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 23 14:16:53.195124 master-0 kubenswrapper[4171]: I0223 14:16:53.195043 4171 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_c997c8e9d3be51d454d8e61e376bef08/kube-rbac-proxy-crio/2.log" Feb 23 14:16:53.870614 master-0 kubenswrapper[4171]: I0223 14:16:53.870519 4171 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 23 14:16:54.170518 master-0 kubenswrapper[4171]: I0223 14:16:54.170339 4171 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:16:54.170706 master-0 kubenswrapper[4171]: I0223 14:16:54.170547 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:54.171959 master-0 kubenswrapper[4171]: I0223 14:16:54.171912 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:54.172031 master-0 kubenswrapper[4171]: I0223 14:16:54.171969 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:54.172031 master-0 kubenswrapper[4171]: I0223 14:16:54.171985 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:54.177065 master-0 kubenswrapper[4171]: I0223 14:16:54.177037 4171 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:16:54.198107 master-0 kubenswrapper[4171]: I0223 14:16:54.198059 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:54.199630 master-0 kubenswrapper[4171]: I0223 14:16:54.199551 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:54.199630 master-0 kubenswrapper[4171]: I0223 14:16:54.199625 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:54.200000 master-0 kubenswrapper[4171]: I0223 14:16:54.199652 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:54.869651 master-0 kubenswrapper[4171]: I0223 14:16:54.869558 4171 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 23 14:16:55.871327 master-0 kubenswrapper[4171]: I0223 14:16:55.871207 4171 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 23 14:16:56.869381 master-0 kubenswrapper[4171]: I0223 14:16:56.869287 4171 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 23 14:16:57.511748 master-0 kubenswrapper[4171]: E0223 14:16:57.511652 4171 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 23 14:16:57.761508 master-0 kubenswrapper[4171]: I0223 14:16:57.761372 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:16:57.763629 master-0 kubenswrapper[4171]: I0223 14:16:57.763349 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:16:57.763629 master-0 kubenswrapper[4171]: I0223 14:16:57.763449 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:16:57.763629 master-0 kubenswrapper[4171]: I0223 14:16:57.763514 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:16:57.764572 master-0 kubenswrapper[4171]: I0223 14:16:57.763641 4171 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Feb 23 14:16:57.771873 master-0 kubenswrapper[4171]: E0223 14:16:57.771810 4171 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Feb 23 14:16:57.869969 master-0 kubenswrapper[4171]: I0223 14:16:57.869884 4171 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 23 14:16:58.868473 master-0 kubenswrapper[4171]: I0223 14:16:58.868377 4171 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 23 14:16:59.868636 master-0 kubenswrapper[4171]: I0223 14:16:59.868565 4171 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 23 14:17:00.867157 master-0 kubenswrapper[4171]: I0223 14:17:00.867051 4171 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 23 14:17:01.026883 master-0 kubenswrapper[4171]: E0223 14:17:01.026744 4171 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Feb 23 14:17:01.871074 master-0 kubenswrapper[4171]: I0223 14:17:01.871020 4171 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 23 14:17:02.116591 master-0 kubenswrapper[4171]: I0223 14:17:02.116539 4171 csr.go:257] certificate signing request csr-h885b is issued Feb 23 14:17:02.721513 master-0 kubenswrapper[4171]: I0223 14:17:02.721445 4171 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 23 14:17:02.873054 master-0 kubenswrapper[4171]: I0223 14:17:02.873001 4171 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Feb 23 14:17:02.888338 master-0 kubenswrapper[4171]: I0223 14:17:02.888290 4171 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Feb 23 14:17:02.950514 master-0 kubenswrapper[4171]: I0223 14:17:02.950434 4171 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Feb 23 14:17:03.118125 master-0 kubenswrapper[4171]: I0223 14:17:03.118000 4171 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 14:08:38 +0000 UTC, rotation deadline is 2026-02-24 11:44:05.679184317 +0000 UTC Feb 23 14:17:03.118125 master-0 kubenswrapper[4171]: I0223 14:17:03.118059 4171 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 21h27m2.561128277s for next certificate rotation Feb 23 14:17:03.213501 master-0 kubenswrapper[4171]: I0223 14:17:03.213430 4171 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Feb 23 14:17:03.213501 master-0 kubenswrapper[4171]: E0223 14:17:03.213500 4171 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Feb 23 14:17:03.233126 master-0 kubenswrapper[4171]: I0223 14:17:03.233093 4171 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Feb 23 14:17:03.248463 master-0 kubenswrapper[4171]: I0223 14:17:03.248420 4171 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Feb 23 14:17:03.313209 master-0 kubenswrapper[4171]: I0223 14:17:03.313172 4171 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Feb 23 14:17:03.546807 master-0 kubenswrapper[4171]: I0223 14:17:03.546717 4171 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 23 14:17:03.572433 master-0 kubenswrapper[4171]: I0223 14:17:03.572358 4171 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Feb 23 14:17:03.572433 master-0 kubenswrapper[4171]: E0223 14:17:03.572406 4171 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Feb 23 14:17:03.671882 master-0 kubenswrapper[4171]: I0223 14:17:03.671808 4171 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Feb 23 14:17:03.688377 master-0 kubenswrapper[4171]: I0223 14:17:03.688270 4171 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Feb 23 14:17:03.745056 master-0 kubenswrapper[4171]: I0223 14:17:03.744980 4171 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Feb 23 14:17:04.015472 master-0 kubenswrapper[4171]: I0223 14:17:04.015383 4171 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Feb 23 14:17:04.015472 master-0 kubenswrapper[4171]: E0223 14:17:04.015425 4171 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Feb 23 14:17:04.517940 master-0 kubenswrapper[4171]: E0223 14:17:04.517884 4171 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"master-0\" not found" node="master-0" Feb 23 14:17:04.566627 master-0 kubenswrapper[4171]: I0223 14:17:04.566556 4171 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Feb 23 14:17:04.581351 master-0 kubenswrapper[4171]: I0223 14:17:04.581261 4171 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Feb 23 14:17:04.637277 master-0 kubenswrapper[4171]: I0223 14:17:04.637206 4171 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Feb 23 14:17:04.772834 master-0 kubenswrapper[4171]: I0223 14:17:04.772671 4171 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:17:04.774352 master-0 kubenswrapper[4171]: I0223 14:17:04.774302 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:17:04.774352 master-0 kubenswrapper[4171]: I0223 14:17:04.774355 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:17:04.774580 master-0 kubenswrapper[4171]: I0223 14:17:04.774373 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:17:04.774580 master-0 kubenswrapper[4171]: I0223 14:17:04.774430 4171 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Feb 23 14:17:04.799932 master-0 kubenswrapper[4171]: I0223 14:17:04.799872 4171 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Feb 23 14:17:04.799932 master-0 kubenswrapper[4171]: E0223 14:17:04.799913 4171 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": node \"master-0\" not found" Feb 23 14:17:04.809902 master-0 kubenswrapper[4171]: E0223 14:17:04.809867 4171 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:17:04.893003 master-0 kubenswrapper[4171]: I0223 14:17:04.892953 4171 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 23 14:17:04.901061 master-0 kubenswrapper[4171]: I0223 14:17:04.901012 4171 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 23 14:17:04.910498 master-0 kubenswrapper[4171]: E0223 14:17:04.910427 4171 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:17:05.011271 master-0 kubenswrapper[4171]: E0223 14:17:05.011203 4171 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:17:05.112451 master-0 kubenswrapper[4171]: E0223 14:17:05.112318 4171 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:17:05.213533 master-0 kubenswrapper[4171]: E0223 14:17:05.213396 4171 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:17:05.314500 master-0 kubenswrapper[4171]: E0223 14:17:05.314407 4171 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:17:05.414994 master-0 kubenswrapper[4171]: E0223 14:17:05.414814 4171 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:17:05.515130 master-0 kubenswrapper[4171]: E0223 14:17:05.515021 4171 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:17:05.566391 master-0 kubenswrapper[4171]: I0223 14:17:05.566204 4171 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 23 14:17:05.868896 master-0 kubenswrapper[4171]: I0223 14:17:05.868843 4171 apiserver.go:52] "Watching apiserver" Feb 23 14:17:05.873502 master-0 kubenswrapper[4171]: I0223 14:17:05.873406 4171 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 23 14:17:05.873706 master-0 kubenswrapper[4171]: I0223 14:17:05.873576 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-7d7db75979-x4qnw"] Feb 23 14:17:05.873822 master-0 kubenswrapper[4171]: I0223 14:17:05.873801 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" Feb 23 14:17:05.875393 master-0 kubenswrapper[4171]: I0223 14:17:05.875353 4171 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 23 14:17:05.875638 master-0 kubenswrapper[4171]: I0223 14:17:05.875593 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 23 14:17:05.876365 master-0 kubenswrapper[4171]: I0223 14:17:05.876335 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 23 14:17:05.898572 master-0 kubenswrapper[4171]: I0223 14:17:05.898512 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/674041a2-e2b0-4286-88cc-f1b00571e3f3-metrics-tls\") pod \"network-operator-7d7db75979-x4qnw\" (UID: \"674041a2-e2b0-4286-88cc-f1b00571e3f3\") " pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" Feb 23 14:17:05.898572 master-0 kubenswrapper[4171]: I0223 14:17:05.898570 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brd4j\" (UniqueName: \"kubernetes.io/projected/674041a2-e2b0-4286-88cc-f1b00571e3f3-kube-api-access-brd4j\") pod \"network-operator-7d7db75979-x4qnw\" (UID: \"674041a2-e2b0-4286-88cc-f1b00571e3f3\") " pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" Feb 23 14:17:05.899006 master-0 kubenswrapper[4171]: I0223 14:17:05.898645 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/674041a2-e2b0-4286-88cc-f1b00571e3f3-host-etc-kube\") pod \"network-operator-7d7db75979-x4qnw\" (UID: \"674041a2-e2b0-4286-88cc-f1b00571e3f3\") " pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" Feb 23 14:17:05.974306 master-0 kubenswrapper[4171]: I0223 14:17:05.974222 4171 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Feb 23 14:17:05.999409 master-0 kubenswrapper[4171]: I0223 14:17:05.999351 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/674041a2-e2b0-4286-88cc-f1b00571e3f3-host-etc-kube\") pod \"network-operator-7d7db75979-x4qnw\" (UID: \"674041a2-e2b0-4286-88cc-f1b00571e3f3\") " pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" Feb 23 14:17:05.999409 master-0 kubenswrapper[4171]: I0223 14:17:05.999417 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/674041a2-e2b0-4286-88cc-f1b00571e3f3-metrics-tls\") pod \"network-operator-7d7db75979-x4qnw\" (UID: \"674041a2-e2b0-4286-88cc-f1b00571e3f3\") " pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" Feb 23 14:17:05.999772 master-0 kubenswrapper[4171]: I0223 14:17:05.999445 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brd4j\" (UniqueName: \"kubernetes.io/projected/674041a2-e2b0-4286-88cc-f1b00571e3f3-kube-api-access-brd4j\") pod \"network-operator-7d7db75979-x4qnw\" (UID: \"674041a2-e2b0-4286-88cc-f1b00571e3f3\") " pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" Feb 23 14:17:05.999772 master-0 kubenswrapper[4171]: I0223 14:17:05.999725 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/674041a2-e2b0-4286-88cc-f1b00571e3f3-host-etc-kube\") pod \"network-operator-7d7db75979-x4qnw\" (UID: \"674041a2-e2b0-4286-88cc-f1b00571e3f3\") " pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" Feb 23 14:17:06.001737 master-0 kubenswrapper[4171]: I0223 14:17:06.001672 4171 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 23 14:17:06.009678 master-0 kubenswrapper[4171]: I0223 14:17:06.009264 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/674041a2-e2b0-4286-88cc-f1b00571e3f3-metrics-tls\") pod \"network-operator-7d7db75979-x4qnw\" (UID: \"674041a2-e2b0-4286-88cc-f1b00571e3f3\") " pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" Feb 23 14:17:06.021412 master-0 kubenswrapper[4171]: I0223 14:17:06.021356 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brd4j\" (UniqueName: \"kubernetes.io/projected/674041a2-e2b0-4286-88cc-f1b00571e3f3-kube-api-access-brd4j\") pod \"network-operator-7d7db75979-x4qnw\" (UID: \"674041a2-e2b0-4286-88cc-f1b00571e3f3\") " pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" Feb 23 14:17:06.073345 master-0 kubenswrapper[4171]: I0223 14:17:06.073262 4171 scope.go:117] "RemoveContainer" containerID="6ac0b900bdb2d552799e0b2929f88eaa7518eb0c998cb215c17a947032781e19" Feb 23 14:17:06.073630 master-0 kubenswrapper[4171]: I0223 14:17:06.073285 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0"] Feb 23 14:17:06.073630 master-0 kubenswrapper[4171]: E0223 14:17:06.073572 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(c997c8e9d3be51d454d8e61e376bef08)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="c997c8e9d3be51d454d8e61e376bef08" Feb 23 14:17:06.186063 master-0 kubenswrapper[4171]: I0223 14:17:06.186002 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" Feb 23 14:17:06.229274 master-0 kubenswrapper[4171]: I0223 14:17:06.229213 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" event={"ID":"674041a2-e2b0-4286-88cc-f1b00571e3f3","Type":"ContainerStarted","Data":"7745fe383c3438f3eb713290ae29bc45137b7df8f820bdc331981eebbfe561fe"} Feb 23 14:17:06.229870 master-0 kubenswrapper[4171]: I0223 14:17:06.229836 4171 scope.go:117] "RemoveContainer" containerID="6ac0b900bdb2d552799e0b2929f88eaa7518eb0c998cb215c17a947032781e19" Feb 23 14:17:06.230210 master-0 kubenswrapper[4171]: E0223 14:17:06.230117 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(c997c8e9d3be51d454d8e61e376bef08)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="c997c8e9d3be51d454d8e61e376bef08" Feb 23 14:17:06.682325 master-0 kubenswrapper[4171]: I0223 14:17:06.682230 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg"] Feb 23 14:17:06.683418 master-0 kubenswrapper[4171]: I0223 14:17:06.682712 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:17:06.685982 master-0 kubenswrapper[4171]: I0223 14:17:06.685930 4171 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 23 14:17:06.686102 master-0 kubenswrapper[4171]: I0223 14:17:06.685998 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 23 14:17:06.686102 master-0 kubenswrapper[4171]: I0223 14:17:06.685943 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 23 14:17:06.808526 master-0 kubenswrapper[4171]: I0223 14:17:06.808398 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3cea0ab8-258b-486c-bb7f-8c93930b296d-etc-ssl-certs\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:17:06.808526 master-0 kubenswrapper[4171]: I0223 14:17:06.808522 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3cea0ab8-258b-486c-bb7f-8c93930b296d-service-ca\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:17:06.808872 master-0 kubenswrapper[4171]: I0223 14:17:06.808741 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:17:06.808872 master-0 kubenswrapper[4171]: I0223 14:17:06.808789 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3cea0ab8-258b-486c-bb7f-8c93930b296d-kube-api-access\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:17:06.808872 master-0 kubenswrapper[4171]: I0223 14:17:06.808837 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3cea0ab8-258b-486c-bb7f-8c93930b296d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:17:06.910245 master-0 kubenswrapper[4171]: I0223 14:17:06.910106 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3cea0ab8-258b-486c-bb7f-8c93930b296d-etc-ssl-certs\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:17:06.910245 master-0 kubenswrapper[4171]: I0223 14:17:06.910242 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3cea0ab8-258b-486c-bb7f-8c93930b296d-service-ca\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:17:06.910635 master-0 kubenswrapper[4171]: I0223 14:17:06.910452 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3cea0ab8-258b-486c-bb7f-8c93930b296d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:17:06.910749 master-0 kubenswrapper[4171]: I0223 14:17:06.910611 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3cea0ab8-258b-486c-bb7f-8c93930b296d-etc-ssl-certs\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:17:06.910749 master-0 kubenswrapper[4171]: I0223 14:17:06.910678 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3cea0ab8-258b-486c-bb7f-8c93930b296d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:17:06.910923 master-0 kubenswrapper[4171]: I0223 14:17:06.910805 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:17:06.911024 master-0 kubenswrapper[4171]: I0223 14:17:06.910914 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3cea0ab8-258b-486c-bb7f-8c93930b296d-kube-api-access\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:17:06.911775 master-0 kubenswrapper[4171]: I0223 14:17:06.911718 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3cea0ab8-258b-486c-bb7f-8c93930b296d-service-ca\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:17:06.911975 master-0 kubenswrapper[4171]: E0223 14:17:06.911879 4171 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Feb 23 14:17:06.912371 master-0 kubenswrapper[4171]: E0223 14:17:06.912317 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert podName:3cea0ab8-258b-486c-bb7f-8c93930b296d nodeName:}" failed. No retries permitted until 2026-02-23 14:17:07.412008326 +0000 UTC m=+37.515409895 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert") pod "cluster-version-operator-5cfd9759cf-bsqrg" (UID: "3cea0ab8-258b-486c-bb7f-8c93930b296d") : secret "cluster-version-operator-serving-cert" not found Feb 23 14:17:06.932381 master-0 kubenswrapper[4171]: I0223 14:17:06.932224 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3cea0ab8-258b-486c-bb7f-8c93930b296d-kube-api-access\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:17:07.284896 master-0 kubenswrapper[4171]: I0223 14:17:07.284747 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["assisted-installer/assisted-installer-controller-r6z45"] Feb 23 14:17:07.285334 master-0 kubenswrapper[4171]: I0223 14:17:07.285091 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-r6z45" Feb 23 14:17:07.287620 master-0 kubenswrapper[4171]: I0223 14:17:07.287571 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"openshift-service-ca.crt" Feb 23 14:17:07.287778 master-0 kubenswrapper[4171]: I0223 14:17:07.287583 4171 reflector.go:368] Caches populated for *v1.Secret from object-"assisted-installer"/"assisted-installer-controller-secret" Feb 23 14:17:07.287975 master-0 kubenswrapper[4171]: I0223 14:17:07.287908 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"kube-root-ca.crt" Feb 23 14:17:07.289784 master-0 kubenswrapper[4171]: I0223 14:17:07.289749 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"assisted-installer-controller-config" Feb 23 14:17:07.414383 master-0 kubenswrapper[4171]: I0223 14:17:07.414320 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/0514f486-2562-473d-8b01-b69441b82367-host-var-run-resolv-conf\") pod \"assisted-installer-controller-r6z45\" (UID: \"0514f486-2562-473d-8b01-b69441b82367\") " pod="assisted-installer/assisted-installer-controller-r6z45" Feb 23 14:17:07.414383 master-0 kubenswrapper[4171]: I0223 14:17:07.414388 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/0514f486-2562-473d-8b01-b69441b82367-sno-bootstrap-files\") pod \"assisted-installer-controller-r6z45\" (UID: \"0514f486-2562-473d-8b01-b69441b82367\") " pod="assisted-installer/assisted-installer-controller-r6z45" Feb 23 14:17:07.414708 master-0 kubenswrapper[4171]: I0223 14:17:07.414509 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/0514f486-2562-473d-8b01-b69441b82367-host-resolv-conf\") pod \"assisted-installer-controller-r6z45\" (UID: \"0514f486-2562-473d-8b01-b69441b82367\") " pod="assisted-installer/assisted-installer-controller-r6z45" Feb 23 14:17:07.414708 master-0 kubenswrapper[4171]: I0223 14:17:07.414623 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:17:07.414708 master-0 kubenswrapper[4171]: I0223 14:17:07.414666 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/0514f486-2562-473d-8b01-b69441b82367-host-ca-bundle\") pod \"assisted-installer-controller-r6z45\" (UID: \"0514f486-2562-473d-8b01-b69441b82367\") " pod="assisted-installer/assisted-installer-controller-r6z45" Feb 23 14:17:07.414903 master-0 kubenswrapper[4171]: I0223 14:17:07.414710 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7g2q\" (UniqueName: \"kubernetes.io/projected/0514f486-2562-473d-8b01-b69441b82367-kube-api-access-f7g2q\") pod \"assisted-installer-controller-r6z45\" (UID: \"0514f486-2562-473d-8b01-b69441b82367\") " pod="assisted-installer/assisted-installer-controller-r6z45" Feb 23 14:17:07.414903 master-0 kubenswrapper[4171]: E0223 14:17:07.414873 4171 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Feb 23 14:17:07.415020 master-0 kubenswrapper[4171]: E0223 14:17:07.414972 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert podName:3cea0ab8-258b-486c-bb7f-8c93930b296d nodeName:}" failed. No retries permitted until 2026-02-23 14:17:08.414945122 +0000 UTC m=+38.518346641 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert") pod "cluster-version-operator-5cfd9759cf-bsqrg" (UID: "3cea0ab8-258b-486c-bb7f-8c93930b296d") : secret "cluster-version-operator-serving-cert" not found Feb 23 14:17:07.417113 master-0 kubenswrapper[4171]: I0223 14:17:07.417066 4171 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 23 14:17:07.515882 master-0 kubenswrapper[4171]: I0223 14:17:07.515820 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/0514f486-2562-473d-8b01-b69441b82367-host-ca-bundle\") pod \"assisted-installer-controller-r6z45\" (UID: \"0514f486-2562-473d-8b01-b69441b82367\") " pod="assisted-installer/assisted-installer-controller-r6z45" Feb 23 14:17:07.516081 master-0 kubenswrapper[4171]: I0223 14:17:07.515885 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7g2q\" (UniqueName: \"kubernetes.io/projected/0514f486-2562-473d-8b01-b69441b82367-kube-api-access-f7g2q\") pod \"assisted-installer-controller-r6z45\" (UID: \"0514f486-2562-473d-8b01-b69441b82367\") " pod="assisted-installer/assisted-installer-controller-r6z45" Feb 23 14:17:07.516081 master-0 kubenswrapper[4171]: I0223 14:17:07.515923 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/0514f486-2562-473d-8b01-b69441b82367-host-var-run-resolv-conf\") pod \"assisted-installer-controller-r6z45\" (UID: \"0514f486-2562-473d-8b01-b69441b82367\") " pod="assisted-installer/assisted-installer-controller-r6z45" Feb 23 14:17:07.516081 master-0 kubenswrapper[4171]: I0223 14:17:07.516022 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/0514f486-2562-473d-8b01-b69441b82367-host-ca-bundle\") pod \"assisted-installer-controller-r6z45\" (UID: \"0514f486-2562-473d-8b01-b69441b82367\") " pod="assisted-installer/assisted-installer-controller-r6z45" Feb 23 14:17:07.516081 master-0 kubenswrapper[4171]: I0223 14:17:07.516064 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/0514f486-2562-473d-8b01-b69441b82367-sno-bootstrap-files\") pod \"assisted-installer-controller-r6z45\" (UID: \"0514f486-2562-473d-8b01-b69441b82367\") " pod="assisted-installer/assisted-installer-controller-r6z45" Feb 23 14:17:07.516227 master-0 kubenswrapper[4171]: I0223 14:17:07.516132 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/0514f486-2562-473d-8b01-b69441b82367-sno-bootstrap-files\") pod \"assisted-installer-controller-r6z45\" (UID: \"0514f486-2562-473d-8b01-b69441b82367\") " pod="assisted-installer/assisted-installer-controller-r6z45" Feb 23 14:17:07.516227 master-0 kubenswrapper[4171]: I0223 14:17:07.516146 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/0514f486-2562-473d-8b01-b69441b82367-host-resolv-conf\") pod \"assisted-installer-controller-r6z45\" (UID: \"0514f486-2562-473d-8b01-b69441b82367\") " pod="assisted-installer/assisted-installer-controller-r6z45" Feb 23 14:17:07.516227 master-0 kubenswrapper[4171]: I0223 14:17:07.516197 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/0514f486-2562-473d-8b01-b69441b82367-host-resolv-conf\") pod \"assisted-installer-controller-r6z45\" (UID: \"0514f486-2562-473d-8b01-b69441b82367\") " pod="assisted-installer/assisted-installer-controller-r6z45" Feb 23 14:17:07.516335 master-0 kubenswrapper[4171]: I0223 14:17:07.516247 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/0514f486-2562-473d-8b01-b69441b82367-host-var-run-resolv-conf\") pod \"assisted-installer-controller-r6z45\" (UID: \"0514f486-2562-473d-8b01-b69441b82367\") " pod="assisted-installer/assisted-installer-controller-r6z45" Feb 23 14:17:07.537178 master-0 kubenswrapper[4171]: I0223 14:17:07.537084 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7g2q\" (UniqueName: \"kubernetes.io/projected/0514f486-2562-473d-8b01-b69441b82367-kube-api-access-f7g2q\") pod \"assisted-installer-controller-r6z45\" (UID: \"0514f486-2562-473d-8b01-b69441b82367\") " pod="assisted-installer/assisted-installer-controller-r6z45" Feb 23 14:17:07.621021 master-0 kubenswrapper[4171]: I0223 14:17:07.620936 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-r6z45" Feb 23 14:17:07.631592 master-0 kubenswrapper[4171]: W0223 14:17:07.631552 4171 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0514f486_2562_473d_8b01_b69441b82367.slice/crio-ee2c13172eac3ecac6ffc4f8cbbcc4a92023a5bb4123fc2178049f9834005518 WatchSource:0}: Error finding container ee2c13172eac3ecac6ffc4f8cbbcc4a92023a5bb4123fc2178049f9834005518: Status 404 returned error can't find the container with id ee2c13172eac3ecac6ffc4f8cbbcc4a92023a5bb4123fc2178049f9834005518 Feb 23 14:17:08.236413 master-0 kubenswrapper[4171]: I0223 14:17:08.236322 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-r6z45" event={"ID":"0514f486-2562-473d-8b01-b69441b82367","Type":"ContainerStarted","Data":"ee2c13172eac3ecac6ffc4f8cbbcc4a92023a5bb4123fc2178049f9834005518"} Feb 23 14:17:08.424367 master-0 kubenswrapper[4171]: I0223 14:17:08.424283 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:17:08.424584 master-0 kubenswrapper[4171]: E0223 14:17:08.424540 4171 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Feb 23 14:17:08.424717 master-0 kubenswrapper[4171]: E0223 14:17:08.424683 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert podName:3cea0ab8-258b-486c-bb7f-8c93930b296d nodeName:}" failed. No retries permitted until 2026-02-23 14:17:10.424635641 +0000 UTC m=+40.528037160 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert") pod "cluster-version-operator-5cfd9759cf-bsqrg" (UID: "3cea0ab8-258b-486c-bb7f-8c93930b296d") : secret "cluster-version-operator-serving-cert" not found Feb 23 14:17:10.462573 master-0 kubenswrapper[4171]: I0223 14:17:10.462525 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:17:10.463346 master-0 kubenswrapper[4171]: E0223 14:17:10.462649 4171 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Feb 23 14:17:10.463346 master-0 kubenswrapper[4171]: E0223 14:17:10.462697 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert podName:3cea0ab8-258b-486c-bb7f-8c93930b296d nodeName:}" failed. No retries permitted until 2026-02-23 14:17:14.462682829 +0000 UTC m=+44.566084318 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert") pod "cluster-version-operator-5cfd9759cf-bsqrg" (UID: "3cea0ab8-258b-486c-bb7f-8c93930b296d") : secret "cluster-version-operator-serving-cert" not found Feb 23 14:17:10.463346 master-0 kubenswrapper[4171]: I0223 14:17:10.462832 4171 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 23 14:17:11.246470 master-0 kubenswrapper[4171]: I0223 14:17:11.246380 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" event={"ID":"674041a2-e2b0-4286-88cc-f1b00571e3f3","Type":"ContainerStarted","Data":"444b5986734e966174e693b843714d39c39b89099075b49c0d4944256ff9f4ae"} Feb 23 14:17:11.865510 master-0 kubenswrapper[4171]: I0223 14:17:11.865306 4171 csr.go:261] certificate signing request csr-kqwvb is approved, waiting to be issued Feb 23 14:17:11.877804 master-0 kubenswrapper[4171]: I0223 14:17:11.877594 4171 csr.go:257] certificate signing request csr-kqwvb is issued Feb 23 14:17:12.879424 master-0 kubenswrapper[4171]: I0223 14:17:12.879334 4171 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 14:08:38 +0000 UTC, rotation deadline is 2026-02-24 11:38:52.353871636 +0000 UTC Feb 23 14:17:12.879424 master-0 kubenswrapper[4171]: I0223 14:17:12.879394 4171 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 21h21m39.474479733s for next certificate rotation Feb 23 14:17:13.880800 master-0 kubenswrapper[4171]: I0223 14:17:13.880637 4171 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 14:08:38 +0000 UTC, rotation deadline is 2026-02-24 07:53:01.8145465 +0000 UTC Feb 23 14:17:13.880800 master-0 kubenswrapper[4171]: I0223 14:17:13.880697 4171 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 17h35m47.933854111s for next certificate rotation Feb 23 14:17:14.186991 master-0 kubenswrapper[4171]: I0223 14:17:14.186842 4171 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" podStartSLOduration=4.912714675 podStartE2EDuration="9.186802609s" podCreationTimestamp="2026-02-23 14:17:05 +0000 UTC" firstStartedPulling="2026-02-23 14:17:06.20777471 +0000 UTC m=+36.311176249" lastFinishedPulling="2026-02-23 14:17:10.481862694 +0000 UTC m=+40.585264183" observedRunningTime="2026-02-23 14:17:11.651698862 +0000 UTC m=+41.755100351" watchObservedRunningTime="2026-02-23 14:17:14.186802609 +0000 UTC m=+44.290204148" Feb 23 14:17:14.187554 master-0 kubenswrapper[4171]: I0223 14:17:14.187471 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/mtu-prober-6cc89"] Feb 23 14:17:14.187996 master-0 kubenswrapper[4171]: I0223 14:17:14.187948 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-6cc89" Feb 23 14:17:14.254433 master-0 kubenswrapper[4171]: I0223 14:17:14.254357 4171 generic.go:334] "Generic (PLEG): container finished" podID="0514f486-2562-473d-8b01-b69441b82367" containerID="ac1b1e24015c720352cbb49d46332282e9687278977a0db4df21fe4d03fe58bd" exitCode=0 Feb 23 14:17:14.254795 master-0 kubenswrapper[4171]: I0223 14:17:14.254445 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-r6z45" event={"ID":"0514f486-2562-473d-8b01-b69441b82367","Type":"ContainerDied","Data":"ac1b1e24015c720352cbb49d46332282e9687278977a0db4df21fe4d03fe58bd"} Feb 23 14:17:14.298383 master-0 kubenswrapper[4171]: I0223 14:17:14.298294 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjb7b\" (UniqueName: \"kubernetes.io/projected/83ba77ca-71ea-4b69-8c91-f04b53b81aff-kube-api-access-cjb7b\") pod \"mtu-prober-6cc89\" (UID: \"83ba77ca-71ea-4b69-8c91-f04b53b81aff\") " pod="openshift-network-operator/mtu-prober-6cc89" Feb 23 14:17:14.398946 master-0 kubenswrapper[4171]: I0223 14:17:14.398830 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjb7b\" (UniqueName: \"kubernetes.io/projected/83ba77ca-71ea-4b69-8c91-f04b53b81aff-kube-api-access-cjb7b\") pod \"mtu-prober-6cc89\" (UID: \"83ba77ca-71ea-4b69-8c91-f04b53b81aff\") " pod="openshift-network-operator/mtu-prober-6cc89" Feb 23 14:17:14.428455 master-0 kubenswrapper[4171]: I0223 14:17:14.428351 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjb7b\" (UniqueName: \"kubernetes.io/projected/83ba77ca-71ea-4b69-8c91-f04b53b81aff-kube-api-access-cjb7b\") pod \"mtu-prober-6cc89\" (UID: \"83ba77ca-71ea-4b69-8c91-f04b53b81aff\") " pod="openshift-network-operator/mtu-prober-6cc89" Feb 23 14:17:14.499702 master-0 kubenswrapper[4171]: I0223 14:17:14.499524 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:17:14.499702 master-0 kubenswrapper[4171]: E0223 14:17:14.499704 4171 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Feb 23 14:17:14.500025 master-0 kubenswrapper[4171]: E0223 14:17:14.499789 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert podName:3cea0ab8-258b-486c-bb7f-8c93930b296d nodeName:}" failed. No retries permitted until 2026-02-23 14:17:22.499766262 +0000 UTC m=+52.603167781 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert") pod "cluster-version-operator-5cfd9759cf-bsqrg" (UID: "3cea0ab8-258b-486c-bb7f-8c93930b296d") : secret "cluster-version-operator-serving-cert" not found Feb 23 14:17:14.508934 master-0 kubenswrapper[4171]: I0223 14:17:14.508867 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-6cc89" Feb 23 14:17:14.521597 master-0 kubenswrapper[4171]: W0223 14:17:14.521536 4171 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83ba77ca_71ea_4b69_8c91_f04b53b81aff.slice/crio-703008ca22883d01a165a1fd5907792c8c2d30c013e9c784479d77eccc42b82c WatchSource:0}: Error finding container 703008ca22883d01a165a1fd5907792c8c2d30c013e9c784479d77eccc42b82c: Status 404 returned error can't find the container with id 703008ca22883d01a165a1fd5907792c8c2d30c013e9c784479d77eccc42b82c Feb 23 14:17:15.259109 master-0 kubenswrapper[4171]: I0223 14:17:15.259016 4171 generic.go:334] "Generic (PLEG): container finished" podID="83ba77ca-71ea-4b69-8c91-f04b53b81aff" containerID="6d4e5cbd51d6e2350099300783b6b53e026119467c4ee08ce357bbba7d0f9eaa" exitCode=0 Feb 23 14:17:15.259109 master-0 kubenswrapper[4171]: I0223 14:17:15.259080 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-6cc89" event={"ID":"83ba77ca-71ea-4b69-8c91-f04b53b81aff","Type":"ContainerDied","Data":"6d4e5cbd51d6e2350099300783b6b53e026119467c4ee08ce357bbba7d0f9eaa"} Feb 23 14:17:15.260187 master-0 kubenswrapper[4171]: I0223 14:17:15.259159 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-6cc89" event={"ID":"83ba77ca-71ea-4b69-8c91-f04b53b81aff","Type":"ContainerStarted","Data":"703008ca22883d01a165a1fd5907792c8c2d30c013e9c784479d77eccc42b82c"} Feb 23 14:17:15.279512 master-0 kubenswrapper[4171]: I0223 14:17:15.279403 4171 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-r6z45" Feb 23 14:17:15.407259 master-0 kubenswrapper[4171]: I0223 14:17:15.407176 4171 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/0514f486-2562-473d-8b01-b69441b82367-sno-bootstrap-files\") pod \"0514f486-2562-473d-8b01-b69441b82367\" (UID: \"0514f486-2562-473d-8b01-b69441b82367\") " Feb 23 14:17:15.407451 master-0 kubenswrapper[4171]: I0223 14:17:15.407264 4171 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7g2q\" (UniqueName: \"kubernetes.io/projected/0514f486-2562-473d-8b01-b69441b82367-kube-api-access-f7g2q\") pod \"0514f486-2562-473d-8b01-b69441b82367\" (UID: \"0514f486-2562-473d-8b01-b69441b82367\") " Feb 23 14:17:15.407451 master-0 kubenswrapper[4171]: I0223 14:17:15.407312 4171 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/0514f486-2562-473d-8b01-b69441b82367-host-ca-bundle\") pod \"0514f486-2562-473d-8b01-b69441b82367\" (UID: \"0514f486-2562-473d-8b01-b69441b82367\") " Feb 23 14:17:15.407451 master-0 kubenswrapper[4171]: I0223 14:17:15.407361 4171 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/0514f486-2562-473d-8b01-b69441b82367-host-var-run-resolv-conf\") pod \"0514f486-2562-473d-8b01-b69441b82367\" (UID: \"0514f486-2562-473d-8b01-b69441b82367\") " Feb 23 14:17:15.407451 master-0 kubenswrapper[4171]: I0223 14:17:15.407405 4171 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/0514f486-2562-473d-8b01-b69441b82367-host-resolv-conf\") pod \"0514f486-2562-473d-8b01-b69441b82367\" (UID: \"0514f486-2562-473d-8b01-b69441b82367\") " Feb 23 14:17:15.407639 master-0 kubenswrapper[4171]: I0223 14:17:15.407595 4171 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0514f486-2562-473d-8b01-b69441b82367-host-resolv-conf" (OuterVolumeSpecName: "host-resolv-conf") pod "0514f486-2562-473d-8b01-b69441b82367" (UID: "0514f486-2562-473d-8b01-b69441b82367"). InnerVolumeSpecName "host-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:17:15.407689 master-0 kubenswrapper[4171]: I0223 14:17:15.407660 4171 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0514f486-2562-473d-8b01-b69441b82367-sno-bootstrap-files" (OuterVolumeSpecName: "sno-bootstrap-files") pod "0514f486-2562-473d-8b01-b69441b82367" (UID: "0514f486-2562-473d-8b01-b69441b82367"). InnerVolumeSpecName "sno-bootstrap-files". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:17:15.407739 master-0 kubenswrapper[4171]: I0223 14:17:15.407706 4171 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0514f486-2562-473d-8b01-b69441b82367-host-var-run-resolv-conf" (OuterVolumeSpecName: "host-var-run-resolv-conf") pod "0514f486-2562-473d-8b01-b69441b82367" (UID: "0514f486-2562-473d-8b01-b69441b82367"). InnerVolumeSpecName "host-var-run-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:17:15.407739 master-0 kubenswrapper[4171]: I0223 14:17:15.407708 4171 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0514f486-2562-473d-8b01-b69441b82367-host-ca-bundle" (OuterVolumeSpecName: "host-ca-bundle") pod "0514f486-2562-473d-8b01-b69441b82367" (UID: "0514f486-2562-473d-8b01-b69441b82367"). InnerVolumeSpecName "host-ca-bundle". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:17:15.412218 master-0 kubenswrapper[4171]: I0223 14:17:15.412162 4171 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0514f486-2562-473d-8b01-b69441b82367-kube-api-access-f7g2q" (OuterVolumeSpecName: "kube-api-access-f7g2q") pod "0514f486-2562-473d-8b01-b69441b82367" (UID: "0514f486-2562-473d-8b01-b69441b82367"). InnerVolumeSpecName "kube-api-access-f7g2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:17:15.507977 master-0 kubenswrapper[4171]: I0223 14:17:15.507896 4171 reconciler_common.go:293] "Volume detached for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/0514f486-2562-473d-8b01-b69441b82367-sno-bootstrap-files\") on node \"master-0\" DevicePath \"\"" Feb 23 14:17:15.507977 master-0 kubenswrapper[4171]: I0223 14:17:15.507945 4171 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7g2q\" (UniqueName: \"kubernetes.io/projected/0514f486-2562-473d-8b01-b69441b82367-kube-api-access-f7g2q\") on node \"master-0\" DevicePath \"\"" Feb 23 14:17:15.507977 master-0 kubenswrapper[4171]: I0223 14:17:15.507965 4171 reconciler_common.go:293] "Volume detached for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/0514f486-2562-473d-8b01-b69441b82367-host-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:17:15.507977 master-0 kubenswrapper[4171]: I0223 14:17:15.507985 4171 reconciler_common.go:293] "Volume detached for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/0514f486-2562-473d-8b01-b69441b82367-host-var-run-resolv-conf\") on node \"master-0\" DevicePath \"\"" Feb 23 14:17:15.508260 master-0 kubenswrapper[4171]: I0223 14:17:15.508120 4171 reconciler_common.go:293] "Volume detached for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/0514f486-2562-473d-8b01-b69441b82367-host-resolv-conf\") on node \"master-0\" DevicePath \"\"" Feb 23 14:17:16.263804 master-0 kubenswrapper[4171]: I0223 14:17:16.263619 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-r6z45" event={"ID":"0514f486-2562-473d-8b01-b69441b82367","Type":"ContainerDied","Data":"ee2c13172eac3ecac6ffc4f8cbbcc4a92023a5bb4123fc2178049f9834005518"} Feb 23 14:17:16.263804 master-0 kubenswrapper[4171]: I0223 14:17:16.263690 4171 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee2c13172eac3ecac6ffc4f8cbbcc4a92023a5bb4123fc2178049f9834005518" Feb 23 14:17:16.263804 master-0 kubenswrapper[4171]: I0223 14:17:16.263649 4171 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-r6z45" Feb 23 14:17:16.284831 master-0 kubenswrapper[4171]: I0223 14:17:16.284778 4171 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-6cc89" Feb 23 14:17:16.416304 master-0 kubenswrapper[4171]: I0223 14:17:16.416159 4171 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjb7b\" (UniqueName: \"kubernetes.io/projected/83ba77ca-71ea-4b69-8c91-f04b53b81aff-kube-api-access-cjb7b\") pod \"83ba77ca-71ea-4b69-8c91-f04b53b81aff\" (UID: \"83ba77ca-71ea-4b69-8c91-f04b53b81aff\") " Feb 23 14:17:16.420722 master-0 kubenswrapper[4171]: I0223 14:17:16.420647 4171 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83ba77ca-71ea-4b69-8c91-f04b53b81aff-kube-api-access-cjb7b" (OuterVolumeSpecName: "kube-api-access-cjb7b") pod "83ba77ca-71ea-4b69-8c91-f04b53b81aff" (UID: "83ba77ca-71ea-4b69-8c91-f04b53b81aff"). InnerVolumeSpecName "kube-api-access-cjb7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:17:16.517042 master-0 kubenswrapper[4171]: I0223 14:17:16.516812 4171 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjb7b\" (UniqueName: \"kubernetes.io/projected/83ba77ca-71ea-4b69-8c91-f04b53b81aff-kube-api-access-cjb7b\") on node \"master-0\" DevicePath \"\"" Feb 23 14:17:17.057798 master-0 kubenswrapper[4171]: I0223 14:17:17.057690 4171 scope.go:117] "RemoveContainer" containerID="6ac0b900bdb2d552799e0b2929f88eaa7518eb0c998cb215c17a947032781e19" Feb 23 14:17:17.268904 master-0 kubenswrapper[4171]: I0223 14:17:17.267302 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-6cc89" event={"ID":"83ba77ca-71ea-4b69-8c91-f04b53b81aff","Type":"ContainerDied","Data":"703008ca22883d01a165a1fd5907792c8c2d30c013e9c784479d77eccc42b82c"} Feb 23 14:17:17.268904 master-0 kubenswrapper[4171]: I0223 14:17:17.267678 4171 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="703008ca22883d01a165a1fd5907792c8c2d30c013e9c784479d77eccc42b82c" Feb 23 14:17:17.268904 master-0 kubenswrapper[4171]: I0223 14:17:17.267675 4171 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-6cc89" Feb 23 14:17:18.272372 master-0 kubenswrapper[4171]: I0223 14:17:18.272294 4171 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_c997c8e9d3be51d454d8e61e376bef08/kube-rbac-proxy-crio/2.log" Feb 23 14:17:18.273221 master-0 kubenswrapper[4171]: I0223 14:17:18.273000 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"c997c8e9d3be51d454d8e61e376bef08","Type":"ContainerStarted","Data":"404214ba54b8b128195146c77065f2702359c7ee02579e2cb25064ddce3c7dcc"} Feb 23 14:17:19.170178 master-0 kubenswrapper[4171]: I0223 14:17:19.170062 4171 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podStartSLOduration=13.170025164 podStartE2EDuration="13.170025164s" podCreationTimestamp="2026-02-23 14:17:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:17:18.290377173 +0000 UTC m=+48.393778712" watchObservedRunningTime="2026-02-23 14:17:19.170025164 +0000 UTC m=+49.273426683" Feb 23 14:17:19.170564 master-0 kubenswrapper[4171]: I0223 14:17:19.170534 4171 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-network-operator/mtu-prober-6cc89"] Feb 23 14:17:19.175746 master-0 kubenswrapper[4171]: I0223 14:17:19.175642 4171 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-network-operator/mtu-prober-6cc89"] Feb 23 14:17:21.060935 master-0 kubenswrapper[4171]: I0223 14:17:21.060861 4171 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83ba77ca-71ea-4b69-8c91-f04b53b81aff" path="/var/lib/kubelet/pods/83ba77ca-71ea-4b69-8c91-f04b53b81aff/volumes" Feb 23 14:17:22.564896 master-0 kubenswrapper[4171]: I0223 14:17:22.564815 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:17:22.565412 master-0 kubenswrapper[4171]: E0223 14:17:22.565009 4171 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Feb 23 14:17:22.565412 master-0 kubenswrapper[4171]: E0223 14:17:22.565102 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert podName:3cea0ab8-258b-486c-bb7f-8c93930b296d nodeName:}" failed. No retries permitted until 2026-02-23 14:17:38.565075324 +0000 UTC m=+68.668476853 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert") pod "cluster-version-operator-5cfd9759cf-bsqrg" (UID: "3cea0ab8-258b-486c-bb7f-8c93930b296d") : secret "cluster-version-operator-serving-cert" not found Feb 23 14:17:24.103918 master-0 kubenswrapper[4171]: I0223 14:17:24.103861 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-vdzqk"] Feb 23 14:17:24.104581 master-0 kubenswrapper[4171]: E0223 14:17:24.103986 4171 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83ba77ca-71ea-4b69-8c91-f04b53b81aff" containerName="prober" Feb 23 14:17:24.104581 master-0 kubenswrapper[4171]: I0223 14:17:24.104012 4171 state_mem.go:107] "Deleted CPUSet assignment" podUID="83ba77ca-71ea-4b69-8c91-f04b53b81aff" containerName="prober" Feb 23 14:17:24.104581 master-0 kubenswrapper[4171]: E0223 14:17:24.104033 4171 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0514f486-2562-473d-8b01-b69441b82367" containerName="assisted-installer-controller" Feb 23 14:17:24.104581 master-0 kubenswrapper[4171]: I0223 14:17:24.104052 4171 state_mem.go:107] "Deleted CPUSet assignment" podUID="0514f486-2562-473d-8b01-b69441b82367" containerName="assisted-installer-controller" Feb 23 14:17:24.104581 master-0 kubenswrapper[4171]: I0223 14:17:24.104114 4171 memory_manager.go:354] "RemoveStaleState removing state" podUID="0514f486-2562-473d-8b01-b69441b82367" containerName="assisted-installer-controller" Feb 23 14:17:24.104581 master-0 kubenswrapper[4171]: I0223 14:17:24.104134 4171 memory_manager.go:354] "RemoveStaleState removing state" podUID="83ba77ca-71ea-4b69-8c91-f04b53b81aff" containerName="prober" Feb 23 14:17:24.104581 master-0 kubenswrapper[4171]: I0223 14:17:24.104454 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.107210 master-0 kubenswrapper[4171]: I0223 14:17:24.107143 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 23 14:17:24.108130 master-0 kubenswrapper[4171]: I0223 14:17:24.108064 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 23 14:17:24.108258 master-0 kubenswrapper[4171]: I0223 14:17:24.108212 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 23 14:17:24.109446 master-0 kubenswrapper[4171]: I0223 14:17:24.109338 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 23 14:17:24.263924 master-0 kubenswrapper[4171]: I0223 14:17:24.263857 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-jdsv6"] Feb 23 14:17:24.266909 master-0 kubenswrapper[4171]: I0223 14:17:24.265852 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:17:24.269645 master-0 kubenswrapper[4171]: I0223 14:17:24.269328 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 23 14:17:24.269645 master-0 kubenswrapper[4171]: I0223 14:17:24.269508 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Feb 23 14:17:24.279527 master-0 kubenswrapper[4171]: I0223 14:17:24.279470 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-os-release\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.279647 master-0 kubenswrapper[4171]: I0223 14:17:24.279533 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-var-lib-cni-multus\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.279647 master-0 kubenswrapper[4171]: I0223 14:17:24.279556 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-hostroot\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.279647 master-0 kubenswrapper[4171]: I0223 14:17:24.279580 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-cnibin\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.279647 master-0 kubenswrapper[4171]: I0223 14:17:24.279602 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-system-cni-dir\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.279647 master-0 kubenswrapper[4171]: I0223 14:17:24.279622 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-run-netns\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.279647 master-0 kubenswrapper[4171]: I0223 14:17:24.279642 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-cni-dir\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.279924 master-0 kubenswrapper[4171]: I0223 14:17:24.279663 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-run-k8s-cni-cncf-io\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.279924 master-0 kubenswrapper[4171]: I0223 14:17:24.279706 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-run-multus-certs\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.279924 master-0 kubenswrapper[4171]: I0223 14:17:24.279725 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-daemon-config\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.279924 master-0 kubenswrapper[4171]: I0223 14:17:24.279749 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-var-lib-kubelet\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.279924 master-0 kubenswrapper[4171]: I0223 14:17:24.279771 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-conf-dir\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.279924 master-0 kubenswrapper[4171]: I0223 14:17:24.279791 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9z2f\" (UniqueName: \"kubernetes.io/projected/09d80e28-0b64-4c5d-a9bc-99d843d40165-kube-api-access-g9z2f\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.279924 master-0 kubenswrapper[4171]: I0223 14:17:24.279857 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/09d80e28-0b64-4c5d-a9bc-99d843d40165-cni-binary-copy\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.279924 master-0 kubenswrapper[4171]: I0223 14:17:24.279906 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-var-lib-cni-bin\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.279924 master-0 kubenswrapper[4171]: I0223 14:17:24.279929 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-etc-kubernetes\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.280292 master-0 kubenswrapper[4171]: I0223 14:17:24.279965 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-socket-dir-parent\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.380401 master-0 kubenswrapper[4171]: I0223 14:17:24.380223 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-socket-dir-parent\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.380401 master-0 kubenswrapper[4171]: I0223 14:17:24.380282 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/483786a0-0a29-44bf-bbd0-2f37e045aa2c-whereabouts-configmap\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:17:24.380401 master-0 kubenswrapper[4171]: I0223 14:17:24.380320 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-os-release\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.380954 master-0 kubenswrapper[4171]: I0223 14:17:24.380437 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-os-release\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.380954 master-0 kubenswrapper[4171]: I0223 14:17:24.380717 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-var-lib-cni-multus\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.380954 master-0 kubenswrapper[4171]: I0223 14:17:24.380769 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-var-lib-cni-multus\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.380954 master-0 kubenswrapper[4171]: I0223 14:17:24.380741 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-socket-dir-parent\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.380954 master-0 kubenswrapper[4171]: I0223 14:17:24.380800 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-hostroot\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.380954 master-0 kubenswrapper[4171]: I0223 14:17:24.380833 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-hostroot\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.380954 master-0 kubenswrapper[4171]: I0223 14:17:24.380842 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-cnibin\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.380954 master-0 kubenswrapper[4171]: I0223 14:17:24.380886 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-cnibin\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.380954 master-0 kubenswrapper[4171]: I0223 14:17:24.380886 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88qnh\" (UniqueName: \"kubernetes.io/projected/483786a0-0a29-44bf-bbd0-2f37e045aa2c-kube-api-access-88qnh\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:17:24.380954 master-0 kubenswrapper[4171]: I0223 14:17:24.380934 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-system-cni-dir\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.381526 master-0 kubenswrapper[4171]: I0223 14:17:24.381014 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-system-cni-dir\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.381526 master-0 kubenswrapper[4171]: I0223 14:17:24.381032 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-run-netns\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.381526 master-0 kubenswrapper[4171]: I0223 14:17:24.381071 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-run-netns\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.381526 master-0 kubenswrapper[4171]: I0223 14:17:24.381086 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-cni-dir\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.381526 master-0 kubenswrapper[4171]: I0223 14:17:24.381132 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-run-k8s-cni-cncf-io\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.381526 master-0 kubenswrapper[4171]: I0223 14:17:24.381158 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-cni-dir\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.381526 master-0 kubenswrapper[4171]: I0223 14:17:24.381173 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-system-cni-dir\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:17:24.381526 master-0 kubenswrapper[4171]: I0223 14:17:24.381207 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-run-k8s-cni-cncf-io\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.381526 master-0 kubenswrapper[4171]: I0223 14:17:24.381237 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-cnibin\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:17:24.381526 master-0 kubenswrapper[4171]: I0223 14:17:24.381284 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/483786a0-0a29-44bf-bbd0-2f37e045aa2c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:17:24.381526 master-0 kubenswrapper[4171]: I0223 14:17:24.381327 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-run-multus-certs\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.381526 master-0 kubenswrapper[4171]: I0223 14:17:24.381406 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-daemon-config\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.381526 master-0 kubenswrapper[4171]: I0223 14:17:24.381523 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-run-multus-certs\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.382307 master-0 kubenswrapper[4171]: I0223 14:17:24.381529 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-var-lib-kubelet\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.382307 master-0 kubenswrapper[4171]: I0223 14:17:24.381619 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-conf-dir\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.382307 master-0 kubenswrapper[4171]: I0223 14:17:24.381571 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-var-lib-kubelet\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.382307 master-0 kubenswrapper[4171]: I0223 14:17:24.381655 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9z2f\" (UniqueName: \"kubernetes.io/projected/09d80e28-0b64-4c5d-a9bc-99d843d40165-kube-api-access-g9z2f\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.382307 master-0 kubenswrapper[4171]: I0223 14:17:24.381714 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-os-release\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:17:24.382307 master-0 kubenswrapper[4171]: I0223 14:17:24.381749 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/483786a0-0a29-44bf-bbd0-2f37e045aa2c-cni-binary-copy\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:17:24.382307 master-0 kubenswrapper[4171]: I0223 14:17:24.381757 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-conf-dir\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.382307 master-0 kubenswrapper[4171]: I0223 14:17:24.381776 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/09d80e28-0b64-4c5d-a9bc-99d843d40165-cni-binary-copy\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.382307 master-0 kubenswrapper[4171]: I0223 14:17:24.381814 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-var-lib-cni-bin\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.382307 master-0 kubenswrapper[4171]: I0223 14:17:24.381850 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-etc-kubernetes\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.382307 master-0 kubenswrapper[4171]: I0223 14:17:24.381889 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:17:24.382307 master-0 kubenswrapper[4171]: I0223 14:17:24.382076 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-etc-kubernetes\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.382307 master-0 kubenswrapper[4171]: I0223 14:17:24.382189 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-var-lib-cni-bin\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.383354 master-0 kubenswrapper[4171]: I0223 14:17:24.382583 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/09d80e28-0b64-4c5d-a9bc-99d843d40165-cni-binary-copy\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.383354 master-0 kubenswrapper[4171]: I0223 14:17:24.382905 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-daemon-config\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.409149 master-0 kubenswrapper[4171]: I0223 14:17:24.409048 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9z2f\" (UniqueName: \"kubernetes.io/projected/09d80e28-0b64-4c5d-a9bc-99d843d40165-kube-api-access-g9z2f\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.426810 master-0 kubenswrapper[4171]: I0223 14:17:24.426725 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vdzqk" Feb 23 14:17:24.444457 master-0 kubenswrapper[4171]: W0223 14:17:24.444385 4171 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09d80e28_0b64_4c5d_a9bc_99d843d40165.slice/crio-7d9debfc99355a24383e4ffd764682011042ebcd62151bc7e6d7e61d3c2be56f WatchSource:0}: Error finding container 7d9debfc99355a24383e4ffd764682011042ebcd62151bc7e6d7e61d3c2be56f: Status 404 returned error can't find the container with id 7d9debfc99355a24383e4ffd764682011042ebcd62151bc7e6d7e61d3c2be56f Feb 23 14:17:24.482512 master-0 kubenswrapper[4171]: I0223 14:17:24.482427 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/483786a0-0a29-44bf-bbd0-2f37e045aa2c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:17:24.482693 master-0 kubenswrapper[4171]: I0223 14:17:24.482590 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-os-release\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:17:24.482693 master-0 kubenswrapper[4171]: I0223 14:17:24.482646 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/483786a0-0a29-44bf-bbd0-2f37e045aa2c-cni-binary-copy\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:17:24.482867 master-0 kubenswrapper[4171]: I0223 14:17:24.482695 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:17:24.482867 master-0 kubenswrapper[4171]: I0223 14:17:24.482744 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/483786a0-0a29-44bf-bbd0-2f37e045aa2c-whereabouts-configmap\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:17:24.482867 master-0 kubenswrapper[4171]: I0223 14:17:24.482803 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88qnh\" (UniqueName: \"kubernetes.io/projected/483786a0-0a29-44bf-bbd0-2f37e045aa2c-kube-api-access-88qnh\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:17:24.483125 master-0 kubenswrapper[4171]: I0223 14:17:24.482873 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-os-release\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:17:24.483221 master-0 kubenswrapper[4171]: I0223 14:17:24.483133 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-system-cni-dir\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:17:24.483221 master-0 kubenswrapper[4171]: I0223 14:17:24.483153 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:17:24.483221 master-0 kubenswrapper[4171]: I0223 14:17:24.483202 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-cnibin\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:17:24.483467 master-0 kubenswrapper[4171]: I0223 14:17:24.483258 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-system-cni-dir\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:17:24.483467 master-0 kubenswrapper[4171]: I0223 14:17:24.483308 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-cnibin\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:17:24.483812 master-0 kubenswrapper[4171]: I0223 14:17:24.483740 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/483786a0-0a29-44bf-bbd0-2f37e045aa2c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:17:24.484023 master-0 kubenswrapper[4171]: I0223 14:17:24.483969 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/483786a0-0a29-44bf-bbd0-2f37e045aa2c-whereabouts-configmap\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:17:24.484379 master-0 kubenswrapper[4171]: I0223 14:17:24.484315 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/483786a0-0a29-44bf-bbd0-2f37e045aa2c-cni-binary-copy\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:17:24.512875 master-0 kubenswrapper[4171]: I0223 14:17:24.512792 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88qnh\" (UniqueName: \"kubernetes.io/projected/483786a0-0a29-44bf-bbd0-2f37e045aa2c-kube-api-access-88qnh\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:17:24.581588 master-0 kubenswrapper[4171]: I0223 14:17:24.581468 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:17:24.598289 master-0 kubenswrapper[4171]: W0223 14:17:24.598217 4171 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod483786a0_0a29_44bf_bbd0_2f37e045aa2c.slice/crio-fb44bfa273a0390e40795165f46ee3660a2a5c93ba6fcc3ac327138fc4e69610 WatchSource:0}: Error finding container fb44bfa273a0390e40795165f46ee3660a2a5c93ba6fcc3ac327138fc4e69610: Status 404 returned error can't find the container with id fb44bfa273a0390e40795165f46ee3660a2a5c93ba6fcc3ac327138fc4e69610 Feb 23 14:17:25.048505 master-0 kubenswrapper[4171]: I0223 14:17:25.048408 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-9dnsv"] Feb 23 14:17:25.048910 master-0 kubenswrapper[4171]: I0223 14:17:25.048854 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:17:25.049028 master-0 kubenswrapper[4171]: E0223 14:17:25.048946 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9dnsv" podUID="ace75aae-6f4f-4299-90e2-d5292271b136" Feb 23 14:17:25.189422 master-0 kubenswrapper[4171]: I0223 14:17:25.189289 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzkcs\" (UniqueName: \"kubernetes.io/projected/ace75aae-6f4f-4299-90e2-d5292271b136-kube-api-access-wzkcs\") pod \"network-metrics-daemon-9dnsv\" (UID: \"ace75aae-6f4f-4299-90e2-d5292271b136\") " pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:17:25.189422 master-0 kubenswrapper[4171]: I0223 14:17:25.189329 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs\") pod \"network-metrics-daemon-9dnsv\" (UID: \"ace75aae-6f4f-4299-90e2-d5292271b136\") " pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:17:25.290327 master-0 kubenswrapper[4171]: I0223 14:17:25.290250 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzkcs\" (UniqueName: \"kubernetes.io/projected/ace75aae-6f4f-4299-90e2-d5292271b136-kube-api-access-wzkcs\") pod \"network-metrics-daemon-9dnsv\" (UID: \"ace75aae-6f4f-4299-90e2-d5292271b136\") " pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:17:25.290327 master-0 kubenswrapper[4171]: I0223 14:17:25.290311 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs\") pod \"network-metrics-daemon-9dnsv\" (UID: \"ace75aae-6f4f-4299-90e2-d5292271b136\") " pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:17:25.290693 master-0 kubenswrapper[4171]: E0223 14:17:25.290417 4171 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 14:17:25.290693 master-0 kubenswrapper[4171]: E0223 14:17:25.290492 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs podName:ace75aae-6f4f-4299-90e2-d5292271b136 nodeName:}" failed. No retries permitted until 2026-02-23 14:17:25.790460372 +0000 UTC m=+55.893861861 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs") pod "network-metrics-daemon-9dnsv" (UID: "ace75aae-6f4f-4299-90e2-d5292271b136") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 14:17:25.294433 master-0 kubenswrapper[4171]: I0223 14:17:25.294363 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdsv6" event={"ID":"483786a0-0a29-44bf-bbd0-2f37e045aa2c","Type":"ContainerStarted","Data":"fb44bfa273a0390e40795165f46ee3660a2a5c93ba6fcc3ac327138fc4e69610"} Feb 23 14:17:25.296077 master-0 kubenswrapper[4171]: I0223 14:17:25.296047 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vdzqk" event={"ID":"09d80e28-0b64-4c5d-a9bc-99d843d40165","Type":"ContainerStarted","Data":"7d9debfc99355a24383e4ffd764682011042ebcd62151bc7e6d7e61d3c2be56f"} Feb 23 14:17:25.309081 master-0 kubenswrapper[4171]: I0223 14:17:25.309014 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzkcs\" (UniqueName: \"kubernetes.io/projected/ace75aae-6f4f-4299-90e2-d5292271b136-kube-api-access-wzkcs\") pod \"network-metrics-daemon-9dnsv\" (UID: \"ace75aae-6f4f-4299-90e2-d5292271b136\") " pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:17:25.794401 master-0 kubenswrapper[4171]: I0223 14:17:25.794347 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs\") pod \"network-metrics-daemon-9dnsv\" (UID: \"ace75aae-6f4f-4299-90e2-d5292271b136\") " pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:17:25.794635 master-0 kubenswrapper[4171]: E0223 14:17:25.794571 4171 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 14:17:25.794733 master-0 kubenswrapper[4171]: E0223 14:17:25.794695 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs podName:ace75aae-6f4f-4299-90e2-d5292271b136 nodeName:}" failed. No retries permitted until 2026-02-23 14:17:26.7946613 +0000 UTC m=+56.898062899 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs") pod "network-metrics-daemon-9dnsv" (UID: "ace75aae-6f4f-4299-90e2-d5292271b136") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 14:17:26.802706 master-0 kubenswrapper[4171]: I0223 14:17:26.802603 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs\") pod \"network-metrics-daemon-9dnsv\" (UID: \"ace75aae-6f4f-4299-90e2-d5292271b136\") " pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:17:26.803217 master-0 kubenswrapper[4171]: E0223 14:17:26.802826 4171 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 14:17:26.803217 master-0 kubenswrapper[4171]: E0223 14:17:26.802926 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs podName:ace75aae-6f4f-4299-90e2-d5292271b136 nodeName:}" failed. No retries permitted until 2026-02-23 14:17:28.802901522 +0000 UTC m=+58.906303071 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs") pod "network-metrics-daemon-9dnsv" (UID: "ace75aae-6f4f-4299-90e2-d5292271b136") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 14:17:27.058076 master-0 kubenswrapper[4171]: I0223 14:17:27.057907 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:17:27.058076 master-0 kubenswrapper[4171]: E0223 14:17:27.058051 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9dnsv" podUID="ace75aae-6f4f-4299-90e2-d5292271b136" Feb 23 14:17:28.307690 master-0 kubenswrapper[4171]: I0223 14:17:28.307532 4171 generic.go:334] "Generic (PLEG): container finished" podID="483786a0-0a29-44bf-bbd0-2f37e045aa2c" containerID="7e56c504fefbada4ed2745ea4973c98d064a08b56a86637d3809d7946280cc20" exitCode=0 Feb 23 14:17:28.307690 master-0 kubenswrapper[4171]: I0223 14:17:28.307590 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdsv6" event={"ID":"483786a0-0a29-44bf-bbd0-2f37e045aa2c","Type":"ContainerDied","Data":"7e56c504fefbada4ed2745ea4973c98d064a08b56a86637d3809d7946280cc20"} Feb 23 14:17:28.903655 master-0 kubenswrapper[4171]: I0223 14:17:28.903555 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs\") pod \"network-metrics-daemon-9dnsv\" (UID: \"ace75aae-6f4f-4299-90e2-d5292271b136\") " pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:17:28.903923 master-0 kubenswrapper[4171]: E0223 14:17:28.903729 4171 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 14:17:28.903923 master-0 kubenswrapper[4171]: E0223 14:17:28.903845 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs podName:ace75aae-6f4f-4299-90e2-d5292271b136 nodeName:}" failed. No retries permitted until 2026-02-23 14:17:32.90381845 +0000 UTC m=+63.007219979 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs") pod "network-metrics-daemon-9dnsv" (UID: "ace75aae-6f4f-4299-90e2-d5292271b136") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 14:17:29.057359 master-0 kubenswrapper[4171]: I0223 14:17:29.057297 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:17:29.057666 master-0 kubenswrapper[4171]: E0223 14:17:29.057459 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9dnsv" podUID="ace75aae-6f4f-4299-90e2-d5292271b136" Feb 23 14:17:31.057448 master-0 kubenswrapper[4171]: I0223 14:17:31.057269 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:17:31.058301 master-0 kubenswrapper[4171]: E0223 14:17:31.057668 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9dnsv" podUID="ace75aae-6f4f-4299-90e2-d5292271b136" Feb 23 14:17:32.957064 master-0 kubenswrapper[4171]: I0223 14:17:32.956999 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs\") pod \"network-metrics-daemon-9dnsv\" (UID: \"ace75aae-6f4f-4299-90e2-d5292271b136\") " pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:17:32.957522 master-0 kubenswrapper[4171]: E0223 14:17:32.957121 4171 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 14:17:32.957522 master-0 kubenswrapper[4171]: E0223 14:17:32.957177 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs podName:ace75aae-6f4f-4299-90e2-d5292271b136 nodeName:}" failed. No retries permitted until 2026-02-23 14:17:40.957162974 +0000 UTC m=+71.060564463 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs") pod "network-metrics-daemon-9dnsv" (UID: "ace75aae-6f4f-4299-90e2-d5292271b136") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 14:17:33.056729 master-0 kubenswrapper[4171]: I0223 14:17:33.056630 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:17:33.057018 master-0 kubenswrapper[4171]: E0223 14:17:33.056794 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9dnsv" podUID="ace75aae-6f4f-4299-90e2-d5292271b136" Feb 23 14:17:35.066225 master-0 kubenswrapper[4171]: I0223 14:17:35.066152 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:17:35.066685 master-0 kubenswrapper[4171]: E0223 14:17:35.066328 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9dnsv" podUID="ace75aae-6f4f-4299-90e2-d5292271b136" Feb 23 14:17:35.525268 master-0 kubenswrapper[4171]: I0223 14:17:35.524824 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdsv6" event={"ID":"483786a0-0a29-44bf-bbd0-2f37e045aa2c","Type":"ContainerStarted","Data":"5fd96309ade76aec20ed37e459e178ae08d952af2aa513f3703806ca12a7c927"} Feb 23 14:17:36.445748 master-0 kubenswrapper[4171]: I0223 14:17:36.445678 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v"] Feb 23 14:17:36.446945 master-0 kubenswrapper[4171]: I0223 14:17:36.446021 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" Feb 23 14:17:36.448407 master-0 kubenswrapper[4171]: I0223 14:17:36.448376 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 23 14:17:36.448553 master-0 kubenswrapper[4171]: I0223 14:17:36.448510 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 23 14:17:36.448628 master-0 kubenswrapper[4171]: I0223 14:17:36.448607 4171 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 23 14:17:36.449591 master-0 kubenswrapper[4171]: I0223 14:17:36.449564 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 23 14:17:36.449677 master-0 kubenswrapper[4171]: I0223 14:17:36.449596 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 23 14:17:36.529132 master-0 kubenswrapper[4171]: I0223 14:17:36.529020 4171 generic.go:334] "Generic (PLEG): container finished" podID="483786a0-0a29-44bf-bbd0-2f37e045aa2c" containerID="5fd96309ade76aec20ed37e459e178ae08d952af2aa513f3703806ca12a7c927" exitCode=0 Feb 23 14:17:36.529415 master-0 kubenswrapper[4171]: I0223 14:17:36.529148 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdsv6" event={"ID":"483786a0-0a29-44bf-bbd0-2f37e045aa2c","Type":"ContainerDied","Data":"5fd96309ade76aec20ed37e459e178ae08d952af2aa513f3703806ca12a7c927"} Feb 23 14:17:36.586187 master-0 kubenswrapper[4171]: I0223 14:17:36.586091 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjs6f\" (UniqueName: \"kubernetes.io/projected/b090ed5a-984f-41dd-8cea-34a1ece1514f-kube-api-access-fjs6f\") pod \"ovnkube-control-plane-5d8dfcdc87-jbc2v\" (UID: \"b090ed5a-984f-41dd-8cea-34a1ece1514f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" Feb 23 14:17:36.586398 master-0 kubenswrapper[4171]: I0223 14:17:36.586231 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b090ed5a-984f-41dd-8cea-34a1ece1514f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-5d8dfcdc87-jbc2v\" (UID: \"b090ed5a-984f-41dd-8cea-34a1ece1514f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" Feb 23 14:17:36.586398 master-0 kubenswrapper[4171]: I0223 14:17:36.586323 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b090ed5a-984f-41dd-8cea-34a1ece1514f-env-overrides\") pod \"ovnkube-control-plane-5d8dfcdc87-jbc2v\" (UID: \"b090ed5a-984f-41dd-8cea-34a1ece1514f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" Feb 23 14:17:36.586398 master-0 kubenswrapper[4171]: I0223 14:17:36.586359 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b090ed5a-984f-41dd-8cea-34a1ece1514f-ovnkube-config\") pod \"ovnkube-control-plane-5d8dfcdc87-jbc2v\" (UID: \"b090ed5a-984f-41dd-8cea-34a1ece1514f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" Feb 23 14:17:36.653744 master-0 kubenswrapper[4171]: I0223 14:17:36.653680 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qtjsn"] Feb 23 14:17:36.656782 master-0 kubenswrapper[4171]: I0223 14:17:36.655536 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.658547 master-0 kubenswrapper[4171]: I0223 14:17:36.657710 4171 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 23 14:17:36.661061 master-0 kubenswrapper[4171]: I0223 14:17:36.660324 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 23 14:17:36.688296 master-0 kubenswrapper[4171]: I0223 14:17:36.688187 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjs6f\" (UniqueName: \"kubernetes.io/projected/b090ed5a-984f-41dd-8cea-34a1ece1514f-kube-api-access-fjs6f\") pod \"ovnkube-control-plane-5d8dfcdc87-jbc2v\" (UID: \"b090ed5a-984f-41dd-8cea-34a1ece1514f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" Feb 23 14:17:36.688296 master-0 kubenswrapper[4171]: I0223 14:17:36.688293 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b090ed5a-984f-41dd-8cea-34a1ece1514f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-5d8dfcdc87-jbc2v\" (UID: \"b090ed5a-984f-41dd-8cea-34a1ece1514f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" Feb 23 14:17:36.688715 master-0 kubenswrapper[4171]: I0223 14:17:36.688679 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b090ed5a-984f-41dd-8cea-34a1ece1514f-ovnkube-config\") pod \"ovnkube-control-plane-5d8dfcdc87-jbc2v\" (UID: \"b090ed5a-984f-41dd-8cea-34a1ece1514f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" Feb 23 14:17:36.688765 master-0 kubenswrapper[4171]: I0223 14:17:36.688711 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b090ed5a-984f-41dd-8cea-34a1ece1514f-env-overrides\") pod \"ovnkube-control-plane-5d8dfcdc87-jbc2v\" (UID: \"b090ed5a-984f-41dd-8cea-34a1ece1514f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" Feb 23 14:17:36.689261 master-0 kubenswrapper[4171]: I0223 14:17:36.689219 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b090ed5a-984f-41dd-8cea-34a1ece1514f-env-overrides\") pod \"ovnkube-control-plane-5d8dfcdc87-jbc2v\" (UID: \"b090ed5a-984f-41dd-8cea-34a1ece1514f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" Feb 23 14:17:36.689460 master-0 kubenswrapper[4171]: I0223 14:17:36.689422 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b090ed5a-984f-41dd-8cea-34a1ece1514f-ovnkube-config\") pod \"ovnkube-control-plane-5d8dfcdc87-jbc2v\" (UID: \"b090ed5a-984f-41dd-8cea-34a1ece1514f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" Feb 23 14:17:36.697441 master-0 kubenswrapper[4171]: I0223 14:17:36.691995 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b090ed5a-984f-41dd-8cea-34a1ece1514f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-5d8dfcdc87-jbc2v\" (UID: \"b090ed5a-984f-41dd-8cea-34a1ece1514f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" Feb 23 14:17:36.708612 master-0 kubenswrapper[4171]: I0223 14:17:36.703708 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjs6f\" (UniqueName: \"kubernetes.io/projected/b090ed5a-984f-41dd-8cea-34a1ece1514f-kube-api-access-fjs6f\") pod \"ovnkube-control-plane-5d8dfcdc87-jbc2v\" (UID: \"b090ed5a-984f-41dd-8cea-34a1ece1514f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" Feb 23 14:17:36.766538 master-0 kubenswrapper[4171]: I0223 14:17:36.766451 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" Feb 23 14:17:36.789566 master-0 kubenswrapper[4171]: I0223 14:17:36.789496 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-kubelet\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.789566 master-0 kubenswrapper[4171]: I0223 14:17:36.789537 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-cni-bin\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.789566 master-0 kubenswrapper[4171]: I0223 14:17:36.789560 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-log-socket\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.789566 master-0 kubenswrapper[4171]: I0223 14:17:36.789575 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/02998803-dd12-4c68-990e-efd131076a0f-ovnkube-config\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.789566 master-0 kubenswrapper[4171]: I0223 14:17:36.789593 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-node-log\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.789943 master-0 kubenswrapper[4171]: I0223 14:17:36.789608 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-cni-netd\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.789943 master-0 kubenswrapper[4171]: I0223 14:17:36.789638 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-run-systemd\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.789943 master-0 kubenswrapper[4171]: I0223 14:17:36.789653 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-run-ovn-kubernetes\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.789943 master-0 kubenswrapper[4171]: I0223 14:17:36.789671 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/02998803-dd12-4c68-990e-efd131076a0f-env-overrides\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.789943 master-0 kubenswrapper[4171]: I0223 14:17:36.789691 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-slash\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.789943 master-0 kubenswrapper[4171]: I0223 14:17:36.789711 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-run-netns\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.789943 master-0 kubenswrapper[4171]: I0223 14:17:36.789731 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-run-openvswitch\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.789943 master-0 kubenswrapper[4171]: I0223 14:17:36.789759 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-var-lib-openvswitch\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.789943 master-0 kubenswrapper[4171]: I0223 14:17:36.789779 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-systemd-units\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.789943 master-0 kubenswrapper[4171]: I0223 14:17:36.789799 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.789943 master-0 kubenswrapper[4171]: I0223 14:17:36.789821 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/02998803-dd12-4c68-990e-efd131076a0f-ovn-node-metrics-cert\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.789943 master-0 kubenswrapper[4171]: I0223 14:17:36.789844 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/02998803-dd12-4c68-990e-efd131076a0f-ovnkube-script-lib\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.789943 master-0 kubenswrapper[4171]: I0223 14:17:36.789859 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-run-ovn\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.789943 master-0 kubenswrapper[4171]: I0223 14:17:36.789876 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddbg8\" (UniqueName: \"kubernetes.io/projected/02998803-dd12-4c68-990e-efd131076a0f-kube-api-access-ddbg8\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.789943 master-0 kubenswrapper[4171]: I0223 14:17:36.789890 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-etc-openvswitch\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.890956 master-0 kubenswrapper[4171]: I0223 14:17:36.890865 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-kubelet\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.890956 master-0 kubenswrapper[4171]: I0223 14:17:36.890937 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-cni-bin\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.891315 master-0 kubenswrapper[4171]: I0223 14:17:36.891073 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-cni-bin\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.891315 master-0 kubenswrapper[4171]: I0223 14:17:36.891137 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-log-socket\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.891315 master-0 kubenswrapper[4171]: I0223 14:17:36.891164 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-node-log\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.891315 master-0 kubenswrapper[4171]: I0223 14:17:36.891190 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-cni-netd\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.891315 master-0 kubenswrapper[4171]: I0223 14:17:36.891195 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-kubelet\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.891315 master-0 kubenswrapper[4171]: I0223 14:17:36.891248 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-cni-netd\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.891315 master-0 kubenswrapper[4171]: I0223 14:17:36.891215 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/02998803-dd12-4c68-990e-efd131076a0f-ovnkube-config\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.891537 master-0 kubenswrapper[4171]: I0223 14:17:36.891313 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-node-log\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.891537 master-0 kubenswrapper[4171]: I0223 14:17:36.891361 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-run-systemd\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.891537 master-0 kubenswrapper[4171]: I0223 14:17:36.891203 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-log-socket\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.891537 master-0 kubenswrapper[4171]: I0223 14:17:36.891413 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-run-ovn-kubernetes\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.891537 master-0 kubenswrapper[4171]: I0223 14:17:36.891439 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-run-systemd\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.891537 master-0 kubenswrapper[4171]: I0223 14:17:36.891459 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-slash\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.891704 master-0 kubenswrapper[4171]: I0223 14:17:36.891562 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-run-ovn-kubernetes\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.891704 master-0 kubenswrapper[4171]: I0223 14:17:36.891598 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/02998803-dd12-4c68-990e-efd131076a0f-env-overrides\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.891704 master-0 kubenswrapper[4171]: I0223 14:17:36.891621 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-run-netns\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.891704 master-0 kubenswrapper[4171]: I0223 14:17:36.891647 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-run-openvswitch\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.891809 master-0 kubenswrapper[4171]: I0223 14:17:36.891758 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-slash\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.891936 master-0 kubenswrapper[4171]: I0223 14:17:36.891888 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-var-lib-openvswitch\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.892013 master-0 kubenswrapper[4171]: I0223 14:17:36.891984 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-systemd-units\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.892013 master-0 kubenswrapper[4171]: I0223 14:17:36.891908 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-run-openvswitch\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.892082 master-0 kubenswrapper[4171]: I0223 14:17:36.891988 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-var-lib-openvswitch\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.892082 master-0 kubenswrapper[4171]: I0223 14:17:36.892038 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.892082 master-0 kubenswrapper[4171]: I0223 14:17:36.892058 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-systemd-units\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.892169 master-0 kubenswrapper[4171]: I0223 14:17:36.892087 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/02998803-dd12-4c68-990e-efd131076a0f-ovn-node-metrics-cert\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.892169 master-0 kubenswrapper[4171]: I0223 14:17:36.892133 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.892169 master-0 kubenswrapper[4171]: I0223 14:17:36.892132 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/02998803-dd12-4c68-990e-efd131076a0f-ovnkube-script-lib\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.892291 master-0 kubenswrapper[4171]: I0223 14:17:36.892209 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-run-netns\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.892291 master-0 kubenswrapper[4171]: I0223 14:17:36.892232 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/02998803-dd12-4c68-990e-efd131076a0f-ovnkube-config\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.892291 master-0 kubenswrapper[4171]: I0223 14:17:36.892272 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-run-ovn\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.892291 master-0 kubenswrapper[4171]: I0223 14:17:36.892290 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddbg8\" (UniqueName: \"kubernetes.io/projected/02998803-dd12-4c68-990e-efd131076a0f-kube-api-access-ddbg8\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.892446 master-0 kubenswrapper[4171]: I0223 14:17:36.892310 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-etc-openvswitch\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.892446 master-0 kubenswrapper[4171]: I0223 14:17:36.892364 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-etc-openvswitch\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.892446 master-0 kubenswrapper[4171]: I0223 14:17:36.892366 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-run-ovn\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.892788 master-0 kubenswrapper[4171]: I0223 14:17:36.892749 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/02998803-dd12-4c68-990e-efd131076a0f-env-overrides\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.893361 master-0 kubenswrapper[4171]: I0223 14:17:36.893310 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/02998803-dd12-4c68-990e-efd131076a0f-ovnkube-script-lib\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.896690 master-0 kubenswrapper[4171]: I0223 14:17:36.895974 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/02998803-dd12-4c68-990e-efd131076a0f-ovn-node-metrics-cert\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.910011 master-0 kubenswrapper[4171]: I0223 14:17:36.909951 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddbg8\" (UniqueName: \"kubernetes.io/projected/02998803-dd12-4c68-990e-efd131076a0f-kube-api-access-ddbg8\") pod \"ovnkube-node-qtjsn\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:36.973307 master-0 kubenswrapper[4171]: I0223 14:17:36.972812 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:17:37.057286 master-0 kubenswrapper[4171]: I0223 14:17:37.057213 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:17:37.057720 master-0 kubenswrapper[4171]: E0223 14:17:37.057363 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9dnsv" podUID="ace75aae-6f4f-4299-90e2-d5292271b136" Feb 23 14:17:38.613682 master-0 kubenswrapper[4171]: I0223 14:17:38.613633 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:17:38.615812 master-0 kubenswrapper[4171]: E0223 14:17:38.613807 4171 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Feb 23 14:17:38.615812 master-0 kubenswrapper[4171]: E0223 14:17:38.613902 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert podName:3cea0ab8-258b-486c-bb7f-8c93930b296d nodeName:}" failed. No retries permitted until 2026-02-23 14:18:10.613880256 +0000 UTC m=+100.717281745 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert") pod "cluster-version-operator-5cfd9759cf-bsqrg" (UID: "3cea0ab8-258b-486c-bb7f-8c93930b296d") : secret "cluster-version-operator-serving-cert" not found Feb 23 14:17:39.057817 master-0 kubenswrapper[4171]: I0223 14:17:39.057735 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:17:39.058135 master-0 kubenswrapper[4171]: E0223 14:17:39.057954 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9dnsv" podUID="ace75aae-6f4f-4299-90e2-d5292271b136" Feb 23 14:17:39.560124 master-0 kubenswrapper[4171]: W0223 14:17:39.560070 4171 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02998803_dd12_4c68_990e_efd131076a0f.slice/crio-477770141bf255a57c3800674a1973731f4824ba7eeefbf57c50b1692c3eba69 WatchSource:0}: Error finding container 477770141bf255a57c3800674a1973731f4824ba7eeefbf57c50b1692c3eba69: Status 404 returned error can't find the container with id 477770141bf255a57c3800674a1973731f4824ba7eeefbf57c50b1692c3eba69 Feb 23 14:17:39.560437 master-0 kubenswrapper[4171]: W0223 14:17:39.560410 4171 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb090ed5a_984f_41dd_8cea_34a1ece1514f.slice/crio-f850a3ee886935c4dd2d0266e97d2bc00c30e8e88c1475292224ac9d98f6501e WatchSource:0}: Error finding container f850a3ee886935c4dd2d0266e97d2bc00c30e8e88c1475292224ac9d98f6501e: Status 404 returned error can't find the container with id f850a3ee886935c4dd2d0266e97d2bc00c30e8e88c1475292224ac9d98f6501e Feb 23 14:17:39.658215 master-0 kubenswrapper[4171]: I0223 14:17:39.658148 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-x9gxm"] Feb 23 14:17:39.659850 master-0 kubenswrapper[4171]: I0223 14:17:39.658592 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:17:39.659850 master-0 kubenswrapper[4171]: E0223 14:17:39.658713 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9gxm" podUID="ded555da-db03-498e-81a9-ad166f29a2aa" Feb 23 14:17:39.825668 master-0 kubenswrapper[4171]: I0223 14:17:39.825595 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4lz2\" (UniqueName: \"kubernetes.io/projected/ded555da-db03-498e-81a9-ad166f29a2aa-kube-api-access-x4lz2\") pod \"network-check-target-x9gxm\" (UID: \"ded555da-db03-498e-81a9-ad166f29a2aa\") " pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:17:39.927055 master-0 kubenswrapper[4171]: I0223 14:17:39.926985 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4lz2\" (UniqueName: \"kubernetes.io/projected/ded555da-db03-498e-81a9-ad166f29a2aa-kube-api-access-x4lz2\") pod \"network-check-target-x9gxm\" (UID: \"ded555da-db03-498e-81a9-ad166f29a2aa\") " pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:17:39.939786 master-0 kubenswrapper[4171]: E0223 14:17:39.939730 4171 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 14:17:39.939786 master-0 kubenswrapper[4171]: E0223 14:17:39.939772 4171 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 14:17:39.939786 master-0 kubenswrapper[4171]: E0223 14:17:39.939787 4171 projected.go:194] Error preparing data for projected volume kube-api-access-x4lz2 for pod openshift-network-diagnostics/network-check-target-x9gxm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 14:17:39.940305 master-0 kubenswrapper[4171]: E0223 14:17:39.939854 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ded555da-db03-498e-81a9-ad166f29a2aa-kube-api-access-x4lz2 podName:ded555da-db03-498e-81a9-ad166f29a2aa nodeName:}" failed. No retries permitted until 2026-02-23 14:17:40.439835491 +0000 UTC m=+70.543236980 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-x4lz2" (UniqueName: "kubernetes.io/projected/ded555da-db03-498e-81a9-ad166f29a2aa-kube-api-access-x4lz2") pod "network-check-target-x9gxm" (UID: "ded555da-db03-498e-81a9-ad166f29a2aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 14:17:40.532389 master-0 kubenswrapper[4171]: I0223 14:17:40.532330 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4lz2\" (UniqueName: \"kubernetes.io/projected/ded555da-db03-498e-81a9-ad166f29a2aa-kube-api-access-x4lz2\") pod \"network-check-target-x9gxm\" (UID: \"ded555da-db03-498e-81a9-ad166f29a2aa\") " pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:17:40.532663 master-0 kubenswrapper[4171]: E0223 14:17:40.532612 4171 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 14:17:40.532712 master-0 kubenswrapper[4171]: E0223 14:17:40.532671 4171 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 14:17:40.532712 master-0 kubenswrapper[4171]: E0223 14:17:40.532688 4171 projected.go:194] Error preparing data for projected volume kube-api-access-x4lz2 for pod openshift-network-diagnostics/network-check-target-x9gxm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 14:17:40.532782 master-0 kubenswrapper[4171]: E0223 14:17:40.532760 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ded555da-db03-498e-81a9-ad166f29a2aa-kube-api-access-x4lz2 podName:ded555da-db03-498e-81a9-ad166f29a2aa nodeName:}" failed. No retries permitted until 2026-02-23 14:17:41.532738852 +0000 UTC m=+71.636140361 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-x4lz2" (UniqueName: "kubernetes.io/projected/ded555da-db03-498e-81a9-ad166f29a2aa-kube-api-access-x4lz2") pod "network-check-target-x9gxm" (UID: "ded555da-db03-498e-81a9-ad166f29a2aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 14:17:40.540579 master-0 kubenswrapper[4171]: I0223 14:17:40.540520 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" event={"ID":"02998803-dd12-4c68-990e-efd131076a0f","Type":"ContainerStarted","Data":"477770141bf255a57c3800674a1973731f4824ba7eeefbf57c50b1692c3eba69"} Feb 23 14:17:40.543622 master-0 kubenswrapper[4171]: I0223 14:17:40.543590 4171 generic.go:334] "Generic (PLEG): container finished" podID="483786a0-0a29-44bf-bbd0-2f37e045aa2c" containerID="29b61cbeccf4eaed8df82b56cbe6a444cd43fd7fd1043bff465ae48185e7e6a0" exitCode=0 Feb 23 14:17:40.543789 master-0 kubenswrapper[4171]: I0223 14:17:40.543729 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdsv6" event={"ID":"483786a0-0a29-44bf-bbd0-2f37e045aa2c","Type":"ContainerDied","Data":"29b61cbeccf4eaed8df82b56cbe6a444cd43fd7fd1043bff465ae48185e7e6a0"} Feb 23 14:17:40.545391 master-0 kubenswrapper[4171]: I0223 14:17:40.545351 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vdzqk" event={"ID":"09d80e28-0b64-4c5d-a9bc-99d843d40165","Type":"ContainerStarted","Data":"f63f2509174f7f7271730d33288707704d50e4f6775be28d027d37556b5992a9"} Feb 23 14:17:40.546593 master-0 kubenswrapper[4171]: I0223 14:17:40.546563 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" event={"ID":"b090ed5a-984f-41dd-8cea-34a1ece1514f","Type":"ContainerStarted","Data":"2b304c8b0f837d8a5676c01cc4b19f81d7aa44858d1d53ee5b0312db0b49e71f"} Feb 23 14:17:40.546766 master-0 kubenswrapper[4171]: I0223 14:17:40.546599 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" event={"ID":"b090ed5a-984f-41dd-8cea-34a1ece1514f","Type":"ContainerStarted","Data":"f850a3ee886935c4dd2d0266e97d2bc00c30e8e88c1475292224ac9d98f6501e"} Feb 23 14:17:41.036386 master-0 kubenswrapper[4171]: I0223 14:17:41.036339 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs\") pod \"network-metrics-daemon-9dnsv\" (UID: \"ace75aae-6f4f-4299-90e2-d5292271b136\") " pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:17:41.036930 master-0 kubenswrapper[4171]: E0223 14:17:41.036496 4171 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 14:17:41.036930 master-0 kubenswrapper[4171]: E0223 14:17:41.036566 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs podName:ace75aae-6f4f-4299-90e2-d5292271b136 nodeName:}" failed. No retries permitted until 2026-02-23 14:17:57.036547651 +0000 UTC m=+87.139949140 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs") pod "network-metrics-daemon-9dnsv" (UID: "ace75aae-6f4f-4299-90e2-d5292271b136") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 14:17:41.056833 master-0 kubenswrapper[4171]: I0223 14:17:41.056801 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:17:41.056931 master-0 kubenswrapper[4171]: I0223 14:17:41.056805 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:17:41.057341 master-0 kubenswrapper[4171]: E0223 14:17:41.057308 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9dnsv" podUID="ace75aae-6f4f-4299-90e2-d5292271b136" Feb 23 14:17:41.057515 master-0 kubenswrapper[4171]: E0223 14:17:41.057457 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9gxm" podUID="ded555da-db03-498e-81a9-ad166f29a2aa" Feb 23 14:17:41.541875 master-0 kubenswrapper[4171]: I0223 14:17:41.541800 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4lz2\" (UniqueName: \"kubernetes.io/projected/ded555da-db03-498e-81a9-ad166f29a2aa-kube-api-access-x4lz2\") pod \"network-check-target-x9gxm\" (UID: \"ded555da-db03-498e-81a9-ad166f29a2aa\") " pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:17:41.542083 master-0 kubenswrapper[4171]: E0223 14:17:41.542014 4171 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 14:17:41.542083 master-0 kubenswrapper[4171]: E0223 14:17:41.542049 4171 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 14:17:41.542083 master-0 kubenswrapper[4171]: E0223 14:17:41.542063 4171 projected.go:194] Error preparing data for projected volume kube-api-access-x4lz2 for pod openshift-network-diagnostics/network-check-target-x9gxm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 14:17:41.542167 master-0 kubenswrapper[4171]: E0223 14:17:41.542132 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ded555da-db03-498e-81a9-ad166f29a2aa-kube-api-access-x4lz2 podName:ded555da-db03-498e-81a9-ad166f29a2aa nodeName:}" failed. No retries permitted until 2026-02-23 14:17:43.542111583 +0000 UTC m=+73.645513132 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-x4lz2" (UniqueName: "kubernetes.io/projected/ded555da-db03-498e-81a9-ad166f29a2aa-kube-api-access-x4lz2") pod "network-check-target-x9gxm" (UID: "ded555da-db03-498e-81a9-ad166f29a2aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 14:17:42.176495 master-0 kubenswrapper[4171]: I0223 14:17:42.176400 4171 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vdzqk" podStartSLOduration=2.849967563 podStartE2EDuration="18.17637433s" podCreationTimestamp="2026-02-23 14:17:24 +0000 UTC" firstStartedPulling="2026-02-23 14:17:24.447058467 +0000 UTC m=+54.550459966" lastFinishedPulling="2026-02-23 14:17:39.773465244 +0000 UTC m=+69.876866733" observedRunningTime="2026-02-23 14:17:40.571531217 +0000 UTC m=+70.674932716" watchObservedRunningTime="2026-02-23 14:17:42.17637433 +0000 UTC m=+72.279775859" Feb 23 14:17:42.176922 master-0 kubenswrapper[4171]: I0223 14:17:42.176842 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Feb 23 14:17:42.262256 master-0 kubenswrapper[4171]: I0223 14:17:42.262195 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-td489"] Feb 23 14:17:42.263063 master-0 kubenswrapper[4171]: I0223 14:17:42.263031 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-td489" Feb 23 14:17:42.265522 master-0 kubenswrapper[4171]: I0223 14:17:42.265225 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 23 14:17:42.265522 master-0 kubenswrapper[4171]: I0223 14:17:42.265362 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 23 14:17:42.265738 master-0 kubenswrapper[4171]: I0223 14:17:42.265566 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 23 14:17:42.265978 master-0 kubenswrapper[4171]: I0223 14:17:42.265950 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 23 14:17:42.265978 master-0 kubenswrapper[4171]: I0223 14:17:42.265960 4171 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 23 14:17:42.294879 master-0 kubenswrapper[4171]: I0223 14:17:42.294805 4171 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-scheduler-master-0" podStartSLOduration=0.294787694 podStartE2EDuration="294.787694ms" podCreationTimestamp="2026-02-23 14:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:17:42.294731463 +0000 UTC m=+72.398132992" watchObservedRunningTime="2026-02-23 14:17:42.294787694 +0000 UTC m=+72.398189173" Feb 23 14:17:42.449777 master-0 kubenswrapper[4171]: I0223 14:17:42.449712 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bbe678de-546d-49d0-8280-3f6d94fa5e4f-webhook-cert\") pod \"network-node-identity-td489\" (UID: \"bbe678de-546d-49d0-8280-3f6d94fa5e4f\") " pod="openshift-network-node-identity/network-node-identity-td489" Feb 23 14:17:42.449777 master-0 kubenswrapper[4171]: I0223 14:17:42.449771 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bbe678de-546d-49d0-8280-3f6d94fa5e4f-env-overrides\") pod \"network-node-identity-td489\" (UID: \"bbe678de-546d-49d0-8280-3f6d94fa5e4f\") " pod="openshift-network-node-identity/network-node-identity-td489" Feb 23 14:17:42.449900 master-0 kubenswrapper[4171]: I0223 14:17:42.449805 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp5kb\" (UniqueName: \"kubernetes.io/projected/bbe678de-546d-49d0-8280-3f6d94fa5e4f-kube-api-access-kp5kb\") pod \"network-node-identity-td489\" (UID: \"bbe678de-546d-49d0-8280-3f6d94fa5e4f\") " pod="openshift-network-node-identity/network-node-identity-td489" Feb 23 14:17:42.449900 master-0 kubenswrapper[4171]: I0223 14:17:42.449830 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/bbe678de-546d-49d0-8280-3f6d94fa5e4f-ovnkube-identity-cm\") pod \"network-node-identity-td489\" (UID: \"bbe678de-546d-49d0-8280-3f6d94fa5e4f\") " pod="openshift-network-node-identity/network-node-identity-td489" Feb 23 14:17:42.550541 master-0 kubenswrapper[4171]: I0223 14:17:42.550486 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bbe678de-546d-49d0-8280-3f6d94fa5e4f-webhook-cert\") pod \"network-node-identity-td489\" (UID: \"bbe678de-546d-49d0-8280-3f6d94fa5e4f\") " pod="openshift-network-node-identity/network-node-identity-td489" Feb 23 14:17:42.550541 master-0 kubenswrapper[4171]: I0223 14:17:42.550520 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bbe678de-546d-49d0-8280-3f6d94fa5e4f-env-overrides\") pod \"network-node-identity-td489\" (UID: \"bbe678de-546d-49d0-8280-3f6d94fa5e4f\") " pod="openshift-network-node-identity/network-node-identity-td489" Feb 23 14:17:42.550541 master-0 kubenswrapper[4171]: I0223 14:17:42.550537 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp5kb\" (UniqueName: \"kubernetes.io/projected/bbe678de-546d-49d0-8280-3f6d94fa5e4f-kube-api-access-kp5kb\") pod \"network-node-identity-td489\" (UID: \"bbe678de-546d-49d0-8280-3f6d94fa5e4f\") " pod="openshift-network-node-identity/network-node-identity-td489" Feb 23 14:17:42.550541 master-0 kubenswrapper[4171]: I0223 14:17:42.550554 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/bbe678de-546d-49d0-8280-3f6d94fa5e4f-ovnkube-identity-cm\") pod \"network-node-identity-td489\" (UID: \"bbe678de-546d-49d0-8280-3f6d94fa5e4f\") " pod="openshift-network-node-identity/network-node-identity-td489" Feb 23 14:17:42.551434 master-0 kubenswrapper[4171]: I0223 14:17:42.551405 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/bbe678de-546d-49d0-8280-3f6d94fa5e4f-ovnkube-identity-cm\") pod \"network-node-identity-td489\" (UID: \"bbe678de-546d-49d0-8280-3f6d94fa5e4f\") " pod="openshift-network-node-identity/network-node-identity-td489" Feb 23 14:17:42.552357 master-0 kubenswrapper[4171]: I0223 14:17:42.552330 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bbe678de-546d-49d0-8280-3f6d94fa5e4f-env-overrides\") pod \"network-node-identity-td489\" (UID: \"bbe678de-546d-49d0-8280-3f6d94fa5e4f\") " pod="openshift-network-node-identity/network-node-identity-td489" Feb 23 14:17:42.555723 master-0 kubenswrapper[4171]: I0223 14:17:42.555684 4171 generic.go:334] "Generic (PLEG): container finished" podID="483786a0-0a29-44bf-bbd0-2f37e045aa2c" containerID="e5ef5b210d67b35d196c3c58900eaedb9852f06e215b468c9e1c1dc53fce376f" exitCode=0 Feb 23 14:17:42.555959 master-0 kubenswrapper[4171]: I0223 14:17:42.555923 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bbe678de-546d-49d0-8280-3f6d94fa5e4f-webhook-cert\") pod \"network-node-identity-td489\" (UID: \"bbe678de-546d-49d0-8280-3f6d94fa5e4f\") " pod="openshift-network-node-identity/network-node-identity-td489" Feb 23 14:17:42.556175 master-0 kubenswrapper[4171]: I0223 14:17:42.556138 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdsv6" event={"ID":"483786a0-0a29-44bf-bbd0-2f37e045aa2c","Type":"ContainerDied","Data":"e5ef5b210d67b35d196c3c58900eaedb9852f06e215b468c9e1c1dc53fce376f"} Feb 23 14:17:42.574665 master-0 kubenswrapper[4171]: I0223 14:17:42.574620 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp5kb\" (UniqueName: \"kubernetes.io/projected/bbe678de-546d-49d0-8280-3f6d94fa5e4f-kube-api-access-kp5kb\") pod \"network-node-identity-td489\" (UID: \"bbe678de-546d-49d0-8280-3f6d94fa5e4f\") " pod="openshift-network-node-identity/network-node-identity-td489" Feb 23 14:17:42.579806 master-0 kubenswrapper[4171]: I0223 14:17:42.579762 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-td489" Feb 23 14:17:43.057657 master-0 kubenswrapper[4171]: I0223 14:17:43.057262 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:17:43.057657 master-0 kubenswrapper[4171]: I0223 14:17:43.057262 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:17:43.057923 master-0 kubenswrapper[4171]: E0223 14:17:43.057651 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9gxm" podUID="ded555da-db03-498e-81a9-ad166f29a2aa" Feb 23 14:17:43.057923 master-0 kubenswrapper[4171]: E0223 14:17:43.057736 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9dnsv" podUID="ace75aae-6f4f-4299-90e2-d5292271b136" Feb 23 14:17:43.559214 master-0 kubenswrapper[4171]: I0223 14:17:43.559121 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4lz2\" (UniqueName: \"kubernetes.io/projected/ded555da-db03-498e-81a9-ad166f29a2aa-kube-api-access-x4lz2\") pod \"network-check-target-x9gxm\" (UID: \"ded555da-db03-498e-81a9-ad166f29a2aa\") " pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:17:43.559991 master-0 kubenswrapper[4171]: E0223 14:17:43.559300 4171 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 14:17:43.559991 master-0 kubenswrapper[4171]: E0223 14:17:43.559320 4171 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 14:17:43.559991 master-0 kubenswrapper[4171]: E0223 14:17:43.559332 4171 projected.go:194] Error preparing data for projected volume kube-api-access-x4lz2 for pod openshift-network-diagnostics/network-check-target-x9gxm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 14:17:43.559991 master-0 kubenswrapper[4171]: E0223 14:17:43.559391 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ded555da-db03-498e-81a9-ad166f29a2aa-kube-api-access-x4lz2 podName:ded555da-db03-498e-81a9-ad166f29a2aa nodeName:}" failed. No retries permitted until 2026-02-23 14:17:47.559372646 +0000 UTC m=+77.662774145 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-x4lz2" (UniqueName: "kubernetes.io/projected/ded555da-db03-498e-81a9-ad166f29a2aa-kube-api-access-x4lz2") pod "network-check-target-x9gxm" (UID: "ded555da-db03-498e-81a9-ad166f29a2aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 14:17:43.563219 master-0 kubenswrapper[4171]: I0223 14:17:43.563160 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-td489" event={"ID":"bbe678de-546d-49d0-8280-3f6d94fa5e4f","Type":"ContainerStarted","Data":"a763a9aa12dde6c52d5c6991687ebd101bd47550719a37c47c1a30d449928cff"} Feb 23 14:17:45.057046 master-0 kubenswrapper[4171]: I0223 14:17:45.056995 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:17:45.057509 master-0 kubenswrapper[4171]: I0223 14:17:45.057006 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:17:45.057509 master-0 kubenswrapper[4171]: E0223 14:17:45.057121 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9dnsv" podUID="ace75aae-6f4f-4299-90e2-d5292271b136" Feb 23 14:17:45.057509 master-0 kubenswrapper[4171]: E0223 14:17:45.057213 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9gxm" podUID="ded555da-db03-498e-81a9-ad166f29a2aa" Feb 23 14:17:47.056828 master-0 kubenswrapper[4171]: I0223 14:17:47.056762 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:17:47.057281 master-0 kubenswrapper[4171]: I0223 14:17:47.056855 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:17:47.057281 master-0 kubenswrapper[4171]: E0223 14:17:47.056902 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9gxm" podUID="ded555da-db03-498e-81a9-ad166f29a2aa" Feb 23 14:17:47.057281 master-0 kubenswrapper[4171]: E0223 14:17:47.056981 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9dnsv" podUID="ace75aae-6f4f-4299-90e2-d5292271b136" Feb 23 14:17:47.591419 master-0 kubenswrapper[4171]: I0223 14:17:47.591367 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4lz2\" (UniqueName: \"kubernetes.io/projected/ded555da-db03-498e-81a9-ad166f29a2aa-kube-api-access-x4lz2\") pod \"network-check-target-x9gxm\" (UID: \"ded555da-db03-498e-81a9-ad166f29a2aa\") " pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:17:47.591810 master-0 kubenswrapper[4171]: E0223 14:17:47.591522 4171 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 14:17:47.591810 master-0 kubenswrapper[4171]: E0223 14:17:47.591537 4171 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 14:17:47.591810 master-0 kubenswrapper[4171]: E0223 14:17:47.591548 4171 projected.go:194] Error preparing data for projected volume kube-api-access-x4lz2 for pod openshift-network-diagnostics/network-check-target-x9gxm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 14:17:47.591810 master-0 kubenswrapper[4171]: E0223 14:17:47.591595 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ded555da-db03-498e-81a9-ad166f29a2aa-kube-api-access-x4lz2 podName:ded555da-db03-498e-81a9-ad166f29a2aa nodeName:}" failed. No retries permitted until 2026-02-23 14:17:55.591582301 +0000 UTC m=+85.694983790 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-x4lz2" (UniqueName: "kubernetes.io/projected/ded555da-db03-498e-81a9-ad166f29a2aa-kube-api-access-x4lz2") pod "network-check-target-x9gxm" (UID: "ded555da-db03-498e-81a9-ad166f29a2aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 14:17:49.058676 master-0 kubenswrapper[4171]: I0223 14:17:49.058633 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:17:49.059124 master-0 kubenswrapper[4171]: E0223 14:17:49.058717 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9dnsv" podUID="ace75aae-6f4f-4299-90e2-d5292271b136" Feb 23 14:17:49.059124 master-0 kubenswrapper[4171]: I0223 14:17:49.058994 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:17:49.059124 master-0 kubenswrapper[4171]: E0223 14:17:49.059035 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9gxm" podUID="ded555da-db03-498e-81a9-ad166f29a2aa" Feb 23 14:17:51.071093 master-0 kubenswrapper[4171]: I0223 14:17:51.071001 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:17:51.071734 master-0 kubenswrapper[4171]: I0223 14:17:51.071416 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:17:51.072810 master-0 kubenswrapper[4171]: E0223 14:17:51.072740 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9gxm" podUID="ded555da-db03-498e-81a9-ad166f29a2aa" Feb 23 14:17:51.073023 master-0 kubenswrapper[4171]: E0223 14:17:51.072971 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9dnsv" podUID="ace75aae-6f4f-4299-90e2-d5292271b136" Feb 23 14:17:52.683966 master-0 kubenswrapper[4171]: I0223 14:17:52.683873 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Feb 23 14:17:52.683966 master-0 kubenswrapper[4171]: W0223 14:17:52.683944 4171 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Feb 23 14:17:53.057903 master-0 kubenswrapper[4171]: I0223 14:17:53.057735 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:17:53.057903 master-0 kubenswrapper[4171]: I0223 14:17:53.057742 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:17:53.058181 master-0 kubenswrapper[4171]: E0223 14:17:53.057938 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9dnsv" podUID="ace75aae-6f4f-4299-90e2-d5292271b136" Feb 23 14:17:53.058181 master-0 kubenswrapper[4171]: E0223 14:17:53.057990 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9gxm" podUID="ded555da-db03-498e-81a9-ad166f29a2aa" Feb 23 14:17:55.057265 master-0 kubenswrapper[4171]: I0223 14:17:55.057205 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:17:55.057799 master-0 kubenswrapper[4171]: I0223 14:17:55.057271 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:17:55.057799 master-0 kubenswrapper[4171]: E0223 14:17:55.057361 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9dnsv" podUID="ace75aae-6f4f-4299-90e2-d5292271b136" Feb 23 14:17:55.057799 master-0 kubenswrapper[4171]: E0223 14:17:55.057436 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9gxm" podUID="ded555da-db03-498e-81a9-ad166f29a2aa" Feb 23 14:17:55.663338 master-0 kubenswrapper[4171]: I0223 14:17:55.663254 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4lz2\" (UniqueName: \"kubernetes.io/projected/ded555da-db03-498e-81a9-ad166f29a2aa-kube-api-access-x4lz2\") pod \"network-check-target-x9gxm\" (UID: \"ded555da-db03-498e-81a9-ad166f29a2aa\") " pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:17:55.663779 master-0 kubenswrapper[4171]: E0223 14:17:55.663546 4171 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 14:17:55.663779 master-0 kubenswrapper[4171]: E0223 14:17:55.663590 4171 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 14:17:55.663779 master-0 kubenswrapper[4171]: E0223 14:17:55.663612 4171 projected.go:194] Error preparing data for projected volume kube-api-access-x4lz2 for pod openshift-network-diagnostics/network-check-target-x9gxm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 14:17:55.663779 master-0 kubenswrapper[4171]: E0223 14:17:55.663693 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ded555da-db03-498e-81a9-ad166f29a2aa-kube-api-access-x4lz2 podName:ded555da-db03-498e-81a9-ad166f29a2aa nodeName:}" failed. No retries permitted until 2026-02-23 14:18:11.663670942 +0000 UTC m=+101.767072441 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-x4lz2" (UniqueName: "kubernetes.io/projected/ded555da-db03-498e-81a9-ad166f29a2aa-kube-api-access-x4lz2") pod "network-check-target-x9gxm" (UID: "ded555da-db03-498e-81a9-ad166f29a2aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 14:17:57.042675 master-0 kubenswrapper[4171]: I0223 14:17:57.042611 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs\") pod \"network-metrics-daemon-9dnsv\" (UID: \"ace75aae-6f4f-4299-90e2-d5292271b136\") " pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:17:57.043225 master-0 kubenswrapper[4171]: E0223 14:17:57.042759 4171 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 14:17:57.043225 master-0 kubenswrapper[4171]: E0223 14:17:57.042826 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs podName:ace75aae-6f4f-4299-90e2-d5292271b136 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:29.042807575 +0000 UTC m=+119.146209064 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs") pod "network-metrics-daemon-9dnsv" (UID: "ace75aae-6f4f-4299-90e2-d5292271b136") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 14:17:57.057242 master-0 kubenswrapper[4171]: I0223 14:17:57.057195 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:17:57.057242 master-0 kubenswrapper[4171]: I0223 14:17:57.057217 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:17:57.057367 master-0 kubenswrapper[4171]: E0223 14:17:57.057316 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9gxm" podUID="ded555da-db03-498e-81a9-ad166f29a2aa" Feb 23 14:17:57.057473 master-0 kubenswrapper[4171]: E0223 14:17:57.057441 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9dnsv" podUID="ace75aae-6f4f-4299-90e2-d5292271b136" Feb 23 14:17:58.068868 master-0 kubenswrapper[4171]: I0223 14:17:58.068824 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Feb 23 14:17:59.057240 master-0 kubenswrapper[4171]: I0223 14:17:59.057169 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:17:59.057457 master-0 kubenswrapper[4171]: I0223 14:17:59.057192 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:17:59.057457 master-0 kubenswrapper[4171]: E0223 14:17:59.057359 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9gxm" podUID="ded555da-db03-498e-81a9-ad166f29a2aa" Feb 23 14:17:59.057627 master-0 kubenswrapper[4171]: E0223 14:17:59.057571 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9dnsv" podUID="ace75aae-6f4f-4299-90e2-d5292271b136" Feb 23 14:17:59.607265 master-0 kubenswrapper[4171]: I0223 14:17:59.607152 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" event={"ID":"b090ed5a-984f-41dd-8cea-34a1ece1514f","Type":"ContainerStarted","Data":"be11245e52df36836387b793176a5296c3112993cdce052d05331b901d833321"} Feb 23 14:17:59.612820 master-0 kubenswrapper[4171]: I0223 14:17:59.612752 4171 generic.go:334] "Generic (PLEG): container finished" podID="483786a0-0a29-44bf-bbd0-2f37e045aa2c" containerID="a6fb92a24f40b4f0a4db9442684eefd34b35d2511917f6d03fe2ac8345b66ead" exitCode=0 Feb 23 14:17:59.612820 master-0 kubenswrapper[4171]: I0223 14:17:59.612809 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdsv6" event={"ID":"483786a0-0a29-44bf-bbd0-2f37e045aa2c","Type":"ContainerDied","Data":"a6fb92a24f40b4f0a4db9442684eefd34b35d2511917f6d03fe2ac8345b66ead"} Feb 23 14:17:59.616019 master-0 kubenswrapper[4171]: I0223 14:17:59.615954 4171 generic.go:334] "Generic (PLEG): container finished" podID="02998803-dd12-4c68-990e-efd131076a0f" containerID="ecedb62b37f77a1caab8aadc53f473fad0809a51787de9f83f8cf88aa03a01af" exitCode=0 Feb 23 14:17:59.616167 master-0 kubenswrapper[4171]: I0223 14:17:59.616074 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" event={"ID":"02998803-dd12-4c68-990e-efd131076a0f","Type":"ContainerDied","Data":"ecedb62b37f77a1caab8aadc53f473fad0809a51787de9f83f8cf88aa03a01af"} Feb 23 14:17:59.619284 master-0 kubenswrapper[4171]: I0223 14:17:59.619244 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-td489" event={"ID":"bbe678de-546d-49d0-8280-3f6d94fa5e4f","Type":"ContainerStarted","Data":"86a800fe59aed9a0c248de7a352a6c1ffaea2cbdde27bb246147baa866e1c79a"} Feb 23 14:17:59.619284 master-0 kubenswrapper[4171]: I0223 14:17:59.619278 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-td489" event={"ID":"bbe678de-546d-49d0-8280-3f6d94fa5e4f","Type":"ContainerStarted","Data":"88c122508c98e7a0f40824b07fef074fffccd316aa8ae95f930b03c4abba7eb2"} Feb 23 14:17:59.625662 master-0 kubenswrapper[4171]: I0223 14:17:59.625594 4171 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0-master-0" podStartSLOduration=8.625576841 podStartE2EDuration="8.625576841s" podCreationTimestamp="2026-02-23 14:17:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:17:59.624557192 +0000 UTC m=+89.727958721" watchObservedRunningTime="2026-02-23 14:17:59.625576841 +0000 UTC m=+89.728978340" Feb 23 14:17:59.656231 master-0 kubenswrapper[4171]: I0223 14:17:59.656129 4171 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" podStartSLOduration=4.949559523 podStartE2EDuration="23.656097031s" podCreationTimestamp="2026-02-23 14:17:36 +0000 UTC" firstStartedPulling="2026-02-23 14:17:39.801087083 +0000 UTC m=+69.904488582" lastFinishedPulling="2026-02-23 14:17:58.507624561 +0000 UTC m=+88.611026090" observedRunningTime="2026-02-23 14:17:59.639521692 +0000 UTC m=+89.742923211" watchObservedRunningTime="2026-02-23 14:17:59.656097031 +0000 UTC m=+89.759498560" Feb 23 14:17:59.657325 master-0 kubenswrapper[4171]: I0223 14:17:59.657254 4171 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podStartSLOduration=1.657243143 podStartE2EDuration="1.657243143s" podCreationTimestamp="2026-02-23 14:17:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:17:59.656848415 +0000 UTC m=+89.760249914" watchObservedRunningTime="2026-02-23 14:17:59.657243143 +0000 UTC m=+89.760644672" Feb 23 14:17:59.709528 master-0 kubenswrapper[4171]: I0223 14:17:59.709417 4171 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-node-identity/network-node-identity-td489" podStartSLOduration=1.7948699769999998 podStartE2EDuration="17.709398318s" podCreationTimestamp="2026-02-23 14:17:42 +0000 UTC" firstStartedPulling="2026-02-23 14:17:42.593026219 +0000 UTC m=+72.696427708" lastFinishedPulling="2026-02-23 14:17:58.50755451 +0000 UTC m=+88.610956049" observedRunningTime="2026-02-23 14:17:59.709112513 +0000 UTC m=+89.812514002" watchObservedRunningTime="2026-02-23 14:17:59.709398318 +0000 UTC m=+89.812799807" Feb 23 14:18:00.627117 master-0 kubenswrapper[4171]: I0223 14:18:00.626857 4171 generic.go:334] "Generic (PLEG): container finished" podID="483786a0-0a29-44bf-bbd0-2f37e045aa2c" containerID="6c7ee6bebf88d829805371dc4fd4b58845a3f175897eb6486d1688a8a41b95ec" exitCode=0 Feb 23 14:18:00.627117 master-0 kubenswrapper[4171]: I0223 14:18:00.626963 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdsv6" event={"ID":"483786a0-0a29-44bf-bbd0-2f37e045aa2c","Type":"ContainerDied","Data":"6c7ee6bebf88d829805371dc4fd4b58845a3f175897eb6486d1688a8a41b95ec"} Feb 23 14:18:00.630341 master-0 kubenswrapper[4171]: I0223 14:18:00.629370 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" event={"ID":"02998803-dd12-4c68-990e-efd131076a0f","Type":"ContainerStarted","Data":"11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633"} Feb 23 14:18:00.630341 master-0 kubenswrapper[4171]: I0223 14:18:00.629425 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" event={"ID":"02998803-dd12-4c68-990e-efd131076a0f","Type":"ContainerStarted","Data":"6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd"} Feb 23 14:18:00.630341 master-0 kubenswrapper[4171]: I0223 14:18:00.629449 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" event={"ID":"02998803-dd12-4c68-990e-efd131076a0f","Type":"ContainerStarted","Data":"9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423"} Feb 23 14:18:00.630341 master-0 kubenswrapper[4171]: I0223 14:18:00.629468 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" event={"ID":"02998803-dd12-4c68-990e-efd131076a0f","Type":"ContainerStarted","Data":"cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f"} Feb 23 14:18:00.630341 master-0 kubenswrapper[4171]: I0223 14:18:00.629510 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" event={"ID":"02998803-dd12-4c68-990e-efd131076a0f","Type":"ContainerStarted","Data":"775caf7c4cc79ea4a46d874a7a2e150a8b62ada28f4738e387f4676813f18769"} Feb 23 14:18:00.630341 master-0 kubenswrapper[4171]: I0223 14:18:00.629527 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" event={"ID":"02998803-dd12-4c68-990e-efd131076a0f","Type":"ContainerStarted","Data":"c21949a45a35d4423855abb386586b266abead9f08605c4e6ac84622a4f6f125"} Feb 23 14:18:01.057311 master-0 kubenswrapper[4171]: I0223 14:18:01.057253 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:18:01.058853 master-0 kubenswrapper[4171]: E0223 14:18:01.058808 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9dnsv" podUID="ace75aae-6f4f-4299-90e2-d5292271b136" Feb 23 14:18:01.059063 master-0 kubenswrapper[4171]: I0223 14:18:01.058981 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:18:01.059322 master-0 kubenswrapper[4171]: E0223 14:18:01.059250 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9gxm" podUID="ded555da-db03-498e-81a9-ad166f29a2aa" Feb 23 14:18:01.639932 master-0 kubenswrapper[4171]: I0223 14:18:01.639878 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdsv6" event={"ID":"483786a0-0a29-44bf-bbd0-2f37e045aa2c","Type":"ContainerStarted","Data":"7389ea8bfb9da14a68c3071f0b69781f7e61f7f464c04916bdf2668709eba104"} Feb 23 14:18:01.667712 master-0 kubenswrapper[4171]: I0223 14:18:01.667595 4171 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-jdsv6" podStartSLOduration=3.906326085 podStartE2EDuration="37.667562607s" podCreationTimestamp="2026-02-23 14:17:24 +0000 UTC" firstStartedPulling="2026-02-23 14:17:24.601395729 +0000 UTC m=+54.704797228" lastFinishedPulling="2026-02-23 14:17:58.362632241 +0000 UTC m=+88.466033750" observedRunningTime="2026-02-23 14:18:01.666299563 +0000 UTC m=+91.769701092" watchObservedRunningTime="2026-02-23 14:18:01.667562607 +0000 UTC m=+91.770964156" Feb 23 14:18:02.650986 master-0 kubenswrapper[4171]: I0223 14:18:02.650788 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" event={"ID":"02998803-dd12-4c68-990e-efd131076a0f","Type":"ContainerStarted","Data":"386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478"} Feb 23 14:18:03.057168 master-0 kubenswrapper[4171]: I0223 14:18:03.057093 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:18:03.057434 master-0 kubenswrapper[4171]: I0223 14:18:03.057094 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:18:03.057434 master-0 kubenswrapper[4171]: E0223 14:18:03.057284 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9gxm" podUID="ded555da-db03-498e-81a9-ad166f29a2aa" Feb 23 14:18:03.057434 master-0 kubenswrapper[4171]: E0223 14:18:03.057224 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9dnsv" podUID="ace75aae-6f4f-4299-90e2-d5292271b136" Feb 23 14:18:04.397626 master-0 kubenswrapper[4171]: I0223 14:18:04.397543 4171 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qtjsn"] Feb 23 14:18:05.058243 master-0 kubenswrapper[4171]: I0223 14:18:05.057814 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:18:05.058513 master-0 kubenswrapper[4171]: E0223 14:18:05.058409 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9dnsv" podUID="ace75aae-6f4f-4299-90e2-d5292271b136" Feb 23 14:18:05.058826 master-0 kubenswrapper[4171]: I0223 14:18:05.057814 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:18:05.058946 master-0 kubenswrapper[4171]: E0223 14:18:05.058886 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9gxm" podUID="ded555da-db03-498e-81a9-ad166f29a2aa" Feb 23 14:18:05.665003 master-0 kubenswrapper[4171]: I0223 14:18:05.664922 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" event={"ID":"02998803-dd12-4c68-990e-efd131076a0f","Type":"ContainerStarted","Data":"e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f"} Feb 23 14:18:05.665899 master-0 kubenswrapper[4171]: I0223 14:18:05.665188 4171 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="northd" containerID="cri-o://6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd" gracePeriod=30 Feb 23 14:18:05.665899 master-0 kubenswrapper[4171]: I0223 14:18:05.665188 4171 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="ovn-controller" containerID="cri-o://c21949a45a35d4423855abb386586b266abead9f08605c4e6ac84622a4f6f125" gracePeriod=30 Feb 23 14:18:05.665899 master-0 kubenswrapper[4171]: I0223 14:18:05.665292 4171 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="sbdb" containerID="cri-o://386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478" gracePeriod=30 Feb 23 14:18:05.665899 master-0 kubenswrapper[4171]: I0223 14:18:05.665325 4171 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423" gracePeriod=30 Feb 23 14:18:05.665899 master-0 kubenswrapper[4171]: I0223 14:18:05.665385 4171 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="kube-rbac-proxy-node" containerID="cri-o://cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f" gracePeriod=30 Feb 23 14:18:05.665899 master-0 kubenswrapper[4171]: I0223 14:18:05.665403 4171 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="ovn-acl-logging" containerID="cri-o://775caf7c4cc79ea4a46d874a7a2e150a8b62ada28f4738e387f4676813f18769" gracePeriod=30 Feb 23 14:18:05.665899 master-0 kubenswrapper[4171]: I0223 14:18:05.665393 4171 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="nbdb" containerID="cri-o://11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633" gracePeriod=30 Feb 23 14:18:05.665899 master-0 kubenswrapper[4171]: I0223 14:18:05.665465 4171 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:18:05.669183 master-0 kubenswrapper[4171]: E0223 14:18:05.669100 4171 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Feb 23 14:18:05.670746 master-0 kubenswrapper[4171]: E0223 14:18:05.670690 4171 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Feb 23 14:18:05.673949 master-0 kubenswrapper[4171]: E0223 14:18:05.673284 4171 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Feb 23 14:18:05.673949 master-0 kubenswrapper[4171]: E0223 14:18:05.673326 4171 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="sbdb" Feb 23 14:18:05.683929 master-0 kubenswrapper[4171]: I0223 14:18:05.683417 4171 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="ovnkube-controller" containerID="cri-o://e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f" gracePeriod=30 Feb 23 14:18:06.264864 master-0 kubenswrapper[4171]: I0223 14:18:06.264762 4171 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtjsn_02998803-dd12-4c68-990e-efd131076a0f/ovnkube-controller/0.log" Feb 23 14:18:06.266958 master-0 kubenswrapper[4171]: I0223 14:18:06.266915 4171 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtjsn_02998803-dd12-4c68-990e-efd131076a0f/kube-rbac-proxy-ovn-metrics/0.log" Feb 23 14:18:06.267782 master-0 kubenswrapper[4171]: I0223 14:18:06.267734 4171 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtjsn_02998803-dd12-4c68-990e-efd131076a0f/kube-rbac-proxy-node/0.log" Feb 23 14:18:06.268688 master-0 kubenswrapper[4171]: I0223 14:18:06.268651 4171 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtjsn_02998803-dd12-4c68-990e-efd131076a0f/ovn-acl-logging/0.log" Feb 23 14:18:06.269542 master-0 kubenswrapper[4171]: I0223 14:18:06.269515 4171 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtjsn_02998803-dd12-4c68-990e-efd131076a0f/ovn-controller/0.log" Feb 23 14:18:06.270108 master-0 kubenswrapper[4171]: I0223 14:18:06.270084 4171 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:18:06.326654 master-0 kubenswrapper[4171]: I0223 14:18:06.326413 4171 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-kubelet\") pod \"02998803-dd12-4c68-990e-efd131076a0f\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " Feb 23 14:18:06.326654 master-0 kubenswrapper[4171]: I0223 14:18:06.326529 4171 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-run-ovn-kubernetes\") pod \"02998803-dd12-4c68-990e-efd131076a0f\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " Feb 23 14:18:06.326654 master-0 kubenswrapper[4171]: I0223 14:18:06.326561 4171 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-run-netns\") pod \"02998803-dd12-4c68-990e-efd131076a0f\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " Feb 23 14:18:06.326654 master-0 kubenswrapper[4171]: I0223 14:18:06.326555 4171 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "02998803-dd12-4c68-990e-efd131076a0f" (UID: "02998803-dd12-4c68-990e-efd131076a0f"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:18:06.326654 master-0 kubenswrapper[4171]: I0223 14:18:06.326607 4171 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/02998803-dd12-4c68-990e-efd131076a0f-ovn-node-metrics-cert\") pod \"02998803-dd12-4c68-990e-efd131076a0f\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " Feb 23 14:18:06.326654 master-0 kubenswrapper[4171]: I0223 14:18:06.326637 4171 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "02998803-dd12-4c68-990e-efd131076a0f" (UID: "02998803-dd12-4c68-990e-efd131076a0f"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:18:06.326654 master-0 kubenswrapper[4171]: I0223 14:18:06.326647 4171 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddbg8\" (UniqueName: \"kubernetes.io/projected/02998803-dd12-4c68-990e-efd131076a0f-kube-api-access-ddbg8\") pod \"02998803-dd12-4c68-990e-efd131076a0f\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " Feb 23 14:18:06.326654 master-0 kubenswrapper[4171]: I0223 14:18:06.326670 4171 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "02998803-dd12-4c68-990e-efd131076a0f" (UID: "02998803-dd12-4c68-990e-efd131076a0f"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:18:06.326654 master-0 kubenswrapper[4171]: I0223 14:18:06.326680 4171 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/02998803-dd12-4c68-990e-efd131076a0f-env-overrides\") pod \"02998803-dd12-4c68-990e-efd131076a0f\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " Feb 23 14:18:06.328285 master-0 kubenswrapper[4171]: I0223 14:18:06.326736 4171 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/02998803-dd12-4c68-990e-efd131076a0f-ovnkube-script-lib\") pod \"02998803-dd12-4c68-990e-efd131076a0f\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " Feb 23 14:18:06.328285 master-0 kubenswrapper[4171]: I0223 14:18:06.326768 4171 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-etc-openvswitch\") pod \"02998803-dd12-4c68-990e-efd131076a0f\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " Feb 23 14:18:06.328285 master-0 kubenswrapper[4171]: I0223 14:18:06.326795 4171 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-slash\") pod \"02998803-dd12-4c68-990e-efd131076a0f\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " Feb 23 14:18:06.328285 master-0 kubenswrapper[4171]: I0223 14:18:06.326823 4171 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-log-socket\") pod \"02998803-dd12-4c68-990e-efd131076a0f\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " Feb 23 14:18:06.328285 master-0 kubenswrapper[4171]: I0223 14:18:06.326859 4171 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/02998803-dd12-4c68-990e-efd131076a0f-ovnkube-config\") pod \"02998803-dd12-4c68-990e-efd131076a0f\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " Feb 23 14:18:06.328285 master-0 kubenswrapper[4171]: I0223 14:18:06.326887 4171 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-node-log\") pod \"02998803-dd12-4c68-990e-efd131076a0f\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " Feb 23 14:18:06.328285 master-0 kubenswrapper[4171]: I0223 14:18:06.326913 4171 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-run-systemd\") pod \"02998803-dd12-4c68-990e-efd131076a0f\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " Feb 23 14:18:06.328285 master-0 kubenswrapper[4171]: I0223 14:18:06.326947 4171 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"02998803-dd12-4c68-990e-efd131076a0f\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " Feb 23 14:18:06.328285 master-0 kubenswrapper[4171]: I0223 14:18:06.326981 4171 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-cni-bin\") pod \"02998803-dd12-4c68-990e-efd131076a0f\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " Feb 23 14:18:06.328285 master-0 kubenswrapper[4171]: I0223 14:18:06.327007 4171 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-var-lib-openvswitch\") pod \"02998803-dd12-4c68-990e-efd131076a0f\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " Feb 23 14:18:06.328285 master-0 kubenswrapper[4171]: I0223 14:18:06.327045 4171 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-systemd-units\") pod \"02998803-dd12-4c68-990e-efd131076a0f\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " Feb 23 14:18:06.328285 master-0 kubenswrapper[4171]: I0223 14:18:06.327115 4171 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-run-ovn\") pod \"02998803-dd12-4c68-990e-efd131076a0f\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " Feb 23 14:18:06.328285 master-0 kubenswrapper[4171]: I0223 14:18:06.327145 4171 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-cni-netd\") pod \"02998803-dd12-4c68-990e-efd131076a0f\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " Feb 23 14:18:06.328285 master-0 kubenswrapper[4171]: I0223 14:18:06.327173 4171 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-run-openvswitch\") pod \"02998803-dd12-4c68-990e-efd131076a0f\" (UID: \"02998803-dd12-4c68-990e-efd131076a0f\") " Feb 23 14:18:06.328285 master-0 kubenswrapper[4171]: I0223 14:18:06.327320 4171 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-run-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:06.328285 master-0 kubenswrapper[4171]: I0223 14:18:06.327342 4171 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-run-netns\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:06.328285 master-0 kubenswrapper[4171]: I0223 14:18:06.327357 4171 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-kubelet\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:06.328285 master-0 kubenswrapper[4171]: I0223 14:18:06.327456 4171 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "02998803-dd12-4c68-990e-efd131076a0f" (UID: "02998803-dd12-4c68-990e-efd131076a0f"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:18:06.332830 master-0 kubenswrapper[4171]: I0223 14:18:06.328847 4171 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-node-log" (OuterVolumeSpecName: "node-log") pod "02998803-dd12-4c68-990e-efd131076a0f" (UID: "02998803-dd12-4c68-990e-efd131076a0f"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:18:06.332830 master-0 kubenswrapper[4171]: I0223 14:18:06.328889 4171 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "02998803-dd12-4c68-990e-efd131076a0f" (UID: "02998803-dd12-4c68-990e-efd131076a0f"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:18:06.332830 master-0 kubenswrapper[4171]: I0223 14:18:06.328907 4171 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-slash" (OuterVolumeSpecName: "host-slash") pod "02998803-dd12-4c68-990e-efd131076a0f" (UID: "02998803-dd12-4c68-990e-efd131076a0f"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:18:06.332830 master-0 kubenswrapper[4171]: I0223 14:18:06.328943 4171 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-log-socket" (OuterVolumeSpecName: "log-socket") pod "02998803-dd12-4c68-990e-efd131076a0f" (UID: "02998803-dd12-4c68-990e-efd131076a0f"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:18:06.332830 master-0 kubenswrapper[4171]: I0223 14:18:06.329355 4171 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02998803-dd12-4c68-990e-efd131076a0f-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "02998803-dd12-4c68-990e-efd131076a0f" (UID: "02998803-dd12-4c68-990e-efd131076a0f"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:18:06.332830 master-0 kubenswrapper[4171]: I0223 14:18:06.329362 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ftngv"] Feb 23 14:18:06.332830 master-0 kubenswrapper[4171]: E0223 14:18:06.329536 4171 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="nbdb" Feb 23 14:18:06.332830 master-0 kubenswrapper[4171]: I0223 14:18:06.329558 4171 state_mem.go:107] "Deleted CPUSet assignment" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="nbdb" Feb 23 14:18:06.332830 master-0 kubenswrapper[4171]: E0223 14:18:06.329571 4171 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="ovnkube-controller" Feb 23 14:18:06.332830 master-0 kubenswrapper[4171]: I0223 14:18:06.329582 4171 state_mem.go:107] "Deleted CPUSet assignment" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="ovnkube-controller" Feb 23 14:18:06.332830 master-0 kubenswrapper[4171]: E0223 14:18:06.329594 4171 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="kube-rbac-proxy-ovn-metrics" Feb 23 14:18:06.332830 master-0 kubenswrapper[4171]: I0223 14:18:06.329604 4171 state_mem.go:107] "Deleted CPUSet assignment" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="kube-rbac-proxy-ovn-metrics" Feb 23 14:18:06.332830 master-0 kubenswrapper[4171]: E0223 14:18:06.329617 4171 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="ovn-controller" Feb 23 14:18:06.332830 master-0 kubenswrapper[4171]: I0223 14:18:06.329629 4171 state_mem.go:107] "Deleted CPUSet assignment" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="ovn-controller" Feb 23 14:18:06.332830 master-0 kubenswrapper[4171]: E0223 14:18:06.329641 4171 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="ovn-acl-logging" Feb 23 14:18:06.332830 master-0 kubenswrapper[4171]: I0223 14:18:06.329654 4171 state_mem.go:107] "Deleted CPUSet assignment" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="ovn-acl-logging" Feb 23 14:18:06.332830 master-0 kubenswrapper[4171]: E0223 14:18:06.329664 4171 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="kube-rbac-proxy-node" Feb 23 14:18:06.332830 master-0 kubenswrapper[4171]: I0223 14:18:06.329676 4171 state_mem.go:107] "Deleted CPUSet assignment" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="kube-rbac-proxy-node" Feb 23 14:18:06.332830 master-0 kubenswrapper[4171]: E0223 14:18:06.329688 4171 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="northd" Feb 23 14:18:06.332830 master-0 kubenswrapper[4171]: I0223 14:18:06.329698 4171 state_mem.go:107] "Deleted CPUSet assignment" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="northd" Feb 23 14:18:06.332830 master-0 kubenswrapper[4171]: E0223 14:18:06.329709 4171 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="sbdb" Feb 23 14:18:06.332830 master-0 kubenswrapper[4171]: I0223 14:18:06.329719 4171 state_mem.go:107] "Deleted CPUSet assignment" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="sbdb" Feb 23 14:18:06.332830 master-0 kubenswrapper[4171]: E0223 14:18:06.329731 4171 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="kubecfg-setup" Feb 23 14:18:06.332830 master-0 kubenswrapper[4171]: I0223 14:18:06.329741 4171 state_mem.go:107] "Deleted CPUSet assignment" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="kubecfg-setup" Feb 23 14:18:06.332830 master-0 kubenswrapper[4171]: I0223 14:18:06.329800 4171 memory_manager.go:354] "RemoveStaleState removing state" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="sbdb" Feb 23 14:18:06.332830 master-0 kubenswrapper[4171]: I0223 14:18:06.329818 4171 memory_manager.go:354] "RemoveStaleState removing state" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="nbdb" Feb 23 14:18:06.332830 master-0 kubenswrapper[4171]: I0223 14:18:06.329828 4171 memory_manager.go:354] "RemoveStaleState removing state" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="ovnkube-controller" Feb 23 14:18:06.332830 master-0 kubenswrapper[4171]: I0223 14:18:06.329839 4171 memory_manager.go:354] "RemoveStaleState removing state" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="kube-rbac-proxy-node" Feb 23 14:18:06.332830 master-0 kubenswrapper[4171]: I0223 14:18:06.329851 4171 memory_manager.go:354] "RemoveStaleState removing state" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="kube-rbac-proxy-ovn-metrics" Feb 23 14:18:06.332830 master-0 kubenswrapper[4171]: I0223 14:18:06.329862 4171 memory_manager.go:354] "RemoveStaleState removing state" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="ovn-acl-logging" Feb 23 14:18:06.332830 master-0 kubenswrapper[4171]: I0223 14:18:06.330066 4171 memory_manager.go:354] "RemoveStaleState removing state" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="ovn-controller" Feb 23 14:18:06.335087 master-0 kubenswrapper[4171]: I0223 14:18:06.330067 4171 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02998803-dd12-4c68-990e-efd131076a0f-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "02998803-dd12-4c68-990e-efd131076a0f" (UID: "02998803-dd12-4c68-990e-efd131076a0f"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:18:06.335087 master-0 kubenswrapper[4171]: I0223 14:18:06.330078 4171 memory_manager.go:354] "RemoveStaleState removing state" podUID="02998803-dd12-4c68-990e-efd131076a0f" containerName="northd" Feb 23 14:18:06.335087 master-0 kubenswrapper[4171]: I0223 14:18:06.330113 4171 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "02998803-dd12-4c68-990e-efd131076a0f" (UID: "02998803-dd12-4c68-990e-efd131076a0f"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:18:06.335087 master-0 kubenswrapper[4171]: I0223 14:18:06.330814 4171 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "02998803-dd12-4c68-990e-efd131076a0f" (UID: "02998803-dd12-4c68-990e-efd131076a0f"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:18:06.335087 master-0 kubenswrapper[4171]: I0223 14:18:06.331125 4171 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "02998803-dd12-4c68-990e-efd131076a0f" (UID: "02998803-dd12-4c68-990e-efd131076a0f"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:18:06.335087 master-0 kubenswrapper[4171]: I0223 14:18:06.331170 4171 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "02998803-dd12-4c68-990e-efd131076a0f" (UID: "02998803-dd12-4c68-990e-efd131076a0f"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:18:06.335087 master-0 kubenswrapper[4171]: I0223 14:18:06.331376 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.335087 master-0 kubenswrapper[4171]: I0223 14:18:06.331375 4171 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "02998803-dd12-4c68-990e-efd131076a0f" (UID: "02998803-dd12-4c68-990e-efd131076a0f"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:18:06.335087 master-0 kubenswrapper[4171]: I0223 14:18:06.331714 4171 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02998803-dd12-4c68-990e-efd131076a0f-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "02998803-dd12-4c68-990e-efd131076a0f" (UID: "02998803-dd12-4c68-990e-efd131076a0f"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:18:06.335087 master-0 kubenswrapper[4171]: I0223 14:18:06.331196 4171 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "02998803-dd12-4c68-990e-efd131076a0f" (UID: "02998803-dd12-4c68-990e-efd131076a0f"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:18:06.335087 master-0 kubenswrapper[4171]: I0223 14:18:06.332808 4171 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02998803-dd12-4c68-990e-efd131076a0f-kube-api-access-ddbg8" (OuterVolumeSpecName: "kube-api-access-ddbg8") pod "02998803-dd12-4c68-990e-efd131076a0f" (UID: "02998803-dd12-4c68-990e-efd131076a0f"). InnerVolumeSpecName "kube-api-access-ddbg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:18:06.335087 master-0 kubenswrapper[4171]: I0223 14:18:06.333557 4171 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02998803-dd12-4c68-990e-efd131076a0f-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "02998803-dd12-4c68-990e-efd131076a0f" (UID: "02998803-dd12-4c68-990e-efd131076a0f"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:18:06.341252 master-0 kubenswrapper[4171]: I0223 14:18:06.341179 4171 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "02998803-dd12-4c68-990e-efd131076a0f" (UID: "02998803-dd12-4c68-990e-efd131076a0f"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:18:06.428075 master-0 kubenswrapper[4171]: I0223 14:18:06.427939 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-slash\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.428075 master-0 kubenswrapper[4171]: I0223 14:18:06.428002 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-run-systemd\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.428075 master-0 kubenswrapper[4171]: I0223 14:18:06.428024 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-run-openvswitch\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.428075 master-0 kubenswrapper[4171]: I0223 14:18:06.428047 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-cni-bin\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.428075 master-0 kubenswrapper[4171]: I0223 14:18:06.428071 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-etc-openvswitch\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.428406 master-0 kubenswrapper[4171]: I0223 14:18:06.428094 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-log-socket\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.428406 master-0 kubenswrapper[4171]: I0223 14:18:06.428134 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-cni-netd\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.428406 master-0 kubenswrapper[4171]: I0223 14:18:06.428155 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f10f592e-5738-4879-b776-246b357d4621-ovn-node-metrics-cert\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.428406 master-0 kubenswrapper[4171]: I0223 14:18:06.428180 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.428406 master-0 kubenswrapper[4171]: I0223 14:18:06.428205 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f10f592e-5738-4879-b776-246b357d4621-ovnkube-config\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.428406 master-0 kubenswrapper[4171]: I0223 14:18:06.428245 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-systemd-units\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.428406 master-0 kubenswrapper[4171]: I0223 14:18:06.428267 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-kubelet\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.428406 master-0 kubenswrapper[4171]: I0223 14:18:06.428285 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-run-netns\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.428406 master-0 kubenswrapper[4171]: I0223 14:18:06.428306 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f10f592e-5738-4879-b776-246b357d4621-ovnkube-script-lib\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.428406 master-0 kubenswrapper[4171]: I0223 14:18:06.428352 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-269v7\" (UniqueName: \"kubernetes.io/projected/f10f592e-5738-4879-b776-246b357d4621-kube-api-access-269v7\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.428406 master-0 kubenswrapper[4171]: I0223 14:18:06.428411 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f10f592e-5738-4879-b776-246b357d4621-env-overrides\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.428807 master-0 kubenswrapper[4171]: I0223 14:18:06.428452 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-var-lib-openvswitch\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.428807 master-0 kubenswrapper[4171]: I0223 14:18:06.428493 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-run-ovn-kubernetes\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.428807 master-0 kubenswrapper[4171]: I0223 14:18:06.428527 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-run-ovn\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.428807 master-0 kubenswrapper[4171]: I0223 14:18:06.428555 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-node-log\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.428807 master-0 kubenswrapper[4171]: I0223 14:18:06.428587 4171 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-cni-bin\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:06.428807 master-0 kubenswrapper[4171]: I0223 14:18:06.428606 4171 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-var-lib-openvswitch\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:06.428807 master-0 kubenswrapper[4171]: I0223 14:18:06.428622 4171 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-systemd-units\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:06.428807 master-0 kubenswrapper[4171]: I0223 14:18:06.428635 4171 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-run-ovn\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:06.428807 master-0 kubenswrapper[4171]: I0223 14:18:06.428648 4171 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-cni-netd\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:06.428807 master-0 kubenswrapper[4171]: I0223 14:18:06.428661 4171 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-run-openvswitch\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:06.428807 master-0 kubenswrapper[4171]: I0223 14:18:06.428674 4171 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/02998803-dd12-4c68-990e-efd131076a0f-ovn-node-metrics-cert\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:06.428807 master-0 kubenswrapper[4171]: I0223 14:18:06.428687 4171 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddbg8\" (UniqueName: \"kubernetes.io/projected/02998803-dd12-4c68-990e-efd131076a0f-kube-api-access-ddbg8\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:06.428807 master-0 kubenswrapper[4171]: I0223 14:18:06.428699 4171 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/02998803-dd12-4c68-990e-efd131076a0f-env-overrides\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:06.428807 master-0 kubenswrapper[4171]: I0223 14:18:06.428711 4171 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/02998803-dd12-4c68-990e-efd131076a0f-ovnkube-script-lib\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:06.428807 master-0 kubenswrapper[4171]: I0223 14:18:06.428723 4171 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-etc-openvswitch\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:06.428807 master-0 kubenswrapper[4171]: I0223 14:18:06.428735 4171 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-slash\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:06.428807 master-0 kubenswrapper[4171]: I0223 14:18:06.428747 4171 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-log-socket\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:06.428807 master-0 kubenswrapper[4171]: I0223 14:18:06.428759 4171 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/02998803-dd12-4c68-990e-efd131076a0f-ovnkube-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:06.428807 master-0 kubenswrapper[4171]: I0223 14:18:06.428771 4171 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-node-log\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:06.428807 master-0 kubenswrapper[4171]: I0223 14:18:06.428782 4171 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-run-systemd\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:06.428807 master-0 kubenswrapper[4171]: I0223 14:18:06.428795 4171 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/02998803-dd12-4c68-990e-efd131076a0f-host-var-lib-cni-networks-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:06.529621 master-0 kubenswrapper[4171]: I0223 14:18:06.529544 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f10f592e-5738-4879-b776-246b357d4621-ovnkube-config\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.529621 master-0 kubenswrapper[4171]: I0223 14:18:06.529632 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-systemd-units\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.529866 master-0 kubenswrapper[4171]: I0223 14:18:06.529656 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-kubelet\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.529866 master-0 kubenswrapper[4171]: I0223 14:18:06.529679 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-run-netns\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.529866 master-0 kubenswrapper[4171]: I0223 14:18:06.529749 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-kubelet\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.529974 master-0 kubenswrapper[4171]: I0223 14:18:06.529891 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-run-netns\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.529974 master-0 kubenswrapper[4171]: I0223 14:18:06.529885 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f10f592e-5738-4879-b776-246b357d4621-ovnkube-script-lib\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.529974 master-0 kubenswrapper[4171]: I0223 14:18:06.529932 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-269v7\" (UniqueName: \"kubernetes.io/projected/f10f592e-5738-4879-b776-246b357d4621-kube-api-access-269v7\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.529974 master-0 kubenswrapper[4171]: I0223 14:18:06.529964 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f10f592e-5738-4879-b776-246b357d4621-env-overrides\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.530118 master-0 kubenswrapper[4171]: I0223 14:18:06.530067 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-systemd-units\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.530254 master-0 kubenswrapper[4171]: I0223 14:18:06.530195 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-var-lib-openvswitch\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.530341 master-0 kubenswrapper[4171]: I0223 14:18:06.530267 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-run-ovn-kubernetes\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.530454 master-0 kubenswrapper[4171]: I0223 14:18:06.530364 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-run-ovn\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.530454 master-0 kubenswrapper[4171]: I0223 14:18:06.530397 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-var-lib-openvswitch\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.530697 master-0 kubenswrapper[4171]: I0223 14:18:06.530457 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-node-log\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.530697 master-0 kubenswrapper[4171]: I0223 14:18:06.530548 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-slash\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.530697 master-0 kubenswrapper[4171]: I0223 14:18:06.530583 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-run-systemd\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.530697 master-0 kubenswrapper[4171]: I0223 14:18:06.530553 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-node-log\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.530697 master-0 kubenswrapper[4171]: I0223 14:18:06.530584 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-run-ovn-kubernetes\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.530697 master-0 kubenswrapper[4171]: I0223 14:18:06.530530 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-run-ovn\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.530697 master-0 kubenswrapper[4171]: I0223 14:18:06.530653 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-slash\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.531208 master-0 kubenswrapper[4171]: I0223 14:18:06.530793 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-run-openvswitch\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.531208 master-0 kubenswrapper[4171]: I0223 14:18:06.530825 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-cni-bin\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.531208 master-0 kubenswrapper[4171]: I0223 14:18:06.530825 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-run-systemd\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.531208 master-0 kubenswrapper[4171]: I0223 14:18:06.530856 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-run-openvswitch\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.531208 master-0 kubenswrapper[4171]: I0223 14:18:06.530887 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-etc-openvswitch\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.531208 master-0 kubenswrapper[4171]: I0223 14:18:06.530893 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-cni-bin\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.531208 master-0 kubenswrapper[4171]: I0223 14:18:06.530910 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-log-socket\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.531208 master-0 kubenswrapper[4171]: I0223 14:18:06.530939 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-log-socket\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.531208 master-0 kubenswrapper[4171]: I0223 14:18:06.530987 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-cni-netd\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.531208 master-0 kubenswrapper[4171]: I0223 14:18:06.531005 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-cni-netd\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.531208 master-0 kubenswrapper[4171]: I0223 14:18:06.530996 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-etc-openvswitch\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.531208 master-0 kubenswrapper[4171]: I0223 14:18:06.531026 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f10f592e-5738-4879-b776-246b357d4621-ovn-node-metrics-cert\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.531208 master-0 kubenswrapper[4171]: I0223 14:18:06.531074 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.531208 master-0 kubenswrapper[4171]: I0223 14:18:06.531105 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f10f592e-5738-4879-b776-246b357d4621-ovnkube-script-lib\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.531208 master-0 kubenswrapper[4171]: I0223 14:18:06.531166 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.531775 master-0 kubenswrapper[4171]: I0223 14:18:06.531625 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f10f592e-5738-4879-b776-246b357d4621-env-overrides\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.532007 master-0 kubenswrapper[4171]: I0223 14:18:06.531925 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f10f592e-5738-4879-b776-246b357d4621-ovnkube-config\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.536127 master-0 kubenswrapper[4171]: I0223 14:18:06.536059 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f10f592e-5738-4879-b776-246b357d4621-ovn-node-metrics-cert\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.559684 master-0 kubenswrapper[4171]: I0223 14:18:06.559595 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-269v7\" (UniqueName: \"kubernetes.io/projected/f10f592e-5738-4879-b776-246b357d4621-kube-api-access-269v7\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.643675 master-0 kubenswrapper[4171]: I0223 14:18:06.643590 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:06.658248 master-0 kubenswrapper[4171]: W0223 14:18:06.658170 4171 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf10f592e_5738_4879_b776_246b357d4621.slice/crio-984304e1b4252b7619a58df9f7ce55ca2014852517f80186c3411dc4b687d274 WatchSource:0}: Error finding container 984304e1b4252b7619a58df9f7ce55ca2014852517f80186c3411dc4b687d274: Status 404 returned error can't find the container with id 984304e1b4252b7619a58df9f7ce55ca2014852517f80186c3411dc4b687d274 Feb 23 14:18:06.671779 master-0 kubenswrapper[4171]: I0223 14:18:06.671719 4171 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtjsn_02998803-dd12-4c68-990e-efd131076a0f/ovnkube-controller/0.log" Feb 23 14:18:06.674336 master-0 kubenswrapper[4171]: I0223 14:18:06.674306 4171 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtjsn_02998803-dd12-4c68-990e-efd131076a0f/kube-rbac-proxy-ovn-metrics/0.log" Feb 23 14:18:06.675164 master-0 kubenswrapper[4171]: I0223 14:18:06.675137 4171 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtjsn_02998803-dd12-4c68-990e-efd131076a0f/kube-rbac-proxy-node/0.log" Feb 23 14:18:06.675967 master-0 kubenswrapper[4171]: I0223 14:18:06.675814 4171 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtjsn_02998803-dd12-4c68-990e-efd131076a0f/ovn-acl-logging/0.log" Feb 23 14:18:06.676354 master-0 kubenswrapper[4171]: I0223 14:18:06.676316 4171 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtjsn_02998803-dd12-4c68-990e-efd131076a0f/ovn-controller/0.log" Feb 23 14:18:06.677124 master-0 kubenswrapper[4171]: I0223 14:18:06.676851 4171 generic.go:334] "Generic (PLEG): container finished" podID="02998803-dd12-4c68-990e-efd131076a0f" containerID="e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f" exitCode=1 Feb 23 14:18:06.677124 master-0 kubenswrapper[4171]: I0223 14:18:06.676894 4171 generic.go:334] "Generic (PLEG): container finished" podID="02998803-dd12-4c68-990e-efd131076a0f" containerID="386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478" exitCode=0 Feb 23 14:18:06.677124 master-0 kubenswrapper[4171]: I0223 14:18:06.676913 4171 generic.go:334] "Generic (PLEG): container finished" podID="02998803-dd12-4c68-990e-efd131076a0f" containerID="11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633" exitCode=0 Feb 23 14:18:06.677124 master-0 kubenswrapper[4171]: I0223 14:18:06.676929 4171 generic.go:334] "Generic (PLEG): container finished" podID="02998803-dd12-4c68-990e-efd131076a0f" containerID="6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd" exitCode=0 Feb 23 14:18:06.677124 master-0 kubenswrapper[4171]: I0223 14:18:06.676943 4171 generic.go:334] "Generic (PLEG): container finished" podID="02998803-dd12-4c68-990e-efd131076a0f" containerID="9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423" exitCode=143 Feb 23 14:18:06.677124 master-0 kubenswrapper[4171]: I0223 14:18:06.676957 4171 generic.go:334] "Generic (PLEG): container finished" podID="02998803-dd12-4c68-990e-efd131076a0f" containerID="cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f" exitCode=143 Feb 23 14:18:06.677124 master-0 kubenswrapper[4171]: I0223 14:18:06.676971 4171 generic.go:334] "Generic (PLEG): container finished" podID="02998803-dd12-4c68-990e-efd131076a0f" containerID="775caf7c4cc79ea4a46d874a7a2e150a8b62ada28f4738e387f4676813f18769" exitCode=143 Feb 23 14:18:06.677124 master-0 kubenswrapper[4171]: I0223 14:18:06.676985 4171 generic.go:334] "Generic (PLEG): container finished" podID="02998803-dd12-4c68-990e-efd131076a0f" containerID="c21949a45a35d4423855abb386586b266abead9f08605c4e6ac84622a4f6f125" exitCode=143 Feb 23 14:18:06.677124 master-0 kubenswrapper[4171]: I0223 14:18:06.677046 4171 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" Feb 23 14:18:06.677124 master-0 kubenswrapper[4171]: I0223 14:18:06.676978 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" event={"ID":"02998803-dd12-4c68-990e-efd131076a0f","Type":"ContainerDied","Data":"e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f"} Feb 23 14:18:06.679029 master-0 kubenswrapper[4171]: I0223 14:18:06.677194 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" event={"ID":"02998803-dd12-4c68-990e-efd131076a0f","Type":"ContainerDied","Data":"386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478"} Feb 23 14:18:06.679029 master-0 kubenswrapper[4171]: I0223 14:18:06.677215 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" event={"ID":"02998803-dd12-4c68-990e-efd131076a0f","Type":"ContainerDied","Data":"11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633"} Feb 23 14:18:06.679029 master-0 kubenswrapper[4171]: I0223 14:18:06.677234 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" event={"ID":"02998803-dd12-4c68-990e-efd131076a0f","Type":"ContainerDied","Data":"6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd"} Feb 23 14:18:06.679029 master-0 kubenswrapper[4171]: I0223 14:18:06.677250 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" event={"ID":"02998803-dd12-4c68-990e-efd131076a0f","Type":"ContainerDied","Data":"9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423"} Feb 23 14:18:06.679029 master-0 kubenswrapper[4171]: I0223 14:18:06.677264 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" event={"ID":"02998803-dd12-4c68-990e-efd131076a0f","Type":"ContainerDied","Data":"cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f"} Feb 23 14:18:06.679029 master-0 kubenswrapper[4171]: I0223 14:18:06.677291 4171 scope.go:117] "RemoveContainer" containerID="e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f" Feb 23 14:18:06.679029 master-0 kubenswrapper[4171]: I0223 14:18:06.677280 4171 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"775caf7c4cc79ea4a46d874a7a2e150a8b62ada28f4738e387f4676813f18769"} Feb 23 14:18:06.679029 master-0 kubenswrapper[4171]: I0223 14:18:06.677397 4171 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c21949a45a35d4423855abb386586b266abead9f08605c4e6ac84622a4f6f125"} Feb 23 14:18:06.679029 master-0 kubenswrapper[4171]: I0223 14:18:06.677406 4171 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ecedb62b37f77a1caab8aadc53f473fad0809a51787de9f83f8cf88aa03a01af"} Feb 23 14:18:06.679029 master-0 kubenswrapper[4171]: I0223 14:18:06.677416 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" event={"ID":"02998803-dd12-4c68-990e-efd131076a0f","Type":"ContainerDied","Data":"775caf7c4cc79ea4a46d874a7a2e150a8b62ada28f4738e387f4676813f18769"} Feb 23 14:18:06.679029 master-0 kubenswrapper[4171]: I0223 14:18:06.677428 4171 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f"} Feb 23 14:18:06.679029 master-0 kubenswrapper[4171]: I0223 14:18:06.677439 4171 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478"} Feb 23 14:18:06.679029 master-0 kubenswrapper[4171]: I0223 14:18:06.677448 4171 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633"} Feb 23 14:18:06.679029 master-0 kubenswrapper[4171]: I0223 14:18:06.677455 4171 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd"} Feb 23 14:18:06.679029 master-0 kubenswrapper[4171]: I0223 14:18:06.677463 4171 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423"} Feb 23 14:18:06.679029 master-0 kubenswrapper[4171]: I0223 14:18:06.677470 4171 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f"} Feb 23 14:18:06.679029 master-0 kubenswrapper[4171]: I0223 14:18:06.677500 4171 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"775caf7c4cc79ea4a46d874a7a2e150a8b62ada28f4738e387f4676813f18769"} Feb 23 14:18:06.679029 master-0 kubenswrapper[4171]: I0223 14:18:06.677508 4171 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c21949a45a35d4423855abb386586b266abead9f08605c4e6ac84622a4f6f125"} Feb 23 14:18:06.679029 master-0 kubenswrapper[4171]: I0223 14:18:06.677515 4171 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ecedb62b37f77a1caab8aadc53f473fad0809a51787de9f83f8cf88aa03a01af"} Feb 23 14:18:06.679029 master-0 kubenswrapper[4171]: I0223 14:18:06.677527 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" event={"ID":"02998803-dd12-4c68-990e-efd131076a0f","Type":"ContainerDied","Data":"c21949a45a35d4423855abb386586b266abead9f08605c4e6ac84622a4f6f125"} Feb 23 14:18:06.679029 master-0 kubenswrapper[4171]: I0223 14:18:06.677538 4171 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f"} Feb 23 14:18:06.679029 master-0 kubenswrapper[4171]: I0223 14:18:06.677547 4171 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478"} Feb 23 14:18:06.679029 master-0 kubenswrapper[4171]: I0223 14:18:06.677555 4171 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633"} Feb 23 14:18:06.679029 master-0 kubenswrapper[4171]: I0223 14:18:06.677563 4171 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd"} Feb 23 14:18:06.679029 master-0 kubenswrapper[4171]: I0223 14:18:06.677571 4171 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423"} Feb 23 14:18:06.679029 master-0 kubenswrapper[4171]: I0223 14:18:06.677578 4171 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f"} Feb 23 14:18:06.679029 master-0 kubenswrapper[4171]: I0223 14:18:06.677585 4171 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"775caf7c4cc79ea4a46d874a7a2e150a8b62ada28f4738e387f4676813f18769"} Feb 23 14:18:06.680047 master-0 kubenswrapper[4171]: I0223 14:18:06.677593 4171 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c21949a45a35d4423855abb386586b266abead9f08605c4e6ac84622a4f6f125"} Feb 23 14:18:06.680047 master-0 kubenswrapper[4171]: I0223 14:18:06.677602 4171 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ecedb62b37f77a1caab8aadc53f473fad0809a51787de9f83f8cf88aa03a01af"} Feb 23 14:18:06.680047 master-0 kubenswrapper[4171]: I0223 14:18:06.677611 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qtjsn" event={"ID":"02998803-dd12-4c68-990e-efd131076a0f","Type":"ContainerDied","Data":"477770141bf255a57c3800674a1973731f4824ba7eeefbf57c50b1692c3eba69"} Feb 23 14:18:06.680047 master-0 kubenswrapper[4171]: I0223 14:18:06.677623 4171 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f"} Feb 23 14:18:06.680047 master-0 kubenswrapper[4171]: I0223 14:18:06.677636 4171 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478"} Feb 23 14:18:06.680047 master-0 kubenswrapper[4171]: I0223 14:18:06.677644 4171 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633"} Feb 23 14:18:06.680047 master-0 kubenswrapper[4171]: I0223 14:18:06.677653 4171 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd"} Feb 23 14:18:06.680047 master-0 kubenswrapper[4171]: I0223 14:18:06.677661 4171 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423"} Feb 23 14:18:06.680047 master-0 kubenswrapper[4171]: I0223 14:18:06.677668 4171 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f"} Feb 23 14:18:06.680047 master-0 kubenswrapper[4171]: I0223 14:18:06.677676 4171 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"775caf7c4cc79ea4a46d874a7a2e150a8b62ada28f4738e387f4676813f18769"} Feb 23 14:18:06.680047 master-0 kubenswrapper[4171]: I0223 14:18:06.677683 4171 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c21949a45a35d4423855abb386586b266abead9f08605c4e6ac84622a4f6f125"} Feb 23 14:18:06.680047 master-0 kubenswrapper[4171]: I0223 14:18:06.677690 4171 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ecedb62b37f77a1caab8aadc53f473fad0809a51787de9f83f8cf88aa03a01af"} Feb 23 14:18:06.680959 master-0 kubenswrapper[4171]: I0223 14:18:06.680883 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" event={"ID":"f10f592e-5738-4879-b776-246b357d4621","Type":"ContainerStarted","Data":"984304e1b4252b7619a58df9f7ce55ca2014852517f80186c3411dc4b687d274"} Feb 23 14:18:06.694370 master-0 kubenswrapper[4171]: I0223 14:18:06.694333 4171 scope.go:117] "RemoveContainer" containerID="386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478" Feb 23 14:18:06.709367 master-0 kubenswrapper[4171]: I0223 14:18:06.709332 4171 scope.go:117] "RemoveContainer" containerID="11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633" Feb 23 14:18:06.725380 master-0 kubenswrapper[4171]: I0223 14:18:06.725324 4171 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qtjsn"] Feb 23 14:18:06.731845 master-0 kubenswrapper[4171]: I0223 14:18:06.731791 4171 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qtjsn"] Feb 23 14:18:06.740586 master-0 kubenswrapper[4171]: I0223 14:18:06.740543 4171 scope.go:117] "RemoveContainer" containerID="6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd" Feb 23 14:18:06.751887 master-0 kubenswrapper[4171]: I0223 14:18:06.751676 4171 scope.go:117] "RemoveContainer" containerID="9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423" Feb 23 14:18:06.764500 master-0 kubenswrapper[4171]: I0223 14:18:06.764402 4171 scope.go:117] "RemoveContainer" containerID="cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f" Feb 23 14:18:06.873157 master-0 kubenswrapper[4171]: I0223 14:18:06.873101 4171 scope.go:117] "RemoveContainer" containerID="775caf7c4cc79ea4a46d874a7a2e150a8b62ada28f4738e387f4676813f18769" Feb 23 14:18:06.886316 master-0 kubenswrapper[4171]: I0223 14:18:06.886279 4171 scope.go:117] "RemoveContainer" containerID="c21949a45a35d4423855abb386586b266abead9f08605c4e6ac84622a4f6f125" Feb 23 14:18:06.899366 master-0 kubenswrapper[4171]: I0223 14:18:06.899309 4171 scope.go:117] "RemoveContainer" containerID="ecedb62b37f77a1caab8aadc53f473fad0809a51787de9f83f8cf88aa03a01af" Feb 23 14:18:06.915059 master-0 kubenswrapper[4171]: I0223 14:18:06.914998 4171 scope.go:117] "RemoveContainer" containerID="e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f" Feb 23 14:18:06.915671 master-0 kubenswrapper[4171]: E0223 14:18:06.915605 4171 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f\": container with ID starting with e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f not found: ID does not exist" containerID="e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f" Feb 23 14:18:06.915768 master-0 kubenswrapper[4171]: I0223 14:18:06.915676 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f"} err="failed to get container status \"e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f\": rpc error: code = NotFound desc = could not find container \"e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f\": container with ID starting with e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f not found: ID does not exist" Feb 23 14:18:06.915768 master-0 kubenswrapper[4171]: I0223 14:18:06.915725 4171 scope.go:117] "RemoveContainer" containerID="386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478" Feb 23 14:18:06.916273 master-0 kubenswrapper[4171]: E0223 14:18:06.916217 4171 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478\": container with ID starting with 386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478 not found: ID does not exist" containerID="386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478" Feb 23 14:18:06.916348 master-0 kubenswrapper[4171]: I0223 14:18:06.916273 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478"} err="failed to get container status \"386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478\": rpc error: code = NotFound desc = could not find container \"386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478\": container with ID starting with 386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478 not found: ID does not exist" Feb 23 14:18:06.916348 master-0 kubenswrapper[4171]: I0223 14:18:06.916308 4171 scope.go:117] "RemoveContainer" containerID="11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633" Feb 23 14:18:06.917078 master-0 kubenswrapper[4171]: E0223 14:18:06.917012 4171 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633\": container with ID starting with 11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633 not found: ID does not exist" containerID="11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633" Feb 23 14:18:06.917149 master-0 kubenswrapper[4171]: I0223 14:18:06.917073 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633"} err="failed to get container status \"11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633\": rpc error: code = NotFound desc = could not find container \"11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633\": container with ID starting with 11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633 not found: ID does not exist" Feb 23 14:18:06.917149 master-0 kubenswrapper[4171]: I0223 14:18:06.917103 4171 scope.go:117] "RemoveContainer" containerID="6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd" Feb 23 14:18:06.917605 master-0 kubenswrapper[4171]: E0223 14:18:06.917554 4171 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd\": container with ID starting with 6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd not found: ID does not exist" containerID="6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd" Feb 23 14:18:06.917690 master-0 kubenswrapper[4171]: I0223 14:18:06.917595 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd"} err="failed to get container status \"6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd\": rpc error: code = NotFound desc = could not find container \"6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd\": container with ID starting with 6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd not found: ID does not exist" Feb 23 14:18:06.917690 master-0 kubenswrapper[4171]: I0223 14:18:06.917635 4171 scope.go:117] "RemoveContainer" containerID="9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423" Feb 23 14:18:06.918271 master-0 kubenswrapper[4171]: E0223 14:18:06.918207 4171 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423\": container with ID starting with 9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423 not found: ID does not exist" containerID="9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423" Feb 23 14:18:06.918350 master-0 kubenswrapper[4171]: I0223 14:18:06.918268 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423"} err="failed to get container status \"9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423\": rpc error: code = NotFound desc = could not find container \"9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423\": container with ID starting with 9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423 not found: ID does not exist" Feb 23 14:18:06.918350 master-0 kubenswrapper[4171]: I0223 14:18:06.918303 4171 scope.go:117] "RemoveContainer" containerID="cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f" Feb 23 14:18:06.918736 master-0 kubenswrapper[4171]: E0223 14:18:06.918689 4171 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f\": container with ID starting with cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f not found: ID does not exist" containerID="cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f" Feb 23 14:18:06.918819 master-0 kubenswrapper[4171]: I0223 14:18:06.918727 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f"} err="failed to get container status \"cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f\": rpc error: code = NotFound desc = could not find container \"cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f\": container with ID starting with cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f not found: ID does not exist" Feb 23 14:18:06.918819 master-0 kubenswrapper[4171]: I0223 14:18:06.918750 4171 scope.go:117] "RemoveContainer" containerID="775caf7c4cc79ea4a46d874a7a2e150a8b62ada28f4738e387f4676813f18769" Feb 23 14:18:06.919232 master-0 kubenswrapper[4171]: E0223 14:18:06.919163 4171 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"775caf7c4cc79ea4a46d874a7a2e150a8b62ada28f4738e387f4676813f18769\": container with ID starting with 775caf7c4cc79ea4a46d874a7a2e150a8b62ada28f4738e387f4676813f18769 not found: ID does not exist" containerID="775caf7c4cc79ea4a46d874a7a2e150a8b62ada28f4738e387f4676813f18769" Feb 23 14:18:06.919304 master-0 kubenswrapper[4171]: I0223 14:18:06.919235 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"775caf7c4cc79ea4a46d874a7a2e150a8b62ada28f4738e387f4676813f18769"} err="failed to get container status \"775caf7c4cc79ea4a46d874a7a2e150a8b62ada28f4738e387f4676813f18769\": rpc error: code = NotFound desc = could not find container \"775caf7c4cc79ea4a46d874a7a2e150a8b62ada28f4738e387f4676813f18769\": container with ID starting with 775caf7c4cc79ea4a46d874a7a2e150a8b62ada28f4738e387f4676813f18769 not found: ID does not exist" Feb 23 14:18:06.919304 master-0 kubenswrapper[4171]: I0223 14:18:06.919276 4171 scope.go:117] "RemoveContainer" containerID="c21949a45a35d4423855abb386586b266abead9f08605c4e6ac84622a4f6f125" Feb 23 14:18:06.920119 master-0 kubenswrapper[4171]: E0223 14:18:06.920054 4171 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c21949a45a35d4423855abb386586b266abead9f08605c4e6ac84622a4f6f125\": container with ID starting with c21949a45a35d4423855abb386586b266abead9f08605c4e6ac84622a4f6f125 not found: ID does not exist" containerID="c21949a45a35d4423855abb386586b266abead9f08605c4e6ac84622a4f6f125" Feb 23 14:18:06.920190 master-0 kubenswrapper[4171]: I0223 14:18:06.920111 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c21949a45a35d4423855abb386586b266abead9f08605c4e6ac84622a4f6f125"} err="failed to get container status \"c21949a45a35d4423855abb386586b266abead9f08605c4e6ac84622a4f6f125\": rpc error: code = NotFound desc = could not find container \"c21949a45a35d4423855abb386586b266abead9f08605c4e6ac84622a4f6f125\": container with ID starting with c21949a45a35d4423855abb386586b266abead9f08605c4e6ac84622a4f6f125 not found: ID does not exist" Feb 23 14:18:06.920190 master-0 kubenswrapper[4171]: I0223 14:18:06.920144 4171 scope.go:117] "RemoveContainer" containerID="ecedb62b37f77a1caab8aadc53f473fad0809a51787de9f83f8cf88aa03a01af" Feb 23 14:18:06.920576 master-0 kubenswrapper[4171]: E0223 14:18:06.920527 4171 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecedb62b37f77a1caab8aadc53f473fad0809a51787de9f83f8cf88aa03a01af\": container with ID starting with ecedb62b37f77a1caab8aadc53f473fad0809a51787de9f83f8cf88aa03a01af not found: ID does not exist" containerID="ecedb62b37f77a1caab8aadc53f473fad0809a51787de9f83f8cf88aa03a01af" Feb 23 14:18:06.920652 master-0 kubenswrapper[4171]: I0223 14:18:06.920572 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecedb62b37f77a1caab8aadc53f473fad0809a51787de9f83f8cf88aa03a01af"} err="failed to get container status \"ecedb62b37f77a1caab8aadc53f473fad0809a51787de9f83f8cf88aa03a01af\": rpc error: code = NotFound desc = could not find container \"ecedb62b37f77a1caab8aadc53f473fad0809a51787de9f83f8cf88aa03a01af\": container with ID starting with ecedb62b37f77a1caab8aadc53f473fad0809a51787de9f83f8cf88aa03a01af not found: ID does not exist" Feb 23 14:18:06.920652 master-0 kubenswrapper[4171]: I0223 14:18:06.920599 4171 scope.go:117] "RemoveContainer" containerID="e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f" Feb 23 14:18:06.921205 master-0 kubenswrapper[4171]: I0223 14:18:06.921132 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f"} err="failed to get container status \"e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f\": rpc error: code = NotFound desc = could not find container \"e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f\": container with ID starting with e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f not found: ID does not exist" Feb 23 14:18:06.921205 master-0 kubenswrapper[4171]: I0223 14:18:06.921191 4171 scope.go:117] "RemoveContainer" containerID="386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478" Feb 23 14:18:06.921910 master-0 kubenswrapper[4171]: I0223 14:18:06.921828 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478"} err="failed to get container status \"386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478\": rpc error: code = NotFound desc = could not find container \"386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478\": container with ID starting with 386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478 not found: ID does not exist" Feb 23 14:18:06.921987 master-0 kubenswrapper[4171]: I0223 14:18:06.921909 4171 scope.go:117] "RemoveContainer" containerID="11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633" Feb 23 14:18:06.922603 master-0 kubenswrapper[4171]: I0223 14:18:06.922429 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633"} err="failed to get container status \"11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633\": rpc error: code = NotFound desc = could not find container \"11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633\": container with ID starting with 11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633 not found: ID does not exist" Feb 23 14:18:06.922693 master-0 kubenswrapper[4171]: I0223 14:18:06.922618 4171 scope.go:117] "RemoveContainer" containerID="6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd" Feb 23 14:18:06.923424 master-0 kubenswrapper[4171]: I0223 14:18:06.923355 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd"} err="failed to get container status \"6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd\": rpc error: code = NotFound desc = could not find container \"6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd\": container with ID starting with 6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd not found: ID does not exist" Feb 23 14:18:06.923424 master-0 kubenswrapper[4171]: I0223 14:18:06.923412 4171 scope.go:117] "RemoveContainer" containerID="9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423" Feb 23 14:18:06.923883 master-0 kubenswrapper[4171]: I0223 14:18:06.923813 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423"} err="failed to get container status \"9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423\": rpc error: code = NotFound desc = could not find container \"9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423\": container with ID starting with 9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423 not found: ID does not exist" Feb 23 14:18:06.923883 master-0 kubenswrapper[4171]: I0223 14:18:06.923870 4171 scope.go:117] "RemoveContainer" containerID="cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f" Feb 23 14:18:06.924597 master-0 kubenswrapper[4171]: I0223 14:18:06.924544 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f"} err="failed to get container status \"cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f\": rpc error: code = NotFound desc = could not find container \"cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f\": container with ID starting with cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f not found: ID does not exist" Feb 23 14:18:06.924597 master-0 kubenswrapper[4171]: I0223 14:18:06.924589 4171 scope.go:117] "RemoveContainer" containerID="775caf7c4cc79ea4a46d874a7a2e150a8b62ada28f4738e387f4676813f18769" Feb 23 14:18:06.925208 master-0 kubenswrapper[4171]: I0223 14:18:06.925099 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"775caf7c4cc79ea4a46d874a7a2e150a8b62ada28f4738e387f4676813f18769"} err="failed to get container status \"775caf7c4cc79ea4a46d874a7a2e150a8b62ada28f4738e387f4676813f18769\": rpc error: code = NotFound desc = could not find container \"775caf7c4cc79ea4a46d874a7a2e150a8b62ada28f4738e387f4676813f18769\": container with ID starting with 775caf7c4cc79ea4a46d874a7a2e150a8b62ada28f4738e387f4676813f18769 not found: ID does not exist" Feb 23 14:18:06.925291 master-0 kubenswrapper[4171]: I0223 14:18:06.925196 4171 scope.go:117] "RemoveContainer" containerID="c21949a45a35d4423855abb386586b266abead9f08605c4e6ac84622a4f6f125" Feb 23 14:18:06.925863 master-0 kubenswrapper[4171]: I0223 14:18:06.925810 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c21949a45a35d4423855abb386586b266abead9f08605c4e6ac84622a4f6f125"} err="failed to get container status \"c21949a45a35d4423855abb386586b266abead9f08605c4e6ac84622a4f6f125\": rpc error: code = NotFound desc = could not find container \"c21949a45a35d4423855abb386586b266abead9f08605c4e6ac84622a4f6f125\": container with ID starting with c21949a45a35d4423855abb386586b266abead9f08605c4e6ac84622a4f6f125 not found: ID does not exist" Feb 23 14:18:06.925863 master-0 kubenswrapper[4171]: I0223 14:18:06.925853 4171 scope.go:117] "RemoveContainer" containerID="ecedb62b37f77a1caab8aadc53f473fad0809a51787de9f83f8cf88aa03a01af" Feb 23 14:18:06.926397 master-0 kubenswrapper[4171]: I0223 14:18:06.926328 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecedb62b37f77a1caab8aadc53f473fad0809a51787de9f83f8cf88aa03a01af"} err="failed to get container status \"ecedb62b37f77a1caab8aadc53f473fad0809a51787de9f83f8cf88aa03a01af\": rpc error: code = NotFound desc = could not find container \"ecedb62b37f77a1caab8aadc53f473fad0809a51787de9f83f8cf88aa03a01af\": container with ID starting with ecedb62b37f77a1caab8aadc53f473fad0809a51787de9f83f8cf88aa03a01af not found: ID does not exist" Feb 23 14:18:06.926501 master-0 kubenswrapper[4171]: I0223 14:18:06.926419 4171 scope.go:117] "RemoveContainer" containerID="e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f" Feb 23 14:18:06.926955 master-0 kubenswrapper[4171]: I0223 14:18:06.926883 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f"} err="failed to get container status \"e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f\": rpc error: code = NotFound desc = could not find container \"e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f\": container with ID starting with e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f not found: ID does not exist" Feb 23 14:18:06.926955 master-0 kubenswrapper[4171]: I0223 14:18:06.926943 4171 scope.go:117] "RemoveContainer" containerID="386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478" Feb 23 14:18:06.927363 master-0 kubenswrapper[4171]: I0223 14:18:06.927307 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478"} err="failed to get container status \"386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478\": rpc error: code = NotFound desc = could not find container \"386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478\": container with ID starting with 386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478 not found: ID does not exist" Feb 23 14:18:06.927363 master-0 kubenswrapper[4171]: I0223 14:18:06.927344 4171 scope.go:117] "RemoveContainer" containerID="11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633" Feb 23 14:18:06.927800 master-0 kubenswrapper[4171]: I0223 14:18:06.927728 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633"} err="failed to get container status \"11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633\": rpc error: code = NotFound desc = could not find container \"11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633\": container with ID starting with 11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633 not found: ID does not exist" Feb 23 14:18:06.927800 master-0 kubenswrapper[4171]: I0223 14:18:06.927785 4171 scope.go:117] "RemoveContainer" containerID="6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd" Feb 23 14:18:06.928209 master-0 kubenswrapper[4171]: I0223 14:18:06.928148 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd"} err="failed to get container status \"6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd\": rpc error: code = NotFound desc = could not find container \"6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd\": container with ID starting with 6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd not found: ID does not exist" Feb 23 14:18:06.928209 master-0 kubenswrapper[4171]: I0223 14:18:06.928198 4171 scope.go:117] "RemoveContainer" containerID="9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423" Feb 23 14:18:06.928779 master-0 kubenswrapper[4171]: I0223 14:18:06.928711 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423"} err="failed to get container status \"9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423\": rpc error: code = NotFound desc = could not find container \"9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423\": container with ID starting with 9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423 not found: ID does not exist" Feb 23 14:18:06.928869 master-0 kubenswrapper[4171]: I0223 14:18:06.928808 4171 scope.go:117] "RemoveContainer" containerID="cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f" Feb 23 14:18:06.929338 master-0 kubenswrapper[4171]: I0223 14:18:06.929257 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f"} err="failed to get container status \"cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f\": rpc error: code = NotFound desc = could not find container \"cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f\": container with ID starting with cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f not found: ID does not exist" Feb 23 14:18:06.929338 master-0 kubenswrapper[4171]: I0223 14:18:06.929292 4171 scope.go:117] "RemoveContainer" containerID="775caf7c4cc79ea4a46d874a7a2e150a8b62ada28f4738e387f4676813f18769" Feb 23 14:18:06.929881 master-0 kubenswrapper[4171]: I0223 14:18:06.929804 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"775caf7c4cc79ea4a46d874a7a2e150a8b62ada28f4738e387f4676813f18769"} err="failed to get container status \"775caf7c4cc79ea4a46d874a7a2e150a8b62ada28f4738e387f4676813f18769\": rpc error: code = NotFound desc = could not find container \"775caf7c4cc79ea4a46d874a7a2e150a8b62ada28f4738e387f4676813f18769\": container with ID starting with 775caf7c4cc79ea4a46d874a7a2e150a8b62ada28f4738e387f4676813f18769 not found: ID does not exist" Feb 23 14:18:06.929881 master-0 kubenswrapper[4171]: I0223 14:18:06.929871 4171 scope.go:117] "RemoveContainer" containerID="c21949a45a35d4423855abb386586b266abead9f08605c4e6ac84622a4f6f125" Feb 23 14:18:06.930540 master-0 kubenswrapper[4171]: I0223 14:18:06.930468 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c21949a45a35d4423855abb386586b266abead9f08605c4e6ac84622a4f6f125"} err="failed to get container status \"c21949a45a35d4423855abb386586b266abead9f08605c4e6ac84622a4f6f125\": rpc error: code = NotFound desc = could not find container \"c21949a45a35d4423855abb386586b266abead9f08605c4e6ac84622a4f6f125\": container with ID starting with c21949a45a35d4423855abb386586b266abead9f08605c4e6ac84622a4f6f125 not found: ID does not exist" Feb 23 14:18:06.930540 master-0 kubenswrapper[4171]: I0223 14:18:06.930530 4171 scope.go:117] "RemoveContainer" containerID="ecedb62b37f77a1caab8aadc53f473fad0809a51787de9f83f8cf88aa03a01af" Feb 23 14:18:06.931099 master-0 kubenswrapper[4171]: I0223 14:18:06.930991 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecedb62b37f77a1caab8aadc53f473fad0809a51787de9f83f8cf88aa03a01af"} err="failed to get container status \"ecedb62b37f77a1caab8aadc53f473fad0809a51787de9f83f8cf88aa03a01af\": rpc error: code = NotFound desc = could not find container \"ecedb62b37f77a1caab8aadc53f473fad0809a51787de9f83f8cf88aa03a01af\": container with ID starting with ecedb62b37f77a1caab8aadc53f473fad0809a51787de9f83f8cf88aa03a01af not found: ID does not exist" Feb 23 14:18:06.931099 master-0 kubenswrapper[4171]: I0223 14:18:06.931085 4171 scope.go:117] "RemoveContainer" containerID="e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f" Feb 23 14:18:06.931637 master-0 kubenswrapper[4171]: I0223 14:18:06.931571 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f"} err="failed to get container status \"e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f\": rpc error: code = NotFound desc = could not find container \"e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f\": container with ID starting with e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f not found: ID does not exist" Feb 23 14:18:06.931637 master-0 kubenswrapper[4171]: I0223 14:18:06.931629 4171 scope.go:117] "RemoveContainer" containerID="386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478" Feb 23 14:18:06.932087 master-0 kubenswrapper[4171]: I0223 14:18:06.932032 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478"} err="failed to get container status \"386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478\": rpc error: code = NotFound desc = could not find container \"386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478\": container with ID starting with 386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478 not found: ID does not exist" Feb 23 14:18:06.932087 master-0 kubenswrapper[4171]: I0223 14:18:06.932071 4171 scope.go:117] "RemoveContainer" containerID="11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633" Feb 23 14:18:06.932515 master-0 kubenswrapper[4171]: I0223 14:18:06.932428 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633"} err="failed to get container status \"11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633\": rpc error: code = NotFound desc = could not find container \"11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633\": container with ID starting with 11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633 not found: ID does not exist" Feb 23 14:18:06.932596 master-0 kubenswrapper[4171]: I0223 14:18:06.932519 4171 scope.go:117] "RemoveContainer" containerID="6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd" Feb 23 14:18:06.933266 master-0 kubenswrapper[4171]: I0223 14:18:06.933206 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd"} err="failed to get container status \"6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd\": rpc error: code = NotFound desc = could not find container \"6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd\": container with ID starting with 6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd not found: ID does not exist" Feb 23 14:18:06.933266 master-0 kubenswrapper[4171]: I0223 14:18:06.933243 4171 scope.go:117] "RemoveContainer" containerID="9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423" Feb 23 14:18:06.933727 master-0 kubenswrapper[4171]: I0223 14:18:06.933677 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423"} err="failed to get container status \"9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423\": rpc error: code = NotFound desc = could not find container \"9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423\": container with ID starting with 9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423 not found: ID does not exist" Feb 23 14:18:06.933727 master-0 kubenswrapper[4171]: I0223 14:18:06.933710 4171 scope.go:117] "RemoveContainer" containerID="cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f" Feb 23 14:18:06.934405 master-0 kubenswrapper[4171]: I0223 14:18:06.934345 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f"} err="failed to get container status \"cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f\": rpc error: code = NotFound desc = could not find container \"cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f\": container with ID starting with cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f not found: ID does not exist" Feb 23 14:18:06.934460 master-0 kubenswrapper[4171]: I0223 14:18:06.934399 4171 scope.go:117] "RemoveContainer" containerID="775caf7c4cc79ea4a46d874a7a2e150a8b62ada28f4738e387f4676813f18769" Feb 23 14:18:06.934872 master-0 kubenswrapper[4171]: I0223 14:18:06.934821 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"775caf7c4cc79ea4a46d874a7a2e150a8b62ada28f4738e387f4676813f18769"} err="failed to get container status \"775caf7c4cc79ea4a46d874a7a2e150a8b62ada28f4738e387f4676813f18769\": rpc error: code = NotFound desc = could not find container \"775caf7c4cc79ea4a46d874a7a2e150a8b62ada28f4738e387f4676813f18769\": container with ID starting with 775caf7c4cc79ea4a46d874a7a2e150a8b62ada28f4738e387f4676813f18769 not found: ID does not exist" Feb 23 14:18:06.934872 master-0 kubenswrapper[4171]: I0223 14:18:06.934861 4171 scope.go:117] "RemoveContainer" containerID="c21949a45a35d4423855abb386586b266abead9f08605c4e6ac84622a4f6f125" Feb 23 14:18:06.935594 master-0 kubenswrapper[4171]: I0223 14:18:06.935552 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c21949a45a35d4423855abb386586b266abead9f08605c4e6ac84622a4f6f125"} err="failed to get container status \"c21949a45a35d4423855abb386586b266abead9f08605c4e6ac84622a4f6f125\": rpc error: code = NotFound desc = could not find container \"c21949a45a35d4423855abb386586b266abead9f08605c4e6ac84622a4f6f125\": container with ID starting with c21949a45a35d4423855abb386586b266abead9f08605c4e6ac84622a4f6f125 not found: ID does not exist" Feb 23 14:18:06.935594 master-0 kubenswrapper[4171]: I0223 14:18:06.935585 4171 scope.go:117] "RemoveContainer" containerID="ecedb62b37f77a1caab8aadc53f473fad0809a51787de9f83f8cf88aa03a01af" Feb 23 14:18:06.936091 master-0 kubenswrapper[4171]: I0223 14:18:06.936030 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecedb62b37f77a1caab8aadc53f473fad0809a51787de9f83f8cf88aa03a01af"} err="failed to get container status \"ecedb62b37f77a1caab8aadc53f473fad0809a51787de9f83f8cf88aa03a01af\": rpc error: code = NotFound desc = could not find container \"ecedb62b37f77a1caab8aadc53f473fad0809a51787de9f83f8cf88aa03a01af\": container with ID starting with ecedb62b37f77a1caab8aadc53f473fad0809a51787de9f83f8cf88aa03a01af not found: ID does not exist" Feb 23 14:18:06.936091 master-0 kubenswrapper[4171]: I0223 14:18:06.936084 4171 scope.go:117] "RemoveContainer" containerID="e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f" Feb 23 14:18:06.936631 master-0 kubenswrapper[4171]: I0223 14:18:06.936578 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f"} err="failed to get container status \"e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f\": rpc error: code = NotFound desc = could not find container \"e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f\": container with ID starting with e877cac985fdd7e338dce9a85f68b7a4187df935fb253adca42d3b29386a970f not found: ID does not exist" Feb 23 14:18:06.936631 master-0 kubenswrapper[4171]: I0223 14:18:06.936618 4171 scope.go:117] "RemoveContainer" containerID="386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478" Feb 23 14:18:06.936989 master-0 kubenswrapper[4171]: I0223 14:18:06.936943 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478"} err="failed to get container status \"386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478\": rpc error: code = NotFound desc = could not find container \"386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478\": container with ID starting with 386f832aae1e231a81641d6065a2cf8b0899003a4d13fc1ff6a9bab8c443d478 not found: ID does not exist" Feb 23 14:18:06.936989 master-0 kubenswrapper[4171]: I0223 14:18:06.936978 4171 scope.go:117] "RemoveContainer" containerID="11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633" Feb 23 14:18:06.937305 master-0 kubenswrapper[4171]: I0223 14:18:06.937256 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633"} err="failed to get container status \"11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633\": rpc error: code = NotFound desc = could not find container \"11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633\": container with ID starting with 11ef0d221cfa6f8e92b483041ae101d6590137bac572a16a2b7a6a026ccd3633 not found: ID does not exist" Feb 23 14:18:06.937305 master-0 kubenswrapper[4171]: I0223 14:18:06.937293 4171 scope.go:117] "RemoveContainer" containerID="6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd" Feb 23 14:18:06.937745 master-0 kubenswrapper[4171]: I0223 14:18:06.937695 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd"} err="failed to get container status \"6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd\": rpc error: code = NotFound desc = could not find container \"6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd\": container with ID starting with 6ac1d788dd937e3b0bef1032d597ab986b131c77137f9de86c2201011ca5b2dd not found: ID does not exist" Feb 23 14:18:06.937745 master-0 kubenswrapper[4171]: I0223 14:18:06.937733 4171 scope.go:117] "RemoveContainer" containerID="9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423" Feb 23 14:18:06.938166 master-0 kubenswrapper[4171]: I0223 14:18:06.938100 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423"} err="failed to get container status \"9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423\": rpc error: code = NotFound desc = could not find container \"9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423\": container with ID starting with 9db1a438912c109e738a7c496e19d46a94b79cdd31d7a4c3cecd69cfce6ba423 not found: ID does not exist" Feb 23 14:18:06.938166 master-0 kubenswrapper[4171]: I0223 14:18:06.938156 4171 scope.go:117] "RemoveContainer" containerID="cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f" Feb 23 14:18:06.938916 master-0 kubenswrapper[4171]: I0223 14:18:06.938851 4171 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f"} err="failed to get container status \"cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f\": rpc error: code = NotFound desc = could not find container \"cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f\": container with ID starting with cb3e924f422762905092df78e4ee350e600905b8cc11a70376022b419398450f not found: ID does not exist" Feb 23 14:18:07.057189 master-0 kubenswrapper[4171]: I0223 14:18:07.057102 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:18:07.057450 master-0 kubenswrapper[4171]: I0223 14:18:07.057208 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:18:07.057699 master-0 kubenswrapper[4171]: E0223 14:18:07.057646 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9dnsv" podUID="ace75aae-6f4f-4299-90e2-d5292271b136" Feb 23 14:18:07.057850 master-0 kubenswrapper[4171]: E0223 14:18:07.057776 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9gxm" podUID="ded555da-db03-498e-81a9-ad166f29a2aa" Feb 23 14:18:07.066054 master-0 kubenswrapper[4171]: I0223 14:18:07.065980 4171 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02998803-dd12-4c68-990e-efd131076a0f" path="/var/lib/kubelet/pods/02998803-dd12-4c68-990e-efd131076a0f/volumes" Feb 23 14:18:07.687095 master-0 kubenswrapper[4171]: I0223 14:18:07.687027 4171 generic.go:334] "Generic (PLEG): container finished" podID="f10f592e-5738-4879-b776-246b357d4621" containerID="d631da69f8bc3fb53c35b8ef8cedda80eee352d8a4bf7c9c1590bb5315fa046f" exitCode=0 Feb 23 14:18:07.687095 master-0 kubenswrapper[4171]: I0223 14:18:07.687084 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" event={"ID":"f10f592e-5738-4879-b776-246b357d4621","Type":"ContainerDied","Data":"d631da69f8bc3fb53c35b8ef8cedda80eee352d8a4bf7c9c1590bb5315fa046f"} Feb 23 14:18:08.694241 master-0 kubenswrapper[4171]: I0223 14:18:08.693846 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" event={"ID":"f10f592e-5738-4879-b776-246b357d4621","Type":"ContainerStarted","Data":"3ea5faed78b2ea2b8994832461aaaf74c9ef592eac3d0c97920f60f89ceac985"} Feb 23 14:18:08.694241 master-0 kubenswrapper[4171]: I0223 14:18:08.694213 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" event={"ID":"f10f592e-5738-4879-b776-246b357d4621","Type":"ContainerStarted","Data":"949866c09669bb2b0e28930526f2d54501e60f4ade0138a026b409a4b51634b6"} Feb 23 14:18:08.694241 master-0 kubenswrapper[4171]: I0223 14:18:08.694226 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" event={"ID":"f10f592e-5738-4879-b776-246b357d4621","Type":"ContainerStarted","Data":"141510cd57858396dc1d9695963a56975d67176a1a34b85a413db22e7a9d1c2d"} Feb 23 14:18:08.694241 master-0 kubenswrapper[4171]: I0223 14:18:08.694239 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" event={"ID":"f10f592e-5738-4879-b776-246b357d4621","Type":"ContainerStarted","Data":"feadf5cd210bd27ef055d343cee55ad02012eb9f1c5d4c7dac28bf62c520d26a"} Feb 23 14:18:08.694241 master-0 kubenswrapper[4171]: I0223 14:18:08.694250 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" event={"ID":"f10f592e-5738-4879-b776-246b357d4621","Type":"ContainerStarted","Data":"4abe3d81e03117e533870906b84ab247755d65cb9c415d60fd56970186891b64"} Feb 23 14:18:08.694241 master-0 kubenswrapper[4171]: I0223 14:18:08.694261 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" event={"ID":"f10f592e-5738-4879-b776-246b357d4621","Type":"ContainerStarted","Data":"c7ce566a8c6dfc791fa99412db4be443eb6afb3b0b0f440d5d6ea4d4f2a79037"} Feb 23 14:18:09.056963 master-0 kubenswrapper[4171]: I0223 14:18:09.056842 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:18:09.056963 master-0 kubenswrapper[4171]: I0223 14:18:09.056950 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:18:09.057354 master-0 kubenswrapper[4171]: E0223 14:18:09.057076 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9dnsv" podUID="ace75aae-6f4f-4299-90e2-d5292271b136" Feb 23 14:18:09.057354 master-0 kubenswrapper[4171]: E0223 14:18:09.057154 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9gxm" podUID="ded555da-db03-498e-81a9-ad166f29a2aa" Feb 23 14:18:10.670101 master-0 kubenswrapper[4171]: I0223 14:18:10.669960 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:18:10.671030 master-0 kubenswrapper[4171]: E0223 14:18:10.670170 4171 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Feb 23 14:18:10.671030 master-0 kubenswrapper[4171]: E0223 14:18:10.670244 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert podName:3cea0ab8-258b-486c-bb7f-8c93930b296d nodeName:}" failed. No retries permitted until 2026-02-23 14:19:14.670222404 +0000 UTC m=+164.773623893 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert") pod "cluster-version-operator-5cfd9759cf-bsqrg" (UID: "3cea0ab8-258b-486c-bb7f-8c93930b296d") : secret "cluster-version-operator-serving-cert" not found Feb 23 14:18:11.056878 master-0 kubenswrapper[4171]: I0223 14:18:11.056816 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:18:11.058325 master-0 kubenswrapper[4171]: E0223 14:18:11.058238 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9dnsv" podUID="ace75aae-6f4f-4299-90e2-d5292271b136" Feb 23 14:18:11.058459 master-0 kubenswrapper[4171]: I0223 14:18:11.058323 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:18:11.058585 master-0 kubenswrapper[4171]: E0223 14:18:11.058536 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9gxm" podUID="ded555da-db03-498e-81a9-ad166f29a2aa" Feb 23 14:18:11.680568 master-0 kubenswrapper[4171]: I0223 14:18:11.680344 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4lz2\" (UniqueName: \"kubernetes.io/projected/ded555da-db03-498e-81a9-ad166f29a2aa-kube-api-access-x4lz2\") pod \"network-check-target-x9gxm\" (UID: \"ded555da-db03-498e-81a9-ad166f29a2aa\") " pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:18:11.681784 master-0 kubenswrapper[4171]: E0223 14:18:11.680668 4171 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 14:18:11.681784 master-0 kubenswrapper[4171]: E0223 14:18:11.680732 4171 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 14:18:11.681784 master-0 kubenswrapper[4171]: E0223 14:18:11.680755 4171 projected.go:194] Error preparing data for projected volume kube-api-access-x4lz2 for pod openshift-network-diagnostics/network-check-target-x9gxm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 14:18:11.681784 master-0 kubenswrapper[4171]: E0223 14:18:11.680862 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ded555da-db03-498e-81a9-ad166f29a2aa-kube-api-access-x4lz2 podName:ded555da-db03-498e-81a9-ad166f29a2aa nodeName:}" failed. No retries permitted until 2026-02-23 14:18:43.680829448 +0000 UTC m=+133.784231137 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-x4lz2" (UniqueName: "kubernetes.io/projected/ded555da-db03-498e-81a9-ad166f29a2aa-kube-api-access-x4lz2") pod "network-check-target-x9gxm" (UID: "ded555da-db03-498e-81a9-ad166f29a2aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 14:18:11.716704 master-0 kubenswrapper[4171]: I0223 14:18:11.716591 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" event={"ID":"f10f592e-5738-4879-b776-246b357d4621","Type":"ContainerStarted","Data":"c159bc6185d4445126a130871d407c6543008d04b81d0dc8a0df37b2a655095d"} Feb 23 14:18:13.057154 master-0 kubenswrapper[4171]: I0223 14:18:13.057076 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:18:13.058036 master-0 kubenswrapper[4171]: E0223 14:18:13.057249 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9gxm" podUID="ded555da-db03-498e-81a9-ad166f29a2aa" Feb 23 14:18:13.058187 master-0 kubenswrapper[4171]: I0223 14:18:13.058153 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:18:13.058390 master-0 kubenswrapper[4171]: E0223 14:18:13.058309 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9dnsv" podUID="ace75aae-6f4f-4299-90e2-d5292271b136" Feb 23 14:18:13.727056 master-0 kubenswrapper[4171]: I0223 14:18:13.726971 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" event={"ID":"f10f592e-5738-4879-b776-246b357d4621","Type":"ContainerStarted","Data":"f429434a74a7d0af0afa5ab1f60331480efe79d0b9a3742ea96282333066d9bc"} Feb 23 14:18:13.727414 master-0 kubenswrapper[4171]: I0223 14:18:13.727352 4171 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:13.754142 master-0 kubenswrapper[4171]: I0223 14:18:13.753657 4171 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:13.793370 master-0 kubenswrapper[4171]: I0223 14:18:13.793287 4171 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" podStartSLOduration=7.793264422 podStartE2EDuration="7.793264422s" podCreationTimestamp="2026-02-23 14:18:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:18:13.756359942 +0000 UTC m=+103.859761491" watchObservedRunningTime="2026-02-23 14:18:13.793264422 +0000 UTC m=+103.896665911" Feb 23 14:18:14.731976 master-0 kubenswrapper[4171]: I0223 14:18:14.731449 4171 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:14.733103 master-0 kubenswrapper[4171]: I0223 14:18:14.732008 4171 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:14.766241 master-0 kubenswrapper[4171]: I0223 14:18:14.766148 4171 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:15.057417 master-0 kubenswrapper[4171]: I0223 14:18:15.057205 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:18:15.057417 master-0 kubenswrapper[4171]: I0223 14:18:15.057348 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:18:15.058121 master-0 kubenswrapper[4171]: E0223 14:18:15.057457 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9dnsv" podUID="ace75aae-6f4f-4299-90e2-d5292271b136" Feb 23 14:18:15.058121 master-0 kubenswrapper[4171]: E0223 14:18:15.057690 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9gxm" podUID="ded555da-db03-498e-81a9-ad166f29a2aa" Feb 23 14:18:15.074043 master-0 kubenswrapper[4171]: I0223 14:18:15.072005 4171 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-x9gxm"] Feb 23 14:18:15.075850 master-0 kubenswrapper[4171]: I0223 14:18:15.075783 4171 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9dnsv"] Feb 23 14:18:15.736084 master-0 kubenswrapper[4171]: I0223 14:18:15.735986 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:18:15.736084 master-0 kubenswrapper[4171]: I0223 14:18:15.736056 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:18:15.737133 master-0 kubenswrapper[4171]: E0223 14:18:15.736185 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9dnsv" podUID="ace75aae-6f4f-4299-90e2-d5292271b136" Feb 23 14:18:15.737133 master-0 kubenswrapper[4171]: E0223 14:18:15.736821 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9gxm" podUID="ded555da-db03-498e-81a9-ad166f29a2aa" Feb 23 14:18:18.056953 master-0 kubenswrapper[4171]: I0223 14:18:18.056852 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:18:18.057968 master-0 kubenswrapper[4171]: E0223 14:18:18.057007 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9gxm" podUID="ded555da-db03-498e-81a9-ad166f29a2aa" Feb 23 14:18:18.057968 master-0 kubenswrapper[4171]: I0223 14:18:18.056851 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:18:18.057968 master-0 kubenswrapper[4171]: E0223 14:18:18.057195 4171 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9dnsv" podUID="ace75aae-6f4f-4299-90e2-d5292271b136" Feb 23 14:18:19.598022 master-0 kubenswrapper[4171]: I0223 14:18:19.597794 4171 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeReady" Feb 23 14:18:19.598876 master-0 kubenswrapper[4171]: I0223 14:18:19.598068 4171 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Feb 23 14:18:19.643505 master-0 kubenswrapper[4171]: I0223 14:18:19.643398 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9"] Feb 23 14:18:19.643899 master-0 kubenswrapper[4171]: I0223 14:18:19.643850 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" Feb 23 14:18:19.661707 master-0 kubenswrapper[4171]: I0223 14:18:19.661513 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 23 14:18:19.663199 master-0 kubenswrapper[4171]: I0223 14:18:19.662463 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 23 14:18:19.663527 master-0 kubenswrapper[4171]: I0223 14:18:19.663238 4171 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 23 14:18:19.663527 master-0 kubenswrapper[4171]: I0223 14:18:19.663499 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 23 14:18:19.668101 master-0 kubenswrapper[4171]: I0223 14:18:19.667971 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd"] Feb 23 14:18:19.680543 master-0 kubenswrapper[4171]: I0223 14:18:19.679203 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x"] Feb 23 14:18:19.680543 master-0 kubenswrapper[4171]: I0223 14:18:19.679715 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" Feb 23 14:18:19.680543 master-0 kubenswrapper[4171]: I0223 14:18:19.680286 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" Feb 23 14:18:19.685048 master-0 kubenswrapper[4171]: I0223 14:18:19.680885 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw"] Feb 23 14:18:19.685048 master-0 kubenswrapper[4171]: I0223 14:18:19.681115 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq"] Feb 23 14:18:19.685048 master-0 kubenswrapper[4171]: I0223 14:18:19.681270 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b"] Feb 23 14:18:19.685048 master-0 kubenswrapper[4171]: I0223 14:18:19.681328 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" Feb 23 14:18:19.685048 master-0 kubenswrapper[4171]: I0223 14:18:19.681434 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:19.685048 master-0 kubenswrapper[4171]: I0223 14:18:19.681652 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" Feb 23 14:18:19.685048 master-0 kubenswrapper[4171]: I0223 14:18:19.683429 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2"] Feb 23 14:18:19.685048 master-0 kubenswrapper[4171]: I0223 14:18:19.683698 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-6fb4df594f-hkcgz"] Feb 23 14:18:19.685048 master-0 kubenswrapper[4171]: I0223 14:18:19.683858 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6"] Feb 23 14:18:19.685048 master-0 kubenswrapper[4171]: I0223 14:18:19.684056 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" Feb 23 14:18:19.685048 master-0 kubenswrapper[4171]: I0223 14:18:19.684327 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:18:19.685591 master-0 kubenswrapper[4171]: I0223 14:18:19.685119 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6fb4df594f-hkcgz" Feb 23 14:18:19.689514 master-0 kubenswrapper[4171]: I0223 14:18:19.686610 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7"] Feb 23 14:18:19.689514 master-0 kubenswrapper[4171]: I0223 14:18:19.687043 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88"] Feb 23 14:18:19.689514 master-0 kubenswrapper[4171]: I0223 14:18:19.687541 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" Feb 23 14:18:19.689514 master-0 kubenswrapper[4171]: I0223 14:18:19.687667 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:18:19.689514 master-0 kubenswrapper[4171]: I0223 14:18:19.688798 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-8c7d49845-5rk2g"] Feb 23 14:18:19.689764 master-0 kubenswrapper[4171]: I0223 14:18:19.689540 4171 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 23 14:18:19.689764 master-0 kubenswrapper[4171]: I0223 14:18:19.689580 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-8c7d49845-5rk2g" Feb 23 14:18:19.689833 master-0 kubenswrapper[4171]: I0223 14:18:19.689772 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 23 14:18:19.693526 master-0 kubenswrapper[4171]: I0223 14:18:19.689884 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 23 14:18:19.693526 master-0 kubenswrapper[4171]: I0223 14:18:19.690027 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 23 14:18:19.693526 master-0 kubenswrapper[4171]: I0223 14:18:19.690147 4171 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 23 14:18:19.693526 master-0 kubenswrapper[4171]: I0223 14:18:19.690276 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 23 14:18:19.693526 master-0 kubenswrapper[4171]: I0223 14:18:19.690452 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 23 14:18:19.693526 master-0 kubenswrapper[4171]: I0223 14:18:19.690592 4171 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 23 14:18:19.693526 master-0 kubenswrapper[4171]: I0223 14:18:19.690825 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 23 14:18:19.693526 master-0 kubenswrapper[4171]: I0223 14:18:19.693388 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm"] Feb 23 14:18:19.693850 master-0 kubenswrapper[4171]: I0223 14:18:19.693682 4171 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 23 14:18:19.697112 master-0 kubenswrapper[4171]: I0223 14:18:19.693994 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 23 14:18:19.697112 master-0 kubenswrapper[4171]: I0223 14:18:19.694031 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:18:19.697112 master-0 kubenswrapper[4171]: I0223 14:18:19.694146 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 23 14:18:19.697112 master-0 kubenswrapper[4171]: I0223 14:18:19.694194 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Feb 23 14:18:19.697112 master-0 kubenswrapper[4171]: I0223 14:18:19.694362 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 23 14:18:19.697112 master-0 kubenswrapper[4171]: I0223 14:18:19.694375 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-6f5488b997-7b5sp"] Feb 23 14:18:19.697112 master-0 kubenswrapper[4171]: I0223 14:18:19.694545 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 23 14:18:19.697112 master-0 kubenswrapper[4171]: I0223 14:18:19.694880 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 23 14:18:19.697112 master-0 kubenswrapper[4171]: I0223 14:18:19.695102 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 23 14:18:19.697112 master-0 kubenswrapper[4171]: I0223 14:18:19.695142 4171 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 23 14:18:19.697112 master-0 kubenswrapper[4171]: I0223 14:18:19.695236 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 23 14:18:19.697112 master-0 kubenswrapper[4171]: I0223 14:18:19.695466 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Feb 23 14:18:19.697112 master-0 kubenswrapper[4171]: I0223 14:18:19.695665 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 23 14:18:19.704153 master-0 kubenswrapper[4171]: I0223 14:18:19.695773 4171 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 23 14:18:19.704153 master-0 kubenswrapper[4171]: I0223 14:18:19.695865 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Feb 23 14:18:19.704153 master-0 kubenswrapper[4171]: I0223 14:18:19.695899 4171 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 23 14:18:19.704153 master-0 kubenswrapper[4171]: I0223 14:18:19.695935 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Feb 23 14:18:19.704153 master-0 kubenswrapper[4171]: I0223 14:18:19.695968 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 23 14:18:19.704153 master-0 kubenswrapper[4171]: I0223 14:18:19.696928 4171 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Feb 23 14:18:19.704153 master-0 kubenswrapper[4171]: I0223 14:18:19.697011 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Feb 23 14:18:19.704153 master-0 kubenswrapper[4171]: I0223 14:18:19.697266 4171 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Feb 23 14:18:19.704153 master-0 kubenswrapper[4171]: I0223 14:18:19.697552 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 23 14:18:19.704153 master-0 kubenswrapper[4171]: I0223 14:18:19.697591 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 23 14:18:19.704153 master-0 kubenswrapper[4171]: I0223 14:18:19.697621 4171 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Feb 23 14:18:19.704153 master-0 kubenswrapper[4171]: I0223 14:18:19.702450 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-6569778c84-hsl6c"] Feb 23 14:18:19.704153 master-0 kubenswrapper[4171]: I0223 14:18:19.697664 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 23 14:18:19.704153 master-0 kubenswrapper[4171]: I0223 14:18:19.697692 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 23 14:18:19.704153 master-0 kubenswrapper[4171]: I0223 14:18:19.697727 4171 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 23 14:18:19.704153 master-0 kubenswrapper[4171]: I0223 14:18:19.702862 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz"] Feb 23 14:18:19.704153 master-0 kubenswrapper[4171]: I0223 14:18:19.697766 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Feb 23 14:18:19.704153 master-0 kubenswrapper[4171]: I0223 14:18:19.703124 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v"] Feb 23 14:18:19.704153 master-0 kubenswrapper[4171]: I0223 14:18:19.703145 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:18:19.704153 master-0 kubenswrapper[4171]: I0223 14:18:19.703459 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp"] Feb 23 14:18:19.704153 master-0 kubenswrapper[4171]: I0223 14:18:19.703903 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" Feb 23 14:18:19.704153 master-0 kubenswrapper[4171]: I0223 14:18:19.703972 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:18:19.706055 master-0 kubenswrapper[4171]: I0223 14:18:19.704219 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v" Feb 23 14:18:19.706055 master-0 kubenswrapper[4171]: I0223 14:18:19.704658 4171 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9"] Feb 23 14:18:19.706055 master-0 kubenswrapper[4171]: I0223 14:18:19.704874 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:18:19.707980 master-0 kubenswrapper[4171]: I0223 14:18:19.707691 4171 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw"] Feb 23 14:18:19.707980 master-0 kubenswrapper[4171]: I0223 14:18:19.707774 4171 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x"] Feb 23 14:18:19.709522 master-0 kubenswrapper[4171]: I0223 14:18:19.709496 4171 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd"] Feb 23 14:18:19.709522 master-0 kubenswrapper[4171]: I0223 14:18:19.709519 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-t5h8h"] Feb 23 14:18:19.710007 master-0 kubenswrapper[4171]: I0223 14:18:19.709783 4171 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-8c7d49845-5rk2g"] Feb 23 14:18:19.710007 master-0 kubenswrapper[4171]: I0223 14:18:19.709864 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-t5h8h" Feb 23 14:18:19.711638 master-0 kubenswrapper[4171]: I0223 14:18:19.711580 4171 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2"] Feb 23 14:18:19.724800 master-0 kubenswrapper[4171]: I0223 14:18:19.724624 4171 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-6fb4df594f-hkcgz"] Feb 23 14:18:19.728200 master-0 kubenswrapper[4171]: I0223 14:18:19.725742 4171 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq"] Feb 23 14:18:19.728200 master-0 kubenswrapper[4171]: I0223 14:18:19.726572 4171 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-6f5488b997-7b5sp"] Feb 23 14:18:19.728200 master-0 kubenswrapper[4171]: I0223 14:18:19.726763 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 23 14:18:19.728200 master-0 kubenswrapper[4171]: I0223 14:18:19.726955 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 23 14:18:19.728200 master-0 kubenswrapper[4171]: I0223 14:18:19.726959 4171 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 23 14:18:19.728200 master-0 kubenswrapper[4171]: I0223 14:18:19.726994 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 23 14:18:19.728200 master-0 kubenswrapper[4171]: I0223 14:18:19.727084 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 23 14:18:19.728200 master-0 kubenswrapper[4171]: I0223 14:18:19.727166 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Feb 23 14:18:19.728200 master-0 kubenswrapper[4171]: I0223 14:18:19.727221 4171 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 23 14:18:19.728200 master-0 kubenswrapper[4171]: I0223 14:18:19.727282 4171 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 23 14:18:19.728200 master-0 kubenswrapper[4171]: I0223 14:18:19.727331 4171 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Feb 23 14:18:19.728200 master-0 kubenswrapper[4171]: I0223 14:18:19.727429 4171 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 23 14:18:19.728200 master-0 kubenswrapper[4171]: I0223 14:18:19.727093 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Feb 23 14:18:19.728200 master-0 kubenswrapper[4171]: I0223 14:18:19.727580 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 23 14:18:19.728200 master-0 kubenswrapper[4171]: I0223 14:18:19.727865 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 23 14:18:19.728200 master-0 kubenswrapper[4171]: I0223 14:18:19.727949 4171 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 23 14:18:19.728200 master-0 kubenswrapper[4171]: I0223 14:18:19.727961 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 23 14:18:19.728200 master-0 kubenswrapper[4171]: I0223 14:18:19.728121 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 23 14:18:19.729106 master-0 kubenswrapper[4171]: I0223 14:18:19.728790 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 23 14:18:19.729854 master-0 kubenswrapper[4171]: I0223 14:18:19.729802 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 23 14:18:19.729907 master-0 kubenswrapper[4171]: I0223 14:18:19.729883 4171 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 23 14:18:19.730021 master-0 kubenswrapper[4171]: I0223 14:18:19.729993 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 23 14:18:19.730055 master-0 kubenswrapper[4171]: I0223 14:18:19.730023 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 23 14:18:19.730130 master-0 kubenswrapper[4171]: I0223 14:18:19.730107 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 23 14:18:19.730403 master-0 kubenswrapper[4171]: I0223 14:18:19.730378 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 23 14:18:19.730491 master-0 kubenswrapper[4171]: I0223 14:18:19.730449 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Feb 23 14:18:19.732586 master-0 kubenswrapper[4171]: I0223 14:18:19.732541 4171 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88"] Feb 23 14:18:19.733831 master-0 kubenswrapper[4171]: I0223 14:18:19.733782 4171 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7"] Feb 23 14:18:19.735984 master-0 kubenswrapper[4171]: I0223 14:18:19.735910 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Feb 23 14:18:19.736281 master-0 kubenswrapper[4171]: I0223 14:18:19.736246 4171 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm"] Feb 23 14:18:19.736379 master-0 kubenswrapper[4171]: I0223 14:18:19.736360 4171 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6"] Feb 23 14:18:19.738377 master-0 kubenswrapper[4171]: I0223 14:18:19.738329 4171 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v"] Feb 23 14:18:19.739818 master-0 kubenswrapper[4171]: I0223 14:18:19.739792 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 23 14:18:19.742963 master-0 kubenswrapper[4171]: I0223 14:18:19.742917 4171 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp"] Feb 23 14:18:19.744456 master-0 kubenswrapper[4171]: I0223 14:18:19.744307 4171 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b"] Feb 23 14:18:19.745264 master-0 kubenswrapper[4171]: I0223 14:18:19.745230 4171 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz"] Feb 23 14:18:19.745517 master-0 kubenswrapper[4171]: I0223 14:18:19.745450 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 23 14:18:19.745698 master-0 kubenswrapper[4171]: I0223 14:18:19.745672 4171 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-6569778c84-hsl6c"] Feb 23 14:18:19.746776 master-0 kubenswrapper[4171]: I0223 14:18:19.746738 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 23 14:18:19.762193 master-0 kubenswrapper[4171]: I0223 14:18:19.762143 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/865ceedb-b19a-4f2f-b295-311e1b7a645e-serving-cert\") pod \"kube-storage-version-migrator-operator-fc889cfd5-tw2r9\" (UID: \"865ceedb-b19a-4f2f-b295-311e1b7a645e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" Feb 23 14:18:19.762365 master-0 kubenswrapper[4171]: I0223 14:18:19.762284 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr2p2\" (UniqueName: \"kubernetes.io/projected/865ceedb-b19a-4f2f-b295-311e1b7a645e-kube-api-access-tr2p2\") pod \"kube-storage-version-migrator-operator-fc889cfd5-tw2r9\" (UID: \"865ceedb-b19a-4f2f-b295-311e1b7a645e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" Feb 23 14:18:19.762396 master-0 kubenswrapper[4171]: I0223 14:18:19.762370 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/865ceedb-b19a-4f2f-b295-311e1b7a645e-config\") pod \"kube-storage-version-migrator-operator-fc889cfd5-tw2r9\" (UID: \"865ceedb-b19a-4f2f-b295-311e1b7a645e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" Feb 23 14:18:19.862845 master-0 kubenswrapper[4171]: I0223 14:18:19.862804 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8de1f285-47ac-42aa-8026-8addce656362-etcd-service-ca\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:18:19.862942 master-0 kubenswrapper[4171]: I0223 14:18:19.862869 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/961e4ecd-545b-4270-ae34-e733dec793b6-serving-cert\") pod \"kube-apiserver-operator-5d87bf58c-nq2tz\" (UID: \"961e4ecd-545b-4270-ae34-e733dec793b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" Feb 23 14:18:19.862942 master-0 kubenswrapper[4171]: I0223 14:18:19.862904 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24829faf-50e8-45bb-abb0-7cc5ccf81080-config\") pod \"openshift-apiserver-operator-8586dccc9b-tvnmq\" (UID: \"24829faf-50e8-45bb-abb0-7cc5ccf81080\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" Feb 23 14:18:19.863018 master-0 kubenswrapper[4171]: I0223 14:18:19.862980 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24829faf-50e8-45bb-abb0-7cc5ccf81080-serving-cert\") pod \"openshift-apiserver-operator-8586dccc9b-tvnmq\" (UID: \"24829faf-50e8-45bb-abb0-7cc5ccf81080\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" Feb 23 14:18:19.863047 master-0 kubenswrapper[4171]: I0223 14:18:19.863029 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qggzs\" (UniqueName: \"kubernetes.io/projected/57b57915-64dd-42f5-b06f-bc4bcc06b667-kube-api-access-qggzs\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:19.863105 master-0 kubenswrapper[4171]: I0223 14:18:19.863065 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf04aca0-8174-4134-835d-37adf6a3b5ca-kube-api-access\") pod \"kube-controller-manager-operator-7bcfbc574b-zdntd\" (UID: \"cf04aca0-8174-4134-835d-37adf6a3b5ca\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" Feb 23 14:18:19.863105 master-0 kubenswrapper[4171]: I0223 14:18:19.863092 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chznd\" (UniqueName: \"kubernetes.io/projected/cb6e88cd-98de-446a-92e8-f56a2f133703-kube-api-access-chznd\") pod \"openshift-controller-manager-operator-584cc7bcb5-67ds6\" (UID: \"cb6e88cd-98de-446a-92e8-f56a2f133703\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" Feb 23 14:18:19.863231 master-0 kubenswrapper[4171]: I0223 14:18:19.863182 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:19.863267 master-0 kubenswrapper[4171]: I0223 14:18:19.863227 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics\") pod \"marketplace-operator-6f5488b997-7b5sp\" (UID: \"585f74db-4593-426b-b0c7-ec8f64810549\") " pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:18:19.863358 master-0 kubenswrapper[4171]: I0223 14:18:19.863314 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-5bd7768f54-bgg88\" (UID: \"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" Feb 23 14:18:19.863411 master-0 kubenswrapper[4171]: I0223 14:18:19.863383 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8de1f285-47ac-42aa-8026-8addce656362-etcd-ca\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:18:19.863439 master-0 kubenswrapper[4171]: I0223 14:18:19.863418 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2d00ece-7586-4346-adbb-eaae1aeda69e-serving-cert\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:18:19.863497 master-0 kubenswrapper[4171]: I0223 14:18:19.863453 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8llc8\" (UniqueName: \"kubernetes.io/projected/a4ae9292-71dc-4484-b277-43cb26c1e04d-kube-api-access-8llc8\") pod \"csi-snapshot-controller-operator-6fb4df594f-hkcgz\" (UID: \"a4ae9292-71dc-4484-b277-43cb26c1e04d\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6fb4df594f-hkcgz" Feb 23 14:18:19.863556 master-0 kubenswrapper[4171]: I0223 14:18:19.863531 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr2p2\" (UniqueName: \"kubernetes.io/projected/865ceedb-b19a-4f2f-b295-311e1b7a645e-kube-api-access-tr2p2\") pod \"kube-storage-version-migrator-operator-fc889cfd5-tw2r9\" (UID: \"865ceedb-b19a-4f2f-b295-311e1b7a645e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" Feb 23 14:18:19.863596 master-0 kubenswrapper[4171]: I0223 14:18:19.863580 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf04aca0-8174-4134-835d-37adf6a3b5ca-serving-cert\") pod \"kube-controller-manager-operator-7bcfbc574b-zdntd\" (UID: \"cf04aca0-8174-4134-835d-37adf6a3b5ca\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" Feb 23 14:18:19.863648 master-0 kubenswrapper[4171]: I0223 14:18:19.863618 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2d00ece-7586-4346-adbb-eaae1aeda69e-trusted-ca-bundle\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:18:19.863684 master-0 kubenswrapper[4171]: I0223 14:18:19.863668 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert\") pod \"package-server-manager-5c75f78c8b-cj2l7\" (UID: \"5b54fc16-d2f7-4b10-a611-5b411b389c5a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:18:19.863734 master-0 kubenswrapper[4171]: I0223 14:18:19.863711 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3488a7eb-5170-478c-9af7-490dbe0f514e-bound-sa-token\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:18:19.863765 master-0 kubenswrapper[4171]: I0223 14:18:19.863741 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2d00ece-7586-4346-adbb-eaae1aeda69e-config\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:18:19.863815 master-0 kubenswrapper[4171]: I0223 14:18:19.863781 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/865ceedb-b19a-4f2f-b295-311e1b7a645e-config\") pod \"kube-storage-version-migrator-operator-fc889cfd5-tw2r9\" (UID: \"865ceedb-b19a-4f2f-b295-311e1b7a645e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" Feb 23 14:18:19.863842 master-0 kubenswrapper[4171]: I0223 14:18:19.863816 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jzsd\" (UniqueName: \"kubernetes.io/projected/646fece3-4a42-4e0c-bcc7-5f705f948d63-kube-api-access-2jzsd\") pod \"cluster-monitoring-operator-6bb6d78bf-wzqcp\" (UID: \"646fece3-4a42-4e0c-bcc7-5f705f948d63\") " pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:18:19.863892 master-0 kubenswrapper[4171]: I0223 14:18:19.863848 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-trusted-ca\") pod \"marketplace-operator-6f5488b997-7b5sp\" (UID: \"585f74db-4593-426b-b0c7-ec8f64810549\") " pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:18:19.863892 master-0 kubenswrapper[4171]: I0223 14:18:19.863878 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5f8j\" (UniqueName: \"kubernetes.io/projected/5b54fc16-d2f7-4b10-a611-5b411b389c5a-kube-api-access-d5f8j\") pod \"package-server-manager-5c75f78c8b-cj2l7\" (UID: \"5b54fc16-d2f7-4b10-a611-5b411b389c5a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:18:19.863985 master-0 kubenswrapper[4171]: I0223 14:18:19.863963 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b714a9df-026e-423d-a980-2569f0d92e47-config\") pod \"service-ca-operator-c48c8bf7c-vtnsw\" (UID: \"b714a9df-026e-423d-a980-2569f0d92e47\") " pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" Feb 23 14:18:19.864162 master-0 kubenswrapper[4171]: I0223 14:18:19.864130 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/646fece3-4a42-4e0c-bcc7-5f705f948d63-telemetry-config\") pod \"cluster-monitoring-operator-6bb6d78bf-wzqcp\" (UID: \"646fece3-4a42-4e0c-bcc7-5f705f948d63\") " pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:18:19.864215 master-0 kubenswrapper[4171]: I0223 14:18:19.864179 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb6e88cd-98de-446a-92e8-f56a2f133703-serving-cert\") pod \"openshift-controller-manager-operator-584cc7bcb5-67ds6\" (UID: \"cb6e88cd-98de-446a-92e8-f56a2f133703\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" Feb 23 14:18:19.864243 master-0 kubenswrapper[4171]: I0223 14:18:19.864211 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8de1f285-47ac-42aa-8026-8addce656362-config\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:18:19.864297 master-0 kubenswrapper[4171]: I0223 14:18:19.864245 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qszm\" (UniqueName: \"kubernetes.io/projected/3488a7eb-5170-478c-9af7-490dbe0f514e-kube-api-access-6qszm\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:18:19.864667 master-0 kubenswrapper[4171]: I0223 14:18:19.864641 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/865ceedb-b19a-4f2f-b295-311e1b7a645e-config\") pod \"kube-storage-version-migrator-operator-fc889cfd5-tw2r9\" (UID: \"865ceedb-b19a-4f2f-b295-311e1b7a645e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" Feb 23 14:18:19.864733 master-0 kubenswrapper[4171]: I0223 14:18:19.864697 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nr85\" (UniqueName: \"kubernetes.io/projected/e2d00ece-7586-4346-adbb-eaae1aeda69e-kube-api-access-4nr85\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:18:19.864766 master-0 kubenswrapper[4171]: I0223 14:18:19.864730 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9cf1c39-24f0-420b-8020-089616d1cdf0-serving-cert\") pod \"openshift-kube-scheduler-operator-77cd4d9559-qvq8x\" (UID: \"b9cf1c39-24f0-420b-8020-089616d1cdf0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" Feb 23 14:18:19.864766 master-0 kubenswrapper[4171]: I0223 14:18:19.864750 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/961e4ecd-545b-4270-ae34-e733dec793b6-config\") pod \"kube-apiserver-operator-5d87bf58c-nq2tz\" (UID: \"961e4ecd-545b-4270-ae34-e733dec793b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" Feb 23 14:18:19.864844 master-0 kubenswrapper[4171]: I0223 14:18:19.864775 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf04aca0-8174-4134-835d-37adf6a3b5ca-config\") pod \"kube-controller-manager-operator-7bcfbc574b-zdntd\" (UID: \"cf04aca0-8174-4134-835d-37adf6a3b5ca\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" Feb 23 14:18:19.864844 master-0 kubenswrapper[4171]: I0223 14:18:19.864823 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hp42\" (UniqueName: \"kubernetes.io/projected/24829faf-50e8-45bb-abb0-7cc5ccf81080-kube-api-access-7hp42\") pod \"openshift-apiserver-operator-8586dccc9b-tvnmq\" (UID: \"24829faf-50e8-45bb-abb0-7cc5ccf81080\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" Feb 23 14:18:19.864916 master-0 kubenswrapper[4171]: I0223 14:18:19.864845 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9cf1c39-24f0-420b-8020-089616d1cdf0-kube-api-access\") pod \"openshift-kube-scheduler-operator-77cd4d9559-qvq8x\" (UID: \"b9cf1c39-24f0-420b-8020-089616d1cdf0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" Feb 23 14:18:19.864916 master-0 kubenswrapper[4171]: I0223 14:18:19.864866 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb6e88cd-98de-446a-92e8-f56a2f133703-config\") pod \"openshift-controller-manager-operator-584cc7bcb5-67ds6\" (UID: \"cb6e88cd-98de-446a-92e8-f56a2f133703\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" Feb 23 14:18:19.864916 master-0 kubenswrapper[4171]: I0223 14:18:19.864906 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4sbp\" (UniqueName: \"kubernetes.io/projected/607c1101-3533-43e3-9eda-13cea2b9dbb6-kube-api-access-v4sbp\") pod \"dns-operator-8c7d49845-5rk2g\" (UID: \"607c1101-3533-43e3-9eda-13cea2b9dbb6\") " pod="openshift-dns-operator/dns-operator-8c7d49845-5rk2g" Feb 23 14:18:19.865012 master-0 kubenswrapper[4171]: I0223 14:18:19.864926 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2d00ece-7586-4346-adbb-eaae1aeda69e-service-ca-bundle\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:18:19.865012 master-0 kubenswrapper[4171]: I0223 14:18:19.864945 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls\") pod \"dns-operator-8c7d49845-5rk2g\" (UID: \"607c1101-3533-43e3-9eda-13cea2b9dbb6\") " pod="openshift-dns-operator/dns-operator-8c7d49845-5rk2g" Feb 23 14:18:19.865012 master-0 kubenswrapper[4171]: I0223 14:18:19.864964 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08c561b3-613b-425f-9de4-d5fc8762ea51-host-slash\") pod \"iptables-alerter-t5h8h\" (UID: \"08c561b3-613b-425f-9de4-d5fc8762ea51\") " pod="openshift-network-operator/iptables-alerter-t5h8h" Feb 23 14:18:19.865128 master-0 kubenswrapper[4171]: I0223 14:18:19.865020 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/961e4ecd-545b-4270-ae34-e733dec793b6-kube-api-access\") pod \"kube-apiserver-operator-5d87bf58c-nq2tz\" (UID: \"961e4ecd-545b-4270-ae34-e733dec793b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" Feb 23 14:18:19.865128 master-0 kubenswrapper[4171]: I0223 14:18:19.865044 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7jvd\" (UniqueName: \"kubernetes.io/projected/8de1f285-47ac-42aa-8026-8addce656362-kube-api-access-x7jvd\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:18:19.865128 master-0 kubenswrapper[4171]: I0223 14:18:19.865064 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr868\" (UniqueName: \"kubernetes.io/projected/b714a9df-026e-423d-a980-2569f0d92e47-kube-api-access-lr868\") pod \"service-ca-operator-c48c8bf7c-vtnsw\" (UID: \"b714a9df-026e-423d-a980-2569f0d92e47\") " pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" Feb 23 14:18:19.865128 master-0 kubenswrapper[4171]: I0223 14:18:19.865083 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9tkx\" (UniqueName: \"kubernetes.io/projected/585f74db-4593-426b-b0c7-ec8f64810549-kube-api-access-q9tkx\") pod \"marketplace-operator-6f5488b997-7b5sp\" (UID: \"585f74db-4593-426b-b0c7-ec8f64810549\") " pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:18:19.865128 master-0 kubenswrapper[4171]: I0223 14:18:19.865105 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs\") pod \"multus-admission-controller-5f98f4f8d5-fnc9v\" (UID: \"842d45c5-3452-4e97-b5f5-540395330a65\") " pod="openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v" Feb 23 14:18:19.865128 master-0 kubenswrapper[4171]: I0223 14:18:19.865128 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57b57915-64dd-42f5-b06f-bc4bcc06b667-trusted-ca\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:19.865328 master-0 kubenswrapper[4171]: I0223 14:18:19.865167 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:18:19.865328 master-0 kubenswrapper[4171]: I0223 14:18:19.865226 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3488a7eb-5170-478c-9af7-490dbe0f514e-trusted-ca\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:18:19.865328 master-0 kubenswrapper[4171]: I0223 14:18:19.865285 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061-operand-assets\") pod \"cluster-olm-operator-5bd7768f54-bgg88\" (UID: \"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" Feb 23 14:18:19.865328 master-0 kubenswrapper[4171]: I0223 14:18:19.865317 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp6tj\" (UniqueName: \"kubernetes.io/projected/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061-kube-api-access-vp6tj\") pod \"cluster-olm-operator-5bd7768f54-bgg88\" (UID: \"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" Feb 23 14:18:19.865465 master-0 kubenswrapper[4171]: I0223 14:18:19.865347 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6bb6d78bf-wzqcp\" (UID: \"646fece3-4a42-4e0c-bcc7-5f705f948d63\") " pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:18:19.865465 master-0 kubenswrapper[4171]: I0223 14:18:19.865371 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phmkf\" (UniqueName: \"kubernetes.io/projected/08c561b3-613b-425f-9de4-d5fc8762ea51-kube-api-access-phmkf\") pod \"iptables-alerter-t5h8h\" (UID: \"08c561b3-613b-425f-9de4-d5fc8762ea51\") " pod="openshift-network-operator/iptables-alerter-t5h8h" Feb 23 14:18:19.865465 master-0 kubenswrapper[4171]: I0223 14:18:19.865392 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8de1f285-47ac-42aa-8026-8addce656362-serving-cert\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:18:19.865465 master-0 kubenswrapper[4171]: I0223 14:18:19.865411 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8de1f285-47ac-42aa-8026-8addce656362-etcd-client\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:18:19.865465 master-0 kubenswrapper[4171]: I0223 14:18:19.865433 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/08c561b3-613b-425f-9de4-d5fc8762ea51-iptables-alerter-script\") pod \"iptables-alerter-t5h8h\" (UID: \"08c561b3-613b-425f-9de4-d5fc8762ea51\") " pod="openshift-network-operator/iptables-alerter-t5h8h" Feb 23 14:18:19.865465 master-0 kubenswrapper[4171]: I0223 14:18:19.865455 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j54j5\" (UniqueName: \"kubernetes.io/projected/842d45c5-3452-4e97-b5f5-540395330a65-kube-api-access-j54j5\") pod \"multus-admission-controller-5f98f4f8d5-fnc9v\" (UID: \"842d45c5-3452-4e97-b5f5-540395330a65\") " pod="openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v" Feb 23 14:18:19.865678 master-0 kubenswrapper[4171]: I0223 14:18:19.865518 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b714a9df-026e-423d-a980-2569f0d92e47-serving-cert\") pod \"service-ca-operator-c48c8bf7c-vtnsw\" (UID: \"b714a9df-026e-423d-a980-2569f0d92e47\") " pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" Feb 23 14:18:19.865678 master-0 kubenswrapper[4171]: I0223 14:18:19.865582 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:19.865678 master-0 kubenswrapper[4171]: I0223 14:18:19.865606 4171 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9cf1c39-24f0-420b-8020-089616d1cdf0-config\") pod \"openshift-kube-scheduler-operator-77cd4d9559-qvq8x\" (UID: \"b9cf1c39-24f0-420b-8020-089616d1cdf0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" Feb 23 14:18:19.865778 master-0 kubenswrapper[4171]: I0223 14:18:19.865713 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/865ceedb-b19a-4f2f-b295-311e1b7a645e-serving-cert\") pod \"kube-storage-version-migrator-operator-fc889cfd5-tw2r9\" (UID: \"865ceedb-b19a-4f2f-b295-311e1b7a645e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" Feb 23 14:18:19.875791 master-0 kubenswrapper[4171]: I0223 14:18:19.875754 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/865ceedb-b19a-4f2f-b295-311e1b7a645e-serving-cert\") pod \"kube-storage-version-migrator-operator-fc889cfd5-tw2r9\" (UID: \"865ceedb-b19a-4f2f-b295-311e1b7a645e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" Feb 23 14:18:19.880949 master-0 kubenswrapper[4171]: I0223 14:18:19.880909 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr2p2\" (UniqueName: \"kubernetes.io/projected/865ceedb-b19a-4f2f-b295-311e1b7a645e-kube-api-access-tr2p2\") pod \"kube-storage-version-migrator-operator-fc889cfd5-tw2r9\" (UID: \"865ceedb-b19a-4f2f-b295-311e1b7a645e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" Feb 23 14:18:19.967736 master-0 kubenswrapper[4171]: I0223 14:18:19.967692 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf04aca0-8174-4134-835d-37adf6a3b5ca-kube-api-access\") pod \"kube-controller-manager-operator-7bcfbc574b-zdntd\" (UID: \"cf04aca0-8174-4134-835d-37adf6a3b5ca\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" Feb 23 14:18:19.967736 master-0 kubenswrapper[4171]: I0223 14:18:19.967740 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chznd\" (UniqueName: \"kubernetes.io/projected/cb6e88cd-98de-446a-92e8-f56a2f133703-kube-api-access-chznd\") pod \"openshift-controller-manager-operator-584cc7bcb5-67ds6\" (UID: \"cb6e88cd-98de-446a-92e8-f56a2f133703\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" Feb 23 14:18:19.968028 master-0 kubenswrapper[4171]: I0223 14:18:19.967769 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:19.968028 master-0 kubenswrapper[4171]: E0223 14:18:19.967884 4171 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Feb 23 14:18:19.968028 master-0 kubenswrapper[4171]: E0223 14:18:19.967936 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert podName:57b57915-64dd-42f5-b06f-bc4bcc06b667 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:20.467917239 +0000 UTC m=+110.571318748 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert") pod "cluster-node-tuning-operator-bcf775fc9-z5t5b" (UID: "57b57915-64dd-42f5-b06f-bc4bcc06b667") : secret "performance-addon-operator-webhook-cert" not found Feb 23 14:18:19.970377 master-0 kubenswrapper[4171]: I0223 14:18:19.968108 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics\") pod \"marketplace-operator-6f5488b997-7b5sp\" (UID: \"585f74db-4593-426b-b0c7-ec8f64810549\") " pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:18:19.970377 master-0 kubenswrapper[4171]: I0223 14:18:19.968158 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-5bd7768f54-bgg88\" (UID: \"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" Feb 23 14:18:19.970377 master-0 kubenswrapper[4171]: I0223 14:18:19.968269 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8de1f285-47ac-42aa-8026-8addce656362-etcd-ca\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:18:19.970377 master-0 kubenswrapper[4171]: E0223 14:18:19.968295 4171 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Feb 23 14:18:19.970377 master-0 kubenswrapper[4171]: E0223 14:18:19.968458 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics podName:585f74db-4593-426b-b0c7-ec8f64810549 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:20.468371548 +0000 UTC m=+110.571773127 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics") pod "marketplace-operator-6f5488b997-7b5sp" (UID: "585f74db-4593-426b-b0c7-ec8f64810549") : secret "marketplace-operator-metrics" not found Feb 23 14:18:19.970377 master-0 kubenswrapper[4171]: I0223 14:18:19.968517 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2d00ece-7586-4346-adbb-eaae1aeda69e-serving-cert\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:18:19.970377 master-0 kubenswrapper[4171]: I0223 14:18:19.968558 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8llc8\" (UniqueName: \"kubernetes.io/projected/a4ae9292-71dc-4484-b277-43cb26c1e04d-kube-api-access-8llc8\") pod \"csi-snapshot-controller-operator-6fb4df594f-hkcgz\" (UID: \"a4ae9292-71dc-4484-b277-43cb26c1e04d\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6fb4df594f-hkcgz" Feb 23 14:18:19.970377 master-0 kubenswrapper[4171]: I0223 14:18:19.968584 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf04aca0-8174-4134-835d-37adf6a3b5ca-serving-cert\") pod \"kube-controller-manager-operator-7bcfbc574b-zdntd\" (UID: \"cf04aca0-8174-4134-835d-37adf6a3b5ca\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" Feb 23 14:18:19.970377 master-0 kubenswrapper[4171]: I0223 14:18:19.968605 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2d00ece-7586-4346-adbb-eaae1aeda69e-trusted-ca-bundle\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:18:19.970377 master-0 kubenswrapper[4171]: I0223 14:18:19.968640 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert\") pod \"package-server-manager-5c75f78c8b-cj2l7\" (UID: \"5b54fc16-d2f7-4b10-a611-5b411b389c5a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:18:19.970377 master-0 kubenswrapper[4171]: I0223 14:18:19.968707 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2d00ece-7586-4346-adbb-eaae1aeda69e-config\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:18:19.970377 master-0 kubenswrapper[4171]: I0223 14:18:19.968734 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3488a7eb-5170-478c-9af7-490dbe0f514e-bound-sa-token\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:18:19.970377 master-0 kubenswrapper[4171]: I0223 14:18:19.968754 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5f8j\" (UniqueName: \"kubernetes.io/projected/5b54fc16-d2f7-4b10-a611-5b411b389c5a-kube-api-access-d5f8j\") pod \"package-server-manager-5c75f78c8b-cj2l7\" (UID: \"5b54fc16-d2f7-4b10-a611-5b411b389c5a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:18:19.970377 master-0 kubenswrapper[4171]: I0223 14:18:19.969040 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8de1f285-47ac-42aa-8026-8addce656362-etcd-ca\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:18:19.970377 master-0 kubenswrapper[4171]: E0223 14:18:19.969050 4171 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Feb 23 14:18:19.971969 master-0 kubenswrapper[4171]: E0223 14:18:19.969183 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert podName:5b54fc16-d2f7-4b10-a611-5b411b389c5a nodeName:}" failed. No retries permitted until 2026-02-23 14:18:20.469139652 +0000 UTC m=+110.572541201 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert") pod "package-server-manager-5c75f78c8b-cj2l7" (UID: "5b54fc16-d2f7-4b10-a611-5b411b389c5a") : secret "package-server-manager-serving-cert" not found Feb 23 14:18:19.971969 master-0 kubenswrapper[4171]: I0223 14:18:19.969809 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b714a9df-026e-423d-a980-2569f0d92e47-config\") pod \"service-ca-operator-c48c8bf7c-vtnsw\" (UID: \"b714a9df-026e-423d-a980-2569f0d92e47\") " pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" Feb 23 14:18:19.971969 master-0 kubenswrapper[4171]: I0223 14:18:19.969942 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2d00ece-7586-4346-adbb-eaae1aeda69e-trusted-ca-bundle\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:18:19.971969 master-0 kubenswrapper[4171]: I0223 14:18:19.970125 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2d00ece-7586-4346-adbb-eaae1aeda69e-config\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:18:19.971969 master-0 kubenswrapper[4171]: I0223 14:18:19.971374 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b714a9df-026e-423d-a980-2569f0d92e47-config\") pod \"service-ca-operator-c48c8bf7c-vtnsw\" (UID: \"b714a9df-026e-423d-a980-2569f0d92e47\") " pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" Feb 23 14:18:19.971969 master-0 kubenswrapper[4171]: I0223 14:18:19.971421 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/646fece3-4a42-4e0c-bcc7-5f705f948d63-telemetry-config\") pod \"cluster-monitoring-operator-6bb6d78bf-wzqcp\" (UID: \"646fece3-4a42-4e0c-bcc7-5f705f948d63\") " pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:18:19.971969 master-0 kubenswrapper[4171]: I0223 14:18:19.971441 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jzsd\" (UniqueName: \"kubernetes.io/projected/646fece3-4a42-4e0c-bcc7-5f705f948d63-kube-api-access-2jzsd\") pod \"cluster-monitoring-operator-6bb6d78bf-wzqcp\" (UID: \"646fece3-4a42-4e0c-bcc7-5f705f948d63\") " pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:18:19.973648 master-0 kubenswrapper[4171]: I0223 14:18:19.973057 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-5bd7768f54-bgg88\" (UID: \"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" Feb 23 14:18:19.973648 master-0 kubenswrapper[4171]: I0223 14:18:19.973400 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2d00ece-7586-4346-adbb-eaae1aeda69e-serving-cert\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:18:19.973648 master-0 kubenswrapper[4171]: I0223 14:18:19.973540 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-trusted-ca\") pod \"marketplace-operator-6f5488b997-7b5sp\" (UID: \"585f74db-4593-426b-b0c7-ec8f64810549\") " pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:18:19.973648 master-0 kubenswrapper[4171]: I0223 14:18:19.973591 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qszm\" (UniqueName: \"kubernetes.io/projected/3488a7eb-5170-478c-9af7-490dbe0f514e-kube-api-access-6qszm\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:18:19.973648 master-0 kubenswrapper[4171]: I0223 14:18:19.973619 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nr85\" (UniqueName: \"kubernetes.io/projected/e2d00ece-7586-4346-adbb-eaae1aeda69e-kube-api-access-4nr85\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:18:19.973648 master-0 kubenswrapper[4171]: I0223 14:18:19.973645 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb6e88cd-98de-446a-92e8-f56a2f133703-serving-cert\") pod \"openshift-controller-manager-operator-584cc7bcb5-67ds6\" (UID: \"cb6e88cd-98de-446a-92e8-f56a2f133703\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" Feb 23 14:18:19.973648 master-0 kubenswrapper[4171]: I0223 14:18:19.973666 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8de1f285-47ac-42aa-8026-8addce656362-config\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:18:19.974454 master-0 kubenswrapper[4171]: I0223 14:18:19.973691 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/961e4ecd-545b-4270-ae34-e733dec793b6-config\") pod \"kube-apiserver-operator-5d87bf58c-nq2tz\" (UID: \"961e4ecd-545b-4270-ae34-e733dec793b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" Feb 23 14:18:19.974454 master-0 kubenswrapper[4171]: I0223 14:18:19.973724 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9cf1c39-24f0-420b-8020-089616d1cdf0-serving-cert\") pod \"openshift-kube-scheduler-operator-77cd4d9559-qvq8x\" (UID: \"b9cf1c39-24f0-420b-8020-089616d1cdf0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" Feb 23 14:18:19.974454 master-0 kubenswrapper[4171]: I0223 14:18:19.973748 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf04aca0-8174-4134-835d-37adf6a3b5ca-config\") pod \"kube-controller-manager-operator-7bcfbc574b-zdntd\" (UID: \"cf04aca0-8174-4134-835d-37adf6a3b5ca\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" Feb 23 14:18:19.974454 master-0 kubenswrapper[4171]: I0223 14:18:19.973771 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb6e88cd-98de-446a-92e8-f56a2f133703-config\") pod \"openshift-controller-manager-operator-584cc7bcb5-67ds6\" (UID: \"cb6e88cd-98de-446a-92e8-f56a2f133703\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" Feb 23 14:18:19.974454 master-0 kubenswrapper[4171]: I0223 14:18:19.973806 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4sbp\" (UniqueName: \"kubernetes.io/projected/607c1101-3533-43e3-9eda-13cea2b9dbb6-kube-api-access-v4sbp\") pod \"dns-operator-8c7d49845-5rk2g\" (UID: \"607c1101-3533-43e3-9eda-13cea2b9dbb6\") " pod="openshift-dns-operator/dns-operator-8c7d49845-5rk2g" Feb 23 14:18:19.974454 master-0 kubenswrapper[4171]: I0223 14:18:19.973827 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hp42\" (UniqueName: \"kubernetes.io/projected/24829faf-50e8-45bb-abb0-7cc5ccf81080-kube-api-access-7hp42\") pod \"openshift-apiserver-operator-8586dccc9b-tvnmq\" (UID: \"24829faf-50e8-45bb-abb0-7cc5ccf81080\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" Feb 23 14:18:19.974454 master-0 kubenswrapper[4171]: I0223 14:18:19.973847 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9cf1c39-24f0-420b-8020-089616d1cdf0-kube-api-access\") pod \"openshift-kube-scheduler-operator-77cd4d9559-qvq8x\" (UID: \"b9cf1c39-24f0-420b-8020-089616d1cdf0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" Feb 23 14:18:19.974454 master-0 kubenswrapper[4171]: I0223 14:18:19.973867 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls\") pod \"dns-operator-8c7d49845-5rk2g\" (UID: \"607c1101-3533-43e3-9eda-13cea2b9dbb6\") " pod="openshift-dns-operator/dns-operator-8c7d49845-5rk2g" Feb 23 14:18:19.974454 master-0 kubenswrapper[4171]: I0223 14:18:19.973889 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08c561b3-613b-425f-9de4-d5fc8762ea51-host-slash\") pod \"iptables-alerter-t5h8h\" (UID: \"08c561b3-613b-425f-9de4-d5fc8762ea51\") " pod="openshift-network-operator/iptables-alerter-t5h8h" Feb 23 14:18:19.974454 master-0 kubenswrapper[4171]: I0223 14:18:19.973909 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2d00ece-7586-4346-adbb-eaae1aeda69e-service-ca-bundle\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:18:19.974454 master-0 kubenswrapper[4171]: I0223 14:18:19.973931 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr868\" (UniqueName: \"kubernetes.io/projected/b714a9df-026e-423d-a980-2569f0d92e47-kube-api-access-lr868\") pod \"service-ca-operator-c48c8bf7c-vtnsw\" (UID: \"b714a9df-026e-423d-a980-2569f0d92e47\") " pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" Feb 23 14:18:19.974454 master-0 kubenswrapper[4171]: I0223 14:18:19.973951 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9tkx\" (UniqueName: \"kubernetes.io/projected/585f74db-4593-426b-b0c7-ec8f64810549-kube-api-access-q9tkx\") pod \"marketplace-operator-6f5488b997-7b5sp\" (UID: \"585f74db-4593-426b-b0c7-ec8f64810549\") " pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:18:19.974454 master-0 kubenswrapper[4171]: I0223 14:18:19.973972 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/961e4ecd-545b-4270-ae34-e733dec793b6-kube-api-access\") pod \"kube-apiserver-operator-5d87bf58c-nq2tz\" (UID: \"961e4ecd-545b-4270-ae34-e733dec793b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" Feb 23 14:18:19.974454 master-0 kubenswrapper[4171]: I0223 14:18:19.974027 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7jvd\" (UniqueName: \"kubernetes.io/projected/8de1f285-47ac-42aa-8026-8addce656362-kube-api-access-x7jvd\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:18:19.974454 master-0 kubenswrapper[4171]: I0223 14:18:19.974051 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57b57915-64dd-42f5-b06f-bc4bcc06b667-trusted-ca\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:19.975613 master-0 kubenswrapper[4171]: I0223 14:18:19.974109 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs\") pod \"multus-admission-controller-5f98f4f8d5-fnc9v\" (UID: \"842d45c5-3452-4e97-b5f5-540395330a65\") " pod="openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v" Feb 23 14:18:19.975613 master-0 kubenswrapper[4171]: I0223 14:18:19.974135 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3488a7eb-5170-478c-9af7-490dbe0f514e-trusted-ca\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:18:19.975613 master-0 kubenswrapper[4171]: I0223 14:18:19.974163 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-trusted-ca\") pod \"marketplace-operator-6f5488b997-7b5sp\" (UID: \"585f74db-4593-426b-b0c7-ec8f64810549\") " pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:18:19.975613 master-0 kubenswrapper[4171]: I0223 14:18:19.974175 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:18:19.975613 master-0 kubenswrapper[4171]: I0223 14:18:19.974212 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061-operand-assets\") pod \"cluster-olm-operator-5bd7768f54-bgg88\" (UID: \"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" Feb 23 14:18:19.975613 master-0 kubenswrapper[4171]: I0223 14:18:19.974232 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp6tj\" (UniqueName: \"kubernetes.io/projected/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061-kube-api-access-vp6tj\") pod \"cluster-olm-operator-5bd7768f54-bgg88\" (UID: \"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" Feb 23 14:18:19.975613 master-0 kubenswrapper[4171]: E0223 14:18:19.974252 4171 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Feb 23 14:18:19.975613 master-0 kubenswrapper[4171]: I0223 14:18:19.974260 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6bb6d78bf-wzqcp\" (UID: \"646fece3-4a42-4e0c-bcc7-5f705f948d63\") " pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:18:19.975613 master-0 kubenswrapper[4171]: I0223 14:18:19.974281 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8de1f285-47ac-42aa-8026-8addce656362-etcd-client\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:18:19.975613 master-0 kubenswrapper[4171]: E0223 14:18:19.974301 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls podName:3488a7eb-5170-478c-9af7-490dbe0f514e nodeName:}" failed. No retries permitted until 2026-02-23 14:18:20.474284338 +0000 UTC m=+110.577685837 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls") pod "ingress-operator-6569778c84-hsl6c" (UID: "3488a7eb-5170-478c-9af7-490dbe0f514e") : secret "metrics-tls" not found Feb 23 14:18:19.975613 master-0 kubenswrapper[4171]: I0223 14:18:19.974322 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/08c561b3-613b-425f-9de4-d5fc8762ea51-iptables-alerter-script\") pod \"iptables-alerter-t5h8h\" (UID: \"08c561b3-613b-425f-9de4-d5fc8762ea51\") " pod="openshift-network-operator/iptables-alerter-t5h8h" Feb 23 14:18:19.975613 master-0 kubenswrapper[4171]: I0223 14:18:19.974346 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phmkf\" (UniqueName: \"kubernetes.io/projected/08c561b3-613b-425f-9de4-d5fc8762ea51-kube-api-access-phmkf\") pod \"iptables-alerter-t5h8h\" (UID: \"08c561b3-613b-425f-9de4-d5fc8762ea51\") " pod="openshift-network-operator/iptables-alerter-t5h8h" Feb 23 14:18:19.975613 master-0 kubenswrapper[4171]: I0223 14:18:19.974372 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8de1f285-47ac-42aa-8026-8addce656362-serving-cert\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:18:19.975613 master-0 kubenswrapper[4171]: I0223 14:18:19.974396 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b714a9df-026e-423d-a980-2569f0d92e47-serving-cert\") pod \"service-ca-operator-c48c8bf7c-vtnsw\" (UID: \"b714a9df-026e-423d-a980-2569f0d92e47\") " pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" Feb 23 14:18:19.975613 master-0 kubenswrapper[4171]: I0223 14:18:19.974420 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:19.975613 master-0 kubenswrapper[4171]: I0223 14:18:19.974442 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j54j5\" (UniqueName: \"kubernetes.io/projected/842d45c5-3452-4e97-b5f5-540395330a65-kube-api-access-j54j5\") pod \"multus-admission-controller-5f98f4f8d5-fnc9v\" (UID: \"842d45c5-3452-4e97-b5f5-540395330a65\") " pod="openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v" Feb 23 14:18:19.976554 master-0 kubenswrapper[4171]: I0223 14:18:19.974466 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9cf1c39-24f0-420b-8020-089616d1cdf0-config\") pod \"openshift-kube-scheduler-operator-77cd4d9559-qvq8x\" (UID: \"b9cf1c39-24f0-420b-8020-089616d1cdf0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" Feb 23 14:18:19.976554 master-0 kubenswrapper[4171]: I0223 14:18:19.974522 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8de1f285-47ac-42aa-8026-8addce656362-etcd-service-ca\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:18:19.976554 master-0 kubenswrapper[4171]: I0223 14:18:19.975522 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061-operand-assets\") pod \"cluster-olm-operator-5bd7768f54-bgg88\" (UID: \"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" Feb 23 14:18:19.976554 master-0 kubenswrapper[4171]: I0223 14:18:19.975565 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24829faf-50e8-45bb-abb0-7cc5ccf81080-config\") pod \"openshift-apiserver-operator-8586dccc9b-tvnmq\" (UID: \"24829faf-50e8-45bb-abb0-7cc5ccf81080\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" Feb 23 14:18:19.976554 master-0 kubenswrapper[4171]: I0223 14:18:19.975594 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/961e4ecd-545b-4270-ae34-e733dec793b6-serving-cert\") pod \"kube-apiserver-operator-5d87bf58c-nq2tz\" (UID: \"961e4ecd-545b-4270-ae34-e733dec793b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" Feb 23 14:18:19.976554 master-0 kubenswrapper[4171]: I0223 14:18:19.975617 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24829faf-50e8-45bb-abb0-7cc5ccf81080-serving-cert\") pod \"openshift-apiserver-operator-8586dccc9b-tvnmq\" (UID: \"24829faf-50e8-45bb-abb0-7cc5ccf81080\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" Feb 23 14:18:19.976554 master-0 kubenswrapper[4171]: I0223 14:18:19.975642 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qggzs\" (UniqueName: \"kubernetes.io/projected/57b57915-64dd-42f5-b06f-bc4bcc06b667-kube-api-access-qggzs\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:19.976982 master-0 kubenswrapper[4171]: E0223 14:18:19.976877 4171 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Feb 23 14:18:19.976982 master-0 kubenswrapper[4171]: I0223 14:18:19.976929 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57b57915-64dd-42f5-b06f-bc4bcc06b667-trusted-ca\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:19.976982 master-0 kubenswrapper[4171]: E0223 14:18:19.976942 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs podName:842d45c5-3452-4e97-b5f5-540395330a65 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:20.476921818 +0000 UTC m=+110.580323437 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs") pod "multus-admission-controller-5f98f4f8d5-fnc9v" (UID: "842d45c5-3452-4e97-b5f5-540395330a65") : secret "multus-admission-controller-secret" not found Feb 23 14:18:19.978350 master-0 kubenswrapper[4171]: E0223 14:18:19.977273 4171 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Feb 23 14:18:19.978350 master-0 kubenswrapper[4171]: E0223 14:18:19.977344 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls podName:646fece3-4a42-4e0c-bcc7-5f705f948d63 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:20.477324085 +0000 UTC m=+110.580725604 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6bb6d78bf-wzqcp" (UID: "646fece3-4a42-4e0c-bcc7-5f705f948d63") : secret "cluster-monitoring-operator-tls" not found Feb 23 14:18:19.978350 master-0 kubenswrapper[4171]: I0223 14:18:19.977357 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8de1f285-47ac-42aa-8026-8addce656362-config\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:18:19.978350 master-0 kubenswrapper[4171]: I0223 14:18:19.977811 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/08c561b3-613b-425f-9de4-d5fc8762ea51-iptables-alerter-script\") pod \"iptables-alerter-t5h8h\" (UID: \"08c561b3-613b-425f-9de4-d5fc8762ea51\") " pod="openshift-network-operator/iptables-alerter-t5h8h" Feb 23 14:18:19.978350 master-0 kubenswrapper[4171]: I0223 14:18:19.978204 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb6e88cd-98de-446a-92e8-f56a2f133703-config\") pod \"openshift-controller-manager-operator-584cc7bcb5-67ds6\" (UID: \"cb6e88cd-98de-446a-92e8-f56a2f133703\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" Feb 23 14:18:19.979823 master-0 kubenswrapper[4171]: I0223 14:18:19.979788 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3488a7eb-5170-478c-9af7-490dbe0f514e-trusted-ca\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:18:19.982344 master-0 kubenswrapper[4171]: I0223 14:18:19.981614 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24829faf-50e8-45bb-abb0-7cc5ccf81080-config\") pod \"openshift-apiserver-operator-8586dccc9b-tvnmq\" (UID: \"24829faf-50e8-45bb-abb0-7cc5ccf81080\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" Feb 23 14:18:19.982344 master-0 kubenswrapper[4171]: I0223 14:18:19.981802 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08c561b3-613b-425f-9de4-d5fc8762ea51-host-slash\") pod \"iptables-alerter-t5h8h\" (UID: \"08c561b3-613b-425f-9de4-d5fc8762ea51\") " pod="openshift-network-operator/iptables-alerter-t5h8h" Feb 23 14:18:19.982344 master-0 kubenswrapper[4171]: E0223 14:18:19.981868 4171 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Feb 23 14:18:19.982344 master-0 kubenswrapper[4171]: E0223 14:18:19.981936 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls podName:607c1101-3533-43e3-9eda-13cea2b9dbb6 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:20.481907171 +0000 UTC m=+110.585308700 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls") pod "dns-operator-8c7d49845-5rk2g" (UID: "607c1101-3533-43e3-9eda-13cea2b9dbb6") : secret "metrics-tls" not found Feb 23 14:18:19.982344 master-0 kubenswrapper[4171]: I0223 14:18:19.982279 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2d00ece-7586-4346-adbb-eaae1aeda69e-service-ca-bundle\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:18:19.983572 master-0 kubenswrapper[4171]: I0223 14:18:19.983499 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf04aca0-8174-4134-835d-37adf6a3b5ca-config\") pod \"kube-controller-manager-operator-7bcfbc574b-zdntd\" (UID: \"cf04aca0-8174-4134-835d-37adf6a3b5ca\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" Feb 23 14:18:19.984195 master-0 kubenswrapper[4171]: I0223 14:18:19.983990 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9cf1c39-24f0-420b-8020-089616d1cdf0-config\") pod \"openshift-kube-scheduler-operator-77cd4d9559-qvq8x\" (UID: \"b9cf1c39-24f0-420b-8020-089616d1cdf0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" Feb 23 14:18:19.984884 master-0 kubenswrapper[4171]: E0223 14:18:19.984678 4171 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Feb 23 14:18:19.984884 master-0 kubenswrapper[4171]: I0223 14:18:19.984720 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8de1f285-47ac-42aa-8026-8addce656362-etcd-service-ca\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:18:19.984884 master-0 kubenswrapper[4171]: E0223 14:18:19.984752 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls podName:57b57915-64dd-42f5-b06f-bc4bcc06b667 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:20.484724534 +0000 UTC m=+110.588126033 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls") pod "cluster-node-tuning-operator-bcf775fc9-z5t5b" (UID: "57b57915-64dd-42f5-b06f-bc4bcc06b667") : secret "node-tuning-operator-tls" not found Feb 23 14:18:19.988576 master-0 kubenswrapper[4171]: I0223 14:18:19.986159 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/961e4ecd-545b-4270-ae34-e733dec793b6-serving-cert\") pod \"kube-apiserver-operator-5d87bf58c-nq2tz\" (UID: \"961e4ecd-545b-4270-ae34-e733dec793b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" Feb 23 14:18:19.988576 master-0 kubenswrapper[4171]: I0223 14:18:19.987370 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf04aca0-8174-4134-835d-37adf6a3b5ca-serving-cert\") pod \"kube-controller-manager-operator-7bcfbc574b-zdntd\" (UID: \"cf04aca0-8174-4134-835d-37adf6a3b5ca\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" Feb 23 14:18:19.988576 master-0 kubenswrapper[4171]: I0223 14:18:19.987874 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/646fece3-4a42-4e0c-bcc7-5f705f948d63-telemetry-config\") pod \"cluster-monitoring-operator-6bb6d78bf-wzqcp\" (UID: \"646fece3-4a42-4e0c-bcc7-5f705f948d63\") " pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:18:19.988576 master-0 kubenswrapper[4171]: I0223 14:18:19.988207 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8de1f285-47ac-42aa-8026-8addce656362-serving-cert\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:18:19.988971 master-0 kubenswrapper[4171]: I0223 14:18:19.988631 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chznd\" (UniqueName: \"kubernetes.io/projected/cb6e88cd-98de-446a-92e8-f56a2f133703-kube-api-access-chznd\") pod \"openshift-controller-manager-operator-584cc7bcb5-67ds6\" (UID: \"cb6e88cd-98de-446a-92e8-f56a2f133703\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" Feb 23 14:18:19.991532 master-0 kubenswrapper[4171]: I0223 14:18:19.990688 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8llc8\" (UniqueName: \"kubernetes.io/projected/a4ae9292-71dc-4484-b277-43cb26c1e04d-kube-api-access-8llc8\") pod \"csi-snapshot-controller-operator-6fb4df594f-hkcgz\" (UID: \"a4ae9292-71dc-4484-b277-43cb26c1e04d\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6fb4df594f-hkcgz" Feb 23 14:18:19.991532 master-0 kubenswrapper[4171]: I0223 14:18:19.990871 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9cf1c39-24f0-420b-8020-089616d1cdf0-serving-cert\") pod \"openshift-kube-scheduler-operator-77cd4d9559-qvq8x\" (UID: \"b9cf1c39-24f0-420b-8020-089616d1cdf0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" Feb 23 14:18:19.991532 master-0 kubenswrapper[4171]: I0223 14:18:19.990952 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8de1f285-47ac-42aa-8026-8addce656362-etcd-client\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:18:19.991532 master-0 kubenswrapper[4171]: I0223 14:18:19.991268 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf04aca0-8174-4134-835d-37adf6a3b5ca-kube-api-access\") pod \"kube-controller-manager-operator-7bcfbc574b-zdntd\" (UID: \"cf04aca0-8174-4134-835d-37adf6a3b5ca\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" Feb 23 14:18:19.992084 master-0 kubenswrapper[4171]: I0223 14:18:19.992041 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/961e4ecd-545b-4270-ae34-e733dec793b6-config\") pod \"kube-apiserver-operator-5d87bf58c-nq2tz\" (UID: \"961e4ecd-545b-4270-ae34-e733dec793b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" Feb 23 14:18:19.996732 master-0 kubenswrapper[4171]: I0223 14:18:19.994536 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb6e88cd-98de-446a-92e8-f56a2f133703-serving-cert\") pod \"openshift-controller-manager-operator-584cc7bcb5-67ds6\" (UID: \"cb6e88cd-98de-446a-92e8-f56a2f133703\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" Feb 23 14:18:19.996732 master-0 kubenswrapper[4171]: I0223 14:18:19.995918 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24829faf-50e8-45bb-abb0-7cc5ccf81080-serving-cert\") pod \"openshift-apiserver-operator-8586dccc9b-tvnmq\" (UID: \"24829faf-50e8-45bb-abb0-7cc5ccf81080\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" Feb 23 14:18:19.996732 master-0 kubenswrapper[4171]: I0223 14:18:19.996696 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b714a9df-026e-423d-a980-2569f0d92e47-serving-cert\") pod \"service-ca-operator-c48c8bf7c-vtnsw\" (UID: \"b714a9df-026e-423d-a980-2569f0d92e47\") " pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" Feb 23 14:18:20.002512 master-0 kubenswrapper[4171]: I0223 14:18:19.999561 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jzsd\" (UniqueName: \"kubernetes.io/projected/646fece3-4a42-4e0c-bcc7-5f705f948d63-kube-api-access-2jzsd\") pod \"cluster-monitoring-operator-6bb6d78bf-wzqcp\" (UID: \"646fece3-4a42-4e0c-bcc7-5f705f948d63\") " pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:18:20.002512 master-0 kubenswrapper[4171]: I0223 14:18:20.001590 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5f8j\" (UniqueName: \"kubernetes.io/projected/5b54fc16-d2f7-4b10-a611-5b411b389c5a-kube-api-access-d5f8j\") pod \"package-server-manager-5c75f78c8b-cj2l7\" (UID: \"5b54fc16-d2f7-4b10-a611-5b411b389c5a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:18:20.002512 master-0 kubenswrapper[4171]: I0223 14:18:20.002353 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3488a7eb-5170-478c-9af7-490dbe0f514e-bound-sa-token\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:18:20.004127 master-0 kubenswrapper[4171]: I0223 14:18:20.004089 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr868\" (UniqueName: \"kubernetes.io/projected/b714a9df-026e-423d-a980-2569f0d92e47-kube-api-access-lr868\") pod \"service-ca-operator-c48c8bf7c-vtnsw\" (UID: \"b714a9df-026e-423d-a980-2569f0d92e47\") " pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" Feb 23 14:18:20.005440 master-0 kubenswrapper[4171]: I0223 14:18:20.005338 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp6tj\" (UniqueName: \"kubernetes.io/projected/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061-kube-api-access-vp6tj\") pod \"cluster-olm-operator-5bd7768f54-bgg88\" (UID: \"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" Feb 23 14:18:20.005994 master-0 kubenswrapper[4171]: I0223 14:18:20.005932 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qszm\" (UniqueName: \"kubernetes.io/projected/3488a7eb-5170-478c-9af7-490dbe0f514e-kube-api-access-6qszm\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:18:20.016960 master-0 kubenswrapper[4171]: I0223 14:18:20.016900 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qggzs\" (UniqueName: \"kubernetes.io/projected/57b57915-64dd-42f5-b06f-bc4bcc06b667-kube-api-access-qggzs\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:20.020160 master-0 kubenswrapper[4171]: I0223 14:18:20.020117 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7jvd\" (UniqueName: \"kubernetes.io/projected/8de1f285-47ac-42aa-8026-8addce656362-kube-api-access-x7jvd\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:18:20.025688 master-0 kubenswrapper[4171]: I0223 14:18:20.025659 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" Feb 23 14:18:20.050260 master-0 kubenswrapper[4171]: I0223 14:18:20.050219 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/961e4ecd-545b-4270-ae34-e733dec793b6-kube-api-access\") pod \"kube-apiserver-operator-5d87bf58c-nq2tz\" (UID: \"961e4ecd-545b-4270-ae34-e733dec793b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" Feb 23 14:18:20.050422 master-0 kubenswrapper[4171]: I0223 14:18:20.050399 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" Feb 23 14:18:20.057210 master-0 kubenswrapper[4171]: I0223 14:18:20.057177 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:18:20.057272 master-0 kubenswrapper[4171]: I0223 14:18:20.057242 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:18:20.059802 master-0 kubenswrapper[4171]: I0223 14:18:20.059766 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" Feb 23 14:18:20.070513 master-0 kubenswrapper[4171]: I0223 14:18:20.070465 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9tkx\" (UniqueName: \"kubernetes.io/projected/585f74db-4593-426b-b0c7-ec8f64810549-kube-api-access-q9tkx\") pod \"marketplace-operator-6f5488b997-7b5sp\" (UID: \"585f74db-4593-426b-b0c7-ec8f64810549\") " pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:18:20.083560 master-0 kubenswrapper[4171]: I0223 14:18:20.083529 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" Feb 23 14:18:20.095326 master-0 kubenswrapper[4171]: I0223 14:18:20.095283 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9cf1c39-24f0-420b-8020-089616d1cdf0-kube-api-access\") pod \"openshift-kube-scheduler-operator-77cd4d9559-qvq8x\" (UID: \"b9cf1c39-24f0-420b-8020-089616d1cdf0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" Feb 23 14:18:20.097152 master-0 kubenswrapper[4171]: I0223 14:18:20.097112 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6fb4df594f-hkcgz" Feb 23 14:18:20.101691 master-0 kubenswrapper[4171]: I0223 14:18:20.100443 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hp42\" (UniqueName: \"kubernetes.io/projected/24829faf-50e8-45bb-abb0-7cc5ccf81080-kube-api-access-7hp42\") pod \"openshift-apiserver-operator-8586dccc9b-tvnmq\" (UID: \"24829faf-50e8-45bb-abb0-7cc5ccf81080\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" Feb 23 14:18:20.110718 master-0 kubenswrapper[4171]: I0223 14:18:20.110353 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" Feb 23 14:18:20.128668 master-0 kubenswrapper[4171]: I0223 14:18:20.128588 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phmkf\" (UniqueName: \"kubernetes.io/projected/08c561b3-613b-425f-9de4-d5fc8762ea51-kube-api-access-phmkf\") pod \"iptables-alerter-t5h8h\" (UID: \"08c561b3-613b-425f-9de4-d5fc8762ea51\") " pod="openshift-network-operator/iptables-alerter-t5h8h" Feb 23 14:18:20.155568 master-0 kubenswrapper[4171]: I0223 14:18:20.154333 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:18:20.158626 master-0 kubenswrapper[4171]: I0223 14:18:20.158571 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nr85\" (UniqueName: \"kubernetes.io/projected/e2d00ece-7586-4346-adbb-eaae1aeda69e-kube-api-access-4nr85\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:18:20.172856 master-0 kubenswrapper[4171]: I0223 14:18:20.172798 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" Feb 23 14:18:20.174156 master-0 kubenswrapper[4171]: I0223 14:18:20.174119 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j54j5\" (UniqueName: \"kubernetes.io/projected/842d45c5-3452-4e97-b5f5-540395330a65-kube-api-access-j54j5\") pod \"multus-admission-controller-5f98f4f8d5-fnc9v\" (UID: \"842d45c5-3452-4e97-b5f5-540395330a65\") " pod="openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v" Feb 23 14:18:20.188586 master-0 kubenswrapper[4171]: I0223 14:18:20.185661 4171 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4sbp\" (UniqueName: \"kubernetes.io/projected/607c1101-3533-43e3-9eda-13cea2b9dbb6-kube-api-access-v4sbp\") pod \"dns-operator-8c7d49845-5rk2g\" (UID: \"607c1101-3533-43e3-9eda-13cea2b9dbb6\") " pod="openshift-dns-operator/dns-operator-8c7d49845-5rk2g" Feb 23 14:18:20.193854 master-0 kubenswrapper[4171]: I0223 14:18:20.191175 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 23 14:18:20.201180 master-0 kubenswrapper[4171]: I0223 14:18:20.201132 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-t5h8h" Feb 23 14:18:20.207860 master-0 kubenswrapper[4171]: I0223 14:18:20.205643 4171 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 23 14:18:20.226686 master-0 kubenswrapper[4171]: I0223 14:18:20.226632 4171 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 23 14:18:20.277394 master-0 kubenswrapper[4171]: I0223 14:18:20.277331 4171 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd"] Feb 23 14:18:20.327385 master-0 kubenswrapper[4171]: I0223 14:18:20.327329 4171 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9"] Feb 23 14:18:20.342013 master-0 kubenswrapper[4171]: I0223 14:18:20.341969 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" Feb 23 14:18:20.345248 master-0 kubenswrapper[4171]: I0223 14:18:20.345195 4171 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw"] Feb 23 14:18:20.356237 master-0 kubenswrapper[4171]: W0223 14:18:20.356058 4171 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb714a9df_026e_423d_a980_2569f0d92e47.slice/crio-5110b129f87dd0c4cfa0060a0c853f8887b553680a908511fa6dc6b38b84e26d WatchSource:0}: Error finding container 5110b129f87dd0c4cfa0060a0c853f8887b553680a908511fa6dc6b38b84e26d: Status 404 returned error can't find the container with id 5110b129f87dd0c4cfa0060a0c853f8887b553680a908511fa6dc6b38b84e26d Feb 23 14:18:20.358634 master-0 kubenswrapper[4171]: I0223 14:18:20.358603 4171 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6"] Feb 23 14:18:20.370381 master-0 kubenswrapper[4171]: W0223 14:18:20.370345 4171 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb6e88cd_98de_446a_92e8_f56a2f133703.slice/crio-6ffc0e356ee8d2e23632fe04da113c89cd2bff5243dd2b5c07a151a546ba49d8 WatchSource:0}: Error finding container 6ffc0e356ee8d2e23632fe04da113c89cd2bff5243dd2b5c07a151a546ba49d8: Status 404 returned error can't find the container with id 6ffc0e356ee8d2e23632fe04da113c89cd2bff5243dd2b5c07a151a546ba49d8 Feb 23 14:18:20.375934 master-0 kubenswrapper[4171]: I0223 14:18:20.375892 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" Feb 23 14:18:20.389818 master-0 kubenswrapper[4171]: I0223 14:18:20.389778 4171 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:18:20.406535 master-0 kubenswrapper[4171]: I0223 14:18:20.406459 4171 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz"] Feb 23 14:18:20.411163 master-0 kubenswrapper[4171]: I0223 14:18:20.411119 4171 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm"] Feb 23 14:18:20.424320 master-0 kubenswrapper[4171]: W0223 14:18:20.424271 4171 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8de1f285_47ac_42aa_8026_8addce656362.slice/crio-bf4e70417a5730a71d5c5227bc5ca324709a18d10a43505d1415f3e27a32b0fc WatchSource:0}: Error finding container bf4e70417a5730a71d5c5227bc5ca324709a18d10a43505d1415f3e27a32b0fc: Status 404 returned error can't find the container with id bf4e70417a5730a71d5c5227bc5ca324709a18d10a43505d1415f3e27a32b0fc Feb 23 14:18:20.483033 master-0 kubenswrapper[4171]: I0223 14:18:20.482988 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert\") pod \"package-server-manager-5c75f78c8b-cj2l7\" (UID: \"5b54fc16-d2f7-4b10-a611-5b411b389c5a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:18:20.483159 master-0 kubenswrapper[4171]: I0223 14:18:20.483053 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls\") pod \"dns-operator-8c7d49845-5rk2g\" (UID: \"607c1101-3533-43e3-9eda-13cea2b9dbb6\") " pod="openshift-dns-operator/dns-operator-8c7d49845-5rk2g" Feb 23 14:18:20.483159 master-0 kubenswrapper[4171]: I0223 14:18:20.483087 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs\") pod \"multus-admission-controller-5f98f4f8d5-fnc9v\" (UID: \"842d45c5-3452-4e97-b5f5-540395330a65\") " pod="openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v" Feb 23 14:18:20.483309 master-0 kubenswrapper[4171]: E0223 14:18:20.483256 4171 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Feb 23 14:18:20.483397 master-0 kubenswrapper[4171]: E0223 14:18:20.483371 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert podName:5b54fc16-d2f7-4b10-a611-5b411b389c5a nodeName:}" failed. No retries permitted until 2026-02-23 14:18:21.483343716 +0000 UTC m=+111.586745235 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert") pod "package-server-manager-5c75f78c8b-cj2l7" (UID: "5b54fc16-d2f7-4b10-a611-5b411b389c5a") : secret "package-server-manager-serving-cert" not found Feb 23 14:18:20.484016 master-0 kubenswrapper[4171]: I0223 14:18:20.483970 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:18:20.484070 master-0 kubenswrapper[4171]: I0223 14:18:20.484040 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6bb6d78bf-wzqcp\" (UID: \"646fece3-4a42-4e0c-bcc7-5f705f948d63\") " pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:18:20.484177 master-0 kubenswrapper[4171]: I0223 14:18:20.484143 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:20.484233 master-0 kubenswrapper[4171]: I0223 14:18:20.484183 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics\") pod \"marketplace-operator-6f5488b997-7b5sp\" (UID: \"585f74db-4593-426b-b0c7-ec8f64810549\") " pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:18:20.484342 master-0 kubenswrapper[4171]: E0223 14:18:20.484311 4171 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Feb 23 14:18:20.484390 master-0 kubenswrapper[4171]: E0223 14:18:20.484367 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics podName:585f74db-4593-426b-b0c7-ec8f64810549 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:21.484351635 +0000 UTC m=+111.587753154 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics") pod "marketplace-operator-6f5488b997-7b5sp" (UID: "585f74db-4593-426b-b0c7-ec8f64810549") : secret "marketplace-operator-metrics" not found Feb 23 14:18:20.484460 master-0 kubenswrapper[4171]: E0223 14:18:20.484439 4171 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Feb 23 14:18:20.484525 master-0 kubenswrapper[4171]: E0223 14:18:20.484513 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs podName:842d45c5-3452-4e97-b5f5-540395330a65 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:21.484470377 +0000 UTC m=+111.587871896 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs") pod "multus-admission-controller-5f98f4f8d5-fnc9v" (UID: "842d45c5-3452-4e97-b5f5-540395330a65") : secret "multus-admission-controller-secret" not found Feb 23 14:18:20.484601 master-0 kubenswrapper[4171]: E0223 14:18:20.484581 4171 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Feb 23 14:18:20.484642 master-0 kubenswrapper[4171]: E0223 14:18:20.484623 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls podName:3488a7eb-5170-478c-9af7-490dbe0f514e nodeName:}" failed. No retries permitted until 2026-02-23 14:18:21.484611769 +0000 UTC m=+111.588013288 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls") pod "ingress-operator-6569778c84-hsl6c" (UID: "3488a7eb-5170-478c-9af7-490dbe0f514e") : secret "metrics-tls" not found Feb 23 14:18:20.484708 master-0 kubenswrapper[4171]: E0223 14:18:20.484687 4171 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Feb 23 14:18:20.484753 master-0 kubenswrapper[4171]: E0223 14:18:20.484732 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls podName:646fece3-4a42-4e0c-bcc7-5f705f948d63 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:21.484720681 +0000 UTC m=+111.588122200 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6bb6d78bf-wzqcp" (UID: "646fece3-4a42-4e0c-bcc7-5f705f948d63") : secret "cluster-monitoring-operator-tls" not found Feb 23 14:18:20.484814 master-0 kubenswrapper[4171]: E0223 14:18:20.484793 4171 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Feb 23 14:18:20.484939 master-0 kubenswrapper[4171]: E0223 14:18:20.484835 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert podName:57b57915-64dd-42f5-b06f-bc4bcc06b667 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:21.484823923 +0000 UTC m=+111.588225462 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert") pod "cluster-node-tuning-operator-bcf775fc9-z5t5b" (UID: "57b57915-64dd-42f5-b06f-bc4bcc06b667") : secret "performance-addon-operator-webhook-cert" not found Feb 23 14:18:20.484939 master-0 kubenswrapper[4171]: E0223 14:18:20.484900 4171 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Feb 23 14:18:20.484939 master-0 kubenswrapper[4171]: E0223 14:18:20.484933 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls podName:607c1101-3533-43e3-9eda-13cea2b9dbb6 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:21.484922045 +0000 UTC m=+111.588323564 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls") pod "dns-operator-8c7d49845-5rk2g" (UID: "607c1101-3533-43e3-9eda-13cea2b9dbb6") : secret "metrics-tls" not found Feb 23 14:18:20.517203 master-0 kubenswrapper[4171]: I0223 14:18:20.517163 4171 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x"] Feb 23 14:18:20.547966 master-0 kubenswrapper[4171]: I0223 14:18:20.547894 4171 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq"] Feb 23 14:18:20.555909 master-0 kubenswrapper[4171]: W0223 14:18:20.555859 4171 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24829faf_50e8_45bb_abb0_7cc5ccf81080.slice/crio-fcec922662159dc1cf38c675599685e8c305a9fc3cb374ca7d731b92354b4d60 WatchSource:0}: Error finding container fcec922662159dc1cf38c675599685e8c305a9fc3cb374ca7d731b92354b4d60: Status 404 returned error can't find the container with id fcec922662159dc1cf38c675599685e8c305a9fc3cb374ca7d731b92354b4d60 Feb 23 14:18:20.568332 master-0 kubenswrapper[4171]: I0223 14:18:20.568290 4171 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2"] Feb 23 14:18:20.576176 master-0 kubenswrapper[4171]: W0223 14:18:20.576134 4171 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2d00ece_7586_4346_adbb_eaae1aeda69e.slice/crio-b0f1382249dc5f24b8f4811073e190383453a7404f8296e621cf4a7e56c21042 WatchSource:0}: Error finding container b0f1382249dc5f24b8f4811073e190383453a7404f8296e621cf4a7e56c21042: Status 404 returned error can't find the container with id b0f1382249dc5f24b8f4811073e190383453a7404f8296e621cf4a7e56c21042 Feb 23 14:18:20.580907 master-0 kubenswrapper[4171]: I0223 14:18:20.580880 4171 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88"] Feb 23 14:18:20.585506 master-0 kubenswrapper[4171]: I0223 14:18:20.585445 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:20.585650 master-0 kubenswrapper[4171]: E0223 14:18:20.585599 4171 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Feb 23 14:18:20.585735 master-0 kubenswrapper[4171]: E0223 14:18:20.585710 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls podName:57b57915-64dd-42f5-b06f-bc4bcc06b667 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:21.585671829 +0000 UTC m=+111.689073318 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls") pod "cluster-node-tuning-operator-bcf775fc9-z5t5b" (UID: "57b57915-64dd-42f5-b06f-bc4bcc06b667") : secret "node-tuning-operator-tls" not found Feb 23 14:18:20.588117 master-0 kubenswrapper[4171]: I0223 14:18:20.586584 4171 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-6fb4df594f-hkcgz"] Feb 23 14:18:20.591856 master-0 kubenswrapper[4171]: W0223 14:18:20.591815 4171 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2aa0d48_7c8e_4ddb_84a3_b3c34414c061.slice/crio-4b7d2c8100142f929dc133ef3a280566ae721c684f05389d72d0b6d99271f228 WatchSource:0}: Error finding container 4b7d2c8100142f929dc133ef3a280566ae721c684f05389d72d0b6d99271f228: Status 404 returned error can't find the container with id 4b7d2c8100142f929dc133ef3a280566ae721c684f05389d72d0b6d99271f228 Feb 23 14:18:20.592429 master-0 kubenswrapper[4171]: W0223 14:18:20.592389 4171 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4ae9292_71dc_4484_b277_43cb26c1e04d.slice/crio-8bce00dde4bf57f38bea21a54eaeb5445e9a6797bd70cd70ab2f40465ffb6015 WatchSource:0}: Error finding container 8bce00dde4bf57f38bea21a54eaeb5445e9a6797bd70cd70ab2f40465ffb6015: Status 404 returned error can't find the container with id 8bce00dde4bf57f38bea21a54eaeb5445e9a6797bd70cd70ab2f40465ffb6015 Feb 23 14:18:20.754765 master-0 kubenswrapper[4171]: I0223 14:18:20.754665 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" event={"ID":"b714a9df-026e-423d-a980-2569f0d92e47","Type":"ContainerStarted","Data":"5110b129f87dd0c4cfa0060a0c853f8887b553680a908511fa6dc6b38b84e26d"} Feb 23 14:18:20.756282 master-0 kubenswrapper[4171]: I0223 14:18:20.756073 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" event={"ID":"e2d00ece-7586-4346-adbb-eaae1aeda69e","Type":"ContainerStarted","Data":"b0f1382249dc5f24b8f4811073e190383453a7404f8296e621cf4a7e56c21042"} Feb 23 14:18:20.757235 master-0 kubenswrapper[4171]: I0223 14:18:20.757172 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" event={"ID":"b9cf1c39-24f0-420b-8020-089616d1cdf0","Type":"ContainerStarted","Data":"4782d187d8efc0b4014aa50653963e17b661187c5f36601036516cb2857a5d98"} Feb 23 14:18:20.758512 master-0 kubenswrapper[4171]: I0223 14:18:20.758441 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6fb4df594f-hkcgz" event={"ID":"a4ae9292-71dc-4484-b277-43cb26c1e04d","Type":"ContainerStarted","Data":"8bce00dde4bf57f38bea21a54eaeb5445e9a6797bd70cd70ab2f40465ffb6015"} Feb 23 14:18:20.759602 master-0 kubenswrapper[4171]: I0223 14:18:20.759556 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" event={"ID":"cb6e88cd-98de-446a-92e8-f56a2f133703","Type":"ContainerStarted","Data":"6ffc0e356ee8d2e23632fe04da113c89cd2bff5243dd2b5c07a151a546ba49d8"} Feb 23 14:18:20.761604 master-0 kubenswrapper[4171]: I0223 14:18:20.761566 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" event={"ID":"961e4ecd-545b-4270-ae34-e733dec793b6","Type":"ContainerStarted","Data":"5ed2538f1dd4c505937625e4613ce7839a7ad1306cb779a0660bf410856f74ea"} Feb 23 14:18:20.761604 master-0 kubenswrapper[4171]: I0223 14:18:20.761601 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" event={"ID":"961e4ecd-545b-4270-ae34-e733dec793b6","Type":"ContainerStarted","Data":"7a94361576154416139d60324d6f01b1540eacf16a8dedc989cadf9cc6e41fca"} Feb 23 14:18:20.763120 master-0 kubenswrapper[4171]: I0223 14:18:20.763085 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" event={"ID":"cf04aca0-8174-4134-835d-37adf6a3b5ca","Type":"ContainerStarted","Data":"a97726e86565351c3e74221a112a0906c73bf937319f30cac1b3e4b4f38e404f"} Feb 23 14:18:20.766763 master-0 kubenswrapper[4171]: I0223 14:18:20.766729 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" event={"ID":"8de1f285-47ac-42aa-8026-8addce656362","Type":"ContainerStarted","Data":"bf4e70417a5730a71d5c5227bc5ca324709a18d10a43505d1415f3e27a32b0fc"} Feb 23 14:18:20.768696 master-0 kubenswrapper[4171]: I0223 14:18:20.768645 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" event={"ID":"24829faf-50e8-45bb-abb0-7cc5ccf81080","Type":"ContainerStarted","Data":"fcec922662159dc1cf38c675599685e8c305a9fc3cb374ca7d731b92354b4d60"} Feb 23 14:18:20.769938 master-0 kubenswrapper[4171]: I0223 14:18:20.769901 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" event={"ID":"865ceedb-b19a-4f2f-b295-311e1b7a645e","Type":"ContainerStarted","Data":"9c6c5f4b9ba45ac61b51f9857ceb74fc6b905bb2bdd1312940fdeb330ace9d7f"} Feb 23 14:18:20.784853 master-0 kubenswrapper[4171]: I0223 14:18:20.784376 4171 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" podStartSLOduration=74.784343233 podStartE2EDuration="1m14.784343233s" podCreationTimestamp="2026-02-23 14:17:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:18:20.779523383 +0000 UTC m=+110.882924882" watchObservedRunningTime="2026-02-23 14:18:20.784343233 +0000 UTC m=+110.887744762" Feb 23 14:18:20.784853 master-0 kubenswrapper[4171]: I0223 14:18:20.784745 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-t5h8h" event={"ID":"08c561b3-613b-425f-9de4-d5fc8762ea51","Type":"ContainerStarted","Data":"4f93047dc0b7cb7f4f7c771225dde60d727738bda2832af456ff04f11ecb402a"} Feb 23 14:18:20.802423 master-0 kubenswrapper[4171]: I0223 14:18:20.802311 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" event={"ID":"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061","Type":"ContainerStarted","Data":"4b7d2c8100142f929dc133ef3a280566ae721c684f05389d72d0b6d99271f228"} Feb 23 14:18:21.495948 master-0 kubenswrapper[4171]: I0223 14:18:21.495800 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:21.495948 master-0 kubenswrapper[4171]: I0223 14:18:21.495862 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics\") pod \"marketplace-operator-6f5488b997-7b5sp\" (UID: \"585f74db-4593-426b-b0c7-ec8f64810549\") " pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:18:21.495948 master-0 kubenswrapper[4171]: I0223 14:18:21.495899 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert\") pod \"package-server-manager-5c75f78c8b-cj2l7\" (UID: \"5b54fc16-d2f7-4b10-a611-5b411b389c5a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:18:21.496252 master-0 kubenswrapper[4171]: E0223 14:18:21.496044 4171 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Feb 23 14:18:21.496252 master-0 kubenswrapper[4171]: E0223 14:18:21.496123 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert podName:57b57915-64dd-42f5-b06f-bc4bcc06b667 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:23.496104849 +0000 UTC m=+113.599506338 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert") pod "cluster-node-tuning-operator-bcf775fc9-z5t5b" (UID: "57b57915-64dd-42f5-b06f-bc4bcc06b667") : secret "performance-addon-operator-webhook-cert" not found Feb 23 14:18:21.497716 master-0 kubenswrapper[4171]: E0223 14:18:21.496499 4171 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Feb 23 14:18:21.497716 master-0 kubenswrapper[4171]: E0223 14:18:21.496582 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics podName:585f74db-4593-426b-b0c7-ec8f64810549 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:23.496559988 +0000 UTC m=+113.599961567 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics") pod "marketplace-operator-6f5488b997-7b5sp" (UID: "585f74db-4593-426b-b0c7-ec8f64810549") : secret "marketplace-operator-metrics" not found Feb 23 14:18:21.524503 master-0 kubenswrapper[4171]: E0223 14:18:21.497234 4171 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Feb 23 14:18:21.524503 master-0 kubenswrapper[4171]: E0223 14:18:21.501319 4171 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Feb 23 14:18:21.524503 master-0 kubenswrapper[4171]: E0223 14:18:21.501332 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert podName:5b54fc16-d2f7-4b10-a611-5b411b389c5a nodeName:}" failed. No retries permitted until 2026-02-23 14:18:23.501303506 +0000 UTC m=+113.604705025 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert") pod "package-server-manager-5c75f78c8b-cj2l7" (UID: "5b54fc16-d2f7-4b10-a611-5b411b389c5a") : secret "package-server-manager-serving-cert" not found Feb 23 14:18:21.524503 master-0 kubenswrapper[4171]: E0223 14:18:21.501363 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls podName:607c1101-3533-43e3-9eda-13cea2b9dbb6 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:23.501351197 +0000 UTC m=+113.604752686 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls") pod "dns-operator-8c7d49845-5rk2g" (UID: "607c1101-3533-43e3-9eda-13cea2b9dbb6") : secret "metrics-tls" not found Feb 23 14:18:21.524503 master-0 kubenswrapper[4171]: I0223 14:18:21.501115 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls\") pod \"dns-operator-8c7d49845-5rk2g\" (UID: \"607c1101-3533-43e3-9eda-13cea2b9dbb6\") " pod="openshift-dns-operator/dns-operator-8c7d49845-5rk2g" Feb 23 14:18:21.524503 master-0 kubenswrapper[4171]: I0223 14:18:21.508632 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs\") pod \"multus-admission-controller-5f98f4f8d5-fnc9v\" (UID: \"842d45c5-3452-4e97-b5f5-540395330a65\") " pod="openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v" Feb 23 14:18:21.524503 master-0 kubenswrapper[4171]: E0223 14:18:21.508753 4171 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Feb 23 14:18:21.524503 master-0 kubenswrapper[4171]: E0223 14:18:21.508811 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs podName:842d45c5-3452-4e97-b5f5-540395330a65 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:23.508794096 +0000 UTC m=+113.612195665 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs") pod "multus-admission-controller-5f98f4f8d5-fnc9v" (UID: "842d45c5-3452-4e97-b5f5-540395330a65") : secret "multus-admission-controller-secret" not found Feb 23 14:18:21.524503 master-0 kubenswrapper[4171]: I0223 14:18:21.508869 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:18:21.524503 master-0 kubenswrapper[4171]: I0223 14:18:21.508930 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6bb6d78bf-wzqcp\" (UID: \"646fece3-4a42-4e0c-bcc7-5f705f948d63\") " pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:18:21.524503 master-0 kubenswrapper[4171]: E0223 14:18:21.509092 4171 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Feb 23 14:18:21.524503 master-0 kubenswrapper[4171]: E0223 14:18:21.509123 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls podName:646fece3-4a42-4e0c-bcc7-5f705f948d63 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:23.509113142 +0000 UTC m=+113.612514711 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6bb6d78bf-wzqcp" (UID: "646fece3-4a42-4e0c-bcc7-5f705f948d63") : secret "cluster-monitoring-operator-tls" not found Feb 23 14:18:21.524503 master-0 kubenswrapper[4171]: E0223 14:18:21.509170 4171 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Feb 23 14:18:21.524503 master-0 kubenswrapper[4171]: E0223 14:18:21.509194 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls podName:3488a7eb-5170-478c-9af7-490dbe0f514e nodeName:}" failed. No retries permitted until 2026-02-23 14:18:23.509184754 +0000 UTC m=+113.612586353 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls") pod "ingress-operator-6569778c84-hsl6c" (UID: "3488a7eb-5170-478c-9af7-490dbe0f514e") : secret "metrics-tls" not found Feb 23 14:18:21.610408 master-0 kubenswrapper[4171]: I0223 14:18:21.610332 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:21.612378 master-0 kubenswrapper[4171]: E0223 14:18:21.612336 4171 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Feb 23 14:18:21.612455 master-0 kubenswrapper[4171]: E0223 14:18:21.612447 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls podName:57b57915-64dd-42f5-b06f-bc4bcc06b667 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:23.612424604 +0000 UTC m=+113.715826093 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls") pod "cluster-node-tuning-operator-bcf775fc9-z5t5b" (UID: "57b57915-64dd-42f5-b06f-bc4bcc06b667") : secret "node-tuning-operator-tls" not found Feb 23 14:18:22.814202 master-0 kubenswrapper[4171]: I0223 14:18:22.814141 4171 generic.go:334] "Generic (PLEG): container finished" podID="d2aa0d48-7c8e-4ddb-84a3-b3c34414c061" containerID="881e3b61730f49e9657641d193738c054ca1938ca39d0f830ceee7b02b6b1f78" exitCode=0 Feb 23 14:18:22.814873 master-0 kubenswrapper[4171]: I0223 14:18:22.814208 4171 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" event={"ID":"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061","Type":"ContainerDied","Data":"881e3b61730f49e9657641d193738c054ca1938ca39d0f830ceee7b02b6b1f78"} Feb 23 14:18:23.532694 master-0 kubenswrapper[4171]: I0223 14:18:23.532620 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:23.532694 master-0 kubenswrapper[4171]: I0223 14:18:23.532675 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics\") pod \"marketplace-operator-6f5488b997-7b5sp\" (UID: \"585f74db-4593-426b-b0c7-ec8f64810549\") " pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:18:23.532977 master-0 kubenswrapper[4171]: E0223 14:18:23.532870 4171 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Feb 23 14:18:23.532977 master-0 kubenswrapper[4171]: I0223 14:18:23.532899 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert\") pod \"package-server-manager-5c75f78c8b-cj2l7\" (UID: \"5b54fc16-d2f7-4b10-a611-5b411b389c5a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:18:23.533070 master-0 kubenswrapper[4171]: E0223 14:18:23.532992 4171 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Feb 23 14:18:23.533070 master-0 kubenswrapper[4171]: E0223 14:18:23.533005 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert podName:57b57915-64dd-42f5-b06f-bc4bcc06b667 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:27.53297498 +0000 UTC m=+117.636376469 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert") pod "cluster-node-tuning-operator-bcf775fc9-z5t5b" (UID: "57b57915-64dd-42f5-b06f-bc4bcc06b667") : secret "performance-addon-operator-webhook-cert" not found Feb 23 14:18:23.533070 master-0 kubenswrapper[4171]: E0223 14:18:23.533047 4171 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Feb 23 14:18:23.533070 master-0 kubenswrapper[4171]: E0223 14:18:23.533053 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics podName:585f74db-4593-426b-b0c7-ec8f64810549 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:27.533035971 +0000 UTC m=+117.636437450 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics") pod "marketplace-operator-6f5488b997-7b5sp" (UID: "585f74db-4593-426b-b0c7-ec8f64810549") : secret "marketplace-operator-metrics" not found Feb 23 14:18:23.533241 master-0 kubenswrapper[4171]: E0223 14:18:23.533093 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert podName:5b54fc16-d2f7-4b10-a611-5b411b389c5a nodeName:}" failed. No retries permitted until 2026-02-23 14:18:27.533078632 +0000 UTC m=+117.636480301 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert") pod "package-server-manager-5c75f78c8b-cj2l7" (UID: "5b54fc16-d2f7-4b10-a611-5b411b389c5a") : secret "package-server-manager-serving-cert" not found Feb 23 14:18:23.533241 master-0 kubenswrapper[4171]: I0223 14:18:23.533093 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls\") pod \"dns-operator-8c7d49845-5rk2g\" (UID: \"607c1101-3533-43e3-9eda-13cea2b9dbb6\") " pod="openshift-dns-operator/dns-operator-8c7d49845-5rk2g" Feb 23 14:18:23.533241 master-0 kubenswrapper[4171]: I0223 14:18:23.533141 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs\") pod \"multus-admission-controller-5f98f4f8d5-fnc9v\" (UID: \"842d45c5-3452-4e97-b5f5-540395330a65\") " pod="openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v" Feb 23 14:18:23.533366 master-0 kubenswrapper[4171]: I0223 14:18:23.533346 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:18:23.533410 master-0 kubenswrapper[4171]: E0223 14:18:23.533343 4171 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Feb 23 14:18:23.533459 master-0 kubenswrapper[4171]: E0223 14:18:23.533416 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls podName:607c1101-3533-43e3-9eda-13cea2b9dbb6 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:27.533406128 +0000 UTC m=+117.636807617 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls") pod "dns-operator-8c7d49845-5rk2g" (UID: "607c1101-3533-43e3-9eda-13cea2b9dbb6") : secret "metrics-tls" not found Feb 23 14:18:23.533459 master-0 kubenswrapper[4171]: E0223 14:18:23.533429 4171 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Feb 23 14:18:23.533459 master-0 kubenswrapper[4171]: E0223 14:18:23.533433 4171 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Feb 23 14:18:23.533611 master-0 kubenswrapper[4171]: E0223 14:18:23.533501 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs podName:842d45c5-3452-4e97-b5f5-540395330a65 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:27.533468379 +0000 UTC m=+117.636870088 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs") pod "multus-admission-controller-5f98f4f8d5-fnc9v" (UID: "842d45c5-3452-4e97-b5f5-540395330a65") : secret "multus-admission-controller-secret" not found Feb 23 14:18:23.533611 master-0 kubenswrapper[4171]: I0223 14:18:23.533532 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6bb6d78bf-wzqcp\" (UID: \"646fece3-4a42-4e0c-bcc7-5f705f948d63\") " pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:18:23.533611 master-0 kubenswrapper[4171]: E0223 14:18:23.533558 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls podName:3488a7eb-5170-478c-9af7-490dbe0f514e nodeName:}" failed. No retries permitted until 2026-02-23 14:18:27.53353231 +0000 UTC m=+117.636933999 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls") pod "ingress-operator-6569778c84-hsl6c" (UID: "3488a7eb-5170-478c-9af7-490dbe0f514e") : secret "metrics-tls" not found Feb 23 14:18:23.533611 master-0 kubenswrapper[4171]: E0223 14:18:23.533612 4171 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Feb 23 14:18:23.533784 master-0 kubenswrapper[4171]: E0223 14:18:23.533654 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls podName:646fece3-4a42-4e0c-bcc7-5f705f948d63 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:27.533643592 +0000 UTC m=+117.637045291 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6bb6d78bf-wzqcp" (UID: "646fece3-4a42-4e0c-bcc7-5f705f948d63") : secret "cluster-monitoring-operator-tls" not found Feb 23 14:18:23.634979 master-0 kubenswrapper[4171]: I0223 14:18:23.634937 4171 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:23.635189 master-0 kubenswrapper[4171]: E0223 14:18:23.635120 4171 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Feb 23 14:18:23.635242 master-0 kubenswrapper[4171]: E0223 14:18:23.635220 4171 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls podName:57b57915-64dd-42f5-b06f-bc4bcc06b667 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:27.635201751 +0000 UTC m=+117.738603240 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls") pod "cluster-node-tuning-operator-bcf775fc9-z5t5b" (UID: "57b57915-64dd-42f5-b06f-bc4bcc06b667") : secret "node-tuning-operator-tls" not found Feb 23 14:18:24.072647 master-0 kubenswrapper[4171]: I0223 14:18:24.072604 4171 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Feb 23 14:18:26.907314 master-0 systemd[1]: Stopping Kubernetes Kubelet... Feb 23 14:18:26.934422 master-0 systemd[1]: kubelet.service: Deactivated successfully. Feb 23 14:18:26.934781 master-0 systemd[1]: Stopped Kubernetes Kubelet. Feb 23 14:18:26.938651 master-0 systemd[1]: kubelet.service: Consumed 10.655s CPU time. Feb 23 14:18:26.973722 master-0 systemd[1]: Starting Kubernetes Kubelet... Feb 23 14:18:27.082453 master-0 kubenswrapper[7728]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 23 14:18:27.082453 master-0 kubenswrapper[7728]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 23 14:18:27.082453 master-0 kubenswrapper[7728]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 23 14:18:27.083572 master-0 kubenswrapper[7728]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 23 14:18:27.083572 master-0 kubenswrapper[7728]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 23 14:18:27.083572 master-0 kubenswrapper[7728]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 23 14:18:27.083572 master-0 kubenswrapper[7728]: I0223 14:18:27.082567 7728 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 23 14:18:27.088556 master-0 kubenswrapper[7728]: W0223 14:18:27.088452 7728 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 23 14:18:27.088556 master-0 kubenswrapper[7728]: W0223 14:18:27.088550 7728 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 23 14:18:27.088556 master-0 kubenswrapper[7728]: W0223 14:18:27.088556 7728 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 23 14:18:27.088556 master-0 kubenswrapper[7728]: W0223 14:18:27.088561 7728 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 23 14:18:27.088690 master-0 kubenswrapper[7728]: W0223 14:18:27.088565 7728 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 23 14:18:27.088690 master-0 kubenswrapper[7728]: W0223 14:18:27.088572 7728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 23 14:18:27.088690 master-0 kubenswrapper[7728]: W0223 14:18:27.088576 7728 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 23 14:18:27.088690 master-0 kubenswrapper[7728]: W0223 14:18:27.088580 7728 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 23 14:18:27.088690 master-0 kubenswrapper[7728]: W0223 14:18:27.088583 7728 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 23 14:18:27.088690 master-0 kubenswrapper[7728]: W0223 14:18:27.088588 7728 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 23 14:18:27.088690 master-0 kubenswrapper[7728]: W0223 14:18:27.088594 7728 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 23 14:18:27.088690 master-0 kubenswrapper[7728]: W0223 14:18:27.088599 7728 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 23 14:18:27.088690 master-0 kubenswrapper[7728]: W0223 14:18:27.088604 7728 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 23 14:18:27.088690 master-0 kubenswrapper[7728]: W0223 14:18:27.088608 7728 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 23 14:18:27.088690 master-0 kubenswrapper[7728]: W0223 14:18:27.088612 7728 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 23 14:18:27.088690 master-0 kubenswrapper[7728]: W0223 14:18:27.088617 7728 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 23 14:18:27.088690 master-0 kubenswrapper[7728]: W0223 14:18:27.088620 7728 feature_gate.go:330] unrecognized feature gate: Example Feb 23 14:18:27.088690 master-0 kubenswrapper[7728]: W0223 14:18:27.088624 7728 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 23 14:18:27.088690 master-0 kubenswrapper[7728]: W0223 14:18:27.088628 7728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 23 14:18:27.088690 master-0 kubenswrapper[7728]: W0223 14:18:27.088631 7728 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 23 14:18:27.088690 master-0 kubenswrapper[7728]: W0223 14:18:27.088635 7728 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 23 14:18:27.088690 master-0 kubenswrapper[7728]: W0223 14:18:27.088638 7728 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 23 14:18:27.088690 master-0 kubenswrapper[7728]: W0223 14:18:27.088642 7728 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 23 14:18:27.088690 master-0 kubenswrapper[7728]: W0223 14:18:27.088646 7728 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 23 14:18:27.089161 master-0 kubenswrapper[7728]: W0223 14:18:27.088649 7728 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 23 14:18:27.089161 master-0 kubenswrapper[7728]: W0223 14:18:27.088654 7728 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 23 14:18:27.089161 master-0 kubenswrapper[7728]: W0223 14:18:27.088658 7728 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 23 14:18:27.089161 master-0 kubenswrapper[7728]: W0223 14:18:27.088662 7728 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 23 14:18:27.089161 master-0 kubenswrapper[7728]: W0223 14:18:27.088666 7728 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 23 14:18:27.089161 master-0 kubenswrapper[7728]: W0223 14:18:27.088669 7728 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 23 14:18:27.089161 master-0 kubenswrapper[7728]: W0223 14:18:27.088673 7728 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 23 14:18:27.089161 master-0 kubenswrapper[7728]: W0223 14:18:27.088677 7728 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 23 14:18:27.089161 master-0 kubenswrapper[7728]: W0223 14:18:27.088686 7728 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 23 14:18:27.089161 master-0 kubenswrapper[7728]: W0223 14:18:27.088690 7728 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 23 14:18:27.089161 master-0 kubenswrapper[7728]: W0223 14:18:27.088694 7728 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 23 14:18:27.089161 master-0 kubenswrapper[7728]: W0223 14:18:27.088699 7728 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 23 14:18:27.089161 master-0 kubenswrapper[7728]: W0223 14:18:27.088704 7728 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 23 14:18:27.089161 master-0 kubenswrapper[7728]: W0223 14:18:27.088708 7728 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 23 14:18:27.089161 master-0 kubenswrapper[7728]: W0223 14:18:27.088713 7728 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 23 14:18:27.089161 master-0 kubenswrapper[7728]: W0223 14:18:27.088717 7728 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 23 14:18:27.089161 master-0 kubenswrapper[7728]: W0223 14:18:27.088764 7728 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 23 14:18:27.089161 master-0 kubenswrapper[7728]: W0223 14:18:27.088770 7728 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 23 14:18:27.089161 master-0 kubenswrapper[7728]: W0223 14:18:27.088775 7728 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 23 14:18:27.089602 master-0 kubenswrapper[7728]: W0223 14:18:27.088779 7728 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 23 14:18:27.089602 master-0 kubenswrapper[7728]: W0223 14:18:27.088783 7728 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 23 14:18:27.089602 master-0 kubenswrapper[7728]: W0223 14:18:27.088788 7728 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 23 14:18:27.089602 master-0 kubenswrapper[7728]: W0223 14:18:27.088793 7728 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 23 14:18:27.089602 master-0 kubenswrapper[7728]: W0223 14:18:27.088798 7728 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 23 14:18:27.089602 master-0 kubenswrapper[7728]: W0223 14:18:27.088802 7728 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 23 14:18:27.089602 master-0 kubenswrapper[7728]: W0223 14:18:27.088806 7728 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 23 14:18:27.089602 master-0 kubenswrapper[7728]: W0223 14:18:27.088810 7728 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 23 14:18:27.089602 master-0 kubenswrapper[7728]: W0223 14:18:27.088814 7728 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 23 14:18:27.089602 master-0 kubenswrapper[7728]: W0223 14:18:27.088819 7728 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 23 14:18:27.089602 master-0 kubenswrapper[7728]: W0223 14:18:27.088823 7728 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 23 14:18:27.089602 master-0 kubenswrapper[7728]: W0223 14:18:27.088827 7728 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Feb 23 14:18:27.089602 master-0 kubenswrapper[7728]: W0223 14:18:27.088832 7728 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 23 14:18:27.089602 master-0 kubenswrapper[7728]: W0223 14:18:27.088836 7728 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 23 14:18:27.089602 master-0 kubenswrapper[7728]: W0223 14:18:27.088844 7728 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 23 14:18:27.089602 master-0 kubenswrapper[7728]: W0223 14:18:27.088848 7728 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 23 14:18:27.089602 master-0 kubenswrapper[7728]: W0223 14:18:27.088852 7728 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 23 14:18:27.089602 master-0 kubenswrapper[7728]: W0223 14:18:27.088855 7728 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 23 14:18:27.089602 master-0 kubenswrapper[7728]: W0223 14:18:27.088859 7728 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 23 14:18:27.090055 master-0 kubenswrapper[7728]: W0223 14:18:27.088863 7728 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 23 14:18:27.090055 master-0 kubenswrapper[7728]: W0223 14:18:27.088872 7728 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 23 14:18:27.090055 master-0 kubenswrapper[7728]: W0223 14:18:27.088876 7728 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 23 14:18:27.090055 master-0 kubenswrapper[7728]: W0223 14:18:27.088880 7728 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 23 14:18:27.090055 master-0 kubenswrapper[7728]: W0223 14:18:27.088884 7728 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 23 14:18:27.090055 master-0 kubenswrapper[7728]: W0223 14:18:27.088887 7728 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 23 14:18:27.090055 master-0 kubenswrapper[7728]: W0223 14:18:27.088891 7728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 23 14:18:27.090055 master-0 kubenswrapper[7728]: W0223 14:18:27.088894 7728 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 23 14:18:27.090055 master-0 kubenswrapper[7728]: W0223 14:18:27.088898 7728 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 23 14:18:27.090055 master-0 kubenswrapper[7728]: W0223 14:18:27.088902 7728 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 23 14:18:27.090055 master-0 kubenswrapper[7728]: I0223 14:18:27.089015 7728 flags.go:64] FLAG: --address="0.0.0.0" Feb 23 14:18:27.090055 master-0 kubenswrapper[7728]: I0223 14:18:27.089026 7728 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 23 14:18:27.090055 master-0 kubenswrapper[7728]: I0223 14:18:27.089037 7728 flags.go:64] FLAG: --anonymous-auth="true" Feb 23 14:18:27.090055 master-0 kubenswrapper[7728]: I0223 14:18:27.089063 7728 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 23 14:18:27.090055 master-0 kubenswrapper[7728]: I0223 14:18:27.089069 7728 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 23 14:18:27.090055 master-0 kubenswrapper[7728]: I0223 14:18:27.089074 7728 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 23 14:18:27.090055 master-0 kubenswrapper[7728]: I0223 14:18:27.089080 7728 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 23 14:18:27.090055 master-0 kubenswrapper[7728]: I0223 14:18:27.089086 7728 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 23 14:18:27.090055 master-0 kubenswrapper[7728]: I0223 14:18:27.089090 7728 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 23 14:18:27.090055 master-0 kubenswrapper[7728]: I0223 14:18:27.089095 7728 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 23 14:18:27.090055 master-0 kubenswrapper[7728]: I0223 14:18:27.089101 7728 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 23 14:18:27.090055 master-0 kubenswrapper[7728]: I0223 14:18:27.089106 7728 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 23 14:18:27.090579 master-0 kubenswrapper[7728]: I0223 14:18:27.089111 7728 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 23 14:18:27.090579 master-0 kubenswrapper[7728]: I0223 14:18:27.089115 7728 flags.go:64] FLAG: --cgroup-root="" Feb 23 14:18:27.090579 master-0 kubenswrapper[7728]: I0223 14:18:27.089120 7728 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 23 14:18:27.090579 master-0 kubenswrapper[7728]: I0223 14:18:27.089125 7728 flags.go:64] FLAG: --client-ca-file="" Feb 23 14:18:27.090579 master-0 kubenswrapper[7728]: I0223 14:18:27.089129 7728 flags.go:64] FLAG: --cloud-config="" Feb 23 14:18:27.090579 master-0 kubenswrapper[7728]: I0223 14:18:27.089134 7728 flags.go:64] FLAG: --cloud-provider="" Feb 23 14:18:27.090579 master-0 kubenswrapper[7728]: I0223 14:18:27.089138 7728 flags.go:64] FLAG: --cluster-dns="[]" Feb 23 14:18:27.090579 master-0 kubenswrapper[7728]: I0223 14:18:27.089143 7728 flags.go:64] FLAG: --cluster-domain="" Feb 23 14:18:27.090579 master-0 kubenswrapper[7728]: I0223 14:18:27.089148 7728 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 23 14:18:27.090579 master-0 kubenswrapper[7728]: I0223 14:18:27.089153 7728 flags.go:64] FLAG: --config-dir="" Feb 23 14:18:27.090579 master-0 kubenswrapper[7728]: I0223 14:18:27.089157 7728 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 23 14:18:27.090579 master-0 kubenswrapper[7728]: I0223 14:18:27.089162 7728 flags.go:64] FLAG: --container-log-max-files="5" Feb 23 14:18:27.090579 master-0 kubenswrapper[7728]: I0223 14:18:27.089169 7728 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 23 14:18:27.090579 master-0 kubenswrapper[7728]: I0223 14:18:27.089173 7728 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 23 14:18:27.090579 master-0 kubenswrapper[7728]: I0223 14:18:27.089178 7728 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 23 14:18:27.090579 master-0 kubenswrapper[7728]: I0223 14:18:27.089183 7728 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 23 14:18:27.090579 master-0 kubenswrapper[7728]: I0223 14:18:27.089189 7728 flags.go:64] FLAG: --contention-profiling="false" Feb 23 14:18:27.090579 master-0 kubenswrapper[7728]: I0223 14:18:27.089193 7728 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 23 14:18:27.090579 master-0 kubenswrapper[7728]: I0223 14:18:27.089197 7728 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 23 14:18:27.090579 master-0 kubenswrapper[7728]: I0223 14:18:27.089201 7728 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 23 14:18:27.090579 master-0 kubenswrapper[7728]: I0223 14:18:27.089206 7728 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 23 14:18:27.090579 master-0 kubenswrapper[7728]: I0223 14:18:27.089211 7728 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 23 14:18:27.090579 master-0 kubenswrapper[7728]: I0223 14:18:27.089215 7728 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 23 14:18:27.090579 master-0 kubenswrapper[7728]: I0223 14:18:27.089219 7728 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 23 14:18:27.090579 master-0 kubenswrapper[7728]: I0223 14:18:27.089224 7728 flags.go:64] FLAG: --enable-load-reader="false" Feb 23 14:18:27.091168 master-0 kubenswrapper[7728]: I0223 14:18:27.089228 7728 flags.go:64] FLAG: --enable-server="true" Feb 23 14:18:27.091168 master-0 kubenswrapper[7728]: I0223 14:18:27.089233 7728 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 23 14:18:27.091168 master-0 kubenswrapper[7728]: I0223 14:18:27.089240 7728 flags.go:64] FLAG: --event-burst="100" Feb 23 14:18:27.091168 master-0 kubenswrapper[7728]: I0223 14:18:27.089245 7728 flags.go:64] FLAG: --event-qps="50" Feb 23 14:18:27.091168 master-0 kubenswrapper[7728]: I0223 14:18:27.089249 7728 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 23 14:18:27.091168 master-0 kubenswrapper[7728]: I0223 14:18:27.089253 7728 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 23 14:18:27.091168 master-0 kubenswrapper[7728]: I0223 14:18:27.089258 7728 flags.go:64] FLAG: --eviction-hard="" Feb 23 14:18:27.091168 master-0 kubenswrapper[7728]: I0223 14:18:27.089263 7728 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 23 14:18:27.091168 master-0 kubenswrapper[7728]: I0223 14:18:27.089267 7728 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 23 14:18:27.091168 master-0 kubenswrapper[7728]: I0223 14:18:27.089272 7728 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 23 14:18:27.091168 master-0 kubenswrapper[7728]: I0223 14:18:27.089277 7728 flags.go:64] FLAG: --eviction-soft="" Feb 23 14:18:27.091168 master-0 kubenswrapper[7728]: I0223 14:18:27.089282 7728 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 23 14:18:27.091168 master-0 kubenswrapper[7728]: I0223 14:18:27.089288 7728 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 23 14:18:27.091168 master-0 kubenswrapper[7728]: I0223 14:18:27.089292 7728 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 23 14:18:27.091168 master-0 kubenswrapper[7728]: I0223 14:18:27.089296 7728 flags.go:64] FLAG: --experimental-mounter-path="" Feb 23 14:18:27.091168 master-0 kubenswrapper[7728]: I0223 14:18:27.089300 7728 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 23 14:18:27.091168 master-0 kubenswrapper[7728]: I0223 14:18:27.089304 7728 flags.go:64] FLAG: --fail-swap-on="true" Feb 23 14:18:27.091168 master-0 kubenswrapper[7728]: I0223 14:18:27.089308 7728 flags.go:64] FLAG: --feature-gates="" Feb 23 14:18:27.091168 master-0 kubenswrapper[7728]: I0223 14:18:27.089314 7728 flags.go:64] FLAG: --file-check-frequency="20s" Feb 23 14:18:27.091168 master-0 kubenswrapper[7728]: I0223 14:18:27.089318 7728 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 23 14:18:27.091168 master-0 kubenswrapper[7728]: I0223 14:18:27.089322 7728 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 23 14:18:27.091168 master-0 kubenswrapper[7728]: I0223 14:18:27.089327 7728 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 23 14:18:27.091168 master-0 kubenswrapper[7728]: I0223 14:18:27.089331 7728 flags.go:64] FLAG: --healthz-port="10248" Feb 23 14:18:27.091168 master-0 kubenswrapper[7728]: I0223 14:18:27.089335 7728 flags.go:64] FLAG: --help="false" Feb 23 14:18:27.091168 master-0 kubenswrapper[7728]: I0223 14:18:27.089340 7728 flags.go:64] FLAG: --hostname-override="" Feb 23 14:18:27.091168 master-0 kubenswrapper[7728]: I0223 14:18:27.089344 7728 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 23 14:18:27.091861 master-0 kubenswrapper[7728]: I0223 14:18:27.089348 7728 flags.go:64] FLAG: --http-check-frequency="20s" Feb 23 14:18:27.091861 master-0 kubenswrapper[7728]: I0223 14:18:27.089352 7728 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 23 14:18:27.091861 master-0 kubenswrapper[7728]: I0223 14:18:27.089356 7728 flags.go:64] FLAG: --image-credential-provider-config="" Feb 23 14:18:27.091861 master-0 kubenswrapper[7728]: I0223 14:18:27.089361 7728 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 23 14:18:27.091861 master-0 kubenswrapper[7728]: I0223 14:18:27.089365 7728 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 23 14:18:27.091861 master-0 kubenswrapper[7728]: I0223 14:18:27.089370 7728 flags.go:64] FLAG: --image-service-endpoint="" Feb 23 14:18:27.091861 master-0 kubenswrapper[7728]: I0223 14:18:27.089375 7728 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 23 14:18:27.091861 master-0 kubenswrapper[7728]: I0223 14:18:27.089379 7728 flags.go:64] FLAG: --kube-api-burst="100" Feb 23 14:18:27.091861 master-0 kubenswrapper[7728]: I0223 14:18:27.089384 7728 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 23 14:18:27.091861 master-0 kubenswrapper[7728]: I0223 14:18:27.089389 7728 flags.go:64] FLAG: --kube-api-qps="50" Feb 23 14:18:27.091861 master-0 kubenswrapper[7728]: I0223 14:18:27.089394 7728 flags.go:64] FLAG: --kube-reserved="" Feb 23 14:18:27.091861 master-0 kubenswrapper[7728]: I0223 14:18:27.089398 7728 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 23 14:18:27.091861 master-0 kubenswrapper[7728]: I0223 14:18:27.089403 7728 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 23 14:18:27.091861 master-0 kubenswrapper[7728]: I0223 14:18:27.089407 7728 flags.go:64] FLAG: --kubelet-cgroups="" Feb 23 14:18:27.091861 master-0 kubenswrapper[7728]: I0223 14:18:27.089411 7728 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 23 14:18:27.091861 master-0 kubenswrapper[7728]: I0223 14:18:27.089416 7728 flags.go:64] FLAG: --lock-file="" Feb 23 14:18:27.091861 master-0 kubenswrapper[7728]: I0223 14:18:27.089421 7728 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 23 14:18:27.091861 master-0 kubenswrapper[7728]: I0223 14:18:27.089427 7728 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 23 14:18:27.091861 master-0 kubenswrapper[7728]: I0223 14:18:27.089432 7728 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 23 14:18:27.091861 master-0 kubenswrapper[7728]: I0223 14:18:27.089443 7728 flags.go:64] FLAG: --log-json-split-stream="false" Feb 23 14:18:27.091861 master-0 kubenswrapper[7728]: I0223 14:18:27.089448 7728 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 23 14:18:27.091861 master-0 kubenswrapper[7728]: I0223 14:18:27.089452 7728 flags.go:64] FLAG: --log-text-split-stream="false" Feb 23 14:18:27.091861 master-0 kubenswrapper[7728]: I0223 14:18:27.089457 7728 flags.go:64] FLAG: --logging-format="text" Feb 23 14:18:27.091861 master-0 kubenswrapper[7728]: I0223 14:18:27.089461 7728 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 23 14:18:27.091861 master-0 kubenswrapper[7728]: I0223 14:18:27.089466 7728 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 23 14:18:27.092440 master-0 kubenswrapper[7728]: I0223 14:18:27.089470 7728 flags.go:64] FLAG: --manifest-url="" Feb 23 14:18:27.092440 master-0 kubenswrapper[7728]: I0223 14:18:27.089491 7728 flags.go:64] FLAG: --manifest-url-header="" Feb 23 14:18:27.092440 master-0 kubenswrapper[7728]: I0223 14:18:27.089497 7728 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 23 14:18:27.092440 master-0 kubenswrapper[7728]: I0223 14:18:27.089502 7728 flags.go:64] FLAG: --max-open-files="1000000" Feb 23 14:18:27.092440 master-0 kubenswrapper[7728]: I0223 14:18:27.089507 7728 flags.go:64] FLAG: --max-pods="110" Feb 23 14:18:27.092440 master-0 kubenswrapper[7728]: I0223 14:18:27.089511 7728 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 23 14:18:27.092440 master-0 kubenswrapper[7728]: I0223 14:18:27.089515 7728 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 23 14:18:27.092440 master-0 kubenswrapper[7728]: I0223 14:18:27.089520 7728 flags.go:64] FLAG: --memory-manager-policy="None" Feb 23 14:18:27.092440 master-0 kubenswrapper[7728]: I0223 14:18:27.089524 7728 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 23 14:18:27.092440 master-0 kubenswrapper[7728]: I0223 14:18:27.089529 7728 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 23 14:18:27.092440 master-0 kubenswrapper[7728]: I0223 14:18:27.089533 7728 flags.go:64] FLAG: --node-ip="192.168.32.10" Feb 23 14:18:27.092440 master-0 kubenswrapper[7728]: I0223 14:18:27.089538 7728 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 23 14:18:27.092440 master-0 kubenswrapper[7728]: I0223 14:18:27.089549 7728 flags.go:64] FLAG: --node-status-max-images="50" Feb 23 14:18:27.092440 master-0 kubenswrapper[7728]: I0223 14:18:27.089553 7728 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 23 14:18:27.092440 master-0 kubenswrapper[7728]: I0223 14:18:27.089558 7728 flags.go:64] FLAG: --oom-score-adj="-999" Feb 23 14:18:27.092440 master-0 kubenswrapper[7728]: I0223 14:18:27.089562 7728 flags.go:64] FLAG: --pod-cidr="" Feb 23 14:18:27.092440 master-0 kubenswrapper[7728]: I0223 14:18:27.089566 7728 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5001a555eb05eef7f23d64667303c2b4db8343ee900c265f7613c40c1db229" Feb 23 14:18:27.092440 master-0 kubenswrapper[7728]: I0223 14:18:27.089575 7728 flags.go:64] FLAG: --pod-manifest-path="" Feb 23 14:18:27.092440 master-0 kubenswrapper[7728]: I0223 14:18:27.089580 7728 flags.go:64] FLAG: --pod-max-pids="-1" Feb 23 14:18:27.092440 master-0 kubenswrapper[7728]: I0223 14:18:27.089585 7728 flags.go:64] FLAG: --pods-per-core="0" Feb 23 14:18:27.092440 master-0 kubenswrapper[7728]: I0223 14:18:27.089589 7728 flags.go:64] FLAG: --port="10250" Feb 23 14:18:27.092440 master-0 kubenswrapper[7728]: I0223 14:18:27.089594 7728 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 23 14:18:27.092440 master-0 kubenswrapper[7728]: I0223 14:18:27.089598 7728 flags.go:64] FLAG: --provider-id="" Feb 23 14:18:27.092440 master-0 kubenswrapper[7728]: I0223 14:18:27.089602 7728 flags.go:64] FLAG: --qos-reserved="" Feb 23 14:18:27.093183 master-0 kubenswrapper[7728]: I0223 14:18:27.089609 7728 flags.go:64] FLAG: --read-only-port="10255" Feb 23 14:18:27.093183 master-0 kubenswrapper[7728]: I0223 14:18:27.089614 7728 flags.go:64] FLAG: --register-node="true" Feb 23 14:18:27.093183 master-0 kubenswrapper[7728]: I0223 14:18:27.089619 7728 flags.go:64] FLAG: --register-schedulable="true" Feb 23 14:18:27.093183 master-0 kubenswrapper[7728]: I0223 14:18:27.089623 7728 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 23 14:18:27.093183 master-0 kubenswrapper[7728]: I0223 14:18:27.089632 7728 flags.go:64] FLAG: --registry-burst="10" Feb 23 14:18:27.093183 master-0 kubenswrapper[7728]: I0223 14:18:27.089636 7728 flags.go:64] FLAG: --registry-qps="5" Feb 23 14:18:27.093183 master-0 kubenswrapper[7728]: I0223 14:18:27.089641 7728 flags.go:64] FLAG: --reserved-cpus="" Feb 23 14:18:27.093183 master-0 kubenswrapper[7728]: I0223 14:18:27.089646 7728 flags.go:64] FLAG: --reserved-memory="" Feb 23 14:18:27.093183 master-0 kubenswrapper[7728]: I0223 14:18:27.089652 7728 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 23 14:18:27.093183 master-0 kubenswrapper[7728]: I0223 14:18:27.089657 7728 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 23 14:18:27.093183 master-0 kubenswrapper[7728]: I0223 14:18:27.089661 7728 flags.go:64] FLAG: --rotate-certificates="false" Feb 23 14:18:27.093183 master-0 kubenswrapper[7728]: I0223 14:18:27.089665 7728 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 23 14:18:27.093183 master-0 kubenswrapper[7728]: I0223 14:18:27.089669 7728 flags.go:64] FLAG: --runonce="false" Feb 23 14:18:27.093183 master-0 kubenswrapper[7728]: I0223 14:18:27.089674 7728 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 23 14:18:27.093183 master-0 kubenswrapper[7728]: I0223 14:18:27.089724 7728 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 23 14:18:27.093183 master-0 kubenswrapper[7728]: I0223 14:18:27.089730 7728 flags.go:64] FLAG: --seccomp-default="false" Feb 23 14:18:27.093183 master-0 kubenswrapper[7728]: I0223 14:18:27.089734 7728 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 23 14:18:27.093183 master-0 kubenswrapper[7728]: I0223 14:18:27.089738 7728 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 23 14:18:27.093183 master-0 kubenswrapper[7728]: I0223 14:18:27.089743 7728 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 23 14:18:27.093183 master-0 kubenswrapper[7728]: I0223 14:18:27.089747 7728 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 23 14:18:27.093183 master-0 kubenswrapper[7728]: I0223 14:18:27.089752 7728 flags.go:64] FLAG: --storage-driver-password="root" Feb 23 14:18:27.093183 master-0 kubenswrapper[7728]: I0223 14:18:27.089756 7728 flags.go:64] FLAG: --storage-driver-secure="false" Feb 23 14:18:27.093183 master-0 kubenswrapper[7728]: I0223 14:18:27.089760 7728 flags.go:64] FLAG: --storage-driver-table="stats" Feb 23 14:18:27.093183 master-0 kubenswrapper[7728]: I0223 14:18:27.089764 7728 flags.go:64] FLAG: --storage-driver-user="root" Feb 23 14:18:27.093183 master-0 kubenswrapper[7728]: I0223 14:18:27.089769 7728 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 23 14:18:27.093767 master-0 kubenswrapper[7728]: I0223 14:18:27.089773 7728 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 23 14:18:27.093767 master-0 kubenswrapper[7728]: I0223 14:18:27.089777 7728 flags.go:64] FLAG: --system-cgroups="" Feb 23 14:18:27.093767 master-0 kubenswrapper[7728]: I0223 14:18:27.089782 7728 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Feb 23 14:18:27.093767 master-0 kubenswrapper[7728]: I0223 14:18:27.089789 7728 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 23 14:18:27.093767 master-0 kubenswrapper[7728]: I0223 14:18:27.089793 7728 flags.go:64] FLAG: --tls-cert-file="" Feb 23 14:18:27.093767 master-0 kubenswrapper[7728]: I0223 14:18:27.089797 7728 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 23 14:18:27.093767 master-0 kubenswrapper[7728]: I0223 14:18:27.089803 7728 flags.go:64] FLAG: --tls-min-version="" Feb 23 14:18:27.093767 master-0 kubenswrapper[7728]: I0223 14:18:27.089807 7728 flags.go:64] FLAG: --tls-private-key-file="" Feb 23 14:18:27.093767 master-0 kubenswrapper[7728]: I0223 14:18:27.089812 7728 flags.go:64] FLAG: --topology-manager-policy="none" Feb 23 14:18:27.093767 master-0 kubenswrapper[7728]: I0223 14:18:27.089816 7728 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 23 14:18:27.093767 master-0 kubenswrapper[7728]: I0223 14:18:27.089820 7728 flags.go:64] FLAG: --topology-manager-scope="container" Feb 23 14:18:27.093767 master-0 kubenswrapper[7728]: I0223 14:18:27.089825 7728 flags.go:64] FLAG: --v="2" Feb 23 14:18:27.093767 master-0 kubenswrapper[7728]: I0223 14:18:27.089831 7728 flags.go:64] FLAG: --version="false" Feb 23 14:18:27.093767 master-0 kubenswrapper[7728]: I0223 14:18:27.089837 7728 flags.go:64] FLAG: --vmodule="" Feb 23 14:18:27.093767 master-0 kubenswrapper[7728]: I0223 14:18:27.089842 7728 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 23 14:18:27.093767 master-0 kubenswrapper[7728]: I0223 14:18:27.089847 7728 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 23 14:18:27.093767 master-0 kubenswrapper[7728]: W0223 14:18:27.090050 7728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 23 14:18:27.093767 master-0 kubenswrapper[7728]: W0223 14:18:27.090056 7728 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 23 14:18:27.093767 master-0 kubenswrapper[7728]: W0223 14:18:27.090061 7728 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 23 14:18:27.093767 master-0 kubenswrapper[7728]: W0223 14:18:27.090065 7728 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 23 14:18:27.093767 master-0 kubenswrapper[7728]: W0223 14:18:27.090070 7728 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 23 14:18:27.093767 master-0 kubenswrapper[7728]: W0223 14:18:27.090074 7728 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 23 14:18:27.093767 master-0 kubenswrapper[7728]: W0223 14:18:27.090078 7728 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 23 14:18:27.093767 master-0 kubenswrapper[7728]: W0223 14:18:27.090082 7728 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 23 14:18:27.094547 master-0 kubenswrapper[7728]: W0223 14:18:27.090085 7728 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 23 14:18:27.094547 master-0 kubenswrapper[7728]: W0223 14:18:27.090090 7728 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 23 14:18:27.094547 master-0 kubenswrapper[7728]: W0223 14:18:27.090093 7728 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 23 14:18:27.094547 master-0 kubenswrapper[7728]: W0223 14:18:27.090097 7728 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 23 14:18:27.094547 master-0 kubenswrapper[7728]: W0223 14:18:27.090100 7728 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 23 14:18:27.094547 master-0 kubenswrapper[7728]: W0223 14:18:27.090104 7728 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 23 14:18:27.094547 master-0 kubenswrapper[7728]: W0223 14:18:27.090108 7728 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 23 14:18:27.094547 master-0 kubenswrapper[7728]: W0223 14:18:27.090111 7728 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 23 14:18:27.094547 master-0 kubenswrapper[7728]: W0223 14:18:27.090114 7728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 23 14:18:27.094547 master-0 kubenswrapper[7728]: W0223 14:18:27.090118 7728 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 23 14:18:27.094547 master-0 kubenswrapper[7728]: W0223 14:18:27.090121 7728 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 23 14:18:27.094547 master-0 kubenswrapper[7728]: W0223 14:18:27.090125 7728 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 23 14:18:27.094547 master-0 kubenswrapper[7728]: W0223 14:18:27.090129 7728 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 23 14:18:27.094547 master-0 kubenswrapper[7728]: W0223 14:18:27.090133 7728 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 23 14:18:27.094547 master-0 kubenswrapper[7728]: W0223 14:18:27.090137 7728 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 23 14:18:27.094547 master-0 kubenswrapper[7728]: W0223 14:18:27.090141 7728 feature_gate.go:330] unrecognized feature gate: Example Feb 23 14:18:27.094547 master-0 kubenswrapper[7728]: W0223 14:18:27.090144 7728 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 23 14:18:27.094547 master-0 kubenswrapper[7728]: W0223 14:18:27.090148 7728 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 23 14:18:27.094547 master-0 kubenswrapper[7728]: W0223 14:18:27.090152 7728 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 23 14:18:27.094993 master-0 kubenswrapper[7728]: W0223 14:18:27.090155 7728 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 23 14:18:27.094993 master-0 kubenswrapper[7728]: W0223 14:18:27.090159 7728 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 23 14:18:27.094993 master-0 kubenswrapper[7728]: W0223 14:18:27.090162 7728 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Feb 23 14:18:27.094993 master-0 kubenswrapper[7728]: W0223 14:18:27.090166 7728 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 23 14:18:27.094993 master-0 kubenswrapper[7728]: W0223 14:18:27.090171 7728 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 23 14:18:27.094993 master-0 kubenswrapper[7728]: W0223 14:18:27.090176 7728 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 23 14:18:27.094993 master-0 kubenswrapper[7728]: W0223 14:18:27.090180 7728 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 23 14:18:27.094993 master-0 kubenswrapper[7728]: W0223 14:18:27.090186 7728 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 23 14:18:27.094993 master-0 kubenswrapper[7728]: W0223 14:18:27.090191 7728 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 23 14:18:27.094993 master-0 kubenswrapper[7728]: W0223 14:18:27.090196 7728 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 23 14:18:27.094993 master-0 kubenswrapper[7728]: W0223 14:18:27.090201 7728 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 23 14:18:27.094993 master-0 kubenswrapper[7728]: W0223 14:18:27.090205 7728 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 23 14:18:27.094993 master-0 kubenswrapper[7728]: W0223 14:18:27.090209 7728 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 23 14:18:27.094993 master-0 kubenswrapper[7728]: W0223 14:18:27.090212 7728 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 23 14:18:27.094993 master-0 kubenswrapper[7728]: W0223 14:18:27.090216 7728 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 23 14:18:27.094993 master-0 kubenswrapper[7728]: W0223 14:18:27.090219 7728 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 23 14:18:27.094993 master-0 kubenswrapper[7728]: W0223 14:18:27.090222 7728 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 23 14:18:27.094993 master-0 kubenswrapper[7728]: W0223 14:18:27.090267 7728 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 23 14:18:27.094993 master-0 kubenswrapper[7728]: W0223 14:18:27.090271 7728 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 23 14:18:27.095454 master-0 kubenswrapper[7728]: W0223 14:18:27.090274 7728 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 23 14:18:27.095454 master-0 kubenswrapper[7728]: W0223 14:18:27.090278 7728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 23 14:18:27.095454 master-0 kubenswrapper[7728]: W0223 14:18:27.090281 7728 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 23 14:18:27.095454 master-0 kubenswrapper[7728]: W0223 14:18:27.090285 7728 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 23 14:18:27.095454 master-0 kubenswrapper[7728]: W0223 14:18:27.090288 7728 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 23 14:18:27.095454 master-0 kubenswrapper[7728]: W0223 14:18:27.090292 7728 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 23 14:18:27.095454 master-0 kubenswrapper[7728]: W0223 14:18:27.090295 7728 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 23 14:18:27.095454 master-0 kubenswrapper[7728]: W0223 14:18:27.090299 7728 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 23 14:18:27.095454 master-0 kubenswrapper[7728]: W0223 14:18:27.090302 7728 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 23 14:18:27.095454 master-0 kubenswrapper[7728]: W0223 14:18:27.090306 7728 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 23 14:18:27.095454 master-0 kubenswrapper[7728]: W0223 14:18:27.090310 7728 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 23 14:18:27.095454 master-0 kubenswrapper[7728]: W0223 14:18:27.090313 7728 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 23 14:18:27.095454 master-0 kubenswrapper[7728]: W0223 14:18:27.090318 7728 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 23 14:18:27.095454 master-0 kubenswrapper[7728]: W0223 14:18:27.090323 7728 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 23 14:18:27.095454 master-0 kubenswrapper[7728]: W0223 14:18:27.090327 7728 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 23 14:18:27.095454 master-0 kubenswrapper[7728]: W0223 14:18:27.090331 7728 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 23 14:18:27.095454 master-0 kubenswrapper[7728]: W0223 14:18:27.090335 7728 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 23 14:18:27.095454 master-0 kubenswrapper[7728]: W0223 14:18:27.090338 7728 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 23 14:18:27.095454 master-0 kubenswrapper[7728]: W0223 14:18:27.090342 7728 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 23 14:18:27.095454 master-0 kubenswrapper[7728]: W0223 14:18:27.090346 7728 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 23 14:18:27.096116 master-0 kubenswrapper[7728]: W0223 14:18:27.090351 7728 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 23 14:18:27.096116 master-0 kubenswrapper[7728]: W0223 14:18:27.090355 7728 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 23 14:18:27.096116 master-0 kubenswrapper[7728]: W0223 14:18:27.090365 7728 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 23 14:18:27.096116 master-0 kubenswrapper[7728]: W0223 14:18:27.090368 7728 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 23 14:18:27.096116 master-0 kubenswrapper[7728]: W0223 14:18:27.090372 7728 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 23 14:18:27.096116 master-0 kubenswrapper[7728]: W0223 14:18:27.090375 7728 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 23 14:18:27.096116 master-0 kubenswrapper[7728]: I0223 14:18:27.090432 7728 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 23 14:18:27.100742 master-0 kubenswrapper[7728]: I0223 14:18:27.100172 7728 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Feb 23 14:18:27.100742 master-0 kubenswrapper[7728]: I0223 14:18:27.100737 7728 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 23 14:18:27.100825 master-0 kubenswrapper[7728]: W0223 14:18:27.100810 7728 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 23 14:18:27.100825 master-0 kubenswrapper[7728]: W0223 14:18:27.100817 7728 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 23 14:18:27.100825 master-0 kubenswrapper[7728]: W0223 14:18:27.100821 7728 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 23 14:18:27.100825 master-0 kubenswrapper[7728]: W0223 14:18:27.100826 7728 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 23 14:18:27.100931 master-0 kubenswrapper[7728]: W0223 14:18:27.100831 7728 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 23 14:18:27.100931 master-0 kubenswrapper[7728]: W0223 14:18:27.100837 7728 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 23 14:18:27.100931 master-0 kubenswrapper[7728]: W0223 14:18:27.100842 7728 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 23 14:18:27.100931 master-0 kubenswrapper[7728]: W0223 14:18:27.100845 7728 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 23 14:18:27.100931 master-0 kubenswrapper[7728]: W0223 14:18:27.100849 7728 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 23 14:18:27.100931 master-0 kubenswrapper[7728]: W0223 14:18:27.100854 7728 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 23 14:18:27.100931 master-0 kubenswrapper[7728]: W0223 14:18:27.100858 7728 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 23 14:18:27.100931 master-0 kubenswrapper[7728]: W0223 14:18:27.100861 7728 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 23 14:18:27.100931 master-0 kubenswrapper[7728]: W0223 14:18:27.100865 7728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 23 14:18:27.100931 master-0 kubenswrapper[7728]: W0223 14:18:27.100870 7728 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 23 14:18:27.100931 master-0 kubenswrapper[7728]: W0223 14:18:27.100878 7728 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Feb 23 14:18:27.100931 master-0 kubenswrapper[7728]: W0223 14:18:27.100881 7728 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 23 14:18:27.100931 master-0 kubenswrapper[7728]: W0223 14:18:27.100885 7728 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 23 14:18:27.100931 master-0 kubenswrapper[7728]: W0223 14:18:27.100889 7728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 23 14:18:27.100931 master-0 kubenswrapper[7728]: W0223 14:18:27.100892 7728 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 23 14:18:27.100931 master-0 kubenswrapper[7728]: W0223 14:18:27.100897 7728 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 23 14:18:27.100931 master-0 kubenswrapper[7728]: W0223 14:18:27.100900 7728 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 23 14:18:27.100931 master-0 kubenswrapper[7728]: W0223 14:18:27.100904 7728 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 23 14:18:27.100931 master-0 kubenswrapper[7728]: W0223 14:18:27.100907 7728 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 23 14:18:27.100931 master-0 kubenswrapper[7728]: W0223 14:18:27.100911 7728 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 23 14:18:27.101386 master-0 kubenswrapper[7728]: W0223 14:18:27.100915 7728 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 23 14:18:27.101386 master-0 kubenswrapper[7728]: W0223 14:18:27.100919 7728 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 23 14:18:27.101386 master-0 kubenswrapper[7728]: W0223 14:18:27.100923 7728 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 23 14:18:27.101386 master-0 kubenswrapper[7728]: W0223 14:18:27.100926 7728 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 23 14:18:27.101386 master-0 kubenswrapper[7728]: W0223 14:18:27.100930 7728 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 23 14:18:27.101386 master-0 kubenswrapper[7728]: W0223 14:18:27.100935 7728 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 23 14:18:27.101386 master-0 kubenswrapper[7728]: W0223 14:18:27.100940 7728 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 23 14:18:27.101386 master-0 kubenswrapper[7728]: W0223 14:18:27.100945 7728 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 23 14:18:27.101386 master-0 kubenswrapper[7728]: W0223 14:18:27.100949 7728 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 23 14:18:27.101386 master-0 kubenswrapper[7728]: W0223 14:18:27.100953 7728 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 23 14:18:27.101386 master-0 kubenswrapper[7728]: W0223 14:18:27.100958 7728 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 23 14:18:27.101386 master-0 kubenswrapper[7728]: W0223 14:18:27.100963 7728 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 23 14:18:27.101386 master-0 kubenswrapper[7728]: W0223 14:18:27.100966 7728 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 23 14:18:27.101386 master-0 kubenswrapper[7728]: W0223 14:18:27.100970 7728 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 23 14:18:27.101386 master-0 kubenswrapper[7728]: W0223 14:18:27.101015 7728 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 23 14:18:27.101386 master-0 kubenswrapper[7728]: W0223 14:18:27.101020 7728 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 23 14:18:27.101386 master-0 kubenswrapper[7728]: W0223 14:18:27.101024 7728 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 23 14:18:27.101386 master-0 kubenswrapper[7728]: W0223 14:18:27.101028 7728 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 23 14:18:27.101386 master-0 kubenswrapper[7728]: W0223 14:18:27.101032 7728 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 23 14:18:27.101386 master-0 kubenswrapper[7728]: W0223 14:18:27.101036 7728 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 23 14:18:27.101920 master-0 kubenswrapper[7728]: W0223 14:18:27.101039 7728 feature_gate.go:330] unrecognized feature gate: Example Feb 23 14:18:27.101920 master-0 kubenswrapper[7728]: W0223 14:18:27.101043 7728 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 23 14:18:27.101920 master-0 kubenswrapper[7728]: W0223 14:18:27.101047 7728 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 23 14:18:27.101920 master-0 kubenswrapper[7728]: W0223 14:18:27.101051 7728 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 23 14:18:27.101920 master-0 kubenswrapper[7728]: W0223 14:18:27.101056 7728 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 23 14:18:27.101920 master-0 kubenswrapper[7728]: W0223 14:18:27.101060 7728 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 23 14:18:27.101920 master-0 kubenswrapper[7728]: W0223 14:18:27.101064 7728 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 23 14:18:27.101920 master-0 kubenswrapper[7728]: W0223 14:18:27.101068 7728 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 23 14:18:27.101920 master-0 kubenswrapper[7728]: W0223 14:18:27.101073 7728 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 23 14:18:27.101920 master-0 kubenswrapper[7728]: W0223 14:18:27.101077 7728 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 23 14:18:27.101920 master-0 kubenswrapper[7728]: W0223 14:18:27.101081 7728 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 23 14:18:27.101920 master-0 kubenswrapper[7728]: W0223 14:18:27.101086 7728 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 23 14:18:27.101920 master-0 kubenswrapper[7728]: W0223 14:18:27.101090 7728 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 23 14:18:27.101920 master-0 kubenswrapper[7728]: W0223 14:18:27.101093 7728 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 23 14:18:27.101920 master-0 kubenswrapper[7728]: W0223 14:18:27.101097 7728 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 23 14:18:27.101920 master-0 kubenswrapper[7728]: W0223 14:18:27.101101 7728 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 23 14:18:27.101920 master-0 kubenswrapper[7728]: W0223 14:18:27.101106 7728 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 23 14:18:27.101920 master-0 kubenswrapper[7728]: W0223 14:18:27.101109 7728 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 23 14:18:27.101920 master-0 kubenswrapper[7728]: W0223 14:18:27.101114 7728 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 23 14:18:27.102409 master-0 kubenswrapper[7728]: W0223 14:18:27.101119 7728 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 23 14:18:27.102409 master-0 kubenswrapper[7728]: W0223 14:18:27.101124 7728 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 23 14:18:27.102409 master-0 kubenswrapper[7728]: W0223 14:18:27.101127 7728 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 23 14:18:27.102409 master-0 kubenswrapper[7728]: W0223 14:18:27.101131 7728 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 23 14:18:27.102409 master-0 kubenswrapper[7728]: W0223 14:18:27.101135 7728 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 23 14:18:27.102409 master-0 kubenswrapper[7728]: W0223 14:18:27.101139 7728 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 23 14:18:27.102409 master-0 kubenswrapper[7728]: W0223 14:18:27.101142 7728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 23 14:18:27.102409 master-0 kubenswrapper[7728]: W0223 14:18:27.101147 7728 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 23 14:18:27.102409 master-0 kubenswrapper[7728]: W0223 14:18:27.101151 7728 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 23 14:18:27.102409 master-0 kubenswrapper[7728]: I0223 14:18:27.101158 7728 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 23 14:18:27.102409 master-0 kubenswrapper[7728]: W0223 14:18:27.101279 7728 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 23 14:18:27.102409 master-0 kubenswrapper[7728]: W0223 14:18:27.101287 7728 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 23 14:18:27.102409 master-0 kubenswrapper[7728]: W0223 14:18:27.101292 7728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 23 14:18:27.102409 master-0 kubenswrapper[7728]: W0223 14:18:27.101296 7728 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 23 14:18:27.102409 master-0 kubenswrapper[7728]: W0223 14:18:27.101299 7728 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 23 14:18:27.102973 master-0 kubenswrapper[7728]: W0223 14:18:27.101303 7728 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 23 14:18:27.102973 master-0 kubenswrapper[7728]: W0223 14:18:27.101307 7728 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 23 14:18:27.102973 master-0 kubenswrapper[7728]: W0223 14:18:27.101310 7728 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 23 14:18:27.102973 master-0 kubenswrapper[7728]: W0223 14:18:27.101314 7728 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 23 14:18:27.102973 master-0 kubenswrapper[7728]: W0223 14:18:27.101318 7728 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 23 14:18:27.102973 master-0 kubenswrapper[7728]: W0223 14:18:27.101324 7728 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 23 14:18:27.102973 master-0 kubenswrapper[7728]: W0223 14:18:27.101328 7728 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 23 14:18:27.102973 master-0 kubenswrapper[7728]: W0223 14:18:27.101332 7728 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 23 14:18:27.102973 master-0 kubenswrapper[7728]: W0223 14:18:27.101336 7728 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 23 14:18:27.102973 master-0 kubenswrapper[7728]: W0223 14:18:27.101339 7728 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 23 14:18:27.102973 master-0 kubenswrapper[7728]: W0223 14:18:27.101343 7728 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 23 14:18:27.102973 master-0 kubenswrapper[7728]: W0223 14:18:27.101347 7728 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 23 14:18:27.102973 master-0 kubenswrapper[7728]: W0223 14:18:27.101351 7728 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 23 14:18:27.102973 master-0 kubenswrapper[7728]: W0223 14:18:27.101355 7728 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 23 14:18:27.102973 master-0 kubenswrapper[7728]: W0223 14:18:27.101360 7728 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 23 14:18:27.102973 master-0 kubenswrapper[7728]: W0223 14:18:27.101363 7728 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 23 14:18:27.102973 master-0 kubenswrapper[7728]: W0223 14:18:27.101367 7728 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 23 14:18:27.102973 master-0 kubenswrapper[7728]: W0223 14:18:27.101371 7728 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 23 14:18:27.102973 master-0 kubenswrapper[7728]: W0223 14:18:27.101374 7728 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 23 14:18:27.103656 master-0 kubenswrapper[7728]: W0223 14:18:27.101379 7728 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 23 14:18:27.103656 master-0 kubenswrapper[7728]: W0223 14:18:27.101383 7728 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 23 14:18:27.103656 master-0 kubenswrapper[7728]: W0223 14:18:27.101387 7728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 23 14:18:27.103656 master-0 kubenswrapper[7728]: W0223 14:18:27.101391 7728 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 23 14:18:27.103656 master-0 kubenswrapper[7728]: W0223 14:18:27.101396 7728 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 23 14:18:27.103656 master-0 kubenswrapper[7728]: W0223 14:18:27.101400 7728 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 23 14:18:27.103656 master-0 kubenswrapper[7728]: W0223 14:18:27.101403 7728 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 23 14:18:27.103656 master-0 kubenswrapper[7728]: W0223 14:18:27.101407 7728 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 23 14:18:27.103656 master-0 kubenswrapper[7728]: W0223 14:18:27.101411 7728 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 23 14:18:27.103656 master-0 kubenswrapper[7728]: W0223 14:18:27.101415 7728 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 23 14:18:27.103656 master-0 kubenswrapper[7728]: W0223 14:18:27.101419 7728 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 23 14:18:27.103656 master-0 kubenswrapper[7728]: W0223 14:18:27.101422 7728 feature_gate.go:330] unrecognized feature gate: Example Feb 23 14:18:27.103656 master-0 kubenswrapper[7728]: W0223 14:18:27.101425 7728 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 23 14:18:27.103656 master-0 kubenswrapper[7728]: W0223 14:18:27.101429 7728 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 23 14:18:27.103656 master-0 kubenswrapper[7728]: W0223 14:18:27.101432 7728 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 23 14:18:27.103656 master-0 kubenswrapper[7728]: W0223 14:18:27.101436 7728 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 23 14:18:27.103656 master-0 kubenswrapper[7728]: W0223 14:18:27.101440 7728 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 23 14:18:27.103656 master-0 kubenswrapper[7728]: W0223 14:18:27.101443 7728 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 23 14:18:27.103656 master-0 kubenswrapper[7728]: W0223 14:18:27.101449 7728 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 23 14:18:27.103656 master-0 kubenswrapper[7728]: W0223 14:18:27.101452 7728 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 23 14:18:27.104172 master-0 kubenswrapper[7728]: W0223 14:18:27.101457 7728 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 23 14:18:27.104172 master-0 kubenswrapper[7728]: W0223 14:18:27.101461 7728 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 23 14:18:27.104172 master-0 kubenswrapper[7728]: W0223 14:18:27.101465 7728 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 23 14:18:27.104172 master-0 kubenswrapper[7728]: W0223 14:18:27.101468 7728 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 23 14:18:27.104172 master-0 kubenswrapper[7728]: W0223 14:18:27.101472 7728 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Feb 23 14:18:27.104172 master-0 kubenswrapper[7728]: W0223 14:18:27.101487 7728 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 23 14:18:27.104172 master-0 kubenswrapper[7728]: W0223 14:18:27.101491 7728 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 23 14:18:27.104172 master-0 kubenswrapper[7728]: W0223 14:18:27.101495 7728 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 23 14:18:27.104172 master-0 kubenswrapper[7728]: W0223 14:18:27.101500 7728 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 23 14:18:27.104172 master-0 kubenswrapper[7728]: W0223 14:18:27.101504 7728 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 23 14:18:27.104172 master-0 kubenswrapper[7728]: W0223 14:18:27.101508 7728 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 23 14:18:27.104172 master-0 kubenswrapper[7728]: W0223 14:18:27.101512 7728 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 23 14:18:27.104172 master-0 kubenswrapper[7728]: W0223 14:18:27.101516 7728 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 23 14:18:27.104172 master-0 kubenswrapper[7728]: W0223 14:18:27.101519 7728 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 23 14:18:27.104172 master-0 kubenswrapper[7728]: W0223 14:18:27.101523 7728 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 23 14:18:27.104172 master-0 kubenswrapper[7728]: W0223 14:18:27.101526 7728 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 23 14:18:27.104172 master-0 kubenswrapper[7728]: W0223 14:18:27.101530 7728 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 23 14:18:27.104172 master-0 kubenswrapper[7728]: W0223 14:18:27.101533 7728 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 23 14:18:27.104172 master-0 kubenswrapper[7728]: W0223 14:18:27.101537 7728 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 23 14:18:27.104172 master-0 kubenswrapper[7728]: W0223 14:18:27.101541 7728 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 23 14:18:27.104770 master-0 kubenswrapper[7728]: W0223 14:18:27.101547 7728 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 23 14:18:27.104770 master-0 kubenswrapper[7728]: W0223 14:18:27.101551 7728 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 23 14:18:27.104770 master-0 kubenswrapper[7728]: W0223 14:18:27.101555 7728 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 23 14:18:27.104770 master-0 kubenswrapper[7728]: W0223 14:18:27.101560 7728 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 23 14:18:27.104770 master-0 kubenswrapper[7728]: W0223 14:18:27.101564 7728 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 23 14:18:27.104770 master-0 kubenswrapper[7728]: W0223 14:18:27.101570 7728 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 23 14:18:27.104770 master-0 kubenswrapper[7728]: W0223 14:18:27.101574 7728 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 23 14:18:27.104770 master-0 kubenswrapper[7728]: W0223 14:18:27.101580 7728 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 23 14:18:27.104770 master-0 kubenswrapper[7728]: I0223 14:18:27.101587 7728 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 23 14:18:27.104770 master-0 kubenswrapper[7728]: I0223 14:18:27.101852 7728 server.go:940] "Client rotation is on, will bootstrap in background" Feb 23 14:18:27.104770 master-0 kubenswrapper[7728]: I0223 14:18:27.104250 7728 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 23 14:18:27.104770 master-0 kubenswrapper[7728]: I0223 14:18:27.104407 7728 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 23 14:18:27.105067 master-0 kubenswrapper[7728]: I0223 14:18:27.104830 7728 server.go:997] "Starting client certificate rotation" Feb 23 14:18:27.105067 master-0 kubenswrapper[7728]: I0223 14:18:27.104864 7728 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 23 14:18:27.105256 master-0 kubenswrapper[7728]: I0223 14:18:27.105053 7728 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 14:08:38 +0000 UTC, rotation deadline is 2026-02-24 08:10:33.796496937 +0000 UTC Feb 23 14:18:27.105256 master-0 kubenswrapper[7728]: I0223 14:18:27.105246 7728 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 17h52m6.691255282s for next certificate rotation Feb 23 14:18:27.107176 master-0 kubenswrapper[7728]: I0223 14:18:27.107145 7728 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 23 14:18:27.109784 master-0 kubenswrapper[7728]: I0223 14:18:27.109738 7728 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 23 14:18:27.114020 master-0 kubenswrapper[7728]: I0223 14:18:27.113991 7728 log.go:25] "Validated CRI v1 runtime API" Feb 23 14:18:27.116986 master-0 kubenswrapper[7728]: I0223 14:18:27.116953 7728 log.go:25] "Validated CRI v1 image API" Feb 23 14:18:27.118583 master-0 kubenswrapper[7728]: I0223 14:18:27.118559 7728 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 23 14:18:27.121773 master-0 kubenswrapper[7728]: I0223 14:18:27.121740 7728 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4 be859281-f98a-48e6-a6b4-cc97afbc917c:/dev/vda3] Feb 23 14:18:27.122017 master-0 kubenswrapper[7728]: I0223 14:18:27.121766 7728 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0829d0cc308970ce0a149e45fa21b4352374f015f4b31f7eb48a14c16cbea5b2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0829d0cc308970ce0a149e45fa21b4352374f015f4b31f7eb48a14c16cbea5b2/userdata/shm major:0 minor:58 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4782d187d8efc0b4014aa50653963e17b661187c5f36601036516cb2857a5d98/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4782d187d8efc0b4014aa50653963e17b661187c5f36601036516cb2857a5d98/userdata/shm major:0 minor:295 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4b7d2c8100142f929dc133ef3a280566ae721c684f05389d72d0b6d99271f228/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4b7d2c8100142f929dc133ef3a280566ae721c684f05389d72d0b6d99271f228/userdata/shm major:0 minor:275 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4f93047dc0b7cb7f4f7c771225dde60d727738bda2832af456ff04f11ecb402a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4f93047dc0b7cb7f4f7c771225dde60d727738bda2832af456ff04f11ecb402a/userdata/shm major:0 minor:285 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5110b129f87dd0c4cfa0060a0c853f8887b553680a908511fa6dc6b38b84e26d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5110b129f87dd0c4cfa0060a0c853f8887b553680a908511fa6dc6b38b84e26d/userdata/shm major:0 minor:266 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/57996809f1e2dec5f618cc991b1ec9797922b627eb03d04dabd6bb6cb4205117/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/57996809f1e2dec5f618cc991b1ec9797922b627eb03d04dabd6bb6cb4205117/userdata/shm major:0 minor:50 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/64bb509af2e5ff8862c488db172772b3eb4a331f81000bc6d7b1d4be31a7f27d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/64bb509af2e5ff8862c488db172772b3eb4a331f81000bc6d7b1d4be31a7f27d/userdata/shm major:0 minor:54 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6b4c5917f42a018e656736ffe0ec509b45d342d70ccb1039a1f41866022cf32e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6b4c5917f42a018e656736ffe0ec509b45d342d70ccb1039a1f41866022cf32e/userdata/shm major:0 minor:41 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6ffc0e356ee8d2e23632fe04da113c89cd2bff5243dd2b5c07a151a546ba49d8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6ffc0e356ee8d2e23632fe04da113c89cd2bff5243dd2b5c07a151a546ba49d8/userdata/shm major:0 minor:269 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7745fe383c3438f3eb713290ae29bc45137b7df8f820bdc331981eebbfe561fe/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7745fe383c3438f3eb713290ae29bc45137b7df8f820bdc331981eebbfe561fe/userdata/shm major:0 minor:77 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7a94361576154416139d60324d6f01b1540eacf16a8dedc989cadf9cc6e41fca/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7a94361576154416139d60324d6f01b1540eacf16a8dedc989cadf9cc6e41fca/userdata/shm major:0 minor:282 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7d9debfc99355a24383e4ffd764682011042ebcd62151bc7e6d7e61d3c2be56f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7d9debfc99355a24383e4ffd764682011042ebcd62151bc7e6d7e61d3c2be56f/userdata/shm major:0 minor:124 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8bce00dde4bf57f38bea21a54eaeb5445e9a6797bd70cd70ab2f40465ffb6015/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8bce00dde4bf57f38bea21a54eaeb5445e9a6797bd70cd70ab2f40465ffb6015/userdata/shm major:0 minor:273 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/984304e1b4252b7619a58df9f7ce55ca2014852517f80186c3411dc4b687d274/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/984304e1b4252b7619a58df9f7ce55ca2014852517f80186c3411dc4b687d274/userdata/shm major:0 minor:144 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9c6c5f4b9ba45ac61b51f9857ceb74fc6b905bb2bdd1312940fdeb330ace9d7f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9c6c5f4b9ba45ac61b51f9857ceb74fc6b905bb2bdd1312940fdeb330ace9d7f/userdata/shm major:0 minor:261 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a763a9aa12dde6c52d5c6991687ebd101bd47550719a37c47c1a30d449928cff/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a763a9aa12dde6c52d5c6991687ebd101bd47550719a37c47c1a30d449928cff/userdata/shm major:0 minor:168 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a97726e86565351c3e74221a112a0906c73bf937319f30cac1b3e4b4f38e404f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a97726e86565351c3e74221a112a0906c73bf937319f30cac1b3e4b4f38e404f/userdata/shm major:0 minor:264 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b0f1382249dc5f24b8f4811073e190383453a7404f8296e621cf4a7e56c21042/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b0f1382249dc5f24b8f4811073e190383453a7404f8296e621cf4a7e56c21042/userdata/shm major:0 minor:301 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/bf4e70417a5730a71d5c5227bc5ca324709a18d10a43505d1415f3e27a32b0fc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/bf4e70417a5730a71d5c5227bc5ca324709a18d10a43505d1415f3e27a32b0fc/userdata/shm major:0 minor:279 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/dd18675422a4846ac8ff692dbc3019546e4c2cecfa8b4d0fe07976539e44abe0/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/dd18675422a4846ac8ff692dbc3019546e4c2cecfa8b4d0fe07976539e44abe0/userdata/shm major:0 minor:46 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f850a3ee886935c4dd2d0266e97d2bc00c30e8e88c1475292224ac9d98f6501e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f850a3ee886935c4dd2d0266e97d2bc00c30e8e88c1475292224ac9d98f6501e/userdata/shm major:0 minor:145 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fb44bfa273a0390e40795165f46ee3660a2a5c93ba6fcc3ac327138fc4e69610/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fb44bfa273a0390e40795165f46ee3660a2a5c93ba6fcc3ac327138fc4e69610/userdata/shm major:0 minor:131 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fcec922662159dc1cf38c675599685e8c305a9fc3cb374ca7d731b92354b4d60/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fcec922662159dc1cf38c675599685e8c305a9fc3cb374ca7d731b92354b4d60/userdata/shm major:0 minor:299 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/08c561b3-613b-425f-9de4-d5fc8762ea51/volumes/kubernetes.io~projected/kube-api-access-phmkf:{mountpoint:/var/lib/kubelet/pods/08c561b3-613b-425f-9de4-d5fc8762ea51/volumes/kubernetes.io~projected/kube-api-access-phmkf major:0 minor:277 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/09d80e28-0b64-4c5d-a9bc-99d843d40165/volumes/kubernetes.io~projected/kube-api-access-g9z2f:{mountpoint:/var/lib/kubelet/pods/09d80e28-0b64-4c5d-a9bc-99d843d40165/volumes/kubernetes.io~projected/kube-api-access-g9z2f major:0 minor:113 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/24829faf-50e8-45bb-abb0-7cc5ccf81080/volumes/kubernetes.io~projected/kube-api-access-7hp42:{mountpoint:/var/lib/kubelet/pods/24829faf-50e8-45bb-abb0-7cc5ccf81080/volumes/kubernetes.io~projected/kube-api-access-7hp42 major:0 minor:272 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/24829faf-50e8-45bb-abb0-7cc5ccf81080/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/24829faf-50e8-45bb-abb0-7cc5ccf81080/volumes/kubernetes.io~secret/serving-cert major:0 minor:244 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3488a7eb-5170-478c-9af7-490dbe0f514e/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/3488a7eb-5170-478c-9af7-490dbe0f514e/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:254 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3488a7eb-5170-478c-9af7-490dbe0f514e/volumes/kubernetes.io~projected/kube-api-access-6qszm:{mountpoint:/var/lib/kubelet/pods/3488a7eb-5170-478c-9af7-490dbe0f514e/volumes/kubernetes.io~projected/kube-api-access-6qszm major:0 minor:258 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3cea0ab8-258b-486c-bb7f-8c93930b296d/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/3cea0ab8-258b-486c-bb7f-8c93930b296d/volumes/kubernetes.io~projected/kube-api-access major:0 minor:110 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/483786a0-0a29-44bf-bbd0-2f37e045aa2c/volumes/kubernetes.io~projected/kube-api-access-88qnh:{mountpoint:/var/lib/kubelet/pods/483786a0-0a29-44bf-bbd0-2f37e045aa2c/volumes/kubernetes.io~projected/kube-api-access-88qnh major:0 minor:130 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/57b57915-64dd-42f5-b06f-bc4bcc06b667/volumes/kubernetes.io~projected/kube-api-access-qggzs:{mountpoint:/var/lib/kubelet/pods/57b57915-64dd-42f5-b06f-bc4bcc06b667/volumes/kubernetes.io~projected/kube-api-access-qggzs major:0 minor:259 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/585f74db-4593-426b-b0c7-ec8f64810549/volumes/kubernetes.io~projected/kube-api-access-q9tkx:{mountpoint:/var/lib/kubelet/pods/585f74db-4593-426b-b0c7-ec8f64810549/volumes/kubernetes.io~projected/kube-api-access-q9tkx major:0 minor:267 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5b54fc16-d2f7-4b10-a611-5b411b389c5a/volumes/kubernetes.io~projected/kube-api-access-d5f8j:{mountpoint:/var/lib/kubelet/pods/5b54fc16-d2f7-4b10-a611-5b411b389c5a/volumes/kubernetes.io~projected/kube-api-access-d5f8j major:0 minor:253 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/607c1101-3533-43e3-9eda-13cea2b9dbb6/volumes/kubernetes.io~projected/kube-api-access-v4sbp:{mountpoint:/var/lib/kubelet/pods/607c1101-3533-43e3-9eda-13cea2b9dbb6/volumes/kubernetes.io~projected/kube-api-access-v4sbp major:0 minor:284 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/646fece3-4a42-4e0c-bcc7-5f705f948d63/volumes/kubernetes.io~projected/kube-api-access-2jzsd:{mountpoint:/var/lib/kubelet/pods/646fece3-4a42-4e0c-bcc7-5f705f948d63/volumes/kubernetes.io~projected/kube-api-access-2jzsd major:0 minor:255 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/674041a2-e2b0-4286-88cc-f1b00571e3f3/volumes/kubernetes.io~projected/kube-api-access-brd4j:{mountpoint:/var/lib/kubelet/pods/674041a2-e2b0-4286-88cc-f1b00571e3f3/volumes/kubernetes.io~projected/kube-api-access-brd4j major:0 minor:73 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/674041a2-e2b0-4286-88cc-f1b00571e3f3/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/674041a2-e2b0-4286-88cc-f1b00571e3f3/volumes/kubernetes.io~secret/metrics-tls major:0 minor:43 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/842d45c5-3452-4e97-b5f5-540395330a65/volumes/kubernetes.io~projected/kube-api-access-j54j5:{mountpoint:/var/lib/kubelet/pods/842d45c5-3452-4e97-b5f5-540395330a65/volumes/kubernetes.io~projected/kube-api-access-j54j5 major:0 minor:281 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/865ceedb-b19a-4f2f-b295-311e1b7a645e/volumes/kubernetes.io~projected/kube-api-access-tr2p2:{mountpoint:/var/lib/kubelet/pods/865ceedb-b19a-4f2f-b295-311e1b7a645e/volumes/kubernetes.io~projected/kube-api-access-tr2p2 major:0 minor:239 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/865ceedb-b19a-4f2f-b295-311e1b7a645e/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/865ceedb-b19a-4f2f-b295-311e1b7a645e/volumes/kubernetes.io~secret/serving-cert major:0 minor:235 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8de1f285-47ac-42aa-8026-8addce656362/volumes/kubernetes.io~projected/kube-api-access-x7jvd:{mountpoint:/var/lib/kubelet/pods/8de1f285-47ac-42aa-8026-8addce656362/volumes/kubernetes.io~projected/kube-api-access-x7jvd major:0 minor:260 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8de1f285-47ac-42aa-8026-8addce656362/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/8de1f285-47ac-42aa-8026-8addce656362/volumes/kubernetes.io~secret/etcd-client major:0 minor:243 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8de1f285-47ac-42aa-8026-8addce656362/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/8de1f285-47ac-42aa-8026-8addce656362/volumes/kubernetes.io~secret/serving-cert major:0 minor:247 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/961e4ecd-545b-4270-ae34-e733dec793b6/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/961e4ecd-545b-4270-ae34-e733dec793b6/volumes/kubernetes.io~projected/kube-api-access major:0 minor:263 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/961e4ecd-545b-4270-ae34-e733dec793b6/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/961e4ecd-545b-4270-ae34-e733dec793b6/volumes/kubernetes.io~secret/serving-cert major:0 minor:245 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a4ae9292-71dc-4484-b277-43cb26c1e04d/volumes/kubernetes.io~projected/kube-api-access-8llc8:{mountpoint:/var/lib/kubelet/pods/a4ae9292-71dc-4484-b277-43cb26c1e04d/volumes/kubernetes.io~projected/kube-api-access-8llc8 major:0 minor:250 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ace75aae-6f4f-4299-90e2-d5292271b136/volumes/kubernetes.io~projected/kube-api-access-wzkcs:{mountpoint:/var/lib/kubelet/pods/ace75aae-6f4f-4299-90e2-d5292271b136/volumes/kubernetes.io~projected/kube-api-access-wzkcs major:0 minor:135 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b090ed5a-984f-41dd-8cea-34a1ece1514f/volumes/kubernetes.io~projected/kube-api-access-fjs6f:{mountpoint:/var/lib/kubelet/pods/b090ed5a-984f-41dd-8cea-34a1ece1514f/volumes/kubernetes.io~projected/kube-api-access-fjs6f major:0 minor:141 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b090ed5a-984f-41dd-8cea-34a1ece1514f/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/b090ed5a-984f-41dd-8cea-34a1ece1514f/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:140 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b714a9df-026e-423d-a980-2569f0d92e47/volumes/kubernetes.io~projected/kube-api-access-lr868:{mountpoint:/var/lib/kubelet/pods/b714a9df-026e-423d-a980-2569f0d92e47/volumes/kubernetes.io~projected/kube-api-access-lr868 major:0 minor:256 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b714a9df-026e-423d-a980-2569f0d92e47/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/b714a9df-026e-423d-a980-2569f0d92e47/volumes/kubernetes.io~secret/serving-cert major:0 minor:246 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b9cf1c39-24f0-420b-8020-089616d1cdf0/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/b9cf1c39-24f0-420b-8020-089616d1cdf0/volumes/kubernetes.io~projected/kube-api-access major:0 minor:271 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b9cf1c39-24f0-420b-8020-089616d1cdf0/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/b9cf1c39-24f0-420b-8020-089616d1cdf0/volumes/kubernetes.io~secret/serving-cert major:0 minor:248 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bbe678de-546d-49d0-8280-3f6d94fa5e4f/volumes/kubernetes.io~projected/kube-api-access-kp5kb:{mountpoint:/var/lib/kubelet/pods/bbe678de-546d-49d0-8280-3f6d94fa5e4f/volumes/kubernetes.io~projected/kube-api-access-kp5kb major:0 minor:167 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bbe678de-546d-49d0-8280-3f6d94fa5e4f/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/bbe678de-546d-49d0-8280-3f6d94fa5e4f/volumes/kubernetes.io~secret/webhook-cert major:0 minor:166 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cb6e88cd-98de-446a-92e8-f56a2f133703/volumes/kubernetes.io~projected/kube-api-access-chznd:{mountpoint:/var/lib/kubelet/pods/cb6e88cd-98de-446a-92e8-f56a2f133703/volumes/kubernetes.io~projected/kube-api-access-chznd major:0 minor:249 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cb6e88cd-98de-446a-92e8-f56a2f133703/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/cb6e88cd-98de-446a-92e8-f56a2f133703/volumes/kubernetes.io~secret/serving-cert major:0 minor:252 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cf04aca0-8174-4134-835d-37adf6a3b5ca/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/cf04aca0-8174-4134-835d-37adf6a3b5ca/volumes/kubernetes.io~projected/kube-api-access major:0 minor:251 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cf04aca0-8174-4134-835d-37adf6a3b5ca/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/cf04aca0-8174-4134-835d-37adf6a3b5ca/volumes/kubernetes.io~secret/serving-cert major:0 minor:242 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061/volumes/kubernetes.io~projected/kube-api-access-vp6tj:{mountpoint:/var/lib/kubelet/pods/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061/volumes/kubernetes.io~projected/kube-api-access-vp6tj major:0 minor:257 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:240 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e2d00ece-7586-4346-adbb-eaae1aeda69e/volumes/kubernetes.io~projected/kube-api-access-4nr85:{mountpoint:/var/lib/kubelet/pods/e2d00ece-7586-4346-adbb-eaae1aeda69e/volumes/kubernetes.io~projected/kube-api-access-4nr85 major:0 minor:278 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e2d00ece-7586-4346-adbb-eaae1aeda69e/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/e2d00ece-7586-4346-adbb-eaae1aeda69e/volumes/kubernetes.io~secret/serving-cert major:0 minor:241 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f10f592e-5738-4879-b776-246b357d4621/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/f10f592e-5738-4879-b776-246b357d4621/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f10f592e-5738-4879-b776-246b357d4621/volumes/kubernetes.io~projected/kube-api-access-269v7:{mountpoint:/var/lib/kubelet/pods/f10f592e-5738-4879-b776-246b357d4621/volumes/kubernetes.io~projected/kube-api-access-269v7 major:0 minor:143 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f10f592e-5738-4879-b776-246b357d4621/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/f10f592e-5738-4879-b776-246b357d4621/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:142 fsType:tmpfs blockSize:0} overlay_0-102:{mountpoint:/var/lib/containers/storage/overlay/3e23d738fd862891b3c5c6ee84f91a0ce1daa9cf5128e482a3f4b8954a4bf8d3/merged major:0 minor:102 fsType:overlay blockSize:0} overlay_0-108:{mountpoint:/var/lib/containers/storage/overlay/6a9d7144afab304634f4421dbd76371440974bc716c0b26cd01f589c0034a15c/merged major:0 minor:108 fsType:overlay blockSize:0} overlay_0-111:{mountpoint:/var/lib/containers/storage/overlay/aa8bfffbe1cc23950a48e86d7438fcb09c7e8b9c4f14a3ca45177c89e18dfa48/merged major:0 minor:111 fsType:overlay blockSize:0} overlay_0-116:{mountpoint:/var/lib/containers/storage/overlay/fbf0b4592cee0fa136a06e3b9c1a66152a8b40356d1fb53bc290b6031dc549ba/merged major:0 minor:116 fsType:overlay blockSize:0} overlay_0-118:{mountpoint:/var/lib/containers/storage/overlay/a1aa23976f0a98ed5e50531694375ca3b171e1566104ad25da2d5f443d50d8d4/merged major:0 minor:118 fsType:overlay blockSize:0} overlay_0-126:{mountpoint:/var/lib/containers/storage/overlay/65ee14cad8f8e89d1b904e383e96e66f946404750a35c14c963851861e9757a0/merged major:0 minor:126 fsType:overlay blockSize:0} overlay_0-128:{mountpoint:/var/lib/containers/storage/overlay/aef8feb430ce5c615cb7ec557a03ee87c7e4a9488a29525a55eafa389c7dff44/merged major:0 minor:128 fsType:overlay blockSize:0} overlay_0-133:{mountpoint:/var/lib/containers/storage/overlay/f36e77d1a7f8bcfa0abd0da89b68bc1acea7f4ab44b36a59515228e130593ab3/merged major:0 minor:133 fsType:overlay blockSize:0} overlay_0-136:{mountpoint:/var/lib/containers/storage/overlay/cab0cf6a3545e3e03202870da16a0f9b15304a7d67884a95e0603397075cf891/merged major:0 minor:136 fsType:overlay blockSize:0} overlay_0-138:{mountpoint:/var/lib/containers/storage/overlay/6f8d98512dccef2bf36b0e248f554e9ff2c819756dda85ebe869a85bb6873172/merged major:0 minor:138 fsType:overlay blockSize:0} overlay_0-148:{mountpoint:/var/lib/containers/storage/overlay/4dafe75899de7a156b4aa675f65ba30667db6c91576d7feb74f9a5c7dda5586a/merged major:0 minor:148 fsType:overlay blockSize:0} overlay_0-150:{mountpoint:/var/lib/containers/storage/overlay/d3cc6d80825cea1b3b543f8f76c93eaad81913823bc8b5613f1475c3af3b9369/merged major:0 minor:150 fsType:overlay blockSize:0} overlay_0-152:{mountpoint:/var/lib/containers/storage/overlay/41daa65fd8ca427bc31044355e67d5fbecef2836af01f5ca37cf0caf5ee25ec4/merged major:0 minor:152 fsType:overlay blockSize:0} overlay_0-154:{mountpoint:/var/lib/containers/storage/overlay/3333daf134cb3da0245069a02c7243182aedd8032f3702a16979a442df0d8ec1/merged major:0 minor:154 fsType:overlay blockSize:0} overlay_0-159:{mountpoint:/var/lib/containers/storage/overlay/4d25ac30bfc8b0dc4a9dee0884be64c59b3b4a06b12fc37141083d69359d3c63/merged major:0 minor:159 fsType:overlay blockSize:0} overlay_0-164:{mountpoint:/var/lib/containers/storage/overlay/416fdbd7f81cdc209c426295b456e9f3b325016c99af172a0cdb54ee1b720108/merged major:0 minor:164 fsType:overlay blockSize:0} overlay_0-170:{mountpoint:/var/lib/containers/storage/overlay/38f99c8ad9bd37b9b4302ffff5c5e979340a349394f8d63ad9affee8a695b213/merged major:0 minor:170 fsType:overlay blockSize:0} overlay_0-172:{mountpoint:/var/lib/containers/storage/overlay/6d458c784040933dc173e5962450a973b8341f582593afa9496558be7640fe0e/merged major:0 minor:172 fsType:overlay blockSize:0} overlay_0-174:{mountpoint:/var/lib/containers/storage/overlay/450c703987cbb7e8e6d3ebe4ebb96d3e61162f0469e9f739001f2a8881f943e7/merged major:0 minor:174 fsType:overlay blockSize:0} overlay_0-176:{mountpoint:/var/lib/containers/storage/overlay/c36ac0523fb77eb41a58606dc01d214baf8615e5bbeb8ff440c70ab25e948a9b/merged major:0 minor:176 fsType:overlay blockSize:0} overlay_0-178:{mountpoint:/var/lib/containers/storage/overlay/cee62bc9f0ca9757435066cbff749f999feef7c526f307853875114308836b79/merged major:0 minor:178 fsType:overlay blockSize:0} overlay_0-180:{mountpoint:/var/lib/containers/storage/overlay/f3a06b6bc5cc9309a5e816dc7e624e48b2fd95e158eaf6ec7a977ce6ed11e8f4/merged major:0 minor:180 fsType:overlay blockSize:0} overlay_0-182:{mountpoint:/var/lib/containers/storage/overlay/1e3ee3b4e61b65508bb9a9c3ce4e93682fca7288ae8c4afe90b5241b15eeb3ac/merged major:0 minor:182 fsType:overlay blockSize:0} overlay_0-190:{mountpoint:/var/lib/containers/storage/overlay/58f913d8907dd224ec3b885692f70c1043484d915de9010496ea8ceddd1aa889/merged major:0 minor:190 fsType:overlay blockSize:0} overlay_0-192:{mountpoint:/var/lib/containers/storage/overlay/51ddc9b6e59ecc9f855b0bb8fd2248ea06dad26175bb39cd321e6c96e8982f82/merged major:0 minor:192 fsType:overlay blockSize:0} overlay_0-200:{mountpoint:/var/lib/containers/storage/overlay/da967a919abebb50750e8c0d85bef2750865d0726afd702b1d1a3f17dfe53bad/merged major:0 minor:200 fsType:overlay blockSize:0} overlay_0-205:{mountpoint:/var/lib/containers/storage/overlay/1a68b7abdc1772ea02040d6b0540fe9d6e306c2754d5742bf76423df09a9a7b8/merged major:0 minor:205 fsType:overlay blockSize:0} overlay_0-210:{mountpoint:/var/lib/containers/storage/overlay/245e9b1536351d0c345cb010b3854cf958019e458799ecda785c2e646b1533aa/merged major:0 minor:210 fsType:overlay blockSize:0} overlay_0-215:{mountpoint:/var/lib/containers/storage/overlay/c0e9ed247f3ae29cce83d4673289a9d6f58da74c2a0ac713b5699a6fc3d6f605/merged major:0 minor:215 fsType:overlay blockSize:0} overlay_0-220:{mountpoint:/var/lib/containers/storage/overlay/5fee5e65188a47c8fb1d51bad993bbec9a22552dcb8f6652e23cea5a06aef08c/merged major:0 minor:220 fsType:overlay blockSize:0} overlay_0-221:{mountpoint:/var/lib/containers/storage/overlay/a5c42f1605b03d1d312007f4db64dc34a7c6cd9b8b8616276f3c0b24e8b7e650/merged major:0 minor:221 fsType:overlay blockSize:0} overlay_0-230:{mountpoint:/var/lib/containers/storage/overlay/6938a7f15f3e6a2e952b5dd5e28c577ac4d16eadc2b89da7147e74db80f20045/merged major:0 minor:230 fsType:overlay blockSize:0} overlay_0-287:{mountpoint:/var/lib/containers/storage/overlay/fd45fad339d4e6dd1831b338f9a235c704498ed73e806313f3f04902ebd9428f/merged major:0 minor:287 fsType:overlay blockSize:0} overlay_0-289:{mountpoint:/var/lib/containers/storage/overlay/59e96ceb691c7ea249ed2a793caeb2025d0fe992eaa028ba9c99c2b376fd863d/merged major:0 minor:289 fsType:overlay blockSize:0} overlay_0-291:{mountpoint:/var/lib/containers/storage/overlay/de7124b09be7c8692a63b3d1a8ea9f9f83da6a0eee30081f57e20a98470ba8c8/merged major:0 minor:291 fsType:overlay blockSize:0} overlay_0-293:{mountpoint:/var/lib/containers/storage/overlay/947b9b54a7154bfe64998abbd37772745f37b596f118913ecd5b817e6399e08d/merged major:0 minor:293 fsType:overlay blockSize:0} overlay_0-297:{mountpoint:/var/lib/containers/storage/overlay/3120d70b85fb10291a840bc6a1b4850fcf98f369a88a30a9cf6401424a58a8e5/merged major:0 minor:297 fsType:overlay blockSize:0} overlay_0-303:{mountpoint:/var/lib/containers/storage/overlay/02e6bec4520bbf2009bcda686373feb51de4291850d5a3614e322a9b5c365b91/merged major:0 minor:303 fsType:overlay blockSize:0} overlay_0-305:{mountpoint:/var/lib/containers/storage/overlay/91baa4597c1a7f1c159d6bb3eb6edd6cb43dfe7701c56233800fe59c612ecca5/merged major:0 minor:305 fsType:overlay blockSize:0} overlay_0-307:{mountpoint:/var/lib/containers/storage/overlay/e9e10c6da5a947cddda5a879cc8c50f82fd9899a38d2585752e58829688114de/merged major:0 minor:307 fsType:overlay blockSize:0} overlay_0-309:{mountpoint:/var/lib/containers/storage/overlay/12bbc7c46d95b08f346a3f5f71c8a0540090dc539a1b2c8b5350a14f78c46334/merged major:0 minor:309 fsType:overlay blockSize:0} overlay_0-311:{mountpoint:/var/lib/containers/storage/overlay/e0bb40a25462827249a9ec7a4d574e757daecd055c7ca8c0b02fabf14528f9ed/merged major:0 minor:311 fsType:overlay blockSize:0} overlay_0-313:{mountpoint:/var/lib/containers/storage/overlay/6329e16ae625fb7e5314e841d6aba1be9fe513780d9fdafa4adc9eee32be69ed/merged major:0 minor:313 fsType:overlay blockSize:0} overlay_0-315:{mountpoint:/var/lib/containers/storage/overlay/8aa793db136d69396e48e1178df52c10748763169e9d74c379a148935c773ce4/merged major:0 minor:315 fsType:overlay blockSize:0} overlay_0-317:{mountpoint:/var/lib/containers/storage/overlay/1009f928e04db67d5b9bce4f0015c9968bf4372829e27df4fb8f204db02cefbf/merged major:0 minor:317 fsType:overlay blockSize:0} overlay_0-319:{mountpoint:/var/lib/containers/storage/overlay/57c349b0dbe7e8c2c5decc44840351a66d8b39475a4814668d89f8d47a3934a2/merged major:0 minor:319 fsType:overlay blockSize:0} overlay_0-44:{mountpoint:/var/lib/containers/storage/overlay/8fb4fdf0e668f6f38a1acd96e143707d856196b9145f68615cb55b06fdd730cf/merged major:0 minor:44 fsType:overlay blockSize:0} overlay_0-48:{mountpoint:/var/lib/containers/storage/overlay/48c01a72e18dd97c4d0b216b631528a3cd619fde4251bc708cacac3ff227dd4e/merged major:0 minor:48 fsType:overlay blockSize:0} overlay_0-52:{mountpoint:/var/lib/containers/storage/overlay/3fbe469d46126a5a22a28864d2d45c3e798b77aa9755af854ec32de7ba3755e4/merged major:0 minor:52 fsType:overlay blockSize:0} overlay_0-56:{mountpoint:/var/lib/containers/storage/overlay/4022e1e0f28afc0ea00cfc33758e87d910e1b3ea281bb82c1d936bc24516b076/merged major:0 minor:56 fsType:overlay blockSize:0} overlay_0-60:{mountpoint:/var/lib/containers/storage/overlay/476e0dd1c67d95cd8931cd33fd0cc8afd0a94d532560d2489594707a49535a77/merged major:0 minor:60 fsType:overlay blockSize:0} overlay_0-62:{mountpoint:/var/lib/containers/storage/overlay/4c8ad3a8be25193d2ab53ea0def8c657bf8fd0e3774f7aa2875e80d5241b3549/merged major:0 minor:62 fsType:overlay blockSize:0} overlay_0-64:{mountpoint:/var/lib/containers/storage/overlay/b32dcbadef3e40e3021ff7b2b77b0d81ca02bc6b4ed3579b4ff5ae4d3bf7206d/merged major:0 minor:64 fsType:overlay blockSize:0} overlay_0-66:{mountpoint:/var/lib/containers/storage/overlay/8e4ea7dcd466d47ff9fb8782263f5797a87f12d02a6690edb7ba6bd4e071b1e3/merged major:0 minor:66 fsType:overlay blockSize:0} overlay_0-71:{mountpoint:/var/lib/containers/storage/overlay/22195f84b8b2a2fd509a7fe1c82467b03d647c1a5d825126523241085b2229cc/merged major:0 minor:71 fsType:overlay blockSize:0} overlay_0-78:{mountpoint:/var/lib/containers/storage/overlay/029311ebd616972c426559e6755d65c760e54dc38f368fe4e4be6d9c920c3816/merged major:0 minor:78 fsType:overlay blockSize:0} overlay_0-80:{mountpoint:/var/lib/containers/storage/overlay/82ab4451e35e9a68cf12b0b304e32274205edd9ccdfec8c0f6378e58164fbbe3/merged major:0 minor:80 fsType:overlay blockSize:0} overlay_0-82:{mountpoint:/var/lib/containers/storage/overlay/9c5a07ef6357864a976e979a79eb33b087aedb7ca61e4bd842040ad0b65cfa4c/merged major:0 minor:82 fsType:overlay blockSize:0} overlay_0-84:{mountpoint:/var/lib/containers/storage/overlay/11d70b8baed9d344523383b57bd6da0f6705babfd128e763946bf4fe59f5d386/merged major:0 minor:84 fsType:overlay blockSize:0} overlay_0-92:{mountpoint:/var/lib/containers/storage/overlay/ea029ff302f6bf96011e43532c5f57da64e0411f047fc3e3368f1106fb248fe5/merged major:0 minor:92 fsType:overlay blockSize:0} overlay_0-94:{mountpoint:/var/lib/containers/storage/overlay/933512f39ed2d596e622ceb8d639da1abdf3cb6ca1055ffa05c6bcd37f642771/merged major:0 minor:94 fsType:overlay blockSize:0}] Feb 23 14:18:27.146588 master-0 kubenswrapper[7728]: I0223 14:18:27.145936 7728 manager.go:217] Machine: {Timestamp:2026-02-23 14:18:27.144984268 +0000 UTC m=+0.107645564 CPUVendorID:AuthenticAMD NumCores:16 NumPhysicalCores:1 NumSockets:16 CpuFrequency:2800000 MemoryCapacity:50514153472 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:b2003aa684e6437e87dd9193d3a162ac SystemUUID:b2003aa6-84e6-437e-87dd-9193d3a162ac BootID:f84e7e92-cd63-4a9e-83cc-11dcb3ddc406 Filesystems:[{Device:overlay_0-118 DeviceMajor:0 DeviceMinor:118 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-128 DeviceMajor:0 DeviceMinor:128 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-176 DeviceMajor:0 DeviceMinor:176 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-182 DeviceMajor:0 DeviceMinor:182 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-303 DeviceMajor:0 DeviceMinor:303 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/dd18675422a4846ac8ff692dbc3019546e4c2cecfa8b4d0fe07976539e44abe0/userdata/shm DeviceMajor:0 DeviceMinor:46 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-108 DeviceMajor:0 DeviceMinor:108 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b090ed5a-984f-41dd-8cea-34a1ece1514f/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:140 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/cb6e88cd-98de-446a-92e8-f56a2f133703/volumes/kubernetes.io~projected/kube-api-access-chznd DeviceMajor:0 DeviceMinor:249 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f850a3ee886935c4dd2d0266e97d2bc00c30e8e88c1475292224ac9d98f6501e/userdata/shm DeviceMajor:0 DeviceMinor:145 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-291 DeviceMajor:0 DeviceMinor:291 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b0f1382249dc5f24b8f4811073e190383453a7404f8296e621cf4a7e56c21042/userdata/shm DeviceMajor:0 DeviceMinor:301 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/3488a7eb-5170-478c-9af7-490dbe0f514e/volumes/kubernetes.io~projected/kube-api-access-6qszm DeviceMajor:0 DeviceMinor:258 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-82 DeviceMajor:0 DeviceMinor:82 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a763a9aa12dde6c52d5c6991687ebd101bd47550719a37c47c1a30d449928cff/userdata/shm DeviceMajor:0 DeviceMinor:168 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-210 DeviceMajor:0 DeviceMinor:210 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b9cf1c39-24f0-420b-8020-089616d1cdf0/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:248 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/cf04aca0-8174-4134-835d-37adf6a3b5ca/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:251 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/cb6e88cd-98de-446a-92e8-f56a2f133703/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:252 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061/volumes/kubernetes.io~projected/kube-api-access-vp6tj DeviceMajor:0 DeviceMinor:257 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/24829faf-50e8-45bb-abb0-7cc5ccf81080/volumes/kubernetes.io~projected/kube-api-access-7hp42 DeviceMajor:0 DeviceMinor:272 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7a94361576154416139d60324d6f01b1540eacf16a8dedc989cadf9cc6e41fca/userdata/shm DeviceMajor:0 DeviceMinor:282 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-64 DeviceMajor:0 DeviceMinor:64 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-172 DeviceMajor:0 DeviceMinor:172 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-319 DeviceMajor:0 DeviceMinor:319 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-71 DeviceMajor:0 DeviceMinor:71 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-111 DeviceMajor:0 DeviceMinor:111 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ace75aae-6f4f-4299-90e2-d5292271b136/volumes/kubernetes.io~projected/kube-api-access-wzkcs DeviceMajor:0 DeviceMinor:135 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/bbe678de-546d-49d0-8280-3f6d94fa5e4f/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:166 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/f10f592e-5738-4879-b776-246b357d4621/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:10102833152 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/865ceedb-b19a-4f2f-b295-311e1b7a645e/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:235 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/5b54fc16-d2f7-4b10-a611-5b411b389c5a/volumes/kubernetes.io~projected/kube-api-access-d5f8j DeviceMajor:0 DeviceMinor:253 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/674041a2-e2b0-4286-88cc-f1b00571e3f3/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:43 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/b9cf1c39-24f0-420b-8020-089616d1cdf0/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:271 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-293 DeviceMajor:0 DeviceMinor:293 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-313 DeviceMajor:0 DeviceMinor:313 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-317 DeviceMajor:0 DeviceMinor:317 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-200 DeviceMajor:0 DeviceMinor:200 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/bf4e70417a5730a71d5c5227bc5ca324709a18d10a43505d1415f3e27a32b0fc/userdata/shm DeviceMajor:0 DeviceMinor:279 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-52 DeviceMajor:0 DeviceMinor:52 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-78 DeviceMajor:0 DeviceMinor:78 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7745fe383c3438f3eb713290ae29bc45137b7df8f820bdc331981eebbfe561fe/userdata/shm DeviceMajor:0 DeviceMinor:77 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-170 DeviceMajor:0 DeviceMinor:170 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/646fece3-4a42-4e0c-bcc7-5f705f948d63/volumes/kubernetes.io~projected/kube-api-access-2jzsd DeviceMajor:0 DeviceMinor:255 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-126 DeviceMajor:0 DeviceMinor:126 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-174 DeviceMajor:0 DeviceMinor:174 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-205 DeviceMajor:0 DeviceMinor:205 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/865ceedb-b19a-4f2f-b295-311e1b7a645e/volumes/kubernetes.io~projected/kube-api-access-tr2p2 DeviceMajor:0 DeviceMinor:239 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8bce00dde4bf57f38bea21a54eaeb5445e9a6797bd70cd70ab2f40465ffb6015/userdata/shm DeviceMajor:0 DeviceMinor:273 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-297 DeviceMajor:0 DeviceMinor:297 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0829d0cc308970ce0a149e45fa21b4352374f015f4b31f7eb48a14c16cbea5b2/userdata/shm DeviceMajor:0 DeviceMinor:58 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-62 DeviceMajor:0 DeviceMinor:62 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-215 DeviceMajor:0 DeviceMinor:215 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8de1f285-47ac-42aa-8026-8addce656362/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:243 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/8de1f285-47ac-42aa-8026-8addce656362/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:247 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/961e4ecd-545b-4270-ae34-e733dec793b6/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:263 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a97726e86565351c3e74221a112a0906c73bf937319f30cac1b3e4b4f38e404f/userdata/shm DeviceMajor:0 DeviceMinor:264 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-102 DeviceMajor:0 DeviceMinor:102 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7d9debfc99355a24383e4ffd764682011042ebcd62151bc7e6d7e61d3c2be56f/userdata/shm DeviceMajor:0 DeviceMinor:124 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-230 DeviceMajor:0 DeviceMinor:230 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/585f74db-4593-426b-b0c7-ec8f64810549/volumes/kubernetes.io~projected/kube-api-access-q9tkx DeviceMajor:0 DeviceMinor:267 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-133 DeviceMajor:0 DeviceMinor:133 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:240 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/b090ed5a-984f-41dd-8cea-34a1ece1514f/volumes/kubernetes.io~projected/kube-api-access-fjs6f DeviceMajor:0 DeviceMinor:141 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-150 DeviceMajor:0 DeviceMinor:150 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f10f592e-5738-4879-b776-246b357d4621/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:142 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4b7d2c8100142f929dc133ef3a280566ae721c684f05389d72d0b6d99271f228/userdata/shm DeviceMajor:0 DeviceMinor:275 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-315 DeviceMajor:0 DeviceMinor:315 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6b4c5917f42a018e656736ffe0ec509b45d342d70ccb1039a1f41866022cf32e/userdata/shm DeviceMajor:0 DeviceMinor:41 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-154 DeviceMajor:0 DeviceMinor:154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4782d187d8efc0b4014aa50653963e17b661187c5f36601036516cb2857a5d98/userdata/shm DeviceMajor:0 DeviceMinor:295 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-309 DeviceMajor:0 DeviceMinor:309 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-44 DeviceMajor:0 DeviceMinor:44 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-148 DeviceMajor:0 DeviceMinor:148 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-221 DeviceMajor:0 DeviceMinor:221 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b714a9df-026e-423d-a980-2569f0d92e47/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:246 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/e2d00ece-7586-4346-adbb-eaae1aeda69e/volumes/kubernetes.io~projected/kube-api-access-4nr85 DeviceMajor:0 DeviceMinor:278 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-152 DeviceMajor:0 DeviceMinor:152 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-178 DeviceMajor:0 DeviceMinor:178 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f10f592e-5738-4879-b776-246b357d4621/volumes/kubernetes.io~projected/kube-api-access-269v7 DeviceMajor:0 DeviceMinor:143 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/a4ae9292-71dc-4484-b277-43cb26c1e04d/volumes/kubernetes.io~projected/kube-api-access-8llc8 DeviceMajor:0 DeviceMinor:250 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/57996809f1e2dec5f618cc991b1ec9797922b627eb03d04dabd6bb6cb4205117/userdata/shm DeviceMajor:0 DeviceMinor:50 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-60 DeviceMajor:0 DeviceMinor:60 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4f93047dc0b7cb7f4f7c771225dde60d727738bda2832af456ff04f11ecb402a/userdata/shm DeviceMajor:0 DeviceMinor:285 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-180 DeviceMajor:0 DeviceMinor:180 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e2d00ece-7586-4346-adbb-eaae1aeda69e/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:241 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/08c561b3-613b-425f-9de4-d5fc8762ea51/volumes/kubernetes.io~projected/kube-api-access-phmkf DeviceMajor:0 DeviceMinor:277 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-56 DeviceMajor:0 DeviceMinor:56 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-116 DeviceMajor:0 DeviceMinor:116 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-190 DeviceMajor:0 DeviceMinor:190 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3488a7eb-5170-478c-9af7-490dbe0f514e/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:254 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-289 DeviceMajor:0 DeviceMinor:289 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:25257078784 Type:vfs Inodes:1048576 HasInodes:true} {Device:/var/lib/kubelet/pods/483786a0-0a29-44bf-bbd0-2f37e045aa2c/volumes/kubernetes.io~projected/kube-api-access-88qnh DeviceMajor:0 DeviceMinor:130 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-164 DeviceMajor:0 DeviceMinor:164 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b714a9df-026e-423d-a980-2569f0d92e47/volumes/kubernetes.io~projected/kube-api-access-lr868 DeviceMajor:0 DeviceMinor:256 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9c6c5f4b9ba45ac61b51f9857ceb74fc6b905bb2bdd1312940fdeb330ace9d7f/userdata/shm DeviceMajor:0 DeviceMinor:261 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-307 DeviceMajor:0 DeviceMinor:307 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-92 DeviceMajor:0 DeviceMinor:92 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fb44bfa273a0390e40795165f46ee3660a2a5c93ba6fcc3ac327138fc4e69610/userdata/shm DeviceMajor:0 DeviceMinor:131 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-136 DeviceMajor:0 DeviceMinor:136 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8de1f285-47ac-42aa-8026-8addce656362/volumes/kubernetes.io~projected/kube-api-access-x7jvd DeviceMajor:0 DeviceMinor:260 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-287 DeviceMajor:0 DeviceMinor:287 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-48 DeviceMajor:0 DeviceMinor:48 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-66 DeviceMajor:0 DeviceMinor:66 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-80 DeviceMajor:0 DeviceMinor:80 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/984304e1b4252b7619a58df9f7ce55ca2014852517f80186c3411dc4b687d274/userdata/shm DeviceMajor:0 DeviceMinor:144 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-192 DeviceMajor:0 DeviceMinor:192 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/842d45c5-3452-4e97-b5f5-540395330a65/volumes/kubernetes.io~projected/kube-api-access-j54j5 DeviceMajor:0 DeviceMinor:281 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-84 DeviceMajor:0 DeviceMinor:84 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/674041a2-e2b0-4286-88cc-f1b00571e3f3/volumes/kubernetes.io~projected/kube-api-access-brd4j DeviceMajor:0 DeviceMinor:73 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/cf04aca0-8174-4134-835d-37adf6a3b5ca/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:242 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/3cea0ab8-258b-486c-bb7f-8c93930b296d/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:110 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-159 DeviceMajor:0 DeviceMinor:159 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-220 DeviceMajor:0 DeviceMinor:220 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/64bb509af2e5ff8862c488db172772b3eb4a331f81000bc6d7b1d4be31a7f27d/userdata/shm DeviceMajor:0 DeviceMinor:54 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6ffc0e356ee8d2e23632fe04da113c89cd2bff5243dd2b5c07a151a546ba49d8/userdata/shm DeviceMajor:0 DeviceMinor:269 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/607c1101-3533-43e3-9eda-13cea2b9dbb6/volumes/kubernetes.io~projected/kube-api-access-v4sbp DeviceMajor:0 DeviceMinor:284 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:10102833152 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/09d80e28-0b64-4c5d-a9bc-99d843d40165/volumes/kubernetes.io~projected/kube-api-access-g9z2f DeviceMajor:0 DeviceMinor:113 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-138 DeviceMajor:0 DeviceMinor:138 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5110b129f87dd0c4cfa0060a0c853f8887b553680a908511fa6dc6b38b84e26d/userdata/shm DeviceMajor:0 DeviceMinor:266 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-311 DeviceMajor:0 DeviceMinor:311 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:25257074688 Type:vfs Inodes:6166278 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/bbe678de-546d-49d0-8280-3f6d94fa5e4f/volumes/kubernetes.io~projected/kube-api-access-kp5kb DeviceMajor:0 DeviceMinor:167 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/961e4ecd-545b-4270-ae34-e733dec793b6/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:245 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-305 DeviceMajor:0 DeviceMinor:305 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/57b57915-64dd-42f5-b06f-bc4bcc06b667/volumes/kubernetes.io~projected/kube-api-access-qggzs DeviceMajor:0 DeviceMinor:259 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fcec922662159dc1cf38c675599685e8c305a9fc3cb374ca7d731b92354b4d60/userdata/shm DeviceMajor:0 DeviceMinor:299 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-94 DeviceMajor:0 DeviceMinor:94 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/24829faf-50e8-45bb-abb0-7cc5ccf81080/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:244 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:4782d187d8efc0b MacAddress:ce:ae:c3:b0:18:fb Speed:10000 Mtu:8900} {Name:4b7d2c8100142f9 MacAddress:2e:c1:0b:ed:3e:17 Speed:10000 Mtu:8900} {Name:5110b129f87dd0c MacAddress:b2:76:37:37:d2:74 Speed:10000 Mtu:8900} {Name:6ffc0e356ee8d2e MacAddress:de:fe:8f:d5:58:a5 Speed:10000 Mtu:8900} {Name:7a9436157615441 MacAddress:6a:e1:08:2c:ce:37 Speed:10000 Mtu:8900} {Name:8bce00dde4bf57f MacAddress:8a:75:a3:7a:ce:97 Speed:10000 Mtu:8900} {Name:9c6c5f4b9ba45ac MacAddress:96:a4:fc:13:60:eb Speed:10000 Mtu:8900} {Name:a97726e86565351 MacAddress:0e:63:8f:1a:46:e4 Speed:10000 Mtu:8900} {Name:b0f1382249dc5f2 MacAddress:2e:93:ad:44:de:01 Speed:10000 Mtu:8900} {Name:bf4e70417a5730a MacAddress:8a:e7:fe:22:f6:77 Speed:10000 Mtu:8900} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:br-int MacAddress:92:5a:b3:60:03:d3 Speed:0 Mtu:8900} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:88:49:0a Speed:-1 Mtu:9000} {Name:eth2 MacAddress:fa:16:3e:4d:95:0b Speed:-1 Mtu:9000} {Name:fcec922662159dc MacAddress:96:31:3b:0f:d4:f3 Speed:10000 Mtu:8900} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:8900} {Name:ovs-system MacAddress:4a:ca:1a:d6:6e:89 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:50514153472 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[12] Caches:[{Id:12 Size:32768 Type:Data Level:1} {Id:12 Size:32768 Type:Instruction Level:1} {Id:12 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:12 Size:16777216 Type:Unified Level:3}] SocketID:12 BookID: DrawerID:} {Id:0 Threads:[13] Caches:[{Id:13 Size:32768 Type:Data Level:1} {Id:13 Size:32768 Type:Instruction Level:1} {Id:13 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:13 Size:16777216 Type:Unified Level:3}] SocketID:13 BookID: DrawerID:} {Id:0 Threads:[14] Caches:[{Id:14 Size:32768 Type:Data Level:1} {Id:14 Size:32768 Type:Instruction Level:1} {Id:14 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:14 Size:16777216 Type:Unified Level:3}] SocketID:14 BookID: DrawerID:} {Id:0 Threads:[15] Caches:[{Id:15 Size:32768 Type:Data Level:1} {Id:15 Size:32768 Type:Instruction Level:1} {Id:15 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:15 Size:16777216 Type:Unified Level:3}] SocketID:15 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 23 14:18:27.146588 master-0 kubenswrapper[7728]: I0223 14:18:27.146562 7728 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 23 14:18:27.147274 master-0 kubenswrapper[7728]: I0223 14:18:27.146697 7728 manager.go:233] Version: {KernelVersion:5.14.0-427.109.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202602022246-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 23 14:18:27.147274 master-0 kubenswrapper[7728]: I0223 14:18:27.147116 7728 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 23 14:18:27.147404 master-0 kubenswrapper[7728]: I0223 14:18:27.147316 7728 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 23 14:18:27.147663 master-0 kubenswrapper[7728]: I0223 14:18:27.147351 7728 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 23 14:18:27.147776 master-0 kubenswrapper[7728]: I0223 14:18:27.147676 7728 topology_manager.go:138] "Creating topology manager with none policy" Feb 23 14:18:27.147776 master-0 kubenswrapper[7728]: I0223 14:18:27.147690 7728 container_manager_linux.go:303] "Creating device plugin manager" Feb 23 14:18:27.147776 master-0 kubenswrapper[7728]: I0223 14:18:27.147702 7728 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 23 14:18:27.147776 master-0 kubenswrapper[7728]: I0223 14:18:27.147731 7728 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 23 14:18:27.147992 master-0 kubenswrapper[7728]: I0223 14:18:27.147870 7728 state_mem.go:36] "Initialized new in-memory state store" Feb 23 14:18:27.148286 master-0 kubenswrapper[7728]: I0223 14:18:27.148249 7728 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 23 14:18:27.148354 master-0 kubenswrapper[7728]: I0223 14:18:27.148321 7728 kubelet.go:418] "Attempting to sync node with API server" Feb 23 14:18:27.148354 master-0 kubenswrapper[7728]: I0223 14:18:27.148336 7728 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 23 14:18:27.148354 master-0 kubenswrapper[7728]: I0223 14:18:27.148352 7728 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 23 14:18:27.148559 master-0 kubenswrapper[7728]: I0223 14:18:27.148366 7728 kubelet.go:324] "Adding apiserver pod source" Feb 23 14:18:27.148559 master-0 kubenswrapper[7728]: I0223 14:18:27.148388 7728 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 23 14:18:27.149776 master-0 kubenswrapper[7728]: I0223 14:18:27.149736 7728 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-6.rhaos4.18.git7ed6156.el9" apiVersion="v1" Feb 23 14:18:27.150008 master-0 kubenswrapper[7728]: I0223 14:18:27.149969 7728 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 23 14:18:27.152964 master-0 kubenswrapper[7728]: I0223 14:18:27.152908 7728 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 23 14:18:27.153209 master-0 kubenswrapper[7728]: I0223 14:18:27.153171 7728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 23 14:18:27.153209 master-0 kubenswrapper[7728]: I0223 14:18:27.153208 7728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 23 14:18:27.153330 master-0 kubenswrapper[7728]: I0223 14:18:27.153220 7728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 23 14:18:27.153330 master-0 kubenswrapper[7728]: I0223 14:18:27.153234 7728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 23 14:18:27.153330 master-0 kubenswrapper[7728]: I0223 14:18:27.153256 7728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 23 14:18:27.153330 master-0 kubenswrapper[7728]: I0223 14:18:27.153268 7728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 23 14:18:27.153330 master-0 kubenswrapper[7728]: I0223 14:18:27.153280 7728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 23 14:18:27.153330 master-0 kubenswrapper[7728]: I0223 14:18:27.153291 7728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 23 14:18:27.153330 master-0 kubenswrapper[7728]: I0223 14:18:27.153306 7728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 23 14:18:27.153330 master-0 kubenswrapper[7728]: I0223 14:18:27.153318 7728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 23 14:18:27.153330 master-0 kubenswrapper[7728]: I0223 14:18:27.153338 7728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 23 14:18:27.153846 master-0 kubenswrapper[7728]: I0223 14:18:27.153362 7728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 23 14:18:27.153846 master-0 kubenswrapper[7728]: I0223 14:18:27.153407 7728 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 23 14:18:27.155363 master-0 kubenswrapper[7728]: I0223 14:18:27.155304 7728 server.go:1280] "Started kubelet" Feb 23 14:18:27.155437 master-0 kubenswrapper[7728]: I0223 14:18:27.155325 7728 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 23 14:18:27.155983 master-0 kubenswrapper[7728]: I0223 14:18:27.155910 7728 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 23 14:18:27.156051 master-0 kubenswrapper[7728]: I0223 14:18:27.155998 7728 server_v1.go:47] "podresources" method="list" useActivePods=true Feb 23 14:18:27.156795 master-0 systemd[1]: Started Kubernetes Kubelet. Feb 23 14:18:27.160432 master-0 kubenswrapper[7728]: I0223 14:18:27.160244 7728 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 23 14:18:27.162128 master-0 kubenswrapper[7728]: I0223 14:18:27.161816 7728 server.go:449] "Adding debug handlers to kubelet server" Feb 23 14:18:27.166523 master-0 kubenswrapper[7728]: I0223 14:18:27.166143 7728 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 23 14:18:27.166776 master-0 kubenswrapper[7728]: I0223 14:18:27.166636 7728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 23 14:18:27.166776 master-0 kubenswrapper[7728]: I0223 14:18:27.166667 7728 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 23 14:18:27.167413 master-0 kubenswrapper[7728]: I0223 14:18:27.167084 7728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 14:08:38 +0000 UTC, rotation deadline is 2026-02-24 08:10:09.466775824 +0000 UTC Feb 23 14:18:27.167413 master-0 kubenswrapper[7728]: I0223 14:18:27.167119 7728 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 17h51m42.299658782s for next certificate rotation Feb 23 14:18:27.167413 master-0 kubenswrapper[7728]: I0223 14:18:27.167198 7728 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 23 14:18:27.167413 master-0 kubenswrapper[7728]: I0223 14:18:27.167215 7728 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 23 14:18:27.167663 master-0 kubenswrapper[7728]: I0223 14:18:27.167416 7728 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Feb 23 14:18:27.167751 master-0 kubenswrapper[7728]: I0223 14:18:27.167683 7728 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 23 14:18:27.175931 master-0 kubenswrapper[7728]: I0223 14:18:27.175857 7728 factory.go:55] Registering systemd factory Feb 23 14:18:27.175995 master-0 kubenswrapper[7728]: I0223 14:18:27.175942 7728 factory.go:221] Registration of the systemd container factory successfully Feb 23 14:18:27.176781 master-0 kubenswrapper[7728]: I0223 14:18:27.176755 7728 factory.go:153] Registering CRI-O factory Feb 23 14:18:27.176781 master-0 kubenswrapper[7728]: I0223 14:18:27.176780 7728 factory.go:221] Registration of the crio container factory successfully Feb 23 14:18:27.176950 master-0 kubenswrapper[7728]: I0223 14:18:27.176894 7728 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 23 14:18:27.176950 master-0 kubenswrapper[7728]: I0223 14:18:27.176915 7728 factory.go:103] Registering Raw factory Feb 23 14:18:27.176950 master-0 kubenswrapper[7728]: I0223 14:18:27.176930 7728 manager.go:1196] Started watching for new ooms in manager Feb 23 14:18:27.177134 master-0 kubenswrapper[7728]: I0223 14:18:27.177074 7728 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 23 14:18:27.179367 master-0 kubenswrapper[7728]: I0223 14:18:27.179315 7728 manager.go:319] Starting recovery of all containers Feb 23 14:18:27.181537 master-0 kubenswrapper[7728]: I0223 14:18:27.181449 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ace75aae-6f4f-4299-90e2-d5292271b136" volumeName="kubernetes.io/projected/ace75aae-6f4f-4299-90e2-d5292271b136-kube-api-access-wzkcs" seLinuxMountContext="" Feb 23 14:18:27.181537 master-0 kubenswrapper[7728]: I0223 14:18:27.181499 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e2d00ece-7586-4346-adbb-eaae1aeda69e" volumeName="kubernetes.io/secret/e2d00ece-7586-4346-adbb-eaae1aeda69e-serving-cert" seLinuxMountContext="" Feb 23 14:18:27.181537 master-0 kubenswrapper[7728]: I0223 14:18:27.181514 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cea0ab8-258b-486c-bb7f-8c93930b296d" volumeName="kubernetes.io/projected/3cea0ab8-258b-486c-bb7f-8c93930b296d-kube-api-access" seLinuxMountContext="" Feb 23 14:18:27.181537 master-0 kubenswrapper[7728]: I0223 14:18:27.181523 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="483786a0-0a29-44bf-bbd0-2f37e045aa2c" volumeName="kubernetes.io/configmap/483786a0-0a29-44bf-bbd0-2f37e045aa2c-whereabouts-configmap" seLinuxMountContext="" Feb 23 14:18:27.181537 master-0 kubenswrapper[7728]: I0223 14:18:27.181532 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57b57915-64dd-42f5-b06f-bc4bcc06b667" volumeName="kubernetes.io/configmap/57b57915-64dd-42f5-b06f-bc4bcc06b667-trusted-ca" seLinuxMountContext="" Feb 23 14:18:27.181537 master-0 kubenswrapper[7728]: I0223 14:18:27.181541 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8de1f285-47ac-42aa-8026-8addce656362" volumeName="kubernetes.io/configmap/8de1f285-47ac-42aa-8026-8addce656362-etcd-service-ca" seLinuxMountContext="" Feb 23 14:18:27.181833 master-0 kubenswrapper[7728]: I0223 14:18:27.181553 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b9cf1c39-24f0-420b-8020-089616d1cdf0" volumeName="kubernetes.io/configmap/b9cf1c39-24f0-420b-8020-089616d1cdf0-config" seLinuxMountContext="" Feb 23 14:18:27.181833 master-0 kubenswrapper[7728]: I0223 14:18:27.181563 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bbe678de-546d-49d0-8280-3f6d94fa5e4f" volumeName="kubernetes.io/configmap/bbe678de-546d-49d0-8280-3f6d94fa5e4f-env-overrides" seLinuxMountContext="" Feb 23 14:18:27.181833 master-0 kubenswrapper[7728]: I0223 14:18:27.181573 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d2aa0d48-7c8e-4ddb-84a3-b3c34414c061" volumeName="kubernetes.io/empty-dir/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061-operand-assets" seLinuxMountContext="" Feb 23 14:18:27.181833 master-0 kubenswrapper[7728]: I0223 14:18:27.181588 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="24829faf-50e8-45bb-abb0-7cc5ccf81080" volumeName="kubernetes.io/projected/24829faf-50e8-45bb-abb0-7cc5ccf81080-kube-api-access-7hp42" seLinuxMountContext="" Feb 23 14:18:27.181833 master-0 kubenswrapper[7728]: I0223 14:18:27.181599 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3488a7eb-5170-478c-9af7-490dbe0f514e" volumeName="kubernetes.io/configmap/3488a7eb-5170-478c-9af7-490dbe0f514e-trusted-ca" seLinuxMountContext="" Feb 23 14:18:27.181833 master-0 kubenswrapper[7728]: I0223 14:18:27.181608 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3488a7eb-5170-478c-9af7-490dbe0f514e" volumeName="kubernetes.io/projected/3488a7eb-5170-478c-9af7-490dbe0f514e-kube-api-access-6qszm" seLinuxMountContext="" Feb 23 14:18:27.181833 master-0 kubenswrapper[7728]: I0223 14:18:27.181617 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b090ed5a-984f-41dd-8cea-34a1ece1514f" volumeName="kubernetes.io/secret/b090ed5a-984f-41dd-8cea-34a1ece1514f-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 23 14:18:27.181833 master-0 kubenswrapper[7728]: I0223 14:18:27.181629 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="865ceedb-b19a-4f2f-b295-311e1b7a645e" volumeName="kubernetes.io/configmap/865ceedb-b19a-4f2f-b295-311e1b7a645e-config" seLinuxMountContext="" Feb 23 14:18:27.181833 master-0 kubenswrapper[7728]: I0223 14:18:27.181638 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cb6e88cd-98de-446a-92e8-f56a2f133703" volumeName="kubernetes.io/projected/cb6e88cd-98de-446a-92e8-f56a2f133703-kube-api-access-chznd" seLinuxMountContext="" Feb 23 14:18:27.181833 master-0 kubenswrapper[7728]: I0223 14:18:27.181647 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f10f592e-5738-4879-b776-246b357d4621" volumeName="kubernetes.io/configmap/f10f592e-5738-4879-b776-246b357d4621-env-overrides" seLinuxMountContext="" Feb 23 14:18:27.181833 master-0 kubenswrapper[7728]: I0223 14:18:27.181681 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09d80e28-0b64-4c5d-a9bc-99d843d40165" volumeName="kubernetes.io/projected/09d80e28-0b64-4c5d-a9bc-99d843d40165-kube-api-access-g9z2f" seLinuxMountContext="" Feb 23 14:18:27.181833 master-0 kubenswrapper[7728]: I0223 14:18:27.181689 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cea0ab8-258b-486c-bb7f-8c93930b296d" volumeName="kubernetes.io/configmap/3cea0ab8-258b-486c-bb7f-8c93930b296d-service-ca" seLinuxMountContext="" Feb 23 14:18:27.181833 master-0 kubenswrapper[7728]: I0223 14:18:27.181698 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b9cf1c39-24f0-420b-8020-089616d1cdf0" volumeName="kubernetes.io/secret/b9cf1c39-24f0-420b-8020-089616d1cdf0-serving-cert" seLinuxMountContext="" Feb 23 14:18:27.181833 master-0 kubenswrapper[7728]: I0223 14:18:27.181708 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bbe678de-546d-49d0-8280-3f6d94fa5e4f" volumeName="kubernetes.io/configmap/bbe678de-546d-49d0-8280-3f6d94fa5e4f-ovnkube-identity-cm" seLinuxMountContext="" Feb 23 14:18:27.181833 master-0 kubenswrapper[7728]: I0223 14:18:27.181719 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8de1f285-47ac-42aa-8026-8addce656362" volumeName="kubernetes.io/projected/8de1f285-47ac-42aa-8026-8addce656362-kube-api-access-x7jvd" seLinuxMountContext="" Feb 23 14:18:27.181833 master-0 kubenswrapper[7728]: I0223 14:18:27.181728 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bbe678de-546d-49d0-8280-3f6d94fa5e4f" volumeName="kubernetes.io/projected/bbe678de-546d-49d0-8280-3f6d94fa5e4f-kube-api-access-kp5kb" seLinuxMountContext="" Feb 23 14:18:27.181833 master-0 kubenswrapper[7728]: I0223 14:18:27.181738 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b54fc16-d2f7-4b10-a611-5b411b389c5a" volumeName="kubernetes.io/projected/5b54fc16-d2f7-4b10-a611-5b411b389c5a-kube-api-access-d5f8j" seLinuxMountContext="" Feb 23 14:18:27.181833 master-0 kubenswrapper[7728]: I0223 14:18:27.181785 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8de1f285-47ac-42aa-8026-8addce656362" volumeName="kubernetes.io/secret/8de1f285-47ac-42aa-8026-8addce656362-etcd-client" seLinuxMountContext="" Feb 23 14:18:27.181833 master-0 kubenswrapper[7728]: I0223 14:18:27.181797 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b090ed5a-984f-41dd-8cea-34a1ece1514f" volumeName="kubernetes.io/configmap/b090ed5a-984f-41dd-8cea-34a1ece1514f-env-overrides" seLinuxMountContext="" Feb 23 14:18:27.181833 master-0 kubenswrapper[7728]: I0223 14:18:27.181807 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b714a9df-026e-423d-a980-2569f0d92e47" volumeName="kubernetes.io/projected/b714a9df-026e-423d-a980-2569f0d92e47-kube-api-access-lr868" seLinuxMountContext="" Feb 23 14:18:27.181833 master-0 kubenswrapper[7728]: I0223 14:18:27.181820 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="865ceedb-b19a-4f2f-b295-311e1b7a645e" volumeName="kubernetes.io/secret/865ceedb-b19a-4f2f-b295-311e1b7a645e-serving-cert" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.181831 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="961e4ecd-545b-4270-ae34-e733dec793b6" volumeName="kubernetes.io/configmap/961e4ecd-545b-4270-ae34-e733dec793b6-config" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.181952 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f10f592e-5738-4879-b776-246b357d4621" volumeName="kubernetes.io/projected/f10f592e-5738-4879-b776-246b357d4621-kube-api-access-269v7" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.181962 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f10f592e-5738-4879-b776-246b357d4621" volumeName="kubernetes.io/secret/f10f592e-5738-4879-b776-246b357d4621-ovn-node-metrics-cert" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.181972 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="08c561b3-613b-425f-9de4-d5fc8762ea51" volumeName="kubernetes.io/projected/08c561b3-613b-425f-9de4-d5fc8762ea51-kube-api-access-phmkf" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.181983 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="483786a0-0a29-44bf-bbd0-2f37e045aa2c" volumeName="kubernetes.io/projected/483786a0-0a29-44bf-bbd0-2f37e045aa2c-kube-api-access-88qnh" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.181995 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57b57915-64dd-42f5-b06f-bc4bcc06b667" volumeName="kubernetes.io/projected/57b57915-64dd-42f5-b06f-bc4bcc06b667-kube-api-access-qggzs" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182004 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="585f74db-4593-426b-b0c7-ec8f64810549" volumeName="kubernetes.io/configmap/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-trusted-ca" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182015 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="585f74db-4593-426b-b0c7-ec8f64810549" volumeName="kubernetes.io/projected/585f74db-4593-426b-b0c7-ec8f64810549-kube-api-access-q9tkx" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182025 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="865ceedb-b19a-4f2f-b295-311e1b7a645e" volumeName="kubernetes.io/projected/865ceedb-b19a-4f2f-b295-311e1b7a645e-kube-api-access-tr2p2" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182035 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cb6e88cd-98de-446a-92e8-f56a2f133703" volumeName="kubernetes.io/configmap/cb6e88cd-98de-446a-92e8-f56a2f133703-config" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182044 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d2aa0d48-7c8e-4ddb-84a3-b3c34414c061" volumeName="kubernetes.io/projected/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061-kube-api-access-vp6tj" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182070 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cf04aca0-8174-4134-835d-37adf6a3b5ca" volumeName="kubernetes.io/configmap/cf04aca0-8174-4134-835d-37adf6a3b5ca-config" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182080 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="646fece3-4a42-4e0c-bcc7-5f705f948d63" volumeName="kubernetes.io/configmap/646fece3-4a42-4e0c-bcc7-5f705f948d63-telemetry-config" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182089 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8de1f285-47ac-42aa-8026-8addce656362" volumeName="kubernetes.io/configmap/8de1f285-47ac-42aa-8026-8addce656362-etcd-ca" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182099 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8de1f285-47ac-42aa-8026-8addce656362" volumeName="kubernetes.io/secret/8de1f285-47ac-42aa-8026-8addce656362-serving-cert" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182186 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="961e4ecd-545b-4270-ae34-e733dec793b6" volumeName="kubernetes.io/projected/961e4ecd-545b-4270-ae34-e733dec793b6-kube-api-access" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182199 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a4ae9292-71dc-4484-b277-43cb26c1e04d" volumeName="kubernetes.io/projected/a4ae9292-71dc-4484-b277-43cb26c1e04d-kube-api-access-8llc8" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182210 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="24829faf-50e8-45bb-abb0-7cc5ccf81080" volumeName="kubernetes.io/secret/24829faf-50e8-45bb-abb0-7cc5ccf81080-serving-cert" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182220 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3488a7eb-5170-478c-9af7-490dbe0f514e" volumeName="kubernetes.io/projected/3488a7eb-5170-478c-9af7-490dbe0f514e-bound-sa-token" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182231 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="607c1101-3533-43e3-9eda-13cea2b9dbb6" volumeName="kubernetes.io/projected/607c1101-3533-43e3-9eda-13cea2b9dbb6-kube-api-access-v4sbp" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182241 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="961e4ecd-545b-4270-ae34-e733dec793b6" volumeName="kubernetes.io/secret/961e4ecd-545b-4270-ae34-e733dec793b6-serving-cert" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182250 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e2d00ece-7586-4346-adbb-eaae1aeda69e" volumeName="kubernetes.io/configmap/e2d00ece-7586-4346-adbb-eaae1aeda69e-config" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182265 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e2d00ece-7586-4346-adbb-eaae1aeda69e" volumeName="kubernetes.io/projected/e2d00ece-7586-4346-adbb-eaae1aeda69e-kube-api-access-4nr85" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182275 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d2aa0d48-7c8e-4ddb-84a3-b3c34414c061" volumeName="kubernetes.io/secret/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061-cluster-olm-operator-serving-cert" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182303 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e2d00ece-7586-4346-adbb-eaae1aeda69e" volumeName="kubernetes.io/configmap/e2d00ece-7586-4346-adbb-eaae1aeda69e-trusted-ca-bundle" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182318 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f10f592e-5738-4879-b776-246b357d4621" volumeName="kubernetes.io/configmap/f10f592e-5738-4879-b776-246b357d4621-ovnkube-script-lib" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182331 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="674041a2-e2b0-4286-88cc-f1b00571e3f3" volumeName="kubernetes.io/secret/674041a2-e2b0-4286-88cc-f1b00571e3f3-metrics-tls" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182342 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="842d45c5-3452-4e97-b5f5-540395330a65" volumeName="kubernetes.io/projected/842d45c5-3452-4e97-b5f5-540395330a65-kube-api-access-j54j5" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182352 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b714a9df-026e-423d-a980-2569f0d92e47" volumeName="kubernetes.io/configmap/b714a9df-026e-423d-a980-2569f0d92e47-config" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182362 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b9cf1c39-24f0-420b-8020-089616d1cdf0" volumeName="kubernetes.io/projected/b9cf1c39-24f0-420b-8020-089616d1cdf0-kube-api-access" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182394 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8de1f285-47ac-42aa-8026-8addce656362" volumeName="kubernetes.io/configmap/8de1f285-47ac-42aa-8026-8addce656362-config" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182424 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b090ed5a-984f-41dd-8cea-34a1ece1514f" volumeName="kubernetes.io/configmap/b090ed5a-984f-41dd-8cea-34a1ece1514f-ovnkube-config" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182435 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cf04aca0-8174-4134-835d-37adf6a3b5ca" volumeName="kubernetes.io/projected/cf04aca0-8174-4134-835d-37adf6a3b5ca-kube-api-access" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182491 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cf04aca0-8174-4134-835d-37adf6a3b5ca" volumeName="kubernetes.io/secret/cf04aca0-8174-4134-835d-37adf6a3b5ca-serving-cert" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182501 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e2d00ece-7586-4346-adbb-eaae1aeda69e" volumeName="kubernetes.io/configmap/e2d00ece-7586-4346-adbb-eaae1aeda69e-service-ca-bundle" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182511 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="483786a0-0a29-44bf-bbd0-2f37e045aa2c" volumeName="kubernetes.io/configmap/483786a0-0a29-44bf-bbd0-2f37e045aa2c-cni-sysctl-allowlist" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182521 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="646fece3-4a42-4e0c-bcc7-5f705f948d63" volumeName="kubernetes.io/projected/646fece3-4a42-4e0c-bcc7-5f705f948d63-kube-api-access-2jzsd" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182531 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="674041a2-e2b0-4286-88cc-f1b00571e3f3" volumeName="kubernetes.io/projected/674041a2-e2b0-4286-88cc-f1b00571e3f3-kube-api-access-brd4j" seLinuxMountContext="" Feb 23 14:18:27.182510 master-0 kubenswrapper[7728]: I0223 14:18:27.182541 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b714a9df-026e-423d-a980-2569f0d92e47" volumeName="kubernetes.io/secret/b714a9df-026e-423d-a980-2569f0d92e47-serving-cert" seLinuxMountContext="" Feb 23 14:18:27.184246 master-0 kubenswrapper[7728]: I0223 14:18:27.182551 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="483786a0-0a29-44bf-bbd0-2f37e045aa2c" volumeName="kubernetes.io/configmap/483786a0-0a29-44bf-bbd0-2f37e045aa2c-cni-binary-copy" seLinuxMountContext="" Feb 23 14:18:27.184246 master-0 kubenswrapper[7728]: I0223 14:18:27.182561 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b090ed5a-984f-41dd-8cea-34a1ece1514f" volumeName="kubernetes.io/projected/b090ed5a-984f-41dd-8cea-34a1ece1514f-kube-api-access-fjs6f" seLinuxMountContext="" Feb 23 14:18:27.184246 master-0 kubenswrapper[7728]: I0223 14:18:27.182572 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bbe678de-546d-49d0-8280-3f6d94fa5e4f" volumeName="kubernetes.io/secret/bbe678de-546d-49d0-8280-3f6d94fa5e4f-webhook-cert" seLinuxMountContext="" Feb 23 14:18:27.184246 master-0 kubenswrapper[7728]: I0223 14:18:27.182582 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f10f592e-5738-4879-b776-246b357d4621" volumeName="kubernetes.io/configmap/f10f592e-5738-4879-b776-246b357d4621-ovnkube-config" seLinuxMountContext="" Feb 23 14:18:27.184246 master-0 kubenswrapper[7728]: I0223 14:18:27.182592 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="08c561b3-613b-425f-9de4-d5fc8762ea51" volumeName="kubernetes.io/configmap/08c561b3-613b-425f-9de4-d5fc8762ea51-iptables-alerter-script" seLinuxMountContext="" Feb 23 14:18:27.184246 master-0 kubenswrapper[7728]: I0223 14:18:27.182602 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09d80e28-0b64-4c5d-a9bc-99d843d40165" volumeName="kubernetes.io/configmap/09d80e28-0b64-4c5d-a9bc-99d843d40165-cni-binary-copy" seLinuxMountContext="" Feb 23 14:18:27.184246 master-0 kubenswrapper[7728]: I0223 14:18:27.182611 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09d80e28-0b64-4c5d-a9bc-99d843d40165" volumeName="kubernetes.io/configmap/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-daemon-config" seLinuxMountContext="" Feb 23 14:18:27.184246 master-0 kubenswrapper[7728]: I0223 14:18:27.182620 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="24829faf-50e8-45bb-abb0-7cc5ccf81080" volumeName="kubernetes.io/configmap/24829faf-50e8-45bb-abb0-7cc5ccf81080-config" seLinuxMountContext="" Feb 23 14:18:27.184246 master-0 kubenswrapper[7728]: I0223 14:18:27.182629 7728 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cb6e88cd-98de-446a-92e8-f56a2f133703" volumeName="kubernetes.io/secret/cb6e88cd-98de-446a-92e8-f56a2f133703-serving-cert" seLinuxMountContext="" Feb 23 14:18:27.184246 master-0 kubenswrapper[7728]: I0223 14:18:27.182638 7728 reconstruct.go:97] "Volume reconstruction finished" Feb 23 14:18:27.184246 master-0 kubenswrapper[7728]: I0223 14:18:27.182644 7728 reconciler.go:26] "Reconciler: start to sync state" Feb 23 14:18:27.185073 master-0 kubenswrapper[7728]: I0223 14:18:27.185053 7728 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 23 14:18:27.218141 master-0 kubenswrapper[7728]: I0223 14:18:27.217992 7728 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 23 14:18:27.219423 master-0 kubenswrapper[7728]: I0223 14:18:27.219391 7728 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 23 14:18:27.219516 master-0 kubenswrapper[7728]: I0223 14:18:27.219453 7728 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 23 14:18:27.219516 master-0 kubenswrapper[7728]: I0223 14:18:27.219506 7728 kubelet.go:2335] "Starting kubelet main sync loop" Feb 23 14:18:27.219685 master-0 kubenswrapper[7728]: E0223 14:18:27.219574 7728 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 23 14:18:27.221919 master-0 kubenswrapper[7728]: I0223 14:18:27.221864 7728 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 23 14:18:27.232432 master-0 kubenswrapper[7728]: I0223 14:18:27.232359 7728 generic.go:334] "Generic (PLEG): container finished" podID="83ba77ca-71ea-4b69-8c91-f04b53b81aff" containerID="6d4e5cbd51d6e2350099300783b6b53e026119467c4ee08ce357bbba7d0f9eaa" exitCode=0 Feb 23 14:18:27.236112 master-0 kubenswrapper[7728]: I0223 14:18:27.235448 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_c997c8e9d3be51d454d8e61e376bef08/kube-rbac-proxy-crio/2.log" Feb 23 14:18:27.236112 master-0 kubenswrapper[7728]: I0223 14:18:27.236053 7728 generic.go:334] "Generic (PLEG): container finished" podID="c997c8e9d3be51d454d8e61e376bef08" containerID="6ac0b900bdb2d552799e0b2929f88eaa7518eb0c998cb215c17a947032781e19" exitCode=1 Feb 23 14:18:27.236112 master-0 kubenswrapper[7728]: I0223 14:18:27.236079 7728 generic.go:334] "Generic (PLEG): container finished" podID="c997c8e9d3be51d454d8e61e376bef08" containerID="02553ea2f34fd5b9d9104437dd7120800883c473073a6a74895604093906e009" exitCode=0 Feb 23 14:18:27.238539 master-0 kubenswrapper[7728]: I0223 14:18:27.238421 7728 generic.go:334] "Generic (PLEG): container finished" podID="0514f486-2562-473d-8b01-b69441b82367" containerID="ac1b1e24015c720352cbb49d46332282e9687278977a0db4df21fe4d03fe58bd" exitCode=0 Feb 23 14:18:27.243696 master-0 kubenswrapper[7728]: I0223 14:18:27.243423 7728 generic.go:334] "Generic (PLEG): container finished" podID="483786a0-0a29-44bf-bbd0-2f37e045aa2c" containerID="6c7ee6bebf88d829805371dc4fd4b58845a3f175897eb6486d1688a8a41b95ec" exitCode=0 Feb 23 14:18:27.243696 master-0 kubenswrapper[7728]: I0223 14:18:27.243456 7728 generic.go:334] "Generic (PLEG): container finished" podID="483786a0-0a29-44bf-bbd0-2f37e045aa2c" containerID="a6fb92a24f40b4f0a4db9442684eefd34b35d2511917f6d03fe2ac8345b66ead" exitCode=0 Feb 23 14:18:27.243696 master-0 kubenswrapper[7728]: I0223 14:18:27.243465 7728 generic.go:334] "Generic (PLEG): container finished" podID="483786a0-0a29-44bf-bbd0-2f37e045aa2c" containerID="e5ef5b210d67b35d196c3c58900eaedb9852f06e215b468c9e1c1dc53fce376f" exitCode=0 Feb 23 14:18:27.243696 master-0 kubenswrapper[7728]: I0223 14:18:27.243489 7728 generic.go:334] "Generic (PLEG): container finished" podID="483786a0-0a29-44bf-bbd0-2f37e045aa2c" containerID="29b61cbeccf4eaed8df82b56cbe6a444cd43fd7fd1043bff465ae48185e7e6a0" exitCode=0 Feb 23 14:18:27.243696 master-0 kubenswrapper[7728]: I0223 14:18:27.243693 7728 generic.go:334] "Generic (PLEG): container finished" podID="483786a0-0a29-44bf-bbd0-2f37e045aa2c" containerID="5fd96309ade76aec20ed37e459e178ae08d952af2aa513f3703806ca12a7c927" exitCode=0 Feb 23 14:18:27.243696 master-0 kubenswrapper[7728]: I0223 14:18:27.243701 7728 generic.go:334] "Generic (PLEG): container finished" podID="483786a0-0a29-44bf-bbd0-2f37e045aa2c" containerID="7e56c504fefbada4ed2745ea4973c98d064a08b56a86637d3809d7946280cc20" exitCode=0 Feb 23 14:18:27.251155 master-0 kubenswrapper[7728]: I0223 14:18:27.251057 7728 generic.go:334] "Generic (PLEG): container finished" podID="c9ad9373c007a4fcd25e70622bdc8deb" containerID="ad9ef13f95d7901e7f24b0914da444cc2df5f3bc77853f6da272e6cf3ddf8974" exitCode=1 Feb 23 14:18:27.256564 master-0 kubenswrapper[7728]: I0223 14:18:27.256516 7728 generic.go:334] "Generic (PLEG): container finished" podID="d2aa0d48-7c8e-4ddb-84a3-b3c34414c061" containerID="881e3b61730f49e9657641d193738c054ca1938ca39d0f830ceee7b02b6b1f78" exitCode=0 Feb 23 14:18:27.262205 master-0 kubenswrapper[7728]: I0223 14:18:27.262157 7728 generic.go:334] "Generic (PLEG): container finished" podID="687e92a6cecf1e2beeef16a0b322ad08" containerID="40ca3552a0c110bf631be979ddbff1eb4abba63ee7c1c34c419314566066d566" exitCode=0 Feb 23 14:18:27.269197 master-0 kubenswrapper[7728]: I0223 14:18:27.269168 7728 generic.go:334] "Generic (PLEG): container finished" podID="f10f592e-5738-4879-b776-246b357d4621" containerID="d631da69f8bc3fb53c35b8ef8cedda80eee352d8a4bf7c9c1590bb5315fa046f" exitCode=0 Feb 23 14:18:27.313185 master-0 kubenswrapper[7728]: I0223 14:18:27.313144 7728 manager.go:324] Recovery completed Feb 23 14:18:27.319760 master-0 kubenswrapper[7728]: E0223 14:18:27.319684 7728 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Feb 23 14:18:27.353361 master-0 kubenswrapper[7728]: I0223 14:18:27.353312 7728 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 23 14:18:27.353361 master-0 kubenswrapper[7728]: I0223 14:18:27.353331 7728 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 23 14:18:27.353361 master-0 kubenswrapper[7728]: I0223 14:18:27.353351 7728 state_mem.go:36] "Initialized new in-memory state store" Feb 23 14:18:27.353635 master-0 kubenswrapper[7728]: I0223 14:18:27.353524 7728 state_mem.go:88] "Updated default CPUSet" cpuSet="" Feb 23 14:18:27.353635 master-0 kubenswrapper[7728]: I0223 14:18:27.353535 7728 state_mem.go:96] "Updated CPUSet assignments" assignments={} Feb 23 14:18:27.353635 master-0 kubenswrapper[7728]: I0223 14:18:27.353557 7728 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Feb 23 14:18:27.353635 master-0 kubenswrapper[7728]: I0223 14:18:27.353564 7728 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Feb 23 14:18:27.353635 master-0 kubenswrapper[7728]: I0223 14:18:27.353571 7728 policy_none.go:49] "None policy: Start" Feb 23 14:18:27.355801 master-0 kubenswrapper[7728]: I0223 14:18:27.355767 7728 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 23 14:18:27.355883 master-0 kubenswrapper[7728]: I0223 14:18:27.355830 7728 state_mem.go:35] "Initializing new in-memory state store" Feb 23 14:18:27.356223 master-0 kubenswrapper[7728]: I0223 14:18:27.356195 7728 state_mem.go:75] "Updated machine memory state" Feb 23 14:18:27.356223 master-0 kubenswrapper[7728]: I0223 14:18:27.356221 7728 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Feb 23 14:18:27.369558 master-0 kubenswrapper[7728]: I0223 14:18:27.369499 7728 manager.go:334] "Starting Device Plugin manager" Feb 23 14:18:27.369757 master-0 kubenswrapper[7728]: I0223 14:18:27.369578 7728 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 23 14:18:27.369757 master-0 kubenswrapper[7728]: I0223 14:18:27.369598 7728 server.go:79] "Starting device plugin registration server" Feb 23 14:18:27.370283 master-0 kubenswrapper[7728]: I0223 14:18:27.370252 7728 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 23 14:18:27.370341 master-0 kubenswrapper[7728]: I0223 14:18:27.370274 7728 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 23 14:18:27.370529 master-0 kubenswrapper[7728]: I0223 14:18:27.370496 7728 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 23 14:18:27.370659 master-0 kubenswrapper[7728]: I0223 14:18:27.370604 7728 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 23 14:18:27.370659 master-0 kubenswrapper[7728]: I0223 14:18:27.370631 7728 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 23 14:18:27.471148 master-0 kubenswrapper[7728]: I0223 14:18:27.470810 7728 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:18:27.472836 master-0 kubenswrapper[7728]: I0223 14:18:27.472787 7728 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:18:27.472836 master-0 kubenswrapper[7728]: I0223 14:18:27.472837 7728 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:18:27.472972 master-0 kubenswrapper[7728]: I0223 14:18:27.472856 7728 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:18:27.472972 master-0 kubenswrapper[7728]: I0223 14:18:27.472919 7728 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Feb 23 14:18:27.482915 master-0 kubenswrapper[7728]: I0223 14:18:27.482853 7728 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Feb 23 14:18:27.483118 master-0 kubenswrapper[7728]: I0223 14:18:27.482987 7728 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Feb 23 14:18:27.520317 master-0 kubenswrapper[7728]: I0223 14:18:27.520201 7728 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0-master-0"] Feb 23 14:18:27.520882 master-0 kubenswrapper[7728]: I0223 14:18:27.520784 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"12dab5d350ebc129b0bfa4714d330b15","Type":"ContainerStarted","Data":"86dd361ededa7f9d61d9c2bea900261b661a76c0468603804e9af20765f8d8cd"} Feb 23 14:18:27.520932 master-0 kubenswrapper[7728]: I0223 14:18:27.520896 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"12dab5d350ebc129b0bfa4714d330b15","Type":"ContainerStarted","Data":"5809ecf60a8e4db68dfab073298af03c567dcc4e91a5b6d7f6d78ca758010d15"} Feb 23 14:18:27.520932 master-0 kubenswrapper[7728]: I0223 14:18:27.520924 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"12dab5d350ebc129b0bfa4714d330b15","Type":"ContainerStarted","Data":"dd18675422a4846ac8ff692dbc3019546e4c2cecfa8b4d0fe07976539e44abe0"} Feb 23 14:18:27.521087 master-0 kubenswrapper[7728]: I0223 14:18:27.520950 7728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="477770141bf255a57c3800674a1973731f4824ba7eeefbf57c50b1692c3eba69" Feb 23 14:18:27.521087 master-0 kubenswrapper[7728]: I0223 14:18:27.520977 7728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="703008ca22883d01a165a1fd5907792c8c2d30c013e9c784479d77eccc42b82c" Feb 23 14:18:27.521087 master-0 kubenswrapper[7728]: I0223 14:18:27.520993 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"c997c8e9d3be51d454d8e61e376bef08","Type":"ContainerStarted","Data":"404214ba54b8b128195146c77065f2702359c7ee02579e2cb25064ddce3c7dcc"} Feb 23 14:18:27.521087 master-0 kubenswrapper[7728]: I0223 14:18:27.521029 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"c997c8e9d3be51d454d8e61e376bef08","Type":"ContainerDied","Data":"6ac0b900bdb2d552799e0b2929f88eaa7518eb0c998cb215c17a947032781e19"} Feb 23 14:18:27.521087 master-0 kubenswrapper[7728]: I0223 14:18:27.521048 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"c997c8e9d3be51d454d8e61e376bef08","Type":"ContainerDied","Data":"02553ea2f34fd5b9d9104437dd7120800883c473073a6a74895604093906e009"} Feb 23 14:18:27.521087 master-0 kubenswrapper[7728]: I0223 14:18:27.521065 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"c997c8e9d3be51d454d8e61e376bef08","Type":"ContainerStarted","Data":"57996809f1e2dec5f618cc991b1ec9797922b627eb03d04dabd6bb6cb4205117"} Feb 23 14:18:27.521087 master-0 kubenswrapper[7728]: I0223 14:18:27.521086 7728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee2c13172eac3ecac6ffc4f8cbbcc4a92023a5bb4123fc2178049f9834005518" Feb 23 14:18:27.521922 master-0 kubenswrapper[7728]: I0223 14:18:27.521130 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"c9ad9373c007a4fcd25e70622bdc8deb","Type":"ContainerStarted","Data":"b545413980bb822863005db697b932a984f3d1797f9e0fd0d4ca5331ec57bc46"} Feb 23 14:18:27.521922 master-0 kubenswrapper[7728]: I0223 14:18:27.521147 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"c9ad9373c007a4fcd25e70622bdc8deb","Type":"ContainerStarted","Data":"f38113657e6647d113d4b8b771a4b871cb4df714ffeae8172aebba272b7e4da9"} Feb 23 14:18:27.521922 master-0 kubenswrapper[7728]: I0223 14:18:27.521168 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"c9ad9373c007a4fcd25e70622bdc8deb","Type":"ContainerDied","Data":"ad9ef13f95d7901e7f24b0914da444cc2df5f3bc77853f6da272e6cf3ddf8974"} Feb 23 14:18:27.521922 master-0 kubenswrapper[7728]: I0223 14:18:27.521190 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"c9ad9373c007a4fcd25e70622bdc8deb","Type":"ContainerStarted","Data":"64bb509af2e5ff8862c488db172772b3eb4a331f81000bc6d7b1d4be31a7f27d"} Feb 23 14:18:27.521922 master-0 kubenswrapper[7728]: I0223 14:18:27.521225 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"687e92a6cecf1e2beeef16a0b322ad08","Type":"ContainerStarted","Data":"e3d987d25306f70a7327b5bce6ea549b476972db2d3366cf37d35b30c1531578"} Feb 23 14:18:27.521922 master-0 kubenswrapper[7728]: I0223 14:18:27.521243 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"687e92a6cecf1e2beeef16a0b322ad08","Type":"ContainerStarted","Data":"bdd3290dcf6f732f006b381bec2edfc3a7a58623787040a36811efd529225351"} Feb 23 14:18:27.521922 master-0 kubenswrapper[7728]: I0223 14:18:27.521261 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"687e92a6cecf1e2beeef16a0b322ad08","Type":"ContainerDied","Data":"40ca3552a0c110bf631be979ddbff1eb4abba63ee7c1c34c419314566066d566"} Feb 23 14:18:27.521922 master-0 kubenswrapper[7728]: I0223 14:18:27.521281 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"687e92a6cecf1e2beeef16a0b322ad08","Type":"ContainerStarted","Data":"0829d0cc308970ce0a149e45fa21b4352374f015f4b31f7eb48a14c16cbea5b2"} Feb 23 14:18:27.521922 master-0 kubenswrapper[7728]: I0223 14:18:27.521336 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"56c3cb71c9851003c8de7e7c5db4b87e","Type":"ContainerStarted","Data":"1161af5c0919fc04c557fffb0fa1799b448226d91a3bed741eb027099a2bf8f9"} Feb 23 14:18:27.521922 master-0 kubenswrapper[7728]: I0223 14:18:27.521352 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"56c3cb71c9851003c8de7e7c5db4b87e","Type":"ContainerStarted","Data":"6b4c5917f42a018e656736ffe0ec509b45d342d70ccb1039a1f41866022cf32e"} Feb 23 14:18:27.535985 master-0 kubenswrapper[7728]: E0223 14:18:27.535916 7728 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:18:27.535985 master-0 kubenswrapper[7728]: E0223 14:18:27.535950 7728 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:18:27.536264 master-0 kubenswrapper[7728]: E0223 14:18:27.535916 7728 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Feb 23 14:18:27.539056 master-0 kubenswrapper[7728]: W0223 14:18:27.539025 7728 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Feb 23 14:18:27.539299 master-0 kubenswrapper[7728]: E0223 14:18:27.539075 7728 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0-master-0\" already exists" pod="openshift-etcd/etcd-master-0-master-0" Feb 23 14:18:27.540099 master-0 kubenswrapper[7728]: E0223 14:18:27.540061 7728 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Feb 23 14:18:27.587598 master-0 kubenswrapper[7728]: I0223 14:18:27.587536 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/56c3cb71c9851003c8de7e7c5db4b87e-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"56c3cb71c9851003c8de7e7c5db4b87e\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Feb 23 14:18:27.587598 master-0 kubenswrapper[7728]: I0223 14:18:27.587575 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/12dab5d350ebc129b0bfa4714d330b15-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"12dab5d350ebc129b0bfa4714d330b15\") " pod="openshift-etcd/etcd-master-0-master-0" Feb 23 14:18:27.587598 master-0 kubenswrapper[7728]: I0223 14:18:27.587601 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:18:27.587927 master-0 kubenswrapper[7728]: I0223 14:18:27.587647 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:18:27.587927 master-0 kubenswrapper[7728]: I0223 14:18:27.587698 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:18:27.587927 master-0 kubenswrapper[7728]: I0223 14:18:27.587729 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:18:27.587927 master-0 kubenswrapper[7728]: I0223 14:18:27.587760 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:18:27.587927 master-0 kubenswrapper[7728]: I0223 14:18:27.587795 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:18:27.587927 master-0 kubenswrapper[7728]: I0223 14:18:27.587825 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/56c3cb71c9851003c8de7e7c5db4b87e-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"56c3cb71c9851003c8de7e7c5db4b87e\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Feb 23 14:18:27.587927 master-0 kubenswrapper[7728]: I0223 14:18:27.587851 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c997c8e9d3be51d454d8e61e376bef08-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"c997c8e9d3be51d454d8e61e376bef08\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Feb 23 14:18:27.587927 master-0 kubenswrapper[7728]: I0223 14:18:27.587878 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c997c8e9d3be51d454d8e61e376bef08-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"c997c8e9d3be51d454d8e61e376bef08\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Feb 23 14:18:27.588681 master-0 kubenswrapper[7728]: I0223 14:18:27.587929 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/12dab5d350ebc129b0bfa4714d330b15-certs\") pod \"etcd-master-0-master-0\" (UID: \"12dab5d350ebc129b0bfa4714d330b15\") " pod="openshift-etcd/etcd-master-0-master-0" Feb 23 14:18:27.588681 master-0 kubenswrapper[7728]: I0223 14:18:27.587968 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:18:27.588681 master-0 kubenswrapper[7728]: I0223 14:18:27.587999 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:18:27.588681 master-0 kubenswrapper[7728]: I0223 14:18:27.588029 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:18:27.588681 master-0 kubenswrapper[7728]: I0223 14:18:27.588065 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:18:27.588681 master-0 kubenswrapper[7728]: I0223 14:18:27.588093 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:18:27.688773 master-0 kubenswrapper[7728]: I0223 14:18:27.688745 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:18:27.688902 master-0 kubenswrapper[7728]: I0223 14:18:27.688584 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:18:27.688902 master-0 kubenswrapper[7728]: I0223 14:18:27.688818 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:18:27.688902 master-0 kubenswrapper[7728]: I0223 14:18:27.688846 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:18:27.688902 master-0 kubenswrapper[7728]: I0223 14:18:27.688867 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/56c3cb71c9851003c8de7e7c5db4b87e-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"56c3cb71c9851003c8de7e7c5db4b87e\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Feb 23 14:18:27.688902 master-0 kubenswrapper[7728]: I0223 14:18:27.688891 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c997c8e9d3be51d454d8e61e376bef08-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"c997c8e9d3be51d454d8e61e376bef08\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Feb 23 14:18:27.689073 master-0 kubenswrapper[7728]: I0223 14:18:27.688911 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:18:27.689073 master-0 kubenswrapper[7728]: I0223 14:18:27.688913 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c997c8e9d3be51d454d8e61e376bef08-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"c997c8e9d3be51d454d8e61e376bef08\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Feb 23 14:18:27.689073 master-0 kubenswrapper[7728]: I0223 14:18:27.688946 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/56c3cb71c9851003c8de7e7c5db4b87e-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"56c3cb71c9851003c8de7e7c5db4b87e\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Feb 23 14:18:27.689166 master-0 kubenswrapper[7728]: I0223 14:18:27.689104 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:18:27.689214 master-0 kubenswrapper[7728]: I0223 14:18:27.689170 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c997c8e9d3be51d454d8e61e376bef08-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"c997c8e9d3be51d454d8e61e376bef08\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Feb 23 14:18:27.689214 master-0 kubenswrapper[7728]: I0223 14:18:27.689203 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/12dab5d350ebc129b0bfa4714d330b15-certs\") pod \"etcd-master-0-master-0\" (UID: \"12dab5d350ebc129b0bfa4714d330b15\") " pod="openshift-etcd/etcd-master-0-master-0" Feb 23 14:18:27.689282 master-0 kubenswrapper[7728]: I0223 14:18:27.689145 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c997c8e9d3be51d454d8e61e376bef08-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"c997c8e9d3be51d454d8e61e376bef08\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Feb 23 14:18:27.689282 master-0 kubenswrapper[7728]: I0223 14:18:27.689237 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/12dab5d350ebc129b0bfa4714d330b15-certs\") pod \"etcd-master-0-master-0\" (UID: \"12dab5d350ebc129b0bfa4714d330b15\") " pod="openshift-etcd/etcd-master-0-master-0" Feb 23 14:18:27.689282 master-0 kubenswrapper[7728]: I0223 14:18:27.689276 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:18:27.689380 master-0 kubenswrapper[7728]: I0223 14:18:27.689321 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:18:27.689380 master-0 kubenswrapper[7728]: I0223 14:18:27.689331 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:18:27.689380 master-0 kubenswrapper[7728]: I0223 14:18:27.689363 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:18:27.689504 master-0 kubenswrapper[7728]: I0223 14:18:27.689366 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:18:27.689504 master-0 kubenswrapper[7728]: I0223 14:18:27.689392 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:18:27.689504 master-0 kubenswrapper[7728]: I0223 14:18:27.689408 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:18:27.689504 master-0 kubenswrapper[7728]: I0223 14:18:27.689431 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:18:27.689504 master-0 kubenswrapper[7728]: I0223 14:18:27.689455 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/56c3cb71c9851003c8de7e7c5db4b87e-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"56c3cb71c9851003c8de7e7c5db4b87e\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Feb 23 14:18:27.689504 master-0 kubenswrapper[7728]: I0223 14:18:27.689459 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:18:27.689504 master-0 kubenswrapper[7728]: I0223 14:18:27.689496 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/12dab5d350ebc129b0bfa4714d330b15-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"12dab5d350ebc129b0bfa4714d330b15\") " pod="openshift-etcd/etcd-master-0-master-0" Feb 23 14:18:27.689794 master-0 kubenswrapper[7728]: I0223 14:18:27.689513 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/56c3cb71c9851003c8de7e7c5db4b87e-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"56c3cb71c9851003c8de7e7c5db4b87e\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Feb 23 14:18:27.689794 master-0 kubenswrapper[7728]: I0223 14:18:27.689542 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/12dab5d350ebc129b0bfa4714d330b15-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"12dab5d350ebc129b0bfa4714d330b15\") " pod="openshift-etcd/etcd-master-0-master-0" Feb 23 14:18:27.689794 master-0 kubenswrapper[7728]: I0223 14:18:27.689528 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:18:27.689794 master-0 kubenswrapper[7728]: I0223 14:18:27.689591 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:18:27.689794 master-0 kubenswrapper[7728]: I0223 14:18:27.689617 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:18:27.689794 master-0 kubenswrapper[7728]: I0223 14:18:27.689640 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:18:27.689794 master-0 kubenswrapper[7728]: I0223 14:18:27.689645 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:18:27.689794 master-0 kubenswrapper[7728]: I0223 14:18:27.689692 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:18:27.689794 master-0 kubenswrapper[7728]: I0223 14:18:27.689723 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:18:27.856371 master-0 kubenswrapper[7728]: I0223 14:18:27.856166 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:18:27.860899 master-0 kubenswrapper[7728]: I0223 14:18:27.860819 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:18:28.149077 master-0 kubenswrapper[7728]: I0223 14:18:28.148914 7728 apiserver.go:52] "Watching apiserver" Feb 23 14:18:28.159094 master-0 kubenswrapper[7728]: I0223 14:18:28.159042 7728 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 23 14:18:28.160124 master-0 kubenswrapper[7728]: I0223 14:18:28.160062 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88","openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x","openshift-network-operator/iptables-alerter-t5h8h","openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b","openshift-marketplace/marketplace-operator-6f5488b997-7b5sp","assisted-installer/assisted-installer-controller-r6z45","openshift-etcd/etcd-master-0-master-0","openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6","kube-system/bootstrap-kube-scheduler-master-0","openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2","openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v","openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v","openshift-multus/network-metrics-daemon-9dnsv","openshift-network-diagnostics/network-check-target-x9gxm","openshift-network-node-identity/network-node-identity-td489","openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7","openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm","openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp","kube-system/bootstrap-kube-controller-manager-master-0","openshift-cluster-storage-operator/csi-snapshot-controller-operator-6fb4df594f-hkcgz","openshift-ingress-operator/ingress-operator-6569778c84-hsl6c","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","openshift-multus/multus-additional-cni-plugins-jdsv6","openshift-multus/multus-vdzqk","openshift-network-operator/network-operator-7d7db75979-x4qnw","openshift-ovn-kubernetes/ovnkube-node-ftngv","openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg","openshift-dns-operator/dns-operator-8c7d49845-5rk2g","openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw"] Feb 23 14:18:28.167541 master-0 kubenswrapper[7728]: I0223 14:18:28.160335 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-r6z45" Feb 23 14:18:28.167541 master-0 kubenswrapper[7728]: I0223 14:18:28.160897 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:28.167541 master-0 kubenswrapper[7728]: I0223 14:18:28.162628 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:18:28.167541 master-0 kubenswrapper[7728]: I0223 14:18:28.162858 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:18:28.167541 master-0 kubenswrapper[7728]: I0223 14:18:28.162957 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:18:28.167541 master-0 kubenswrapper[7728]: I0223 14:18:28.162968 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 23 14:18:28.167541 master-0 kubenswrapper[7728]: I0223 14:18:28.162999 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:18:28.167541 master-0 kubenswrapper[7728]: I0223 14:18:28.163696 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 23 14:18:28.167541 master-0 kubenswrapper[7728]: I0223 14:18:28.163845 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 23 14:18:28.167541 master-0 kubenswrapper[7728]: I0223 14:18:28.164063 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 23 14:18:28.167541 master-0 kubenswrapper[7728]: I0223 14:18:28.164461 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-8c7d49845-5rk2g" Feb 23 14:18:28.167541 master-0 kubenswrapper[7728]: I0223 14:18:28.164533 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 23 14:18:28.167541 master-0 kubenswrapper[7728]: I0223 14:18:28.164782 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 23 14:18:28.167541 master-0 kubenswrapper[7728]: I0223 14:18:28.166528 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:18:28.167541 master-0 kubenswrapper[7728]: I0223 14:18:28.167059 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:18:28.167541 master-0 kubenswrapper[7728]: I0223 14:18:28.167245 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v" Feb 23 14:18:28.168166 master-0 kubenswrapper[7728]: I0223 14:18:28.168139 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:18:28.175111 master-0 kubenswrapper[7728]: I0223 14:18:28.175088 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 23 14:18:28.176317 master-0 kubenswrapper[7728]: I0223 14:18:28.175315 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 23 14:18:28.176439 master-0 kubenswrapper[7728]: I0223 14:18:28.176215 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 23 14:18:28.176557 master-0 kubenswrapper[7728]: I0223 14:18:28.176539 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Feb 23 14:18:28.176679 master-0 kubenswrapper[7728]: I0223 14:18:28.176658 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Feb 23 14:18:28.176789 master-0 kubenswrapper[7728]: I0223 14:18:28.176766 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Feb 23 14:18:28.176890 master-0 kubenswrapper[7728]: I0223 14:18:28.176866 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 23 14:18:28.177186 master-0 kubenswrapper[7728]: I0223 14:18:28.177146 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 23 14:18:28.177257 master-0 kubenswrapper[7728]: I0223 14:18:28.177240 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 23 14:18:28.177405 master-0 kubenswrapper[7728]: I0223 14:18:28.177389 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 23 14:18:28.177566 master-0 kubenswrapper[7728]: I0223 14:18:28.177550 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 23 14:18:28.177756 master-0 kubenswrapper[7728]: I0223 14:18:28.177698 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 23 14:18:28.177847 master-0 kubenswrapper[7728]: I0223 14:18:28.177833 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Feb 23 14:18:28.178042 master-0 kubenswrapper[7728]: I0223 14:18:28.177987 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Feb 23 14:18:28.178092 master-0 kubenswrapper[7728]: I0223 14:18:28.178074 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 23 14:18:28.178204 master-0 kubenswrapper[7728]: I0223 14:18:28.178182 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 23 14:18:28.178236 master-0 kubenswrapper[7728]: I0223 14:18:28.178212 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 23 14:18:28.178321 master-0 kubenswrapper[7728]: I0223 14:18:28.178305 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 23 14:18:28.178394 master-0 kubenswrapper[7728]: I0223 14:18:28.178370 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 23 14:18:28.178428 master-0 kubenswrapper[7728]: I0223 14:18:28.178410 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Feb 23 14:18:28.178428 master-0 kubenswrapper[7728]: I0223 14:18:28.178428 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 23 14:18:28.178578 master-0 kubenswrapper[7728]: I0223 14:18:28.178534 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Feb 23 14:18:28.178617 master-0 kubenswrapper[7728]: I0223 14:18:28.178566 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 23 14:18:28.178646 master-0 kubenswrapper[7728]: I0223 14:18:28.178627 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Feb 23 14:18:28.178646 master-0 kubenswrapper[7728]: I0223 14:18:28.178633 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 23 14:18:28.178712 master-0 kubenswrapper[7728]: I0223 14:18:28.178656 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Feb 23 14:18:28.178741 master-0 kubenswrapper[7728]: I0223 14:18:28.178705 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 23 14:18:28.178741 master-0 kubenswrapper[7728]: I0223 14:18:28.178729 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 23 14:18:28.178798 master-0 kubenswrapper[7728]: I0223 14:18:28.178747 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Feb 23 14:18:28.178798 master-0 kubenswrapper[7728]: I0223 14:18:28.178763 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 23 14:18:28.178850 master-0 kubenswrapper[7728]: I0223 14:18:28.178547 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 23 14:18:28.178850 master-0 kubenswrapper[7728]: I0223 14:18:28.178836 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 23 14:18:28.178902 master-0 kubenswrapper[7728]: I0223 14:18:28.178839 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 23 14:18:28.178902 master-0 kubenswrapper[7728]: I0223 14:18:28.178861 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 23 14:18:28.178902 master-0 kubenswrapper[7728]: I0223 14:18:28.178888 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 23 14:18:28.178980 master-0 kubenswrapper[7728]: I0223 14:18:28.177719 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 23 14:18:28.178980 master-0 kubenswrapper[7728]: I0223 14:18:28.178027 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Feb 23 14:18:28.178980 master-0 kubenswrapper[7728]: I0223 14:18:28.178934 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 23 14:18:28.178980 master-0 kubenswrapper[7728]: I0223 14:18:28.178960 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 23 14:18:28.178980 master-0 kubenswrapper[7728]: I0223 14:18:28.178565 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Feb 23 14:18:28.179104 master-0 kubenswrapper[7728]: I0223 14:18:28.178979 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 23 14:18:28.179104 master-0 kubenswrapper[7728]: I0223 14:18:28.179043 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 23 14:18:28.179104 master-0 kubenswrapper[7728]: I0223 14:18:28.179055 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 23 14:18:28.179104 master-0 kubenswrapper[7728]: I0223 14:18:28.179064 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 23 14:18:28.179104 master-0 kubenswrapper[7728]: I0223 14:18:28.179093 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 23 14:18:28.179224 master-0 kubenswrapper[7728]: I0223 14:18:28.179125 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 23 14:18:28.179467 master-0 kubenswrapper[7728]: I0223 14:18:28.179442 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 23 14:18:28.180288 master-0 kubenswrapper[7728]: I0223 14:18:28.180269 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 23 14:18:28.180346 master-0 kubenswrapper[7728]: I0223 14:18:28.180283 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 23 14:18:28.187421 master-0 kubenswrapper[7728]: I0223 14:18:28.187391 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 23 14:18:28.190025 master-0 kubenswrapper[7728]: I0223 14:18:28.189983 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 23 14:18:28.190405 master-0 kubenswrapper[7728]: I0223 14:18:28.190115 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 23 14:18:28.190534 master-0 kubenswrapper[7728]: I0223 14:18:28.190507 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 23 14:18:28.190927 master-0 kubenswrapper[7728]: I0223 14:18:28.190898 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 23 14:18:28.191523 master-0 kubenswrapper[7728]: I0223 14:18:28.191502 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 23 14:18:28.191639 master-0 kubenswrapper[7728]: I0223 14:18:28.191617 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 23 14:18:28.191686 master-0 kubenswrapper[7728]: I0223 14:18:28.191625 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 23 14:18:28.191786 master-0 kubenswrapper[7728]: I0223 14:18:28.191743 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 23 14:18:28.191829 master-0 kubenswrapper[7728]: I0223 14:18:28.191809 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 23 14:18:28.191829 master-0 kubenswrapper[7728]: I0223 14:18:28.191822 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 23 14:18:28.191878 master-0 kubenswrapper[7728]: I0223 14:18:28.191767 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 23 14:18:28.192108 master-0 kubenswrapper[7728]: I0223 14:18:28.192066 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24829faf-50e8-45bb-abb0-7cc5ccf81080-serving-cert\") pod \"openshift-apiserver-operator-8586dccc9b-tvnmq\" (UID: \"24829faf-50e8-45bb-abb0-7cc5ccf81080\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" Feb 23 14:18:28.192162 master-0 kubenswrapper[7728]: I0223 14:18:28.192112 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j54j5\" (UniqueName: \"kubernetes.io/projected/842d45c5-3452-4e97-b5f5-540395330a65-kube-api-access-j54j5\") pod \"multus-admission-controller-5f98f4f8d5-fnc9v\" (UID: \"842d45c5-3452-4e97-b5f5-540395330a65\") " pod="openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v" Feb 23 14:18:28.192162 master-0 kubenswrapper[7728]: I0223 14:18:28.192145 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-cnibin\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.192221 master-0 kubenswrapper[7728]: I0223 14:18:28.192170 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-conf-dir\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.192274 master-0 kubenswrapper[7728]: I0223 14:18:28.192247 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-run-ovn-kubernetes\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.192316 master-0 kubenswrapper[7728]: I0223 14:18:28.192292 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f10f592e-5738-4879-b776-246b357d4621-ovnkube-config\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.192347 master-0 kubenswrapper[7728]: I0223 14:18:28.192322 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2d00ece-7586-4346-adbb-eaae1aeda69e-service-ca-bundle\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:18:28.192379 master-0 kubenswrapper[7728]: I0223 14:18:28.192349 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b090ed5a-984f-41dd-8cea-34a1ece1514f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-5d8dfcdc87-jbc2v\" (UID: \"b090ed5a-984f-41dd-8cea-34a1ece1514f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" Feb 23 14:18:28.192507 master-0 kubenswrapper[7728]: I0223 14:18:28.192380 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf04aca0-8174-4134-835d-37adf6a3b5ca-kube-api-access\") pod \"kube-controller-manager-operator-7bcfbc574b-zdntd\" (UID: \"cf04aca0-8174-4134-835d-37adf6a3b5ca\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" Feb 23 14:18:28.192507 master-0 kubenswrapper[7728]: I0223 14:18:28.192406 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8de1f285-47ac-42aa-8026-8addce656362-config\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:18:28.192507 master-0 kubenswrapper[7728]: I0223 14:18:28.192430 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf04aca0-8174-4134-835d-37adf6a3b5ca-serving-cert\") pod \"kube-controller-manager-operator-7bcfbc574b-zdntd\" (UID: \"cf04aca0-8174-4134-835d-37adf6a3b5ca\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" Feb 23 14:18:28.192507 master-0 kubenswrapper[7728]: I0223 14:18:28.192461 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/674041a2-e2b0-4286-88cc-f1b00571e3f3-metrics-tls\") pod \"network-operator-7d7db75979-x4qnw\" (UID: \"674041a2-e2b0-4286-88cc-f1b00571e3f3\") " pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" Feb 23 14:18:28.192636 master-0 kubenswrapper[7728]: I0223 14:18:28.192507 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9cf1c39-24f0-420b-8020-089616d1cdf0-config\") pod \"openshift-kube-scheduler-operator-77cd4d9559-qvq8x\" (UID: \"b9cf1c39-24f0-420b-8020-089616d1cdf0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" Feb 23 14:18:28.192771 master-0 kubenswrapper[7728]: I0223 14:18:28.192210 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 23 14:18:28.192987 master-0 kubenswrapper[7728]: I0223 14:18:28.192948 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8de1f285-47ac-42aa-8026-8addce656362-config\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:18:28.193046 master-0 kubenswrapper[7728]: I0223 14:18:28.193022 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3cea0ab8-258b-486c-bb7f-8c93930b296d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:18:28.193079 master-0 kubenswrapper[7728]: I0223 14:18:28.193040 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Feb 23 14:18:28.193194 master-0 kubenswrapper[7728]: I0223 14:18:28.192402 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 23 14:18:28.193242 master-0 kubenswrapper[7728]: I0223 14:18:28.193224 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf04aca0-8174-4134-835d-37adf6a3b5ca-serving-cert\") pod \"kube-controller-manager-operator-7bcfbc574b-zdntd\" (UID: \"cf04aca0-8174-4134-835d-37adf6a3b5ca\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" Feb 23 14:18:28.193459 master-0 kubenswrapper[7728]: I0223 14:18:28.193432 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9cf1c39-24f0-420b-8020-089616d1cdf0-config\") pod \"openshift-kube-scheduler-operator-77cd4d9559-qvq8x\" (UID: \"b9cf1c39-24f0-420b-8020-089616d1cdf0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" Feb 23 14:18:28.193561 master-0 kubenswrapper[7728]: I0223 14:18:28.193540 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/674041a2-e2b0-4286-88cc-f1b00571e3f3-metrics-tls\") pod \"network-operator-7d7db75979-x4qnw\" (UID: \"674041a2-e2b0-4286-88cc-f1b00571e3f3\") " pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" Feb 23 14:18:28.193751 master-0 kubenswrapper[7728]: I0223 14:18:28.193737 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2d00ece-7586-4346-adbb-eaae1aeda69e-service-ca-bundle\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:18:28.193818 master-0 kubenswrapper[7728]: I0223 14:18:28.193754 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 23 14:18:28.193940 master-0 kubenswrapper[7728]: I0223 14:18:28.193055 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-log-socket\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.194038 master-0 kubenswrapper[7728]: I0223 14:18:28.194000 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp6tj\" (UniqueName: \"kubernetes.io/projected/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061-kube-api-access-vp6tj\") pod \"cluster-olm-operator-5bd7768f54-bgg88\" (UID: \"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" Feb 23 14:18:28.194085 master-0 kubenswrapper[7728]: I0223 14:18:28.193166 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24829faf-50e8-45bb-abb0-7cc5ccf81080-serving-cert\") pod \"openshift-apiserver-operator-8586dccc9b-tvnmq\" (UID: \"24829faf-50e8-45bb-abb0-7cc5ccf81080\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" Feb 23 14:18:28.194085 master-0 kubenswrapper[7728]: I0223 14:18:28.194060 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs\") pod \"network-metrics-daemon-9dnsv\" (UID: \"ace75aae-6f4f-4299-90e2-d5292271b136\") " pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:18:28.194150 master-0 kubenswrapper[7728]: I0223 14:18:28.194103 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57b57915-64dd-42f5-b06f-bc4bcc06b667-trusted-ca\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:28.194150 master-0 kubenswrapper[7728]: I0223 14:18:28.193785 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 23 14:18:28.194207 master-0 kubenswrapper[7728]: I0223 14:18:28.192674 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 23 14:18:28.194207 master-0 kubenswrapper[7728]: I0223 14:18:28.194141 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/961e4ecd-545b-4270-ae34-e733dec793b6-config\") pod \"kube-apiserver-operator-5d87bf58c-nq2tz\" (UID: \"961e4ecd-545b-4270-ae34-e733dec793b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" Feb 23 14:18:28.194260 master-0 kubenswrapper[7728]: I0223 14:18:28.194224 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/961e4ecd-545b-4270-ae34-e733dec793b6-kube-api-access\") pod \"kube-apiserver-operator-5d87bf58c-nq2tz\" (UID: \"961e4ecd-545b-4270-ae34-e733dec793b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" Feb 23 14:18:28.194290 master-0 kubenswrapper[7728]: I0223 14:18:28.194264 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-daemon-config\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.194451 master-0 kubenswrapper[7728]: I0223 14:18:28.194428 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/483786a0-0a29-44bf-bbd0-2f37e045aa2c-cni-binary-copy\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:18:28.194507 master-0 kubenswrapper[7728]: I0223 14:18:28.194462 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls\") pod \"dns-operator-8c7d49845-5rk2g\" (UID: \"607c1101-3533-43e3-9eda-13cea2b9dbb6\") " pod="openshift-dns-operator/dns-operator-8c7d49845-5rk2g" Feb 23 14:18:28.194539 master-0 kubenswrapper[7728]: I0223 14:18:28.194506 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b714a9df-026e-423d-a980-2569f0d92e47-serving-cert\") pod \"service-ca-operator-c48c8bf7c-vtnsw\" (UID: \"b714a9df-026e-423d-a980-2569f0d92e47\") " pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" Feb 23 14:18:28.194658 master-0 kubenswrapper[7728]: I0223 14:18:28.194627 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6bb6d78bf-wzqcp\" (UID: \"646fece3-4a42-4e0c-bcc7-5f705f948d63\") " pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:18:28.194703 master-0 kubenswrapper[7728]: I0223 14:18:28.194676 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/865ceedb-b19a-4f2f-b295-311e1b7a645e-serving-cert\") pod \"kube-storage-version-migrator-operator-fc889cfd5-tw2r9\" (UID: \"865ceedb-b19a-4f2f-b295-311e1b7a645e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" Feb 23 14:18:28.194874 master-0 kubenswrapper[7728]: I0223 14:18:28.194851 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 23 14:18:28.194965 master-0 kubenswrapper[7728]: I0223 14:18:28.194953 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 23 14:18:28.195217 master-0 kubenswrapper[7728]: I0223 14:18:28.195174 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/865ceedb-b19a-4f2f-b295-311e1b7a645e-serving-cert\") pod \"kube-storage-version-migrator-operator-fc889cfd5-tw2r9\" (UID: \"865ceedb-b19a-4f2f-b295-311e1b7a645e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" Feb 23 14:18:28.195366 master-0 kubenswrapper[7728]: I0223 14:18:28.195334 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 23 14:18:28.195462 master-0 kubenswrapper[7728]: I0223 14:18:28.195431 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 23 14:18:28.195821 master-0 kubenswrapper[7728]: I0223 14:18:28.195598 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061-operand-assets\") pod \"cluster-olm-operator-5bd7768f54-bgg88\" (UID: \"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" Feb 23 14:18:28.195962 master-0 kubenswrapper[7728]: I0223 14:18:28.195925 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp5kb\" (UniqueName: \"kubernetes.io/projected/bbe678de-546d-49d0-8280-3f6d94fa5e4f-kube-api-access-kp5kb\") pod \"network-node-identity-td489\" (UID: \"bbe678de-546d-49d0-8280-3f6d94fa5e4f\") " pod="openshift-network-node-identity/network-node-identity-td489" Feb 23 14:18:28.196045 master-0 kubenswrapper[7728]: I0223 14:18:28.196007 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061-operand-assets\") pod \"cluster-olm-operator-5bd7768f54-bgg88\" (UID: \"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" Feb 23 14:18:28.196080 master-0 kubenswrapper[7728]: I0223 14:18:28.195832 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 23 14:18:28.196157 master-0 kubenswrapper[7728]: I0223 14:18:28.196119 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs\") pod \"multus-admission-controller-5f98f4f8d5-fnc9v\" (UID: \"842d45c5-3452-4e97-b5f5-540395330a65\") " pod="openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v" Feb 23 14:18:28.196315 master-0 kubenswrapper[7728]: I0223 14:18:28.195977 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 23 14:18:28.196385 master-0 kubenswrapper[7728]: I0223 14:18:28.196290 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb6e88cd-98de-446a-92e8-f56a2f133703-config\") pod \"openshift-controller-manager-operator-584cc7bcb5-67ds6\" (UID: \"cb6e88cd-98de-446a-92e8-f56a2f133703\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" Feb 23 14:18:28.196441 master-0 kubenswrapper[7728]: I0223 14:18:28.196399 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/961e4ecd-545b-4270-ae34-e733dec793b6-config\") pod \"kube-apiserver-operator-5d87bf58c-nq2tz\" (UID: \"961e4ecd-545b-4270-ae34-e733dec793b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" Feb 23 14:18:28.196471 master-0 kubenswrapper[7728]: I0223 14:18:28.196429 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b714a9df-026e-423d-a980-2569f0d92e47-serving-cert\") pod \"service-ca-operator-c48c8bf7c-vtnsw\" (UID: \"b714a9df-026e-423d-a980-2569f0d92e47\") " pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" Feb 23 14:18:28.196471 master-0 kubenswrapper[7728]: I0223 14:18:28.196419 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:18:28.196596 master-0 kubenswrapper[7728]: I0223 14:18:28.196575 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb6e88cd-98de-446a-92e8-f56a2f133703-config\") pod \"openshift-controller-manager-operator-584cc7bcb5-67ds6\" (UID: \"cb6e88cd-98de-446a-92e8-f56a2f133703\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" Feb 23 14:18:28.196627 master-0 kubenswrapper[7728]: I0223 14:18:28.196546 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-cni-netd\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.196654 master-0 kubenswrapper[7728]: I0223 14:18:28.196629 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr2p2\" (UniqueName: \"kubernetes.io/projected/865ceedb-b19a-4f2f-b295-311e1b7a645e-kube-api-access-tr2p2\") pod \"kube-storage-version-migrator-operator-fc889cfd5-tw2r9\" (UID: \"865ceedb-b19a-4f2f-b295-311e1b7a645e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" Feb 23 14:18:28.196804 master-0 kubenswrapper[7728]: I0223 14:18:28.196772 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Feb 23 14:18:28.196833 master-0 kubenswrapper[7728]: I0223 14:18:28.196775 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.196888 master-0 kubenswrapper[7728]: I0223 14:18:28.196861 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4sbp\" (UniqueName: \"kubernetes.io/projected/607c1101-3533-43e3-9eda-13cea2b9dbb6-kube-api-access-v4sbp\") pod \"dns-operator-8c7d49845-5rk2g\" (UID: \"607c1101-3533-43e3-9eda-13cea2b9dbb6\") " pod="openshift-dns-operator/dns-operator-8c7d49845-5rk2g" Feb 23 14:18:28.197050 master-0 kubenswrapper[7728]: I0223 14:18:28.196915 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chznd\" (UniqueName: \"kubernetes.io/projected/cb6e88cd-98de-446a-92e8-f56a2f133703-kube-api-access-chznd\") pod \"openshift-controller-manager-operator-584cc7bcb5-67ds6\" (UID: \"cb6e88cd-98de-446a-92e8-f56a2f133703\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" Feb 23 14:18:28.197050 master-0 kubenswrapper[7728]: I0223 14:18:28.196955 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-var-lib-kubelet\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.197050 master-0 kubenswrapper[7728]: I0223 14:18:28.196996 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8llc8\" (UniqueName: \"kubernetes.io/projected/a4ae9292-71dc-4484-b277-43cb26c1e04d-kube-api-access-8llc8\") pod \"csi-snapshot-controller-operator-6fb4df594f-hkcgz\" (UID: \"a4ae9292-71dc-4484-b277-43cb26c1e04d\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6fb4df594f-hkcgz" Feb 23 14:18:28.197050 master-0 kubenswrapper[7728]: I0223 14:18:28.197034 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3cea0ab8-258b-486c-bb7f-8c93930b296d-service-ca\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:18:28.197296 master-0 kubenswrapper[7728]: I0223 14:18:28.197274 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 23 14:18:28.197416 master-0 kubenswrapper[7728]: I0223 14:18:28.197384 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-os-release\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:18:28.197443 master-0 kubenswrapper[7728]: I0223 14:18:28.197411 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 23 14:18:28.197443 master-0 kubenswrapper[7728]: I0223 14:18:28.197435 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-run-netns\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.197510 master-0 kubenswrapper[7728]: I0223 14:18:28.197474 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f10f592e-5738-4879-b776-246b357d4621-ovnkube-script-lib\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.197636 master-0 kubenswrapper[7728]: I0223 14:18:28.197614 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:28.198072 master-0 kubenswrapper[7728]: I0223 14:18:28.198031 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/961e4ecd-545b-4270-ae34-e733dec793b6-serving-cert\") pod \"kube-apiserver-operator-5d87bf58c-nq2tz\" (UID: \"961e4ecd-545b-4270-ae34-e733dec793b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" Feb 23 14:18:28.198127 master-0 kubenswrapper[7728]: I0223 14:18:28.198102 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3488a7eb-5170-478c-9af7-490dbe0f514e-bound-sa-token\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:18:28.198162 master-0 kubenswrapper[7728]: I0223 14:18:28.198146 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-run-k8s-cni-cncf-io\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.198196 master-0 kubenswrapper[7728]: I0223 14:18:28.198185 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-trusted-ca\") pod \"marketplace-operator-6f5488b997-7b5sp\" (UID: \"585f74db-4593-426b-b0c7-ec8f64810549\") " pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:18:28.198223 master-0 kubenswrapper[7728]: I0223 14:18:28.198195 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3cea0ab8-258b-486c-bb7f-8c93930b296d-service-ca\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:18:28.198223 master-0 kubenswrapper[7728]: I0223 14:18:28.198213 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9cf1c39-24f0-420b-8020-089616d1cdf0-serving-cert\") pod \"openshift-kube-scheduler-operator-77cd4d9559-qvq8x\" (UID: \"b9cf1c39-24f0-420b-8020-089616d1cdf0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" Feb 23 14:18:28.198406 master-0 kubenswrapper[7728]: I0223 14:18:28.198376 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3488a7eb-5170-478c-9af7-490dbe0f514e-trusted-ca\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:18:28.198446 master-0 kubenswrapper[7728]: I0223 14:18:28.198427 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-os-release\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.198527 master-0 kubenswrapper[7728]: I0223 14:18:28.198473 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3cea0ab8-258b-486c-bb7f-8c93930b296d-kube-api-access\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:18:28.198571 master-0 kubenswrapper[7728]: I0223 14:18:28.198534 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-5bd7768f54-bgg88\" (UID: \"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" Feb 23 14:18:28.198608 master-0 kubenswrapper[7728]: I0223 14:18:28.198579 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08c561b3-613b-425f-9de4-d5fc8762ea51-host-slash\") pod \"iptables-alerter-t5h8h\" (UID: \"08c561b3-613b-425f-9de4-d5fc8762ea51\") " pod="openshift-network-operator/iptables-alerter-t5h8h" Feb 23 14:18:28.198636 master-0 kubenswrapper[7728]: I0223 14:18:28.198623 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3cea0ab8-258b-486c-bb7f-8c93930b296d-etc-ssl-certs\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:18:28.198725 master-0 kubenswrapper[7728]: I0223 14:18:28.198690 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9cf1c39-24f0-420b-8020-089616d1cdf0-serving-cert\") pod \"openshift-kube-scheduler-operator-77cd4d9559-qvq8x\" (UID: \"b9cf1c39-24f0-420b-8020-089616d1cdf0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" Feb 23 14:18:28.198783 master-0 kubenswrapper[7728]: I0223 14:18:28.198584 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f10f592e-5738-4879-b776-246b357d4621-ovnkube-script-lib\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.198847 master-0 kubenswrapper[7728]: I0223 14:18:28.198805 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f10f592e-5738-4879-b776-246b357d4621-ovnkube-config\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.198883 master-0 kubenswrapper[7728]: I0223 14:18:28.198864 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/961e4ecd-545b-4270-ae34-e733dec793b6-serving-cert\") pod \"kube-apiserver-operator-5d87bf58c-nq2tz\" (UID: \"961e4ecd-545b-4270-ae34-e733dec793b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" Feb 23 14:18:28.198883 master-0 kubenswrapper[7728]: I0223 14:18:28.198830 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-var-lib-cni-multus\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.199437 master-0 kubenswrapper[7728]: I0223 14:18:28.199403 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 23 14:18:28.200769 master-0 kubenswrapper[7728]: I0223 14:18:28.200735 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-5bd7768f54-bgg88\" (UID: \"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" Feb 23 14:18:28.201082 master-0 kubenswrapper[7728]: I0223 14:18:28.201041 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Feb 23 14:18:28.201359 master-0 kubenswrapper[7728]: I0223 14:18:28.201325 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bbe678de-546d-49d0-8280-3f6d94fa5e4f-webhook-cert\") pod \"network-node-identity-td489\" (UID: \"bbe678de-546d-49d0-8280-3f6d94fa5e4f\") " pod="openshift-network-node-identity/network-node-identity-td489" Feb 23 14:18:28.201400 master-0 kubenswrapper[7728]: I0223 14:18:28.201366 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:28.201618 master-0 kubenswrapper[7728]: I0223 14:18:28.201580 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/865ceedb-b19a-4f2f-b295-311e1b7a645e-config\") pod \"kube-storage-version-migrator-operator-fc889cfd5-tw2r9\" (UID: \"865ceedb-b19a-4f2f-b295-311e1b7a645e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" Feb 23 14:18:28.201682 master-0 kubenswrapper[7728]: I0223 14:18:28.201664 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-run-systemd\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.201730 master-0 kubenswrapper[7728]: I0223 14:18:28.201711 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-node-log\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.201801 master-0 kubenswrapper[7728]: I0223 14:18:28.201775 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2d00ece-7586-4346-adbb-eaae1aeda69e-trusted-ca-bundle\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:18:28.201848 master-0 kubenswrapper[7728]: I0223 14:18:28.201831 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5f8j\" (UniqueName: \"kubernetes.io/projected/5b54fc16-d2f7-4b10-a611-5b411b389c5a-kube-api-access-d5f8j\") pod \"package-server-manager-5c75f78c8b-cj2l7\" (UID: \"5b54fc16-d2f7-4b10-a611-5b411b389c5a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:18:28.201964 master-0 kubenswrapper[7728]: I0223 14:18:28.201939 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24829faf-50e8-45bb-abb0-7cc5ccf81080-config\") pod \"openshift-apiserver-operator-8586dccc9b-tvnmq\" (UID: \"24829faf-50e8-45bb-abb0-7cc5ccf81080\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" Feb 23 14:18:28.201999 master-0 kubenswrapper[7728]: I0223 14:18:28.201978 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/646fece3-4a42-4e0c-bcc7-5f705f948d63-telemetry-config\") pod \"cluster-monitoring-operator-6bb6d78bf-wzqcp\" (UID: \"646fece3-4a42-4e0c-bcc7-5f705f948d63\") " pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:18:28.202026 master-0 kubenswrapper[7728]: I0223 14:18:28.202011 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nr85\" (UniqueName: \"kubernetes.io/projected/e2d00ece-7586-4346-adbb-eaae1aeda69e-kube-api-access-4nr85\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:18:28.202078 master-0 kubenswrapper[7728]: I0223 14:18:28.202056 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7jvd\" (UniqueName: \"kubernetes.io/projected/8de1f285-47ac-42aa-8026-8addce656362-kube-api-access-x7jvd\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:18:28.202139 master-0 kubenswrapper[7728]: I0223 14:18:28.202115 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/865ceedb-b19a-4f2f-b295-311e1b7a645e-config\") pod \"kube-storage-version-migrator-operator-fc889cfd5-tw2r9\" (UID: \"865ceedb-b19a-4f2f-b295-311e1b7a645e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" Feb 23 14:18:28.202175 master-0 kubenswrapper[7728]: I0223 14:18:28.201940 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bbe678de-546d-49d0-8280-3f6d94fa5e4f-webhook-cert\") pod \"network-node-identity-td489\" (UID: \"bbe678de-546d-49d0-8280-3f6d94fa5e4f\") " pod="openshift-network-node-identity/network-node-identity-td489" Feb 23 14:18:28.202240 master-0 kubenswrapper[7728]: I0223 14:18:28.202205 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b714a9df-026e-423d-a980-2569f0d92e47-config\") pod \"service-ca-operator-c48c8bf7c-vtnsw\" (UID: \"b714a9df-026e-423d-a980-2569f0d92e47\") " pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" Feb 23 14:18:28.202498 master-0 kubenswrapper[7728]: I0223 14:18:28.202454 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24829faf-50e8-45bb-abb0-7cc5ccf81080-config\") pod \"openshift-apiserver-operator-8586dccc9b-tvnmq\" (UID: \"24829faf-50e8-45bb-abb0-7cc5ccf81080\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" Feb 23 14:18:28.202498 master-0 kubenswrapper[7728]: I0223 14:18:28.202457 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qggzs\" (UniqueName: \"kubernetes.io/projected/57b57915-64dd-42f5-b06f-bc4bcc06b667-kube-api-access-qggzs\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:28.202760 master-0 kubenswrapper[7728]: I0223 14:18:28.202524 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb6e88cd-98de-446a-92e8-f56a2f133703-serving-cert\") pod \"openshift-controller-manager-operator-584cc7bcb5-67ds6\" (UID: \"cb6e88cd-98de-446a-92e8-f56a2f133703\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" Feb 23 14:18:28.202822 master-0 kubenswrapper[7728]: I0223 14:18:28.202792 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-system-cni-dir\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.202859 master-0 kubenswrapper[7728]: I0223 14:18:28.202825 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-systemd-units\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.202859 master-0 kubenswrapper[7728]: I0223 14:18:28.202849 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-socket-dir-parent\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.202911 master-0 kubenswrapper[7728]: I0223 14:18:28.202871 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9z2f\" (UniqueName: \"kubernetes.io/projected/09d80e28-0b64-4c5d-a9bc-99d843d40165-kube-api-access-g9z2f\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.202945 master-0 kubenswrapper[7728]: I0223 14:18:28.202933 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf04aca0-8174-4134-835d-37adf6a3b5ca-config\") pod \"kube-controller-manager-operator-7bcfbc574b-zdntd\" (UID: \"cf04aca0-8174-4134-835d-37adf6a3b5ca\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" Feb 23 14:18:28.203069 master-0 kubenswrapper[7728]: I0223 14:18:28.203040 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/646fece3-4a42-4e0c-bcc7-5f705f948d63-telemetry-config\") pod \"cluster-monitoring-operator-6bb6d78bf-wzqcp\" (UID: \"646fece3-4a42-4e0c-bcc7-5f705f948d63\") " pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:18:28.203253 master-0 kubenswrapper[7728]: I0223 14:18:28.203208 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb6e88cd-98de-446a-92e8-f56a2f133703-serving-cert\") pod \"openshift-controller-manager-operator-584cc7bcb5-67ds6\" (UID: \"cb6e88cd-98de-446a-92e8-f56a2f133703\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" Feb 23 14:18:28.203299 master-0 kubenswrapper[7728]: I0223 14:18:28.203201 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b714a9df-026e-423d-a980-2569f0d92e47-config\") pod \"service-ca-operator-c48c8bf7c-vtnsw\" (UID: \"b714a9df-026e-423d-a980-2569f0d92e47\") " pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" Feb 23 14:18:28.203327 master-0 kubenswrapper[7728]: I0223 14:18:28.203235 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 23 14:18:28.203360 master-0 kubenswrapper[7728]: I0223 14:18:28.203304 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 23 14:18:28.203505 master-0 kubenswrapper[7728]: I0223 14:18:28.203283 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2d00ece-7586-4346-adbb-eaae1aeda69e-serving-cert\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:18:28.203563 master-0 kubenswrapper[7728]: I0223 14:18:28.203510 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzkcs\" (UniqueName: \"kubernetes.io/projected/ace75aae-6f4f-4299-90e2-d5292271b136-kube-api-access-wzkcs\") pod \"network-metrics-daemon-9dnsv\" (UID: \"ace75aae-6f4f-4299-90e2-d5292271b136\") " pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:18:28.203563 master-0 kubenswrapper[7728]: I0223 14:18:28.203546 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:18:28.203631 master-0 kubenswrapper[7728]: I0223 14:18:28.203581 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf04aca0-8174-4134-835d-37adf6a3b5ca-config\") pod \"kube-controller-manager-operator-7bcfbc574b-zdntd\" (UID: \"cf04aca0-8174-4134-835d-37adf6a3b5ca\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" Feb 23 14:18:28.203631 master-0 kubenswrapper[7728]: I0223 14:18:28.203592 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-var-lib-openvswitch\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.203720 master-0 kubenswrapper[7728]: I0223 14:18:28.203634 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-cni-bin\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.203720 master-0 kubenswrapper[7728]: I0223 14:18:28.203655 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-cni-dir\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.203720 master-0 kubenswrapper[7728]: I0223 14:18:28.203681 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-kubelet\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.203720 master-0 kubenswrapper[7728]: I0223 14:18:28.203703 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4lz2\" (UniqueName: \"kubernetes.io/projected/ded555da-db03-498e-81a9-ad166f29a2aa-kube-api-access-x4lz2\") pod \"network-check-target-x9gxm\" (UID: \"ded555da-db03-498e-81a9-ad166f29a2aa\") " pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:18:28.203874 master-0 kubenswrapper[7728]: I0223 14:18:28.203729 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/bbe678de-546d-49d0-8280-3f6d94fa5e4f-ovnkube-identity-cm\") pod \"network-node-identity-td489\" (UID: \"bbe678de-546d-49d0-8280-3f6d94fa5e4f\") " pod="openshift-network-node-identity/network-node-identity-td489" Feb 23 14:18:28.203874 master-0 kubenswrapper[7728]: I0223 14:18:28.203757 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2d00ece-7586-4346-adbb-eaae1aeda69e-config\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:18:28.203874 master-0 kubenswrapper[7728]: I0223 14:18:28.203785 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qszm\" (UniqueName: \"kubernetes.io/projected/3488a7eb-5170-478c-9af7-490dbe0f514e-kube-api-access-6qszm\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:18:28.203874 master-0 kubenswrapper[7728]: I0223 14:18:28.203817 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88qnh\" (UniqueName: \"kubernetes.io/projected/483786a0-0a29-44bf-bbd0-2f37e045aa2c-kube-api-access-88qnh\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:18:28.203874 master-0 kubenswrapper[7728]: I0223 14:18:28.203844 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f10f592e-5738-4879-b776-246b357d4621-ovn-node-metrics-cert\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.203874 master-0 kubenswrapper[7728]: I0223 14:18:28.203864 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/09d80e28-0b64-4c5d-a9bc-99d843d40165-cni-binary-copy\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.204088 master-0 kubenswrapper[7728]: I0223 14:18:28.203884 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics\") pod \"marketplace-operator-6f5488b997-7b5sp\" (UID: \"585f74db-4593-426b-b0c7-ec8f64810549\") " pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:18:28.204088 master-0 kubenswrapper[7728]: I0223 14:18:28.203907 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8de1f285-47ac-42aa-8026-8addce656362-etcd-client\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:18:28.204088 master-0 kubenswrapper[7728]: I0223 14:18:28.203932 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert\") pod \"package-server-manager-5c75f78c8b-cj2l7\" (UID: \"5b54fc16-d2f7-4b10-a611-5b411b389c5a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:18:28.204088 master-0 kubenswrapper[7728]: I0223 14:18:28.204009 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brd4j\" (UniqueName: \"kubernetes.io/projected/674041a2-e2b0-4286-88cc-f1b00571e3f3-kube-api-access-brd4j\") pod \"network-operator-7d7db75979-x4qnw\" (UID: \"674041a2-e2b0-4286-88cc-f1b00571e3f3\") " pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" Feb 23 14:18:28.204088 master-0 kubenswrapper[7728]: I0223 14:18:28.204029 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-cnibin\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:18:28.204088 master-0 kubenswrapper[7728]: I0223 14:18:28.204048 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:18:28.204088 master-0 kubenswrapper[7728]: I0223 14:18:28.204047 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2d00ece-7586-4346-adbb-eaae1aeda69e-config\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:18:28.204336 master-0 kubenswrapper[7728]: I0223 14:18:28.204144 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b090ed5a-984f-41dd-8cea-34a1ece1514f-ovnkube-config\") pod \"ovnkube-control-plane-5d8dfcdc87-jbc2v\" (UID: \"b090ed5a-984f-41dd-8cea-34a1ece1514f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" Feb 23 14:18:28.204336 master-0 kubenswrapper[7728]: I0223 14:18:28.204173 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-slash\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.204336 master-0 kubenswrapper[7728]: I0223 14:18:28.204185 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f10f592e-5738-4879-b776-246b357d4621-ovn-node-metrics-cert\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.204336 master-0 kubenswrapper[7728]: I0223 14:18:28.204195 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8de1f285-47ac-42aa-8026-8addce656362-etcd-service-ca\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:18:28.204336 master-0 kubenswrapper[7728]: I0223 14:18:28.204246 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/09d80e28-0b64-4c5d-a9bc-99d843d40165-cni-binary-copy\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.204336 master-0 kubenswrapper[7728]: I0223 14:18:28.204271 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-run-multus-certs\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.204336 master-0 kubenswrapper[7728]: I0223 14:18:28.204308 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b090ed5a-984f-41dd-8cea-34a1ece1514f-env-overrides\") pod \"ovnkube-control-plane-5d8dfcdc87-jbc2v\" (UID: \"b090ed5a-984f-41dd-8cea-34a1ece1514f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" Feb 23 14:18:28.204336 master-0 kubenswrapper[7728]: I0223 14:18:28.204332 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjs6f\" (UniqueName: \"kubernetes.io/projected/b090ed5a-984f-41dd-8cea-34a1ece1514f-kube-api-access-fjs6f\") pod \"ovnkube-control-plane-5d8dfcdc87-jbc2v\" (UID: \"b090ed5a-984f-41dd-8cea-34a1ece1514f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" Feb 23 14:18:28.204336 master-0 kubenswrapper[7728]: I0223 14:18:28.204335 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8de1f285-47ac-42aa-8026-8addce656362-etcd-service-ca\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:18:28.204336 master-0 kubenswrapper[7728]: I0223 14:18:28.204359 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-run-ovn\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.204759 master-0 kubenswrapper[7728]: I0223 14:18:28.204388 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-269v7\" (UniqueName: \"kubernetes.io/projected/f10f592e-5738-4879-b776-246b357d4621-kube-api-access-269v7\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.204759 master-0 kubenswrapper[7728]: I0223 14:18:28.204414 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/674041a2-e2b0-4286-88cc-f1b00571e3f3-host-etc-kube\") pod \"network-operator-7d7db75979-x4qnw\" (UID: \"674041a2-e2b0-4286-88cc-f1b00571e3f3\") " pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" Feb 23 14:18:28.204759 master-0 kubenswrapper[7728]: I0223 14:18:28.204438 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-hostroot\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.204759 master-0 kubenswrapper[7728]: I0223 14:18:28.204456 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-var-lib-cni-bin\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.204759 master-0 kubenswrapper[7728]: I0223 14:18:28.204499 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/483786a0-0a29-44bf-bbd0-2f37e045aa2c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:18:28.204759 master-0 kubenswrapper[7728]: I0223 14:18:28.204520 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-etc-openvswitch\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.204759 master-0 kubenswrapper[7728]: I0223 14:18:28.204564 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9cf1c39-24f0-420b-8020-089616d1cdf0-kube-api-access\") pod \"openshift-kube-scheduler-operator-77cd4d9559-qvq8x\" (UID: \"b9cf1c39-24f0-420b-8020-089616d1cdf0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" Feb 23 14:18:28.204759 master-0 kubenswrapper[7728]: I0223 14:18:28.204640 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8de1f285-47ac-42aa-8026-8addce656362-etcd-client\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:18:28.204759 master-0 kubenswrapper[7728]: I0223 14:18:28.204651 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b090ed5a-984f-41dd-8cea-34a1ece1514f-ovnkube-config\") pod \"ovnkube-control-plane-5d8dfcdc87-jbc2v\" (UID: \"b090ed5a-984f-41dd-8cea-34a1ece1514f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" Feb 23 14:18:28.204759 master-0 kubenswrapper[7728]: I0223 14:18:28.204733 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b090ed5a-984f-41dd-8cea-34a1ece1514f-env-overrides\") pod \"ovnkube-control-plane-5d8dfcdc87-jbc2v\" (UID: \"b090ed5a-984f-41dd-8cea-34a1ece1514f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" Feb 23 14:18:28.204759 master-0 kubenswrapper[7728]: I0223 14:18:28.204759 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2d00ece-7586-4346-adbb-eaae1aeda69e-serving-cert\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:18:28.205053 master-0 kubenswrapper[7728]: I0223 14:18:28.204820 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 23 14:18:28.205053 master-0 kubenswrapper[7728]: I0223 14:18:28.204973 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-run-netns\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.205053 master-0 kubenswrapper[7728]: I0223 14:18:28.204987 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b090ed5a-984f-41dd-8cea-34a1ece1514f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-5d8dfcdc87-jbc2v\" (UID: \"b090ed5a-984f-41dd-8cea-34a1ece1514f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" Feb 23 14:18:28.205053 master-0 kubenswrapper[7728]: I0223 14:18:28.205033 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 23 14:18:28.205157 master-0 kubenswrapper[7728]: I0223 14:18:28.205032 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57b57915-64dd-42f5-b06f-bc4bcc06b667-trusted-ca\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:28.205157 master-0 kubenswrapper[7728]: I0223 14:18:28.205036 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/483786a0-0a29-44bf-bbd0-2f37e045aa2c-cni-binary-copy\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:18:28.205157 master-0 kubenswrapper[7728]: I0223 14:18:28.205099 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 23 14:18:28.205157 master-0 kubenswrapper[7728]: I0223 14:18:28.205126 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 23 14:18:28.205157 master-0 kubenswrapper[7728]: I0223 14:18:28.205134 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 23 14:18:28.205157 master-0 kubenswrapper[7728]: I0223 14:18:28.205145 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 23 14:18:28.205357 master-0 kubenswrapper[7728]: I0223 14:18:28.205167 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-etc-kubernetes\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.205357 master-0 kubenswrapper[7728]: I0223 14:18:28.205213 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/483786a0-0a29-44bf-bbd0-2f37e045aa2c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:18:28.205357 master-0 kubenswrapper[7728]: I0223 14:18:28.205218 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f10f592e-5738-4879-b776-246b357d4621-env-overrides\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.205357 master-0 kubenswrapper[7728]: I0223 14:18:28.205269 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/08c561b3-613b-425f-9de4-d5fc8762ea51-iptables-alerter-script\") pod \"iptables-alerter-t5h8h\" (UID: \"08c561b3-613b-425f-9de4-d5fc8762ea51\") " pod="openshift-network-operator/iptables-alerter-t5h8h" Feb 23 14:18:28.205357 master-0 kubenswrapper[7728]: I0223 14:18:28.205295 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jzsd\" (UniqueName: \"kubernetes.io/projected/646fece3-4a42-4e0c-bcc7-5f705f948d63-kube-api-access-2jzsd\") pod \"cluster-monitoring-operator-6bb6d78bf-wzqcp\" (UID: \"646fece3-4a42-4e0c-bcc7-5f705f948d63\") " pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:18:28.205546 master-0 kubenswrapper[7728]: I0223 14:18:28.205379 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f10f592e-5738-4879-b776-246b357d4621-env-overrides\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.205546 master-0 kubenswrapper[7728]: I0223 14:18:28.205501 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-system-cni-dir\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:18:28.205846 master-0 kubenswrapper[7728]: I0223 14:18:28.205785 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-run-openvswitch\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.206365 master-0 kubenswrapper[7728]: I0223 14:18:28.206009 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/08c561b3-613b-425f-9de4-d5fc8762ea51-iptables-alerter-script\") pod \"iptables-alerter-t5h8h\" (UID: \"08c561b3-613b-425f-9de4-d5fc8762ea51\") " pod="openshift-network-operator/iptables-alerter-t5h8h" Feb 23 14:18:28.207567 master-0 kubenswrapper[7728]: I0223 14:18:28.207538 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hp42\" (UniqueName: \"kubernetes.io/projected/24829faf-50e8-45bb-abb0-7cc5ccf81080-kube-api-access-7hp42\") pod \"openshift-apiserver-operator-8586dccc9b-tvnmq\" (UID: \"24829faf-50e8-45bb-abb0-7cc5ccf81080\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" Feb 23 14:18:28.207613 master-0 kubenswrapper[7728]: I0223 14:18:28.207580 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8de1f285-47ac-42aa-8026-8addce656362-serving-cert\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:18:28.207613 master-0 kubenswrapper[7728]: I0223 14:18:28.207604 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8de1f285-47ac-42aa-8026-8addce656362-etcd-ca\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:18:28.207664 master-0 kubenswrapper[7728]: I0223 14:18:28.207627 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr868\" (UniqueName: \"kubernetes.io/projected/b714a9df-026e-423d-a980-2569f0d92e47-kube-api-access-lr868\") pod \"service-ca-operator-c48c8bf7c-vtnsw\" (UID: \"b714a9df-026e-423d-a980-2569f0d92e47\") " pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" Feb 23 14:18:28.207664 master-0 kubenswrapper[7728]: I0223 14:18:28.207651 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bbe678de-546d-49d0-8280-3f6d94fa5e4f-env-overrides\") pod \"network-node-identity-td489\" (UID: \"bbe678de-546d-49d0-8280-3f6d94fa5e4f\") " pod="openshift-network-node-identity/network-node-identity-td489" Feb 23 14:18:28.207717 master-0 kubenswrapper[7728]: I0223 14:18:28.207671 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phmkf\" (UniqueName: \"kubernetes.io/projected/08c561b3-613b-425f-9de4-d5fc8762ea51-kube-api-access-phmkf\") pod \"iptables-alerter-t5h8h\" (UID: \"08c561b3-613b-425f-9de4-d5fc8762ea51\") " pod="openshift-network-operator/iptables-alerter-t5h8h" Feb 23 14:18:28.207717 master-0 kubenswrapper[7728]: I0223 14:18:28.207693 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9tkx\" (UniqueName: \"kubernetes.io/projected/585f74db-4593-426b-b0c7-ec8f64810549-kube-api-access-q9tkx\") pod \"marketplace-operator-6f5488b997-7b5sp\" (UID: \"585f74db-4593-426b-b0c7-ec8f64810549\") " pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:18:28.207865 master-0 kubenswrapper[7728]: I0223 14:18:28.207841 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bbe678de-546d-49d0-8280-3f6d94fa5e4f-env-overrides\") pod \"network-node-identity-td489\" (UID: \"bbe678de-546d-49d0-8280-3f6d94fa5e4f\") " pod="openshift-network-node-identity/network-node-identity-td489" Feb 23 14:18:28.207900 master-0 kubenswrapper[7728]: I0223 14:18:28.207857 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/483786a0-0a29-44bf-bbd0-2f37e045aa2c-whereabouts-configmap\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:18:28.207979 master-0 kubenswrapper[7728]: I0223 14:18:28.207958 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8de1f285-47ac-42aa-8026-8addce656362-serving-cert\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:18:28.208094 master-0 kubenswrapper[7728]: I0223 14:18:28.208067 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/483786a0-0a29-44bf-bbd0-2f37e045aa2c-whereabouts-configmap\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:18:28.208125 master-0 kubenswrapper[7728]: I0223 14:18:28.208099 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 23 14:18:28.208268 master-0 kubenswrapper[7728]: I0223 14:18:28.208241 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8de1f285-47ac-42aa-8026-8addce656362-etcd-ca\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:18:28.208434 master-0 kubenswrapper[7728]: I0223 14:18:28.208411 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 23 14:18:28.214783 master-0 kubenswrapper[7728]: I0223 14:18:28.214429 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/bbe678de-546d-49d0-8280-3f6d94fa5e4f-ovnkube-identity-cm\") pod \"network-node-identity-td489\" (UID: \"bbe678de-546d-49d0-8280-3f6d94fa5e4f\") " pod="openshift-network-node-identity/network-node-identity-td489" Feb 23 14:18:28.215450 master-0 kubenswrapper[7728]: I0223 14:18:28.215419 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-trusted-ca\") pod \"marketplace-operator-6f5488b997-7b5sp\" (UID: \"585f74db-4593-426b-b0c7-ec8f64810549\") " pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:18:28.215969 master-0 kubenswrapper[7728]: I0223 14:18:28.215922 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-daemon-config\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.221561 master-0 kubenswrapper[7728]: I0223 14:18:28.221521 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 23 14:18:28.223899 master-0 kubenswrapper[7728]: I0223 14:18:28.223867 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2d00ece-7586-4346-adbb-eaae1aeda69e-trusted-ca-bundle\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:18:28.226627 master-0 kubenswrapper[7728]: I0223 14:18:28.226599 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/961e4ecd-545b-4270-ae34-e733dec793b6-kube-api-access\") pod \"kube-apiserver-operator-5d87bf58c-nq2tz\" (UID: \"961e4ecd-545b-4270-ae34-e733dec793b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" Feb 23 14:18:28.229715 master-0 kubenswrapper[7728]: I0223 14:18:28.229686 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j54j5\" (UniqueName: \"kubernetes.io/projected/842d45c5-3452-4e97-b5f5-540395330a65-kube-api-access-j54j5\") pod \"multus-admission-controller-5f98f4f8d5-fnc9v\" (UID: \"842d45c5-3452-4e97-b5f5-540395330a65\") " pod="openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v" Feb 23 14:18:28.232556 master-0 kubenswrapper[7728]: I0223 14:18:28.232528 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp5kb\" (UniqueName: \"kubernetes.io/projected/bbe678de-546d-49d0-8280-3f6d94fa5e4f-kube-api-access-kp5kb\") pod \"network-node-identity-td489\" (UID: \"bbe678de-546d-49d0-8280-3f6d94fa5e4f\") " pod="openshift-network-node-identity/network-node-identity-td489" Feb 23 14:18:28.233086 master-0 kubenswrapper[7728]: I0223 14:18:28.233060 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf04aca0-8174-4134-835d-37adf6a3b5ca-kube-api-access\") pod \"kube-controller-manager-operator-7bcfbc574b-zdntd\" (UID: \"cf04aca0-8174-4134-835d-37adf6a3b5ca\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" Feb 23 14:18:28.233968 master-0 kubenswrapper[7728]: I0223 14:18:28.233941 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3488a7eb-5170-478c-9af7-490dbe0f514e-trusted-ca\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:18:28.234069 master-0 kubenswrapper[7728]: I0223 14:18:28.234043 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp6tj\" (UniqueName: \"kubernetes.io/projected/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061-kube-api-access-vp6tj\") pod \"cluster-olm-operator-5bd7768f54-bgg88\" (UID: \"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" Feb 23 14:18:28.242393 master-0 kubenswrapper[7728]: I0223 14:18:28.242367 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr2p2\" (UniqueName: \"kubernetes.io/projected/865ceedb-b19a-4f2f-b295-311e1b7a645e-kube-api-access-tr2p2\") pod \"kube-storage-version-migrator-operator-fc889cfd5-tw2r9\" (UID: \"865ceedb-b19a-4f2f-b295-311e1b7a645e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" Feb 23 14:18:28.268277 master-0 kubenswrapper[7728]: I0223 14:18:28.268231 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chznd\" (UniqueName: \"kubernetes.io/projected/cb6e88cd-98de-446a-92e8-f56a2f133703-kube-api-access-chznd\") pod \"openshift-controller-manager-operator-584cc7bcb5-67ds6\" (UID: \"cb6e88cd-98de-446a-92e8-f56a2f133703\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" Feb 23 14:18:28.268817 master-0 kubenswrapper[7728]: I0223 14:18:28.268793 7728 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Feb 23 14:18:28.285405 master-0 kubenswrapper[7728]: I0223 14:18:28.284997 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4sbp\" (UniqueName: \"kubernetes.io/projected/607c1101-3533-43e3-9eda-13cea2b9dbb6-kube-api-access-v4sbp\") pod \"dns-operator-8c7d49845-5rk2g\" (UID: \"607c1101-3533-43e3-9eda-13cea2b9dbb6\") " pod="openshift-dns-operator/dns-operator-8c7d49845-5rk2g" Feb 23 14:18:28.317867 master-0 kubenswrapper[7728]: I0223 14:18:28.317728 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-log-socket\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.318049 master-0 kubenswrapper[7728]: I0223 14:18:28.317881 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-log-socket\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.318049 master-0 kubenswrapper[7728]: I0223 14:18:28.317920 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs\") pod \"network-metrics-daemon-9dnsv\" (UID: \"ace75aae-6f4f-4299-90e2-d5292271b136\") " pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:18:28.318049 master-0 kubenswrapper[7728]: I0223 14:18:28.317955 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls\") pod \"dns-operator-8c7d49845-5rk2g\" (UID: \"607c1101-3533-43e3-9eda-13cea2b9dbb6\") " pod="openshift-dns-operator/dns-operator-8c7d49845-5rk2g" Feb 23 14:18:28.318049 master-0 kubenswrapper[7728]: I0223 14:18:28.317983 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6bb6d78bf-wzqcp\" (UID: \"646fece3-4a42-4e0c-bcc7-5f705f948d63\") " pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:18:28.318049 master-0 kubenswrapper[7728]: I0223 14:18:28.318010 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:18:28.318049 master-0 kubenswrapper[7728]: E0223 14:18:28.317999 7728 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Feb 23 14:18:28.318049 master-0 kubenswrapper[7728]: I0223 14:18:28.318038 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-cni-netd\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.318377 master-0 kubenswrapper[7728]: E0223 14:18:28.318081 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs podName:ace75aae-6f4f-4299-90e2-d5292271b136 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:28.818064449 +0000 UTC m=+1.780725745 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs") pod "network-metrics-daemon-9dnsv" (UID: "ace75aae-6f4f-4299-90e2-d5292271b136") : secret "metrics-daemon-secret" not found Feb 23 14:18:28.318377 master-0 kubenswrapper[7728]: I0223 14:18:28.318105 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs\") pod \"multus-admission-controller-5f98f4f8d5-fnc9v\" (UID: \"842d45c5-3452-4e97-b5f5-540395330a65\") " pod="openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v" Feb 23 14:18:28.318377 master-0 kubenswrapper[7728]: I0223 14:18:28.318132 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.318377 master-0 kubenswrapper[7728]: I0223 14:18:28.318171 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-var-lib-kubelet\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.318377 master-0 kubenswrapper[7728]: I0223 14:18:28.318190 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-run-netns\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.318377 master-0 kubenswrapper[7728]: I0223 14:18:28.318211 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-os-release\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:18:28.318377 master-0 kubenswrapper[7728]: I0223 14:18:28.318233 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-run-k8s-cni-cncf-io\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.318377 master-0 kubenswrapper[7728]: I0223 14:18:28.318284 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:28.318377 master-0 kubenswrapper[7728]: I0223 14:18:28.318312 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-os-release\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.318377 master-0 kubenswrapper[7728]: I0223 14:18:28.318338 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3cea0ab8-258b-486c-bb7f-8c93930b296d-etc-ssl-certs\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:18:28.318377 master-0 kubenswrapper[7728]: I0223 14:18:28.318358 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08c561b3-613b-425f-9de4-d5fc8762ea51-host-slash\") pod \"iptables-alerter-t5h8h\" (UID: \"08c561b3-613b-425f-9de4-d5fc8762ea51\") " pod="openshift-network-operator/iptables-alerter-t5h8h" Feb 23 14:18:28.318377 master-0 kubenswrapper[7728]: I0223 14:18:28.318377 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:28.318377 master-0 kubenswrapper[7728]: I0223 14:18:28.318397 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-var-lib-cni-multus\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.318428 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-run-systemd\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.318448 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-node-log\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.318512 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-systemd-units\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: E0223 14:18:28.318523 7728 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.318538 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-system-cni-dir\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: E0223 14:18:28.318568 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls podName:607c1101-3533-43e3-9eda-13cea2b9dbb6 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:28.818553218 +0000 UTC m=+1.781214524 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls") pod "dns-operator-8c7d49845-5rk2g" (UID: "607c1101-3533-43e3-9eda-13cea2b9dbb6") : secret "metrics-tls" not found Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.318589 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-socket-dir-parent\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.318649 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-system-cni-dir\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.318678 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3cea0ab8-258b-486c-bb7f-8c93930b296d-etc-ssl-certs\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: E0223 14:18:28.318718 7728 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: E0223 14:18:28.318781 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert podName:3cea0ab8-258b-486c-bb7f-8c93930b296d nodeName:}" failed. No retries permitted until 2026-02-23 14:18:28.818771562 +0000 UTC m=+1.781432858 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert") pod "cluster-version-operator-5cfd9759cf-bsqrg" (UID: "3cea0ab8-258b-486c-bb7f-8c93930b296d") : secret "cluster-version-operator-serving-cert" not found Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.318725 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08c561b3-613b-425f-9de4-d5fc8762ea51-host-slash\") pod \"iptables-alerter-t5h8h\" (UID: \"08c561b3-613b-425f-9de4-d5fc8762ea51\") " pod="openshift-network-operator/iptables-alerter-t5h8h" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: E0223 14:18:28.318839 7728 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: E0223 14:18:28.318869 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls podName:646fece3-4a42-4e0c-bcc7-5f705f948d63 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:28.818856823 +0000 UTC m=+1.781518119 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6bb6d78bf-wzqcp" (UID: "646fece3-4a42-4e0c-bcc7-5f705f948d63") : secret "cluster-monitoring-operator-tls" not found Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: E0223 14:18:28.318880 7728 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: E0223 14:18:28.318905 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls podName:57b57915-64dd-42f5-b06f-bc4bcc06b667 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:28.818899824 +0000 UTC m=+1.781561120 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls") pod "cluster-node-tuning-operator-bcf775fc9-z5t5b" (UID: "57b57915-64dd-42f5-b06f-bc4bcc06b667") : secret "node-tuning-operator-tls" not found Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.318917 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-var-lib-cni-multus\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.318930 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.318954 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8llc8\" (UniqueName: \"kubernetes.io/projected/a4ae9292-71dc-4484-b277-43cb26c1e04d-kube-api-access-8llc8\") pod \"csi-snapshot-controller-operator-6fb4df594f-hkcgz\" (UID: \"a4ae9292-71dc-4484-b277-43cb26c1e04d\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6fb4df594f-hkcgz" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.318966 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319007 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-var-lib-openvswitch\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319022 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-run-netns\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319064 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-cni-bin\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319083 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-cni-dir\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319092 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-cni-bin\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319104 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-kubelet\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319063 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-var-lib-openvswitch\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319140 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-kubelet\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319149 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-cni-dir\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319160 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-run-k8s-cni-cncf-io\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319181 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4lz2\" (UniqueName: \"kubernetes.io/projected/ded555da-db03-498e-81a9-ad166f29a2aa-kube-api-access-x4lz2\") pod \"network-check-target-x9gxm\" (UID: \"ded555da-db03-498e-81a9-ad166f29a2aa\") " pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319186 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-node-log\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319217 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics\") pod \"marketplace-operator-6f5488b997-7b5sp\" (UID: \"585f74db-4593-426b-b0c7-ec8f64810549\") " pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319245 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-slash\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: E0223 14:18:28.319264 7728 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319275 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert\") pod \"package-server-manager-5c75f78c8b-cj2l7\" (UID: \"5b54fc16-d2f7-4b10-a611-5b411b389c5a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: E0223 14:18:28.319288 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics podName:585f74db-4593-426b-b0c7-ec8f64810549 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:28.819279871 +0000 UTC m=+1.781941167 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics") pod "marketplace-operator-6f5488b997-7b5sp" (UID: "585f74db-4593-426b-b0c7-ec8f64810549") : secret "marketplace-operator-metrics" not found Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: E0223 14:18:28.319325 7728 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: E0223 14:18:28.319358 7728 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: E0223 14:18:28.319376 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs podName:842d45c5-3452-4e97-b5f5-540395330a65 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:28.819344053 +0000 UTC m=+1.782005339 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs") pod "multus-admission-controller-5f98f4f8d5-fnc9v" (UID: "842d45c5-3452-4e97-b5f5-540395330a65") : secret "multus-admission-controller-secret" not found Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319325 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-slash\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: E0223 14:18:28.319399 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls podName:3488a7eb-5170-478c-9af7-490dbe0f514e nodeName:}" failed. No retries permitted until 2026-02-23 14:18:28.819387853 +0000 UTC m=+1.782049159 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls") pod "ingress-operator-6569778c84-hsl6c" (UID: "3488a7eb-5170-478c-9af7-490dbe0f514e") : secret "metrics-tls" not found Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319431 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-os-release\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: E0223 14:18:28.319435 7728 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: E0223 14:18:28.319468 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert podName:5b54fc16-d2f7-4b10-a611-5b411b389c5a nodeName:}" failed. No retries permitted until 2026-02-23 14:18:28.819460865 +0000 UTC m=+1.782122161 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert") pod "package-server-manager-5c75f78c8b-cj2l7" (UID: "5b54fc16-d2f7-4b10-a611-5b411b389c5a") : secret "package-server-manager-serving-cert" not found Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319472 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-os-release\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319504 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-cni-netd\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319511 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-systemd-units\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: E0223 14:18:28.319532 7728 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319542 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-socket-dir-parent\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: E0223 14:18:28.319563 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert podName:57b57915-64dd-42f5-b06f-bc4bcc06b667 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:28.819553746 +0000 UTC m=+1.782215062 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert") pod "cluster-node-tuning-operator-bcf775fc9-z5t5b" (UID: "57b57915-64dd-42f5-b06f-bc4bcc06b667") : secret "performance-addon-operator-webhook-cert" not found Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319583 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-cnibin\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319607 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319639 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-run-ovn\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319690 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-run-multus-certs\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319693 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-cnibin\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319740 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/674041a2-e2b0-4286-88cc-f1b00571e3f3-host-etc-kube\") pod \"network-operator-7d7db75979-x4qnw\" (UID: \"674041a2-e2b0-4286-88cc-f1b00571e3f3\") " pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319758 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-run-systemd\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319769 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-hostroot\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319772 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319802 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-run-ovn\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319815 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-hostroot\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319824 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-var-lib-cni-bin\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319844 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-etc-openvswitch\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319857 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-var-lib-cni-bin\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319870 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-run-netns\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319882 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-run-multus-certs\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319888 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-etc-kubernetes\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319888 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-etc-openvswitch\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319914 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-run-netns\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319921 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-etc-kubernetes\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319914 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-run-openvswitch\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.319780 master-0 kubenswrapper[7728]: I0223 14:18:28.319935 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/674041a2-e2b0-4286-88cc-f1b00571e3f3-host-etc-kube\") pod \"network-operator-7d7db75979-x4qnw\" (UID: \"674041a2-e2b0-4286-88cc-f1b00571e3f3\") " pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" Feb 23 14:18:28.325173 master-0 kubenswrapper[7728]: I0223 14:18:28.319980 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-system-cni-dir\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:18:28.325173 master-0 kubenswrapper[7728]: I0223 14:18:28.319984 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-run-openvswitch\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.325173 master-0 kubenswrapper[7728]: I0223 14:18:28.320092 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-system-cni-dir\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:18:28.325173 master-0 kubenswrapper[7728]: I0223 14:18:28.320146 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-run-ovn-kubernetes\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.325173 master-0 kubenswrapper[7728]: I0223 14:18:28.320180 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-cnibin\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.325173 master-0 kubenswrapper[7728]: I0223 14:18:28.320198 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-conf-dir\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.325173 master-0 kubenswrapper[7728]: I0223 14:18:28.320206 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-run-ovn-kubernetes\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.325173 master-0 kubenswrapper[7728]: I0223 14:18:28.320236 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3cea0ab8-258b-486c-bb7f-8c93930b296d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:18:28.325173 master-0 kubenswrapper[7728]: I0223 14:18:28.320290 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3cea0ab8-258b-486c-bb7f-8c93930b296d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:18:28.325173 master-0 kubenswrapper[7728]: I0223 14:18:28.320424 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-conf-dir\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.325173 master-0 kubenswrapper[7728]: I0223 14:18:28.320468 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-cnibin\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.325173 master-0 kubenswrapper[7728]: I0223 14:18:28.321169 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-var-lib-kubelet\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.337894 master-0 kubenswrapper[7728]: I0223 14:18:28.337849 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3488a7eb-5170-478c-9af7-490dbe0f514e-bound-sa-token\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:18:28.360204 master-0 kubenswrapper[7728]: I0223 14:18:28.360131 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3cea0ab8-258b-486c-bb7f-8c93930b296d-kube-api-access\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:18:28.368270 master-0 kubenswrapper[7728]: I0223 14:18:28.368213 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5f8j\" (UniqueName: \"kubernetes.io/projected/5b54fc16-d2f7-4b10-a611-5b411b389c5a-kube-api-access-d5f8j\") pod \"package-server-manager-5c75f78c8b-cj2l7\" (UID: \"5b54fc16-d2f7-4b10-a611-5b411b389c5a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:18:28.394812 master-0 kubenswrapper[7728]: I0223 14:18:28.394770 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qggzs\" (UniqueName: \"kubernetes.io/projected/57b57915-64dd-42f5-b06f-bc4bcc06b667-kube-api-access-qggzs\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:28.416986 master-0 kubenswrapper[7728]: I0223 14:18:28.416895 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9z2f\" (UniqueName: \"kubernetes.io/projected/09d80e28-0b64-4c5d-a9bc-99d843d40165-kube-api-access-g9z2f\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:18:28.424351 master-0 kubenswrapper[7728]: I0223 14:18:28.424309 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7jvd\" (UniqueName: \"kubernetes.io/projected/8de1f285-47ac-42aa-8026-8addce656362-kube-api-access-x7jvd\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:18:28.447661 master-0 kubenswrapper[7728]: I0223 14:18:28.447618 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nr85\" (UniqueName: \"kubernetes.io/projected/e2d00ece-7586-4346-adbb-eaae1aeda69e-kube-api-access-4nr85\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:18:28.464537 master-0 kubenswrapper[7728]: I0223 14:18:28.464375 7728 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 14:18:28.465019 master-0 kubenswrapper[7728]: I0223 14:18:28.464969 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzkcs\" (UniqueName: \"kubernetes.io/projected/ace75aae-6f4f-4299-90e2-d5292271b136-kube-api-access-wzkcs\") pod \"network-metrics-daemon-9dnsv\" (UID: \"ace75aae-6f4f-4299-90e2-d5292271b136\") " pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:18:28.504836 master-0 kubenswrapper[7728]: I0223 14:18:28.504793 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjs6f\" (UniqueName: \"kubernetes.io/projected/b090ed5a-984f-41dd-8cea-34a1ece1514f-kube-api-access-fjs6f\") pod \"ovnkube-control-plane-5d8dfcdc87-jbc2v\" (UID: \"b090ed5a-984f-41dd-8cea-34a1ece1514f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" Feb 23 14:18:28.507774 master-0 kubenswrapper[7728]: I0223 14:18:28.507738 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qszm\" (UniqueName: \"kubernetes.io/projected/3488a7eb-5170-478c-9af7-490dbe0f514e-kube-api-access-6qszm\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:18:28.530549 master-0 kubenswrapper[7728]: I0223 14:18:28.530508 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88qnh\" (UniqueName: \"kubernetes.io/projected/483786a0-0a29-44bf-bbd0-2f37e045aa2c-kube-api-access-88qnh\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:18:28.553337 master-0 kubenswrapper[7728]: I0223 14:18:28.553294 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9cf1c39-24f0-420b-8020-089616d1cdf0-kube-api-access\") pod \"openshift-kube-scheduler-operator-77cd4d9559-qvq8x\" (UID: \"b9cf1c39-24f0-420b-8020-089616d1cdf0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" Feb 23 14:18:28.590402 master-0 kubenswrapper[7728]: I0223 14:18:28.590358 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-269v7\" (UniqueName: \"kubernetes.io/projected/f10f592e-5738-4879-b776-246b357d4621-kube-api-access-269v7\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:28.593640 master-0 kubenswrapper[7728]: I0223 14:18:28.593608 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brd4j\" (UniqueName: \"kubernetes.io/projected/674041a2-e2b0-4286-88cc-f1b00571e3f3-kube-api-access-brd4j\") pod \"network-operator-7d7db75979-x4qnw\" (UID: \"674041a2-e2b0-4286-88cc-f1b00571e3f3\") " pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" Feb 23 14:18:28.606371 master-0 kubenswrapper[7728]: I0223 14:18:28.606326 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jzsd\" (UniqueName: \"kubernetes.io/projected/646fece3-4a42-4e0c-bcc7-5f705f948d63-kube-api-access-2jzsd\") pod \"cluster-monitoring-operator-6bb6d78bf-wzqcp\" (UID: \"646fece3-4a42-4e0c-bcc7-5f705f948d63\") " pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:18:28.623301 master-0 kubenswrapper[7728]: I0223 14:18:28.623258 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hp42\" (UniqueName: \"kubernetes.io/projected/24829faf-50e8-45bb-abb0-7cc5ccf81080-kube-api-access-7hp42\") pod \"openshift-apiserver-operator-8586dccc9b-tvnmq\" (UID: \"24829faf-50e8-45bb-abb0-7cc5ccf81080\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" Feb 23 14:18:28.643833 master-0 kubenswrapper[7728]: I0223 14:18:28.643795 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr868\" (UniqueName: \"kubernetes.io/projected/b714a9df-026e-423d-a980-2569f0d92e47-kube-api-access-lr868\") pod \"service-ca-operator-c48c8bf7c-vtnsw\" (UID: \"b714a9df-026e-423d-a980-2569f0d92e47\") " pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" Feb 23 14:18:28.663238 master-0 kubenswrapper[7728]: I0223 14:18:28.663204 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9tkx\" (UniqueName: \"kubernetes.io/projected/585f74db-4593-426b-b0c7-ec8f64810549-kube-api-access-q9tkx\") pod \"marketplace-operator-6f5488b997-7b5sp\" (UID: \"585f74db-4593-426b-b0c7-ec8f64810549\") " pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:18:28.683320 master-0 kubenswrapper[7728]: I0223 14:18:28.683279 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phmkf\" (UniqueName: \"kubernetes.io/projected/08c561b3-613b-425f-9de4-d5fc8762ea51-kube-api-access-phmkf\") pod \"iptables-alerter-t5h8h\" (UID: \"08c561b3-613b-425f-9de4-d5fc8762ea51\") " pod="openshift-network-operator/iptables-alerter-t5h8h" Feb 23 14:18:28.776607 master-0 kubenswrapper[7728]: E0223 14:18:28.776559 7728 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:18:28.786495 master-0 kubenswrapper[7728]: I0223 14:18:28.786434 7728 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 23 14:18:28.795913 master-0 kubenswrapper[7728]: I0223 14:18:28.795853 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4lz2\" (UniqueName: \"kubernetes.io/projected/ded555da-db03-498e-81a9-ad166f29a2aa-kube-api-access-x4lz2\") pod \"network-check-target-x9gxm\" (UID: \"ded555da-db03-498e-81a9-ad166f29a2aa\") " pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:18:28.820218 master-0 kubenswrapper[7728]: E0223 14:18:28.820125 7728 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:034588ffd95ce834e866279bf80a45af2cddda631c6c9a6344c1bb2e033fd83e" Feb 23 14:18:28.820447 master-0 kubenswrapper[7728]: E0223 14:18:28.820349 7728 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:csi-snapshot-controller-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:034588ffd95ce834e866279bf80a45af2cddda631c6c9a6344c1bb2e033fd83e,Command:[],Args:[start -v=2],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPERAND_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:39d04e6e7ced98e7e189aff1bf392a4d4526e011fc6adead5c6b27dbd08776a9,ValueFrom:nil,},EnvVar{Name:WEBHOOK_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d953b34fe1ab03e9a57b3c91de4220683cf92e804edb5f5c230e5888e1c5a6d2,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.33,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.18.33,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8llc8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000160000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-snapshot-controller-operator-6fb4df594f-hkcgz_openshift-cluster-storage-operator(a4ae9292-71dc-4484-b277-43cb26c1e04d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 14:18:28.821634 master-0 kubenswrapper[7728]: E0223 14:18:28.821579 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"csi-snapshot-controller-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6fb4df594f-hkcgz" podUID="a4ae9292-71dc-4484-b277-43cb26c1e04d" Feb 23 14:18:28.826284 master-0 kubenswrapper[7728]: I0223 14:18:28.826217 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:28.826389 master-0 kubenswrapper[7728]: I0223 14:18:28.826288 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:18:28.826547 master-0 kubenswrapper[7728]: I0223 14:18:28.826444 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics\") pod \"marketplace-operator-6f5488b997-7b5sp\" (UID: \"585f74db-4593-426b-b0c7-ec8f64810549\") " pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:18:28.826547 master-0 kubenswrapper[7728]: E0223 14:18:28.826449 7728 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Feb 23 14:18:28.826547 master-0 kubenswrapper[7728]: E0223 14:18:28.826549 7728 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Feb 23 14:18:28.826712 master-0 kubenswrapper[7728]: E0223 14:18:28.826559 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls podName:57b57915-64dd-42f5-b06f-bc4bcc06b667 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:29.826536455 +0000 UTC m=+2.789197751 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls") pod "cluster-node-tuning-operator-bcf775fc9-z5t5b" (UID: "57b57915-64dd-42f5-b06f-bc4bcc06b667") : secret "node-tuning-operator-tls" not found Feb 23 14:18:28.826712 master-0 kubenswrapper[7728]: I0223 14:18:28.826642 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert\") pod \"package-server-manager-5c75f78c8b-cj2l7\" (UID: \"5b54fc16-d2f7-4b10-a611-5b411b389c5a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:18:28.826712 master-0 kubenswrapper[7728]: E0223 14:18:28.826699 7728 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Feb 23 14:18:28.826842 master-0 kubenswrapper[7728]: E0223 14:18:28.826784 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls podName:3488a7eb-5170-478c-9af7-490dbe0f514e nodeName:}" failed. No retries permitted until 2026-02-23 14:18:29.826722688 +0000 UTC m=+2.789384164 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls") pod "ingress-operator-6569778c84-hsl6c" (UID: "3488a7eb-5170-478c-9af7-490dbe0f514e") : secret "metrics-tls" not found Feb 23 14:18:28.826890 master-0 kubenswrapper[7728]: I0223 14:18:28.826810 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs\") pod \"network-metrics-daemon-9dnsv\" (UID: \"ace75aae-6f4f-4299-90e2-d5292271b136\") " pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:18:28.826890 master-0 kubenswrapper[7728]: E0223 14:18:28.826853 7728 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Feb 23 14:18:28.826890 master-0 kubenswrapper[7728]: I0223 14:18:28.826880 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls\") pod \"dns-operator-8c7d49845-5rk2g\" (UID: \"607c1101-3533-43e3-9eda-13cea2b9dbb6\") " pod="openshift-dns-operator/dns-operator-8c7d49845-5rk2g" Feb 23 14:18:28.826890 master-0 kubenswrapper[7728]: E0223 14:18:28.826889 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert podName:5b54fc16-d2f7-4b10-a611-5b411b389c5a nodeName:}" failed. No retries permitted until 2026-02-23 14:18:29.826878091 +0000 UTC m=+2.789539567 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert") pod "package-server-manager-5c75f78c8b-cj2l7" (UID: "5b54fc16-d2f7-4b10-a611-5b411b389c5a") : secret "package-server-manager-serving-cert" not found Feb 23 14:18:28.827030 master-0 kubenswrapper[7728]: E0223 14:18:28.826914 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics podName:585f74db-4593-426b-b0c7-ec8f64810549 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:29.826905872 +0000 UTC m=+2.789567168 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics") pod "marketplace-operator-6f5488b997-7b5sp" (UID: "585f74db-4593-426b-b0c7-ec8f64810549") : secret "marketplace-operator-metrics" not found Feb 23 14:18:28.827030 master-0 kubenswrapper[7728]: I0223 14:18:28.826951 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6bb6d78bf-wzqcp\" (UID: \"646fece3-4a42-4e0c-bcc7-5f705f948d63\") " pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:18:28.827030 master-0 kubenswrapper[7728]: E0223 14:18:28.826976 7728 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Feb 23 14:18:28.827030 master-0 kubenswrapper[7728]: E0223 14:18:28.827014 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs podName:ace75aae-6f4f-4299-90e2-d5292271b136 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:29.827002454 +0000 UTC m=+2.789663750 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs") pod "network-metrics-daemon-9dnsv" (UID: "ace75aae-6f4f-4299-90e2-d5292271b136") : secret "metrics-daemon-secret" not found Feb 23 14:18:28.827176 master-0 kubenswrapper[7728]: I0223 14:18:28.827033 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:18:28.827176 master-0 kubenswrapper[7728]: E0223 14:18:28.827050 7728 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Feb 23 14:18:28.827176 master-0 kubenswrapper[7728]: E0223 14:18:28.827057 7728 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Feb 23 14:18:28.827176 master-0 kubenswrapper[7728]: I0223 14:18:28.827105 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs\") pod \"multus-admission-controller-5f98f4f8d5-fnc9v\" (UID: \"842d45c5-3452-4e97-b5f5-540395330a65\") " pod="openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v" Feb 23 14:18:28.827176 master-0 kubenswrapper[7728]: E0223 14:18:28.827125 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls podName:646fece3-4a42-4e0c-bcc7-5f705f948d63 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:29.827111846 +0000 UTC m=+2.789773152 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6bb6d78bf-wzqcp" (UID: "646fece3-4a42-4e0c-bcc7-5f705f948d63") : secret "cluster-monitoring-operator-tls" not found Feb 23 14:18:28.827362 master-0 kubenswrapper[7728]: E0223 14:18:28.827188 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls podName:607c1101-3533-43e3-9eda-13cea2b9dbb6 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:29.827172497 +0000 UTC m=+2.789833953 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls") pod "dns-operator-8c7d49845-5rk2g" (UID: "607c1101-3533-43e3-9eda-13cea2b9dbb6") : secret "metrics-tls" not found Feb 23 14:18:28.827362 master-0 kubenswrapper[7728]: E0223 14:18:28.827210 7728 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Feb 23 14:18:28.827362 master-0 kubenswrapper[7728]: I0223 14:18:28.827219 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:28.827362 master-0 kubenswrapper[7728]: E0223 14:18:28.827268 7728 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Feb 23 14:18:28.827362 master-0 kubenswrapper[7728]: E0223 14:18:28.827273 7728 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Feb 23 14:18:28.827362 master-0 kubenswrapper[7728]: E0223 14:18:28.827286 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert podName:3cea0ab8-258b-486c-bb7f-8c93930b296d nodeName:}" failed. No retries permitted until 2026-02-23 14:18:29.827268509 +0000 UTC m=+2.789929805 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert") pod "cluster-version-operator-5cfd9759cf-bsqrg" (UID: "3cea0ab8-258b-486c-bb7f-8c93930b296d") : secret "cluster-version-operator-serving-cert" not found Feb 23 14:18:28.827362 master-0 kubenswrapper[7728]: E0223 14:18:28.827306 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs podName:842d45c5-3452-4e97-b5f5-540395330a65 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:29.827295709 +0000 UTC m=+2.789957025 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs") pod "multus-admission-controller-5f98f4f8d5-fnc9v" (UID: "842d45c5-3452-4e97-b5f5-540395330a65") : secret "multus-admission-controller-secret" not found Feb 23 14:18:28.827362 master-0 kubenswrapper[7728]: E0223 14:18:28.827326 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert podName:57b57915-64dd-42f5-b06f-bc4bcc06b667 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:29.82731681 +0000 UTC m=+2.789978116 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert") pod "cluster-node-tuning-operator-bcf775fc9-z5t5b" (UID: "57b57915-64dd-42f5-b06f-bc4bcc06b667") : secret "performance-addon-operator-webhook-cert" not found Feb 23 14:18:28.988496 master-0 kubenswrapper[7728]: I0223 14:18:28.988357 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:18:29.069529 master-0 kubenswrapper[7728]: I0223 14:18:29.069473 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:18:29.737795 master-0 kubenswrapper[7728]: I0223 14:18:29.737635 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:18:29.745507 master-0 kubenswrapper[7728]: I0223 14:18:29.745387 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:18:29.846329 master-0 kubenswrapper[7728]: I0223 14:18:29.846215 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs\") pod \"network-metrics-daemon-9dnsv\" (UID: \"ace75aae-6f4f-4299-90e2-d5292271b136\") " pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:18:29.846329 master-0 kubenswrapper[7728]: I0223 14:18:29.846333 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls\") pod \"dns-operator-8c7d49845-5rk2g\" (UID: \"607c1101-3533-43e3-9eda-13cea2b9dbb6\") " pod="openshift-dns-operator/dns-operator-8c7d49845-5rk2g" Feb 23 14:18:29.846683 master-0 kubenswrapper[7728]: I0223 14:18:29.846374 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6bb6d78bf-wzqcp\" (UID: \"646fece3-4a42-4e0c-bcc7-5f705f948d63\") " pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:18:29.846683 master-0 kubenswrapper[7728]: E0223 14:18:29.846385 7728 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Feb 23 14:18:29.846683 master-0 kubenswrapper[7728]: I0223 14:18:29.846415 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs\") pod \"multus-admission-controller-5f98f4f8d5-fnc9v\" (UID: \"842d45c5-3452-4e97-b5f5-540395330a65\") " pod="openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v" Feb 23 14:18:29.846683 master-0 kubenswrapper[7728]: I0223 14:18:29.846450 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:18:29.846683 master-0 kubenswrapper[7728]: E0223 14:18:29.846497 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs podName:ace75aae-6f4f-4299-90e2-d5292271b136 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:31.846449813 +0000 UTC m=+4.809111119 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs") pod "network-metrics-daemon-9dnsv" (UID: "ace75aae-6f4f-4299-90e2-d5292271b136") : secret "metrics-daemon-secret" not found Feb 23 14:18:29.846683 master-0 kubenswrapper[7728]: E0223 14:18:29.846652 7728 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Feb 23 14:18:29.847150 master-0 kubenswrapper[7728]: E0223 14:18:29.846680 7728 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Feb 23 14:18:29.847150 master-0 kubenswrapper[7728]: E0223 14:18:29.846800 7728 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Feb 23 14:18:29.847150 master-0 kubenswrapper[7728]: E0223 14:18:29.846722 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs podName:842d45c5-3452-4e97-b5f5-540395330a65 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:31.846700098 +0000 UTC m=+4.809361424 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs") pod "multus-admission-controller-5f98f4f8d5-fnc9v" (UID: "842d45c5-3452-4e97-b5f5-540395330a65") : secret "multus-admission-controller-secret" not found Feb 23 14:18:29.847150 master-0 kubenswrapper[7728]: I0223 14:18:29.846890 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:29.847150 master-0 kubenswrapper[7728]: E0223 14:18:29.846906 7728 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Feb 23 14:18:29.847150 master-0 kubenswrapper[7728]: E0223 14:18:29.846940 7728 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Feb 23 14:18:29.847150 master-0 kubenswrapper[7728]: E0223 14:18:29.846952 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls podName:646fece3-4a42-4e0c-bcc7-5f705f948d63 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:31.846938323 +0000 UTC m=+4.809599659 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6bb6d78bf-wzqcp" (UID: "646fece3-4a42-4e0c-bcc7-5f705f948d63") : secret "cluster-monitoring-operator-tls" not found Feb 23 14:18:29.847150 master-0 kubenswrapper[7728]: I0223 14:18:29.846987 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:29.847150 master-0 kubenswrapper[7728]: E0223 14:18:29.846999 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert podName:57b57915-64dd-42f5-b06f-bc4bcc06b667 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:31.846976853 +0000 UTC m=+4.809638189 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert") pod "cluster-node-tuning-operator-bcf775fc9-z5t5b" (UID: "57b57915-64dd-42f5-b06f-bc4bcc06b667") : secret "performance-addon-operator-webhook-cert" not found Feb 23 14:18:29.847150 master-0 kubenswrapper[7728]: E0223 14:18:29.847029 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert podName:3cea0ab8-258b-486c-bb7f-8c93930b296d nodeName:}" failed. No retries permitted until 2026-02-23 14:18:31.847015594 +0000 UTC m=+4.809676930 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert") pod "cluster-version-operator-5cfd9759cf-bsqrg" (UID: "3cea0ab8-258b-486c-bb7f-8c93930b296d") : secret "cluster-version-operator-serving-cert" not found Feb 23 14:18:29.847150 master-0 kubenswrapper[7728]: E0223 14:18:29.847061 7728 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Feb 23 14:18:29.847150 master-0 kubenswrapper[7728]: E0223 14:18:29.847091 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls podName:607c1101-3533-43e3-9eda-13cea2b9dbb6 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:31.847077365 +0000 UTC m=+4.809738911 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls") pod "dns-operator-8c7d49845-5rk2g" (UID: "607c1101-3533-43e3-9eda-13cea2b9dbb6") : secret "metrics-tls" not found Feb 23 14:18:29.847150 master-0 kubenswrapper[7728]: I0223 14:18:29.847068 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:18:29.847150 master-0 kubenswrapper[7728]: E0223 14:18:29.847116 7728 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Feb 23 14:18:29.847150 master-0 kubenswrapper[7728]: E0223 14:18:29.847124 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls podName:57b57915-64dd-42f5-b06f-bc4bcc06b667 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:31.847107346 +0000 UTC m=+4.809768772 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls") pod "cluster-node-tuning-operator-bcf775fc9-z5t5b" (UID: "57b57915-64dd-42f5-b06f-bc4bcc06b667") : secret "node-tuning-operator-tls" not found Feb 23 14:18:29.847150 master-0 kubenswrapper[7728]: E0223 14:18:29.847151 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls podName:3488a7eb-5170-478c-9af7-490dbe0f514e nodeName:}" failed. No retries permitted until 2026-02-23 14:18:31.847141586 +0000 UTC m=+4.809802892 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls") pod "ingress-operator-6569778c84-hsl6c" (UID: "3488a7eb-5170-478c-9af7-490dbe0f514e") : secret "metrics-tls" not found Feb 23 14:18:29.847806 master-0 kubenswrapper[7728]: I0223 14:18:29.847186 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics\") pod \"marketplace-operator-6f5488b997-7b5sp\" (UID: \"585f74db-4593-426b-b0c7-ec8f64810549\") " pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:18:29.847806 master-0 kubenswrapper[7728]: I0223 14:18:29.847233 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert\") pod \"package-server-manager-5c75f78c8b-cj2l7\" (UID: \"5b54fc16-d2f7-4b10-a611-5b411b389c5a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:18:29.847806 master-0 kubenswrapper[7728]: E0223 14:18:29.847300 7728 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Feb 23 14:18:29.847806 master-0 kubenswrapper[7728]: E0223 14:18:29.847345 7728 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Feb 23 14:18:29.847806 master-0 kubenswrapper[7728]: E0223 14:18:29.847350 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics podName:585f74db-4593-426b-b0c7-ec8f64810549 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:31.84733518 +0000 UTC m=+4.809996516 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics") pod "marketplace-operator-6f5488b997-7b5sp" (UID: "585f74db-4593-426b-b0c7-ec8f64810549") : secret "marketplace-operator-metrics" not found Feb 23 14:18:29.847806 master-0 kubenswrapper[7728]: E0223 14:18:29.847371 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert podName:5b54fc16-d2f7-4b10-a611-5b411b389c5a nodeName:}" failed. No retries permitted until 2026-02-23 14:18:31.84736307 +0000 UTC m=+4.810024376 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert") pod "package-server-manager-5c75f78c8b-cj2l7" (UID: "5b54fc16-d2f7-4b10-a611-5b411b389c5a") : secret "package-server-manager-serving-cert" not found Feb 23 14:18:30.228344 master-0 kubenswrapper[7728]: I0223 14:18:30.228191 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:18:30.508356 master-0 kubenswrapper[7728]: I0223 14:18:30.508184 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:30.547571 master-0 kubenswrapper[7728]: I0223 14:18:30.547461 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:31.289913 master-0 kubenswrapper[7728]: I0223 14:18:31.289834 7728 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 14:18:31.289913 master-0 kubenswrapper[7728]: I0223 14:18:31.289859 7728 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 14:18:31.334921 master-0 kubenswrapper[7728]: E0223 14:18:31.334828 7728 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:11f566fe2ae782ad96d36028b0fd81911a64ef787dcebc83803f741f272fa396" Feb 23 14:18:31.335202 master-0 kubenswrapper[7728]: E0223 14:18:31.335141 7728 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:etcd-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:11f566fe2ae782ad96d36028b0fd81911a64ef787dcebc83803f741f272fa396,Command:[cluster-etcd-operator operator],Args:[--config=/var/run/configmaps/config/config.yaml --terminate-on-files=/var/run/secrets/serving-cert/tls.crt --terminate-on-files=/var/run/secrets/serving-cert/tls.key --terminate-on-files=/var/run/secrets/etcd-client/tls.crt --terminate-on-files=/var/run/secrets/etcd-client/tls.key --terminate-on-files=/var/run/configmaps/etcd-ca/ca-bundle.crt --terminate-on-files=/var/run/configmaps/etcd-service-ca/service-ca.crt],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d77a77c401bcfaa65a6ab6de82415af0e7ace1b470626647e5feb4875c89a5ef,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:11f566fe2ae782ad96d36028b0fd81911a64ef787dcebc83803f741f272fa396,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.33,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.18.33,ValueFrom:nil,},EnvVar{Name:OPENSHIFT_PROFILE,Value:web,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etcd-ca,ReadOnly:false,MountPath:/var/run/configmaps/etcd-ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etcd-service-ca,ReadOnly:false,MountPath:/var/run/configmaps/etcd-service-ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etcd-client,ReadOnly:false,MountPath:/var/run/secrets/etcd-client,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x7jvd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:30,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod etcd-operator-545bf96f4d-fpwtm_openshift-etcd-operator(8de1f285-47ac-42aa-8026-8addce656362): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 14:18:31.336433 master-0 kubenswrapper[7728]: E0223 14:18:31.336360 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" podUID="8de1f285-47ac-42aa-8026-8addce656362" Feb 23 14:18:31.818714 master-0 kubenswrapper[7728]: I0223 14:18:31.818306 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-controller-manager-master-0" podStartSLOduration=7.818276837 podStartE2EDuration="7.818276837s" podCreationTimestamp="2026-02-23 14:18:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:18:31.818266397 +0000 UTC m=+4.780927733" watchObservedRunningTime="2026-02-23 14:18:31.818276837 +0000 UTC m=+4.780938173" Feb 23 14:18:31.880065 master-0 kubenswrapper[7728]: I0223 14:18:31.878745 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6bb6d78bf-wzqcp\" (UID: \"646fece3-4a42-4e0c-bcc7-5f705f948d63\") " pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:18:31.880065 master-0 kubenswrapper[7728]: I0223 14:18:31.878814 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs\") pod \"multus-admission-controller-5f98f4f8d5-fnc9v\" (UID: \"842d45c5-3452-4e97-b5f5-540395330a65\") " pod="openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v" Feb 23 14:18:31.880065 master-0 kubenswrapper[7728]: I0223 14:18:31.878841 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:18:31.880065 master-0 kubenswrapper[7728]: I0223 14:18:31.878876 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:31.880065 master-0 kubenswrapper[7728]: I0223 14:18:31.878937 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:31.880065 master-0 kubenswrapper[7728]: I0223 14:18:31.878973 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:18:31.880065 master-0 kubenswrapper[7728]: I0223 14:18:31.879002 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics\") pod \"marketplace-operator-6f5488b997-7b5sp\" (UID: \"585f74db-4593-426b-b0c7-ec8f64810549\") " pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:18:31.880065 master-0 kubenswrapper[7728]: I0223 14:18:31.879030 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert\") pod \"package-server-manager-5c75f78c8b-cj2l7\" (UID: \"5b54fc16-d2f7-4b10-a611-5b411b389c5a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:18:31.880065 master-0 kubenswrapper[7728]: I0223 14:18:31.879075 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs\") pod \"network-metrics-daemon-9dnsv\" (UID: \"ace75aae-6f4f-4299-90e2-d5292271b136\") " pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:18:31.880065 master-0 kubenswrapper[7728]: I0223 14:18:31.879099 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls\") pod \"dns-operator-8c7d49845-5rk2g\" (UID: \"607c1101-3533-43e3-9eda-13cea2b9dbb6\") " pod="openshift-dns-operator/dns-operator-8c7d49845-5rk2g" Feb 23 14:18:31.880065 master-0 kubenswrapper[7728]: E0223 14:18:31.879243 7728 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Feb 23 14:18:31.880065 master-0 kubenswrapper[7728]: E0223 14:18:31.879307 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls podName:607c1101-3533-43e3-9eda-13cea2b9dbb6 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:35.879282158 +0000 UTC m=+8.841943464 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls") pod "dns-operator-8c7d49845-5rk2g" (UID: "607c1101-3533-43e3-9eda-13cea2b9dbb6") : secret "metrics-tls" not found Feb 23 14:18:31.880065 master-0 kubenswrapper[7728]: E0223 14:18:31.879357 7728 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Feb 23 14:18:31.880065 master-0 kubenswrapper[7728]: E0223 14:18:31.879383 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls podName:646fece3-4a42-4e0c-bcc7-5f705f948d63 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:35.87937492 +0000 UTC m=+8.842036226 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6bb6d78bf-wzqcp" (UID: "646fece3-4a42-4e0c-bcc7-5f705f948d63") : secret "cluster-monitoring-operator-tls" not found Feb 23 14:18:31.880065 master-0 kubenswrapper[7728]: E0223 14:18:31.879428 7728 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Feb 23 14:18:31.880065 master-0 kubenswrapper[7728]: E0223 14:18:31.879452 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs podName:842d45c5-3452-4e97-b5f5-540395330a65 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:35.879444901 +0000 UTC m=+8.842106207 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs") pod "multus-admission-controller-5f98f4f8d5-fnc9v" (UID: "842d45c5-3452-4e97-b5f5-540395330a65") : secret "multus-admission-controller-secret" not found Feb 23 14:18:31.880065 master-0 kubenswrapper[7728]: E0223 14:18:31.879523 7728 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Feb 23 14:18:31.880065 master-0 kubenswrapper[7728]: E0223 14:18:31.879548 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert podName:3cea0ab8-258b-486c-bb7f-8c93930b296d nodeName:}" failed. No retries permitted until 2026-02-23 14:18:35.879541343 +0000 UTC m=+8.842202649 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert") pod "cluster-version-operator-5cfd9759cf-bsqrg" (UID: "3cea0ab8-258b-486c-bb7f-8c93930b296d") : secret "cluster-version-operator-serving-cert" not found Feb 23 14:18:31.880065 master-0 kubenswrapper[7728]: E0223 14:18:31.879591 7728 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Feb 23 14:18:31.880065 master-0 kubenswrapper[7728]: E0223 14:18:31.879612 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert podName:57b57915-64dd-42f5-b06f-bc4bcc06b667 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:35.879606154 +0000 UTC m=+8.842267460 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert") pod "cluster-node-tuning-operator-bcf775fc9-z5t5b" (UID: "57b57915-64dd-42f5-b06f-bc4bcc06b667") : secret "performance-addon-operator-webhook-cert" not found Feb 23 14:18:31.880065 master-0 kubenswrapper[7728]: E0223 14:18:31.879652 7728 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Feb 23 14:18:31.880065 master-0 kubenswrapper[7728]: E0223 14:18:31.879673 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls podName:57b57915-64dd-42f5-b06f-bc4bcc06b667 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:35.879667225 +0000 UTC m=+8.842328531 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls") pod "cluster-node-tuning-operator-bcf775fc9-z5t5b" (UID: "57b57915-64dd-42f5-b06f-bc4bcc06b667") : secret "node-tuning-operator-tls" not found Feb 23 14:18:31.880065 master-0 kubenswrapper[7728]: E0223 14:18:31.879713 7728 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Feb 23 14:18:31.880065 master-0 kubenswrapper[7728]: E0223 14:18:31.879740 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls podName:3488a7eb-5170-478c-9af7-490dbe0f514e nodeName:}" failed. No retries permitted until 2026-02-23 14:18:35.879730706 +0000 UTC m=+8.842392012 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls") pod "ingress-operator-6569778c84-hsl6c" (UID: "3488a7eb-5170-478c-9af7-490dbe0f514e") : secret "metrics-tls" not found Feb 23 14:18:31.880065 master-0 kubenswrapper[7728]: E0223 14:18:31.879797 7728 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Feb 23 14:18:31.880065 master-0 kubenswrapper[7728]: E0223 14:18:31.879829 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics podName:585f74db-4593-426b-b0c7-ec8f64810549 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:35.879819148 +0000 UTC m=+8.842480454 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics") pod "marketplace-operator-6f5488b997-7b5sp" (UID: "585f74db-4593-426b-b0c7-ec8f64810549") : secret "marketplace-operator-metrics" not found Feb 23 14:18:31.880065 master-0 kubenswrapper[7728]: E0223 14:18:31.879885 7728 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Feb 23 14:18:31.880065 master-0 kubenswrapper[7728]: E0223 14:18:31.879915 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert podName:5b54fc16-d2f7-4b10-a611-5b411b389c5a nodeName:}" failed. No retries permitted until 2026-02-23 14:18:35.87990527 +0000 UTC m=+8.842566576 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert") pod "package-server-manager-5c75f78c8b-cj2l7" (UID: "5b54fc16-d2f7-4b10-a611-5b411b389c5a") : secret "package-server-manager-serving-cert" not found Feb 23 14:18:31.880065 master-0 kubenswrapper[7728]: E0223 14:18:31.879973 7728 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Feb 23 14:18:31.880065 master-0 kubenswrapper[7728]: E0223 14:18:31.880004 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs podName:ace75aae-6f4f-4299-90e2-d5292271b136 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:35.879994781 +0000 UTC m=+8.842656097 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs") pod "network-metrics-daemon-9dnsv" (UID: "ace75aae-6f4f-4299-90e2-d5292271b136") : secret "metrics-daemon-secret" not found Feb 23 14:18:33.246094 master-0 kubenswrapper[7728]: E0223 14:18:33.245706 7728 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d7bd3361d506dcc1be3afa62d35080c5dd37afccc26cd36019e2b9db2c45f896" Feb 23 14:18:33.246094 master-0 kubenswrapper[7728]: E0223 14:18:33.246006 7728 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openshift-controller-manager-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d7bd3361d506dcc1be3afa62d35080c5dd37afccc26cd36019e2b9db2c45f896,Command:[cluster-openshift-controller-manager-operator operator],Args:[--config=/var/run/configmaps/config/config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.33,ValueFrom:nil,},EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:314be88d356b2c8a3c4416daeb4cfcd58d617a4526319c01ddaffae4b4179e74,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.33,ValueFrom:nil,},EnvVar{Name:ROUTE_CONTROLLER_MANAGER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:117a846734fc8159b7172a40ed2feb43a969b7dbc113ee1a572cbf6f9f922655,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-chznd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openshift-controller-manager-operator-584cc7bcb5-67ds6_openshift-controller-manager-operator(cb6e88cd-98de-446a-92e8-f56a2f133703): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 14:18:33.247430 master-0 kubenswrapper[7728]: E0223 14:18:33.247352 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-controller-manager-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" podUID="cb6e88cd-98de-446a-92e8-f56a2f133703" Feb 23 14:18:33.462726 master-0 kubenswrapper[7728]: I0223 14:18:33.462652 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:33.463102 master-0 kubenswrapper[7728]: I0223 14:18:33.463035 7728 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 14:18:33.463102 master-0 kubenswrapper[7728]: I0223 14:18:33.463056 7728 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 14:18:33.538407 master-0 kubenswrapper[7728]: I0223 14:18:33.538324 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:34.098724 master-0 kubenswrapper[7728]: E0223 14:18:34.098649 7728 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4e8c6ae1f9a450c90857c9fbccf1e5fb404dbc0d65d086afce005d6bd307853b" Feb 23 14:18:34.100625 master-0 kubenswrapper[7728]: E0223 14:18:34.098862 7728 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:copy-operator-controller-manifests,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4e8c6ae1f9a450c90857c9fbccf1e5fb404dbc0d65d086afce005d6bd307853b,Command:[/bin/sh],Args:[-c cp -a /openshift/manifests /operand-assets/operator-controller],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:operand-assets,ReadOnly:false,MountPath:/operand-assets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vp6tj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cluster-olm-operator-5bd7768f54-bgg88_openshift-cluster-olm-operator(d2aa0d48-7c8e-4ddb-84a3-b3c34414c061): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 14:18:34.105229 master-0 kubenswrapper[7728]: E0223 14:18:34.105165 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"copy-operator-controller-manifests\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" podUID="d2aa0d48-7c8e-4ddb-84a3-b3c34414c061" Feb 23 14:18:34.305750 master-0 kubenswrapper[7728]: I0223 14:18:34.304672 7728 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 14:18:34.378695 master-0 kubenswrapper[7728]: I0223 14:18:34.378632 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-x9gxm"] Feb 23 14:18:35.312490 master-0 kubenswrapper[7728]: I0223 14:18:35.312112 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" event={"ID":"b9cf1c39-24f0-420b-8020-089616d1cdf0","Type":"ContainerStarted","Data":"4f3667b06f9040c2373de3a09349d52a663561d04056133aea74705119d3b818"} Feb 23 14:18:35.313877 master-0 kubenswrapper[7728]: I0223 14:18:35.313829 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" event={"ID":"cf04aca0-8174-4134-835d-37adf6a3b5ca","Type":"ContainerStarted","Data":"93ee993f97732b66b7b7fa627308e4fbe3771a952955dfa9d4f021a884360bf3"} Feb 23 14:18:35.318965 master-0 kubenswrapper[7728]: I0223 14:18:35.318925 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" event={"ID":"e2d00ece-7586-4346-adbb-eaae1aeda69e","Type":"ContainerStarted","Data":"6fdaded4c1d5d4706ada0063d02a22ac0f3bed1016ec71609468c9f080c894da"} Feb 23 14:18:35.322941 master-0 kubenswrapper[7728]: I0223 14:18:35.322853 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-x9gxm" event={"ID":"ded555da-db03-498e-81a9-ad166f29a2aa","Type":"ContainerStarted","Data":"ee198b41064b4937283c60c072335d7824964ece616cc687d14de79a4a2e8885"} Feb 23 14:18:35.323051 master-0 kubenswrapper[7728]: I0223 14:18:35.322945 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-x9gxm" event={"ID":"ded555da-db03-498e-81a9-ad166f29a2aa","Type":"ContainerStarted","Data":"7644d6b4dd6d2352356f500ef21c6602c372872ba1a236023043ba253ba34314"} Feb 23 14:18:35.325102 master-0 kubenswrapper[7728]: I0223 14:18:35.325042 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" event={"ID":"b714a9df-026e-423d-a980-2569f0d92e47","Type":"ContainerStarted","Data":"5867cf57b319e8b378703de8112e0a4c5fd05aee108af7754fc3219eac54a673"} Feb 23 14:18:35.326985 master-0 kubenswrapper[7728]: I0223 14:18:35.326922 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" event={"ID":"24829faf-50e8-45bb-abb0-7cc5ccf81080","Type":"ContainerStarted","Data":"0e43678d3197cf112cf0a044926bfa730d56557262cc8421afdcc26a5ee07b83"} Feb 23 14:18:35.329280 master-0 kubenswrapper[7728]: I0223 14:18:35.329217 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" event={"ID":"865ceedb-b19a-4f2f-b295-311e1b7a645e","Type":"ContainerStarted","Data":"515b3836a32aed4579312ac49c6468a1e7035624b7a30950b8364d5d10c9310d"} Feb 23 14:18:35.920514 master-0 kubenswrapper[7728]: I0223 14:18:35.909640 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:18:35.920514 master-0 kubenswrapper[7728]: I0223 14:18:35.909803 7728 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 14:18:35.920514 master-0 kubenswrapper[7728]: I0223 14:18:35.917352 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:18:35.924149 master-0 kubenswrapper[7728]: I0223 14:18:35.924108 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:18:35.924149 master-0 kubenswrapper[7728]: I0223 14:18:35.924151 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics\") pod \"marketplace-operator-6f5488b997-7b5sp\" (UID: \"585f74db-4593-426b-b0c7-ec8f64810549\") " pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:18:35.924309 master-0 kubenswrapper[7728]: I0223 14:18:35.924172 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert\") pod \"package-server-manager-5c75f78c8b-cj2l7\" (UID: \"5b54fc16-d2f7-4b10-a611-5b411b389c5a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:18:35.924309 master-0 kubenswrapper[7728]: E0223 14:18:35.924298 7728 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Feb 23 14:18:35.924426 master-0 kubenswrapper[7728]: E0223 14:18:35.924356 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls podName:3488a7eb-5170-478c-9af7-490dbe0f514e nodeName:}" failed. No retries permitted until 2026-02-23 14:18:43.924337141 +0000 UTC m=+16.886998527 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls") pod "ingress-operator-6569778c84-hsl6c" (UID: "3488a7eb-5170-478c-9af7-490dbe0f514e") : secret "metrics-tls" not found Feb 23 14:18:35.924426 master-0 kubenswrapper[7728]: I0223 14:18:35.924384 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs\") pod \"network-metrics-daemon-9dnsv\" (UID: \"ace75aae-6f4f-4299-90e2-d5292271b136\") " pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:18:35.924597 master-0 kubenswrapper[7728]: I0223 14:18:35.924425 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls\") pod \"dns-operator-8c7d49845-5rk2g\" (UID: \"607c1101-3533-43e3-9eda-13cea2b9dbb6\") " pod="openshift-dns-operator/dns-operator-8c7d49845-5rk2g" Feb 23 14:18:35.924597 master-0 kubenswrapper[7728]: I0223 14:18:35.924452 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6bb6d78bf-wzqcp\" (UID: \"646fece3-4a42-4e0c-bcc7-5f705f948d63\") " pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:18:35.924597 master-0 kubenswrapper[7728]: I0223 14:18:35.924494 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs\") pod \"multus-admission-controller-5f98f4f8d5-fnc9v\" (UID: \"842d45c5-3452-4e97-b5f5-540395330a65\") " pod="openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v" Feb 23 14:18:35.924597 master-0 kubenswrapper[7728]: E0223 14:18:35.924510 7728 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Feb 23 14:18:35.924597 master-0 kubenswrapper[7728]: E0223 14:18:35.924584 7728 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Feb 23 14:18:35.924597 master-0 kubenswrapper[7728]: E0223 14:18:35.924590 7728 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Feb 23 14:18:35.924938 master-0 kubenswrapper[7728]: E0223 14:18:35.924620 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert podName:5b54fc16-d2f7-4b10-a611-5b411b389c5a nodeName:}" failed. No retries permitted until 2026-02-23 14:18:43.924590716 +0000 UTC m=+16.887252042 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert") pod "package-server-manager-5c75f78c8b-cj2l7" (UID: "5b54fc16-d2f7-4b10-a611-5b411b389c5a") : secret "package-server-manager-serving-cert" not found Feb 23 14:18:35.924938 master-0 kubenswrapper[7728]: I0223 14:18:35.924519 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:18:35.924938 master-0 kubenswrapper[7728]: E0223 14:18:35.924648 7728 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Feb 23 14:18:35.924938 master-0 kubenswrapper[7728]: E0223 14:18:35.924725 7728 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Feb 23 14:18:35.924938 master-0 kubenswrapper[7728]: E0223 14:18:35.924649 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs podName:842d45c5-3452-4e97-b5f5-540395330a65 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:43.924636557 +0000 UTC m=+16.887297893 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs") pod "multus-admission-controller-5f98f4f8d5-fnc9v" (UID: "842d45c5-3452-4e97-b5f5-540395330a65") : secret "multus-admission-controller-secret" not found Feb 23 14:18:35.924938 master-0 kubenswrapper[7728]: E0223 14:18:35.924687 7728 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Feb 23 14:18:35.924938 master-0 kubenswrapper[7728]: E0223 14:18:35.924755 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls podName:646fece3-4a42-4e0c-bcc7-5f705f948d63 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:43.924745119 +0000 UTC m=+16.887406545 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6bb6d78bf-wzqcp" (UID: "646fece3-4a42-4e0c-bcc7-5f705f948d63") : secret "cluster-monitoring-operator-tls" not found Feb 23 14:18:35.924938 master-0 kubenswrapper[7728]: E0223 14:18:35.924774 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs podName:ace75aae-6f4f-4299-90e2-d5292271b136 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:43.924764329 +0000 UTC m=+16.887425755 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs") pod "network-metrics-daemon-9dnsv" (UID: "ace75aae-6f4f-4299-90e2-d5292271b136") : secret "metrics-daemon-secret" not found Feb 23 14:18:35.924938 master-0 kubenswrapper[7728]: I0223 14:18:35.924794 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:35.924938 master-0 kubenswrapper[7728]: E0223 14:18:35.924816 7728 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Feb 23 14:18:35.924938 master-0 kubenswrapper[7728]: E0223 14:18:35.924835 7728 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Feb 23 14:18:35.924938 master-0 kubenswrapper[7728]: E0223 14:18:35.924853 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics podName:585f74db-4593-426b-b0c7-ec8f64810549 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:43.924841291 +0000 UTC m=+16.887502587 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics") pod "marketplace-operator-6f5488b997-7b5sp" (UID: "585f74db-4593-426b-b0c7-ec8f64810549") : secret "marketplace-operator-metrics" not found Feb 23 14:18:35.924938 master-0 kubenswrapper[7728]: E0223 14:18:35.924878 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert podName:57b57915-64dd-42f5-b06f-bc4bcc06b667 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:43.924868241 +0000 UTC m=+16.887529547 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert") pod "cluster-node-tuning-operator-bcf775fc9-z5t5b" (UID: "57b57915-64dd-42f5-b06f-bc4bcc06b667") : secret "performance-addon-operator-webhook-cert" not found Feb 23 14:18:35.924938 master-0 kubenswrapper[7728]: E0223 14:18:35.924878 7728 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Feb 23 14:18:35.924938 master-0 kubenswrapper[7728]: E0223 14:18:35.924891 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls podName:607c1101-3533-43e3-9eda-13cea2b9dbb6 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:43.924885152 +0000 UTC m=+16.887546598 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls") pod "dns-operator-8c7d49845-5rk2g" (UID: "607c1101-3533-43e3-9eda-13cea2b9dbb6") : secret "metrics-tls" not found Feb 23 14:18:35.924938 master-0 kubenswrapper[7728]: I0223 14:18:35.924837 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:35.924938 master-0 kubenswrapper[7728]: E0223 14:18:35.924906 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert podName:3cea0ab8-258b-486c-bb7f-8c93930b296d nodeName:}" failed. No retries permitted until 2026-02-23 14:18:43.924899912 +0000 UTC m=+16.887561348 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert") pod "cluster-version-operator-5cfd9759cf-bsqrg" (UID: "3cea0ab8-258b-486c-bb7f-8c93930b296d") : secret "cluster-version-operator-serving-cert" not found Feb 23 14:18:35.925935 master-0 kubenswrapper[7728]: E0223 14:18:35.924974 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls podName:57b57915-64dd-42f5-b06f-bc4bcc06b667 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:43.924964033 +0000 UTC m=+16.887625459 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls") pod "cluster-node-tuning-operator-bcf775fc9-z5t5b" (UID: "57b57915-64dd-42f5-b06f-bc4bcc06b667") : secret "node-tuning-operator-tls" not found Feb 23 14:18:36.332905 master-0 kubenswrapper[7728]: I0223 14:18:36.332852 7728 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 14:18:36.363097 master-0 kubenswrapper[7728]: I0223 14:18:36.363031 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:18:36.373671 master-0 kubenswrapper[7728]: I0223 14:18:36.373613 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:18:37.437867 master-0 kubenswrapper[7728]: I0223 14:18:37.437785 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:18:37.442958 master-0 kubenswrapper[7728]: I0223 14:18:37.442837 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:18:37.459612 master-0 kubenswrapper[7728]: I0223 14:18:37.459060 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-5c85bff57-vk2x8"] Feb 23 14:18:37.459612 master-0 kubenswrapper[7728]: E0223 14:18:37.459427 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0514f486-2562-473d-8b01-b69441b82367" containerName="assisted-installer-controller" Feb 23 14:18:37.459612 master-0 kubenswrapper[7728]: I0223 14:18:37.459440 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="0514f486-2562-473d-8b01-b69441b82367" containerName="assisted-installer-controller" Feb 23 14:18:37.459612 master-0 kubenswrapper[7728]: E0223 14:18:37.459450 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83ba77ca-71ea-4b69-8c91-f04b53b81aff" containerName="prober" Feb 23 14:18:37.459612 master-0 kubenswrapper[7728]: I0223 14:18:37.459458 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="83ba77ca-71ea-4b69-8c91-f04b53b81aff" containerName="prober" Feb 23 14:18:37.459612 master-0 kubenswrapper[7728]: I0223 14:18:37.459531 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="83ba77ca-71ea-4b69-8c91-f04b53b81aff" containerName="prober" Feb 23 14:18:37.459612 master-0 kubenswrapper[7728]: I0223 14:18:37.459541 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="0514f486-2562-473d-8b01-b69441b82367" containerName="assisted-installer-controller" Feb 23 14:18:37.460458 master-0 kubenswrapper[7728]: I0223 14:18:37.459840 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-5c85bff57-vk2x8" Feb 23 14:18:37.463457 master-0 kubenswrapper[7728]: I0223 14:18:37.463379 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 23 14:18:37.463874 master-0 kubenswrapper[7728]: I0223 14:18:37.463808 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 23 14:18:37.544444 master-0 kubenswrapper[7728]: I0223 14:18:37.544315 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g999k\" (UniqueName: \"kubernetes.io/projected/8ca3dee6-f651-4536-991c-303752c22f07-kube-api-access-g999k\") pod \"migrator-5c85bff57-vk2x8\" (UID: \"8ca3dee6-f651-4536-991c-303752c22f07\") " pod="openshift-kube-storage-version-migrator/migrator-5c85bff57-vk2x8" Feb 23 14:18:37.593973 master-0 kubenswrapper[7728]: I0223 14:18:37.593007 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-5c85bff57-vk2x8"] Feb 23 14:18:37.645211 master-0 kubenswrapper[7728]: I0223 14:18:37.645134 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g999k\" (UniqueName: \"kubernetes.io/projected/8ca3dee6-f651-4536-991c-303752c22f07-kube-api-access-g999k\") pod \"migrator-5c85bff57-vk2x8\" (UID: \"8ca3dee6-f651-4536-991c-303752c22f07\") " pod="openshift-kube-storage-version-migrator/migrator-5c85bff57-vk2x8" Feb 23 14:18:37.667760 master-0 kubenswrapper[7728]: I0223 14:18:37.667693 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g999k\" (UniqueName: \"kubernetes.io/projected/8ca3dee6-f651-4536-991c-303752c22f07-kube-api-access-g999k\") pod \"migrator-5c85bff57-vk2x8\" (UID: \"8ca3dee6-f651-4536-991c-303752c22f07\") " pod="openshift-kube-storage-version-migrator/migrator-5c85bff57-vk2x8" Feb 23 14:18:37.790075 master-0 kubenswrapper[7728]: I0223 14:18:37.789777 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-5c85bff57-vk2x8" Feb 23 14:18:37.959955 master-0 kubenswrapper[7728]: I0223 14:18:37.917007 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:18:38.107373 master-0 kubenswrapper[7728]: I0223 14:18:38.107120 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-5c85bff57-vk2x8"] Feb 23 14:18:38.343572 master-0 kubenswrapper[7728]: I0223 14:18:38.343356 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-5c85bff57-vk2x8" event={"ID":"8ca3dee6-f651-4536-991c-303752c22f07","Type":"ContainerStarted","Data":"71856c04f28ed0a4e9a36c70729f2e0d164816c342db7fab0a6d5f76b0f61b6a"} Feb 23 14:18:38.772551 master-0 kubenswrapper[7728]: I0223 14:18:38.764543 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-576b4d78bd-lq6ct"] Feb 23 14:18:38.772551 master-0 kubenswrapper[7728]: I0223 14:18:38.765256 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-576b4d78bd-lq6ct" Feb 23 14:18:38.772551 master-0 kubenswrapper[7728]: I0223 14:18:38.770826 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 23 14:18:38.772551 master-0 kubenswrapper[7728]: I0223 14:18:38.771351 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 23 14:18:38.772551 master-0 kubenswrapper[7728]: I0223 14:18:38.771687 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 23 14:18:38.772551 master-0 kubenswrapper[7728]: I0223 14:18:38.772161 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 23 14:18:38.791075 master-0 kubenswrapper[7728]: I0223 14:18:38.790984 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-576b4d78bd-lq6ct"] Feb 23 14:18:38.865739 master-0 kubenswrapper[7728]: I0223 14:18:38.865664 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qhr9\" (UniqueName: \"kubernetes.io/projected/709ac071-4392-4a3f-a3d1-4bc8ba2f6236-kube-api-access-6qhr9\") pod \"service-ca-576b4d78bd-lq6ct\" (UID: \"709ac071-4392-4a3f-a3d1-4bc8ba2f6236\") " pod="openshift-service-ca/service-ca-576b4d78bd-lq6ct" Feb 23 14:18:38.865739 master-0 kubenswrapper[7728]: I0223 14:18:38.865736 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/709ac071-4392-4a3f-a3d1-4bc8ba2f6236-signing-key\") pod \"service-ca-576b4d78bd-lq6ct\" (UID: \"709ac071-4392-4a3f-a3d1-4bc8ba2f6236\") " pod="openshift-service-ca/service-ca-576b4d78bd-lq6ct" Feb 23 14:18:38.870761 master-0 kubenswrapper[7728]: I0223 14:18:38.865930 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/709ac071-4392-4a3f-a3d1-4bc8ba2f6236-signing-cabundle\") pod \"service-ca-576b4d78bd-lq6ct\" (UID: \"709ac071-4392-4a3f-a3d1-4bc8ba2f6236\") " pod="openshift-service-ca/service-ca-576b4d78bd-lq6ct" Feb 23 14:18:38.966770 master-0 kubenswrapper[7728]: I0223 14:18:38.966719 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qhr9\" (UniqueName: \"kubernetes.io/projected/709ac071-4392-4a3f-a3d1-4bc8ba2f6236-kube-api-access-6qhr9\") pod \"service-ca-576b4d78bd-lq6ct\" (UID: \"709ac071-4392-4a3f-a3d1-4bc8ba2f6236\") " pod="openshift-service-ca/service-ca-576b4d78bd-lq6ct" Feb 23 14:18:38.966770 master-0 kubenswrapper[7728]: I0223 14:18:38.966778 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/709ac071-4392-4a3f-a3d1-4bc8ba2f6236-signing-key\") pod \"service-ca-576b4d78bd-lq6ct\" (UID: \"709ac071-4392-4a3f-a3d1-4bc8ba2f6236\") " pod="openshift-service-ca/service-ca-576b4d78bd-lq6ct" Feb 23 14:18:38.967055 master-0 kubenswrapper[7728]: I0223 14:18:38.966820 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/709ac071-4392-4a3f-a3d1-4bc8ba2f6236-signing-cabundle\") pod \"service-ca-576b4d78bd-lq6ct\" (UID: \"709ac071-4392-4a3f-a3d1-4bc8ba2f6236\") " pod="openshift-service-ca/service-ca-576b4d78bd-lq6ct" Feb 23 14:18:38.967839 master-0 kubenswrapper[7728]: I0223 14:18:38.967810 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/709ac071-4392-4a3f-a3d1-4bc8ba2f6236-signing-cabundle\") pod \"service-ca-576b4d78bd-lq6ct\" (UID: \"709ac071-4392-4a3f-a3d1-4bc8ba2f6236\") " pod="openshift-service-ca/service-ca-576b4d78bd-lq6ct" Feb 23 14:18:38.977688 master-0 kubenswrapper[7728]: I0223 14:18:38.977615 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/709ac071-4392-4a3f-a3d1-4bc8ba2f6236-signing-key\") pod \"service-ca-576b4d78bd-lq6ct\" (UID: \"709ac071-4392-4a3f-a3d1-4bc8ba2f6236\") " pod="openshift-service-ca/service-ca-576b4d78bd-lq6ct" Feb 23 14:18:39.000658 master-0 kubenswrapper[7728]: I0223 14:18:39.000549 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qhr9\" (UniqueName: \"kubernetes.io/projected/709ac071-4392-4a3f-a3d1-4bc8ba2f6236-kube-api-access-6qhr9\") pod \"service-ca-576b4d78bd-lq6ct\" (UID: \"709ac071-4392-4a3f-a3d1-4bc8ba2f6236\") " pod="openshift-service-ca/service-ca-576b4d78bd-lq6ct" Feb 23 14:18:39.099662 master-0 kubenswrapper[7728]: I0223 14:18:39.099528 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-576b4d78bd-lq6ct" Feb 23 14:18:39.351919 master-0 kubenswrapper[7728]: I0223 14:18:39.351772 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-t5h8h" event={"ID":"08c561b3-613b-425f-9de4-d5fc8762ea51","Type":"ContainerStarted","Data":"12fd878098bd7b54da3d1c8bde1617a7040cd38eee56d2ee103f7f6046abc156"} Feb 23 14:18:40.360072 master-0 kubenswrapper[7728]: I0223 14:18:40.359403 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-5c85bff57-vk2x8" event={"ID":"8ca3dee6-f651-4536-991c-303752c22f07","Type":"ContainerStarted","Data":"656ff9c0d892162aa14c1f4924bdaef93d7972bd31e96f6230a2df8c99d0a8a8"} Feb 23 14:18:40.447439 master-0 kubenswrapper[7728]: I0223 14:18:40.442233 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-576b4d78bd-lq6ct"] Feb 23 14:18:40.544776 master-0 kubenswrapper[7728]: W0223 14:18:40.544689 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod709ac071_4392_4a3f_a3d1_4bc8ba2f6236.slice/crio-4257fe78462bb2b5b2d39786788f5521c3464b4b4bf8cf481be2dae32881a79a WatchSource:0}: Error finding container 4257fe78462bb2b5b2d39786788f5521c3464b4b4bf8cf481be2dae32881a79a: Status 404 returned error can't find the container with id 4257fe78462bb2b5b2d39786788f5521c3464b4b4bf8cf481be2dae32881a79a Feb 23 14:18:41.365024 master-0 kubenswrapper[7728]: I0223 14:18:41.364940 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-5c85bff57-vk2x8" event={"ID":"8ca3dee6-f651-4536-991c-303752c22f07","Type":"ContainerStarted","Data":"0c89c563f0edf976572aa08fd148607bd8a41b8057fc467b03154cfc52d5048c"} Feb 23 14:18:41.366614 master-0 kubenswrapper[7728]: I0223 14:18:41.366540 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-576b4d78bd-lq6ct" event={"ID":"709ac071-4392-4a3f-a3d1-4bc8ba2f6236","Type":"ContainerStarted","Data":"c28d30a2b760e3ebbe98681a086eea9adf4942f9ca5f692597b7830f1309f2a8"} Feb 23 14:18:41.366614 master-0 kubenswrapper[7728]: I0223 14:18:41.366592 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-576b4d78bd-lq6ct" event={"ID":"709ac071-4392-4a3f-a3d1-4bc8ba2f6236","Type":"ContainerStarted","Data":"4257fe78462bb2b5b2d39786788f5521c3464b4b4bf8cf481be2dae32881a79a"} Feb 23 14:18:41.393104 master-0 kubenswrapper[7728]: I0223 14:18:41.393020 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-5c85bff57-vk2x8" podStartSLOduration=2.668520002 podStartE2EDuration="4.392994511s" podCreationTimestamp="2026-02-23 14:18:37 +0000 UTC" firstStartedPulling="2026-02-23 14:18:38.121699463 +0000 UTC m=+11.084360769" lastFinishedPulling="2026-02-23 14:18:39.846173982 +0000 UTC m=+12.808835278" observedRunningTime="2026-02-23 14:18:41.392840068 +0000 UTC m=+14.355501434" watchObservedRunningTime="2026-02-23 14:18:41.392994511 +0000 UTC m=+14.355655837" Feb 23 14:18:41.436654 master-0 kubenswrapper[7728]: I0223 14:18:41.436474 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-576b4d78bd-lq6ct" podStartSLOduration=3.436440624 podStartE2EDuration="3.436440624s" podCreationTimestamp="2026-02-23 14:18:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:18:41.4346336 +0000 UTC m=+14.397294926" watchObservedRunningTime="2026-02-23 14:18:41.436440624 +0000 UTC m=+14.399101960" Feb 23 14:18:43.378507 master-0 kubenswrapper[7728]: I0223 14:18:43.378394 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6fb4df594f-hkcgz" event={"ID":"a4ae9292-71dc-4484-b277-43cb26c1e04d","Type":"ContainerStarted","Data":"fafa7b0f21c17417165ff9592e80bbb6992685b66472f608cb30827b7d663491"} Feb 23 14:18:43.947175 master-0 kubenswrapper[7728]: I0223 14:18:43.947110 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs\") pod \"network-metrics-daemon-9dnsv\" (UID: \"ace75aae-6f4f-4299-90e2-d5292271b136\") " pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:18:43.947175 master-0 kubenswrapper[7728]: I0223 14:18:43.947197 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls\") pod \"dns-operator-8c7d49845-5rk2g\" (UID: \"607c1101-3533-43e3-9eda-13cea2b9dbb6\") " pod="openshift-dns-operator/dns-operator-8c7d49845-5rk2g" Feb 23 14:18:43.947708 master-0 kubenswrapper[7728]: I0223 14:18:43.947253 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6bb6d78bf-wzqcp\" (UID: \"646fece3-4a42-4e0c-bcc7-5f705f948d63\") " pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:18:43.947708 master-0 kubenswrapper[7728]: I0223 14:18:43.947302 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs\") pod \"multus-admission-controller-5f98f4f8d5-fnc9v\" (UID: \"842d45c5-3452-4e97-b5f5-540395330a65\") " pod="openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v" Feb 23 14:18:43.947708 master-0 kubenswrapper[7728]: I0223 14:18:43.947353 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:18:43.947708 master-0 kubenswrapper[7728]: I0223 14:18:43.947412 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:43.947708 master-0 kubenswrapper[7728]: I0223 14:18:43.947470 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:43.947708 master-0 kubenswrapper[7728]: I0223 14:18:43.947565 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:18:43.947708 master-0 kubenswrapper[7728]: I0223 14:18:43.947617 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics\") pod \"marketplace-operator-6f5488b997-7b5sp\" (UID: \"585f74db-4593-426b-b0c7-ec8f64810549\") " pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:18:43.947708 master-0 kubenswrapper[7728]: I0223 14:18:43.947672 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert\") pod \"package-server-manager-5c75f78c8b-cj2l7\" (UID: \"5b54fc16-d2f7-4b10-a611-5b411b389c5a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:18:43.948393 master-0 kubenswrapper[7728]: E0223 14:18:43.947913 7728 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Feb 23 14:18:43.948393 master-0 kubenswrapper[7728]: E0223 14:18:43.948005 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert podName:5b54fc16-d2f7-4b10-a611-5b411b389c5a nodeName:}" failed. No retries permitted until 2026-02-23 14:18:59.947975058 +0000 UTC m=+32.910636394 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert") pod "package-server-manager-5c75f78c8b-cj2l7" (UID: "5b54fc16-d2f7-4b10-a611-5b411b389c5a") : secret "package-server-manager-serving-cert" not found Feb 23 14:18:43.949129 master-0 kubenswrapper[7728]: E0223 14:18:43.949068 7728 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Feb 23 14:18:43.949265 master-0 kubenswrapper[7728]: E0223 14:18:43.949176 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs podName:842d45c5-3452-4e97-b5f5-540395330a65 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:59.949149 +0000 UTC m=+32.911810306 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs") pod "multus-admission-controller-5f98f4f8d5-fnc9v" (UID: "842d45c5-3452-4e97-b5f5-540395330a65") : secret "multus-admission-controller-secret" not found Feb 23 14:18:43.949265 master-0 kubenswrapper[7728]: E0223 14:18:43.949072 7728 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Feb 23 14:18:43.949265 master-0 kubenswrapper[7728]: E0223 14:18:43.949228 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls podName:646fece3-4a42-4e0c-bcc7-5f705f948d63 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:59.949219591 +0000 UTC m=+32.911880897 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6bb6d78bf-wzqcp" (UID: "646fece3-4a42-4e0c-bcc7-5f705f948d63") : secret "cluster-monitoring-operator-tls" not found Feb 23 14:18:43.949265 master-0 kubenswrapper[7728]: E0223 14:18:43.949072 7728 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Feb 23 14:18:43.949265 master-0 kubenswrapper[7728]: E0223 14:18:43.949266 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics podName:585f74db-4593-426b-b0c7-ec8f64810549 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:59.949257742 +0000 UTC m=+32.911919048 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics") pod "marketplace-operator-6f5488b997-7b5sp" (UID: "585f74db-4593-426b-b0c7-ec8f64810549") : secret "marketplace-operator-metrics" not found Feb 23 14:18:43.949265 master-0 kubenswrapper[7728]: E0223 14:18:43.949238 7728 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Feb 23 14:18:43.949828 master-0 kubenswrapper[7728]: E0223 14:18:43.949301 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs podName:ace75aae-6f4f-4299-90e2-d5292271b136 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:59.949293762 +0000 UTC m=+32.911955068 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs") pod "network-metrics-daemon-9dnsv" (UID: "ace75aae-6f4f-4299-90e2-d5292271b136") : secret "metrics-daemon-secret" not found Feb 23 14:18:43.950063 master-0 kubenswrapper[7728]: E0223 14:18:43.950000 7728 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Feb 23 14:18:43.950184 master-0 kubenswrapper[7728]: E0223 14:18:43.950117 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls podName:3488a7eb-5170-478c-9af7-490dbe0f514e nodeName:}" failed. No retries permitted until 2026-02-23 14:18:59.950087427 +0000 UTC m=+32.912748773 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls") pod "ingress-operator-6569778c84-hsl6c" (UID: "3488a7eb-5170-478c-9af7-490dbe0f514e") : secret "metrics-tls" not found Feb 23 14:18:43.955811 master-0 kubenswrapper[7728]: I0223 14:18:43.955756 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls\") pod \"dns-operator-8c7d49845-5rk2g\" (UID: \"607c1101-3533-43e3-9eda-13cea2b9dbb6\") " pod="openshift-dns-operator/dns-operator-8c7d49845-5rk2g" Feb 23 14:18:43.956706 master-0 kubenswrapper[7728]: I0223 14:18:43.956199 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:43.956706 master-0 kubenswrapper[7728]: I0223 14:18:43.956295 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:43.958847 master-0 kubenswrapper[7728]: I0223 14:18:43.958760 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert\") pod \"cluster-version-operator-5cfd9759cf-bsqrg\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:18:44.061830 master-0 kubenswrapper[7728]: I0223 14:18:44.061785 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:18:44.077188 master-0 kubenswrapper[7728]: I0223 14:18:44.076770 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-8c7d49845-5rk2g" Feb 23 14:18:44.077188 master-0 kubenswrapper[7728]: I0223 14:18:44.077186 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:18:44.118237 master-0 kubenswrapper[7728]: W0223 14:18:44.117908 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cea0ab8_258b_486c_bb7f_8c93930b296d.slice/crio-d94e3aa12ea197f83dbe73a39e5dd5f5709d36ebed817bef04e39396f416a043 WatchSource:0}: Error finding container d94e3aa12ea197f83dbe73a39e5dd5f5709d36ebed817bef04e39396f416a043: Status 404 returned error can't find the container with id d94e3aa12ea197f83dbe73a39e5dd5f5709d36ebed817bef04e39396f416a043 Feb 23 14:18:44.323043 master-0 kubenswrapper[7728]: I0223 14:18:44.322871 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b"] Feb 23 14:18:44.340407 master-0 kubenswrapper[7728]: W0223 14:18:44.340358 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57b57915_64dd_42f5_b06f_bc4bcc06b667.slice/crio-18bf2f609f8efa099778779b29a09c8f72903a95132d89474102c7f4d79d3d39 WatchSource:0}: Error finding container 18bf2f609f8efa099778779b29a09c8f72903a95132d89474102c7f4d79d3d39: Status 404 returned error can't find the container with id 18bf2f609f8efa099778779b29a09c8f72903a95132d89474102c7f4d79d3d39 Feb 23 14:18:44.373290 master-0 kubenswrapper[7728]: I0223 14:18:44.372853 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-8c7d49845-5rk2g"] Feb 23 14:18:44.382461 master-0 kubenswrapper[7728]: W0223 14:18:44.382083 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod607c1101_3533_43e3_9eda_13cea2b9dbb6.slice/crio-a928f690a2e58a25ba69277c1852026731fa14cc1f9743eea2995395d98f0871 WatchSource:0}: Error finding container a928f690a2e58a25ba69277c1852026731fa14cc1f9743eea2995395d98f0871: Status 404 returned error can't find the container with id a928f690a2e58a25ba69277c1852026731fa14cc1f9743eea2995395d98f0871 Feb 23 14:18:44.387612 master-0 kubenswrapper[7728]: I0223 14:18:44.385819 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" event={"ID":"3cea0ab8-258b-486c-bb7f-8c93930b296d","Type":"ContainerStarted","Data":"d94e3aa12ea197f83dbe73a39e5dd5f5709d36ebed817bef04e39396f416a043"} Feb 23 14:18:44.390597 master-0 kubenswrapper[7728]: I0223 14:18:44.390536 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" event={"ID":"57b57915-64dd-42f5-b06f-bc4bcc06b667","Type":"ContainerStarted","Data":"18bf2f609f8efa099778779b29a09c8f72903a95132d89474102c7f4d79d3d39"} Feb 23 14:18:44.523507 master-0 kubenswrapper[7728]: I0223 14:18:44.523032 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-6847bb4785-5fw2x"] Feb 23 14:18:44.523507 master-0 kubenswrapper[7728]: I0223 14:18:44.523475 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6847bb4785-5fw2x" Feb 23 14:18:44.555706 master-0 kubenswrapper[7728]: I0223 14:18:44.555577 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-6847bb4785-5fw2x"] Feb 23 14:18:44.559910 master-0 kubenswrapper[7728]: I0223 14:18:44.559865 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhsc6\" (UniqueName: \"kubernetes.io/projected/2e89a047-9ebc-459b-b7b3-e902c1fb0e17-kube-api-access-bhsc6\") pod \"csi-snapshot-controller-6847bb4785-5fw2x\" (UID: \"2e89a047-9ebc-459b-b7b3-e902c1fb0e17\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-6847bb4785-5fw2x" Feb 23 14:18:44.661684 master-0 kubenswrapper[7728]: I0223 14:18:44.661619 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhsc6\" (UniqueName: \"kubernetes.io/projected/2e89a047-9ebc-459b-b7b3-e902c1fb0e17-kube-api-access-bhsc6\") pod \"csi-snapshot-controller-6847bb4785-5fw2x\" (UID: \"2e89a047-9ebc-459b-b7b3-e902c1fb0e17\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-6847bb4785-5fw2x" Feb 23 14:18:44.682459 master-0 kubenswrapper[7728]: I0223 14:18:44.682403 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhsc6\" (UniqueName: \"kubernetes.io/projected/2e89a047-9ebc-459b-b7b3-e902c1fb0e17-kube-api-access-bhsc6\") pod \"csi-snapshot-controller-6847bb4785-5fw2x\" (UID: \"2e89a047-9ebc-459b-b7b3-e902c1fb0e17\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-6847bb4785-5fw2x" Feb 23 14:18:44.851817 master-0 kubenswrapper[7728]: I0223 14:18:44.851779 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6847bb4785-5fw2x" Feb 23 14:18:45.028514 master-0 kubenswrapper[7728]: I0223 14:18:45.028114 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-6847bb4785-5fw2x"] Feb 23 14:18:45.036396 master-0 kubenswrapper[7728]: W0223 14:18:45.036352 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e89a047_9ebc_459b_b7b3_e902c1fb0e17.slice/crio-390cfd29e9f1dac7e5d17a7f7165d182236f5a201c52a8221fd54d5117d708f7 WatchSource:0}: Error finding container 390cfd29e9f1dac7e5d17a7f7165d182236f5a201c52a8221fd54d5117d708f7: Status 404 returned error can't find the container with id 390cfd29e9f1dac7e5d17a7f7165d182236f5a201c52a8221fd54d5117d708f7 Feb 23 14:18:45.401367 master-0 kubenswrapper[7728]: I0223 14:18:45.400318 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" event={"ID":"8de1f285-47ac-42aa-8026-8addce656362","Type":"ContainerStarted","Data":"9576dd15e5e70c1d1ba1e6d5d639886620c60fa49c2ad4add67f8fd17b2dd5ba"} Feb 23 14:18:45.408703 master-0 kubenswrapper[7728]: I0223 14:18:45.402871 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6847bb4785-5fw2x" event={"ID":"2e89a047-9ebc-459b-b7b3-e902c1fb0e17","Type":"ContainerStarted","Data":"390cfd29e9f1dac7e5d17a7f7165d182236f5a201c52a8221fd54d5117d708f7"} Feb 23 14:18:45.408703 master-0 kubenswrapper[7728]: I0223 14:18:45.404038 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-8c7d49845-5rk2g" event={"ID":"607c1101-3533-43e3-9eda-13cea2b9dbb6","Type":"ContainerStarted","Data":"a928f690a2e58a25ba69277c1852026731fa14cc1f9743eea2995395d98f0871"} Feb 23 14:18:47.299069 master-0 kubenswrapper[7728]: I0223 14:18:47.298778 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:47.299560 master-0 kubenswrapper[7728]: I0223 14:18:47.299197 7728 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 14:18:47.315922 master-0 kubenswrapper[7728]: I0223 14:18:47.315864 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:18:48.019933 master-0 kubenswrapper[7728]: I0223 14:18:48.019592 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-7cd76464f7-bfxtj"] Feb 23 14:18:48.020576 master-0 kubenswrapper[7728]: I0223 14:18:48.020537 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.023381 master-0 kubenswrapper[7728]: I0223 14:18:48.023274 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 23 14:18:48.029448 master-0 kubenswrapper[7728]: I0223 14:18:48.023715 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 23 14:18:48.030462 master-0 kubenswrapper[7728]: I0223 14:18:48.029932 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 23 14:18:48.030462 master-0 kubenswrapper[7728]: I0223 14:18:48.030206 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 23 14:18:48.030462 master-0 kubenswrapper[7728]: I0223 14:18:48.030269 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-0" Feb 23 14:18:48.030654 master-0 kubenswrapper[7728]: I0223 14:18:48.030554 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-0" Feb 23 14:18:48.031060 master-0 kubenswrapper[7728]: I0223 14:18:48.030933 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 23 14:18:48.032860 master-0 kubenswrapper[7728]: I0223 14:18:48.031243 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 23 14:18:48.032860 master-0 kubenswrapper[7728]: I0223 14:18:48.031542 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 23 14:18:48.032860 master-0 kubenswrapper[7728]: I0223 14:18:48.032737 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-7cd76464f7-bfxtj"] Feb 23 14:18:48.036062 master-0 kubenswrapper[7728]: I0223 14:18:48.036039 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 23 14:18:48.093075 master-0 kubenswrapper[7728]: I0223 14:18:48.092755 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-encryption-config\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.093075 master-0 kubenswrapper[7728]: I0223 14:18:48.093062 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-etcd-serving-ca\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.093388 master-0 kubenswrapper[7728]: I0223 14:18:48.093116 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-config\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.093388 master-0 kubenswrapper[7728]: I0223 14:18:48.093164 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-serving-cert\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.093388 master-0 kubenswrapper[7728]: I0223 14:18:48.093191 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-trusted-ca-bundle\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.093388 master-0 kubenswrapper[7728]: I0223 14:18:48.093210 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/77499b29-66ab-4302-b2a4-d76c69b86c8c-audit-dir\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.093388 master-0 kubenswrapper[7728]: I0223 14:18:48.093236 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/77499b29-66ab-4302-b2a4-d76c69b86c8c-node-pullsecrets\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.093388 master-0 kubenswrapper[7728]: I0223 14:18:48.093261 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpmjj\" (UniqueName: \"kubernetes.io/projected/77499b29-66ab-4302-b2a4-d76c69b86c8c-kube-api-access-lpmjj\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.093388 master-0 kubenswrapper[7728]: I0223 14:18:48.093314 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-audit\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.093388 master-0 kubenswrapper[7728]: I0223 14:18:48.093334 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-image-import-ca\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.093388 master-0 kubenswrapper[7728]: I0223 14:18:48.093357 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-etcd-client\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.194690 master-0 kubenswrapper[7728]: I0223 14:18:48.194653 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-audit\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.194961 master-0 kubenswrapper[7728]: I0223 14:18:48.194943 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-etcd-client\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.195083 master-0 kubenswrapper[7728]: I0223 14:18:48.195064 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-image-import-ca\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.195194 master-0 kubenswrapper[7728]: I0223 14:18:48.195178 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-etcd-serving-ca\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.195309 master-0 kubenswrapper[7728]: I0223 14:18:48.195286 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-encryption-config\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.195437 master-0 kubenswrapper[7728]: I0223 14:18:48.195418 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-config\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.195582 master-0 kubenswrapper[7728]: I0223 14:18:48.195564 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-serving-cert\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.195690 master-0 kubenswrapper[7728]: I0223 14:18:48.195675 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-trusted-ca-bundle\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.195783 master-0 kubenswrapper[7728]: I0223 14:18:48.195708 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-image-import-ca\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.195783 master-0 kubenswrapper[7728]: E0223 14:18:48.194950 7728 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Feb 23 14:18:48.195873 master-0 kubenswrapper[7728]: E0223 14:18:48.195843 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-audit podName:77499b29-66ab-4302-b2a4-d76c69b86c8c nodeName:}" failed. No retries permitted until 2026-02-23 14:18:48.695825554 +0000 UTC m=+21.658486850 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-audit") pod "apiserver-7cd76464f7-bfxtj" (UID: "77499b29-66ab-4302-b2a4-d76c69b86c8c") : configmap "audit-0" not found Feb 23 14:18:48.195972 master-0 kubenswrapper[7728]: I0223 14:18:48.195947 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/77499b29-66ab-4302-b2a4-d76c69b86c8c-audit-dir\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.196132 master-0 kubenswrapper[7728]: E0223 14:18:48.195069 7728 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Feb 23 14:18:48.196229 master-0 kubenswrapper[7728]: I0223 14:18:48.195756 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/77499b29-66ab-4302-b2a4-d76c69b86c8c-audit-dir\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.196281 master-0 kubenswrapper[7728]: E0223 14:18:48.195298 7728 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: configmap "etcd-serving-ca" not found Feb 23 14:18:48.196281 master-0 kubenswrapper[7728]: I0223 14:18:48.196093 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-config\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.196337 master-0 kubenswrapper[7728]: E0223 14:18:48.195737 7728 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Feb 23 14:18:48.196404 master-0 kubenswrapper[7728]: E0223 14:18:48.196393 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-etcd-client podName:77499b29-66ab-4302-b2a4-d76c69b86c8c nodeName:}" failed. No retries permitted until 2026-02-23 14:18:48.696218763 +0000 UTC m=+21.658880059 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-etcd-client") pod "apiserver-7cd76464f7-bfxtj" (UID: "77499b29-66ab-4302-b2a4-d76c69b86c8c") : secret "etcd-client" not found Feb 23 14:18:48.196512 master-0 kubenswrapper[7728]: I0223 14:18:48.196471 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/77499b29-66ab-4302-b2a4-d76c69b86c8c-node-pullsecrets\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.196627 master-0 kubenswrapper[7728]: I0223 14:18:48.196538 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/77499b29-66ab-4302-b2a4-d76c69b86c8c-node-pullsecrets\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.196695 master-0 kubenswrapper[7728]: E0223 14:18:48.196565 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-etcd-serving-ca podName:77499b29-66ab-4302-b2a4-d76c69b86c8c nodeName:}" failed. No retries permitted until 2026-02-23 14:18:48.6965524 +0000 UTC m=+21.659213776 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-etcd-serving-ca") pod "apiserver-7cd76464f7-bfxtj" (UID: "77499b29-66ab-4302-b2a4-d76c69b86c8c") : configmap "etcd-serving-ca" not found Feb 23 14:18:48.196784 master-0 kubenswrapper[7728]: E0223 14:18:48.196774 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-serving-cert podName:77499b29-66ab-4302-b2a4-d76c69b86c8c nodeName:}" failed. No retries permitted until 2026-02-23 14:18:48.696763094 +0000 UTC m=+21.659424390 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-serving-cert") pod "apiserver-7cd76464f7-bfxtj" (UID: "77499b29-66ab-4302-b2a4-d76c69b86c8c") : secret "serving-cert" not found Feb 23 14:18:48.196902 master-0 kubenswrapper[7728]: I0223 14:18:48.196885 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpmjj\" (UniqueName: \"kubernetes.io/projected/77499b29-66ab-4302-b2a4-d76c69b86c8c-kube-api-access-lpmjj\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.197031 master-0 kubenswrapper[7728]: I0223 14:18:48.196644 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-trusted-ca-bundle\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.200178 master-0 kubenswrapper[7728]: I0223 14:18:48.200136 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-encryption-config\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.214041 master-0 kubenswrapper[7728]: I0223 14:18:48.213958 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpmjj\" (UniqueName: \"kubernetes.io/projected/77499b29-66ab-4302-b2a4-d76c69b86c8c-kube-api-access-lpmjj\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.701823 master-0 kubenswrapper[7728]: I0223 14:18:48.701769 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-etcd-serving-ca\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.702663 master-0 kubenswrapper[7728]: I0223 14:18:48.701843 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-serving-cert\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.702663 master-0 kubenswrapper[7728]: I0223 14:18:48.701893 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-audit\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.702663 master-0 kubenswrapper[7728]: I0223 14:18:48.701911 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-etcd-client\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:48.702663 master-0 kubenswrapper[7728]: E0223 14:18:48.702017 7728 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Feb 23 14:18:48.702663 master-0 kubenswrapper[7728]: E0223 14:18:48.702060 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-etcd-client podName:77499b29-66ab-4302-b2a4-d76c69b86c8c nodeName:}" failed. No retries permitted until 2026-02-23 14:18:49.702048021 +0000 UTC m=+22.664709317 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-etcd-client") pod "apiserver-7cd76464f7-bfxtj" (UID: "77499b29-66ab-4302-b2a4-d76c69b86c8c") : secret "etcd-client" not found Feb 23 14:18:48.702663 master-0 kubenswrapper[7728]: E0223 14:18:48.702369 7728 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: configmap "etcd-serving-ca" not found Feb 23 14:18:48.702663 master-0 kubenswrapper[7728]: E0223 14:18:48.702393 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-etcd-serving-ca podName:77499b29-66ab-4302-b2a4-d76c69b86c8c nodeName:}" failed. No retries permitted until 2026-02-23 14:18:49.702386208 +0000 UTC m=+22.665047504 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-etcd-serving-ca") pod "apiserver-7cd76464f7-bfxtj" (UID: "77499b29-66ab-4302-b2a4-d76c69b86c8c") : configmap "etcd-serving-ca" not found Feb 23 14:18:48.702663 master-0 kubenswrapper[7728]: E0223 14:18:48.702436 7728 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Feb 23 14:18:48.702663 master-0 kubenswrapper[7728]: E0223 14:18:48.702453 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-serving-cert podName:77499b29-66ab-4302-b2a4-d76c69b86c8c nodeName:}" failed. No retries permitted until 2026-02-23 14:18:49.70244764 +0000 UTC m=+22.665108936 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-serving-cert") pod "apiserver-7cd76464f7-bfxtj" (UID: "77499b29-66ab-4302-b2a4-d76c69b86c8c") : secret "serving-cert" not found Feb 23 14:18:48.702663 master-0 kubenswrapper[7728]: E0223 14:18:48.702495 7728 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Feb 23 14:18:48.702663 master-0 kubenswrapper[7728]: E0223 14:18:48.702513 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-audit podName:77499b29-66ab-4302-b2a4-d76c69b86c8c nodeName:}" failed. No retries permitted until 2026-02-23 14:18:49.702508201 +0000 UTC m=+22.665169497 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-audit") pod "apiserver-7cd76464f7-bfxtj" (UID: "77499b29-66ab-4302-b2a4-d76c69b86c8c") : configmap "audit-0" not found Feb 23 14:18:49.710742 master-0 kubenswrapper[7728]: I0223 14:18:49.710701 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-serving-cert\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:49.710742 master-0 kubenswrapper[7728]: I0223 14:18:49.710776 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-audit\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:49.711457 master-0 kubenswrapper[7728]: I0223 14:18:49.710796 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-etcd-client\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:49.711457 master-0 kubenswrapper[7728]: I0223 14:18:49.710816 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-etcd-serving-ca\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:49.711457 master-0 kubenswrapper[7728]: E0223 14:18:49.710902 7728 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: configmap "etcd-serving-ca" not found Feb 23 14:18:49.711457 master-0 kubenswrapper[7728]: E0223 14:18:49.710971 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-etcd-serving-ca podName:77499b29-66ab-4302-b2a4-d76c69b86c8c nodeName:}" failed. No retries permitted until 2026-02-23 14:18:51.710932847 +0000 UTC m=+24.673594143 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-etcd-serving-ca") pod "apiserver-7cd76464f7-bfxtj" (UID: "77499b29-66ab-4302-b2a4-d76c69b86c8c") : configmap "etcd-serving-ca" not found Feb 23 14:18:49.711457 master-0 kubenswrapper[7728]: E0223 14:18:49.711342 7728 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Feb 23 14:18:49.711457 master-0 kubenswrapper[7728]: E0223 14:18:49.711367 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-serving-cert podName:77499b29-66ab-4302-b2a4-d76c69b86c8c nodeName:}" failed. No retries permitted until 2026-02-23 14:18:51.711360306 +0000 UTC m=+24.674021602 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-serving-cert") pod "apiserver-7cd76464f7-bfxtj" (UID: "77499b29-66ab-4302-b2a4-d76c69b86c8c") : secret "serving-cert" not found Feb 23 14:18:49.711457 master-0 kubenswrapper[7728]: E0223 14:18:49.711430 7728 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Feb 23 14:18:49.711457 master-0 kubenswrapper[7728]: E0223 14:18:49.711450 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-audit podName:77499b29-66ab-4302-b2a4-d76c69b86c8c nodeName:}" failed. No retries permitted until 2026-02-23 14:18:51.711443378 +0000 UTC m=+24.674104674 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-audit") pod "apiserver-7cd76464f7-bfxtj" (UID: "77499b29-66ab-4302-b2a4-d76c69b86c8c") : configmap "audit-0" not found Feb 23 14:18:49.711814 master-0 kubenswrapper[7728]: E0223 14:18:49.711511 7728 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Feb 23 14:18:49.711814 master-0 kubenswrapper[7728]: E0223 14:18:49.711535 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-etcd-client podName:77499b29-66ab-4302-b2a4-d76c69b86c8c nodeName:}" failed. No retries permitted until 2026-02-23 14:18:51.71152717 +0000 UTC m=+24.674188466 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-etcd-client") pod "apiserver-7cd76464f7-bfxtj" (UID: "77499b29-66ab-4302-b2a4-d76c69b86c8c") : secret "etcd-client" not found Feb 23 14:18:50.423503 master-0 kubenswrapper[7728]: I0223 14:18:50.422584 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-8c7d49845-5rk2g" event={"ID":"607c1101-3533-43e3-9eda-13cea2b9dbb6","Type":"ContainerStarted","Data":"af6d79da0d728e46a6d28f0e6d5149695cf9e0a5184f46643dc1c16723bb00aa"} Feb 23 14:18:50.427502 master-0 kubenswrapper[7728]: I0223 14:18:50.423857 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" event={"ID":"cb6e88cd-98de-446a-92e8-f56a2f133703","Type":"ContainerStarted","Data":"031c49419dbbce343a020e2a52b0b21aa31f7846ce6d6338d427aedeeb387c27"} Feb 23 14:18:50.427502 master-0 kubenswrapper[7728]: I0223 14:18:50.424931 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" event={"ID":"3cea0ab8-258b-486c-bb7f-8c93930b296d","Type":"ContainerStarted","Data":"e532ebe35ecd05d1d110750408a3aca8e2e0ca55c7ed17dc1b108800da8ba8b6"} Feb 23 14:18:50.427502 master-0 kubenswrapper[7728]: I0223 14:18:50.427042 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6847bb4785-5fw2x" event={"ID":"2e89a047-9ebc-459b-b7b3-e902c1fb0e17","Type":"ContainerStarted","Data":"b8ab745e2116720c089d0aba55fcbbcd93f3d05db7dc85aaff6bdfb686118c69"} Feb 23 14:18:50.432194 master-0 kubenswrapper[7728]: I0223 14:18:50.432146 7728 generic.go:334] "Generic (PLEG): container finished" podID="d2aa0d48-7c8e-4ddb-84a3-b3c34414c061" containerID="4bdbe696b77666c832d686aa40ee248bba34d80f9ddd9b86b73fd8952b7b6113" exitCode=0 Feb 23 14:18:50.432294 master-0 kubenswrapper[7728]: I0223 14:18:50.432234 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" event={"ID":"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061","Type":"ContainerDied","Data":"4bdbe696b77666c832d686aa40ee248bba34d80f9ddd9b86b73fd8952b7b6113"} Feb 23 14:18:50.433329 master-0 kubenswrapper[7728]: I0223 14:18:50.433292 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" event={"ID":"57b57915-64dd-42f5-b06f-bc4bcc06b667","Type":"ContainerStarted","Data":"2f96ee533f5d52939bd2d7faf41993b118d9a6bfbb0b89e7580d1b1a849ba083"} Feb 23 14:18:50.446051 master-0 kubenswrapper[7728]: I0223 14:18:50.445996 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-wsx6c"] Feb 23 14:18:50.446607 master-0 kubenswrapper[7728]: I0223 14:18:50.446579 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.521845 master-0 kubenswrapper[7728]: I0223 14:18:50.520967 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9b558268-2262-4593-893e-408639a9987d-tmp\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.521845 master-0 kubenswrapper[7728]: I0223 14:18:50.521042 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-sysctl-d\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.521845 master-0 kubenswrapper[7728]: I0223 14:18:50.521067 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-run\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.521845 master-0 kubenswrapper[7728]: I0223 14:18:50.521105 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-kubernetes\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.521845 master-0 kubenswrapper[7728]: I0223 14:18:50.521124 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-systemd\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.521845 master-0 kubenswrapper[7728]: I0223 14:18:50.521144 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-modprobe-d\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.521845 master-0 kubenswrapper[7728]: I0223 14:18:50.521177 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-sysconfig\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.521845 master-0 kubenswrapper[7728]: I0223 14:18:50.521196 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-host\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.521845 master-0 kubenswrapper[7728]: I0223 14:18:50.521225 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmmn9\" (UniqueName: \"kubernetes.io/projected/9b558268-2262-4593-893e-408639a9987d-kube-api-access-nmmn9\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.521845 master-0 kubenswrapper[7728]: I0223 14:18:50.521281 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-lib-modules\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.521845 master-0 kubenswrapper[7728]: I0223 14:18:50.521311 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9b558268-2262-4593-893e-408639a9987d-etc-tuned\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.521845 master-0 kubenswrapper[7728]: I0223 14:18:50.521394 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-sysctl-conf\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.521845 master-0 kubenswrapper[7728]: I0223 14:18:50.521587 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-sys\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.521845 master-0 kubenswrapper[7728]: I0223 14:18:50.521655 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-var-lib-kubelet\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.525040 master-0 kubenswrapper[7728]: I0223 14:18:50.524969 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6847bb4785-5fw2x" podStartSLOduration=1.580162379 podStartE2EDuration="6.524951566s" podCreationTimestamp="2026-02-23 14:18:44 +0000 UTC" firstStartedPulling="2026-02-23 14:18:45.039126742 +0000 UTC m=+18.001788038" lastFinishedPulling="2026-02-23 14:18:49.983915929 +0000 UTC m=+22.946577225" observedRunningTime="2026-02-23 14:18:50.454218812 +0000 UTC m=+23.416880108" watchObservedRunningTime="2026-02-23 14:18:50.524951566 +0000 UTC m=+23.487612862" Feb 23 14:18:50.623219 master-0 kubenswrapper[7728]: I0223 14:18:50.623077 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9b558268-2262-4593-893e-408639a9987d-etc-tuned\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.623219 master-0 kubenswrapper[7728]: I0223 14:18:50.623152 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-sysctl-conf\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.623219 master-0 kubenswrapper[7728]: I0223 14:18:50.623221 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-sys\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.623539 master-0 kubenswrapper[7728]: I0223 14:18:50.623252 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-var-lib-kubelet\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.623539 master-0 kubenswrapper[7728]: I0223 14:18:50.623276 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9b558268-2262-4593-893e-408639a9987d-tmp\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.623539 master-0 kubenswrapper[7728]: I0223 14:18:50.623299 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-sysctl-d\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.623539 master-0 kubenswrapper[7728]: I0223 14:18:50.623316 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-run\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.623539 master-0 kubenswrapper[7728]: I0223 14:18:50.623345 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-kubernetes\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.623539 master-0 kubenswrapper[7728]: I0223 14:18:50.623365 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-systemd\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.623539 master-0 kubenswrapper[7728]: I0223 14:18:50.623385 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-modprobe-d\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.623539 master-0 kubenswrapper[7728]: I0223 14:18:50.623415 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-sysconfig\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.623539 master-0 kubenswrapper[7728]: I0223 14:18:50.623436 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-host\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.623539 master-0 kubenswrapper[7728]: I0223 14:18:50.623468 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmmn9\" (UniqueName: \"kubernetes.io/projected/9b558268-2262-4593-893e-408639a9987d-kube-api-access-nmmn9\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.623539 master-0 kubenswrapper[7728]: I0223 14:18:50.623532 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-lib-modules\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.623931 master-0 kubenswrapper[7728]: I0223 14:18:50.623700 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-run\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.623931 master-0 kubenswrapper[7728]: I0223 14:18:50.623895 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-lib-modules\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.624005 master-0 kubenswrapper[7728]: I0223 14:18:50.623989 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-kubernetes\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.624046 master-0 kubenswrapper[7728]: I0223 14:18:50.624030 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-systemd\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.624164 master-0 kubenswrapper[7728]: I0223 14:18:50.624096 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-modprobe-d\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.624164 master-0 kubenswrapper[7728]: I0223 14:18:50.624158 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-sysconfig\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.624277 master-0 kubenswrapper[7728]: I0223 14:18:50.624194 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-host\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.626662 master-0 kubenswrapper[7728]: I0223 14:18:50.624563 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-var-lib-kubelet\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.626662 master-0 kubenswrapper[7728]: I0223 14:18:50.624701 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-sysctl-conf\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.626662 master-0 kubenswrapper[7728]: I0223 14:18:50.624718 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-sys\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.626662 master-0 kubenswrapper[7728]: I0223 14:18:50.624746 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-sysctl-d\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.630105 master-0 kubenswrapper[7728]: I0223 14:18:50.630059 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9b558268-2262-4593-893e-408639a9987d-etc-tuned\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.631008 master-0 kubenswrapper[7728]: I0223 14:18:50.630270 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9b558268-2262-4593-893e-408639a9987d-tmp\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.644620 master-0 kubenswrapper[7728]: I0223 14:18:50.644567 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmmn9\" (UniqueName: \"kubernetes.io/projected/9b558268-2262-4593-893e-408639a9987d-kube-api-access-nmmn9\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.841977 master-0 kubenswrapper[7728]: I0223 14:18:50.841578 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:18:50.917550 master-0 kubenswrapper[7728]: I0223 14:18:50.917489 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-86l7f"] Feb 23 14:18:50.918248 master-0 kubenswrapper[7728]: I0223 14:18:50.918224 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-86l7f" Feb 23 14:18:50.924128 master-0 kubenswrapper[7728]: I0223 14:18:50.924107 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 23 14:18:50.924405 master-0 kubenswrapper[7728]: I0223 14:18:50.924391 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 23 14:18:50.924577 master-0 kubenswrapper[7728]: I0223 14:18:50.924560 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 23 14:18:50.924724 master-0 kubenswrapper[7728]: I0223 14:18:50.924706 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 23 14:18:50.927157 master-0 kubenswrapper[7728]: I0223 14:18:50.927012 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-86l7f"] Feb 23 14:18:51.047171 master-0 kubenswrapper[7728]: I0223 14:18:51.047109 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a801da1-a7eb-4187-98b8-315076f55e19-metrics-tls\") pod \"dns-default-86l7f\" (UID: \"6a801da1-a7eb-4187-98b8-315076f55e19\") " pod="openshift-dns/dns-default-86l7f" Feb 23 14:18:51.047353 master-0 kubenswrapper[7728]: I0223 14:18:51.047241 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqkz4\" (UniqueName: \"kubernetes.io/projected/6a801da1-a7eb-4187-98b8-315076f55e19-kube-api-access-pqkz4\") pod \"dns-default-86l7f\" (UID: \"6a801da1-a7eb-4187-98b8-315076f55e19\") " pod="openshift-dns/dns-default-86l7f" Feb 23 14:18:51.047353 master-0 kubenswrapper[7728]: I0223 14:18:51.047289 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a801da1-a7eb-4187-98b8-315076f55e19-config-volume\") pod \"dns-default-86l7f\" (UID: \"6a801da1-a7eb-4187-98b8-315076f55e19\") " pod="openshift-dns/dns-default-86l7f" Feb 23 14:18:51.148753 master-0 kubenswrapper[7728]: I0223 14:18:51.148654 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a801da1-a7eb-4187-98b8-315076f55e19-metrics-tls\") pod \"dns-default-86l7f\" (UID: \"6a801da1-a7eb-4187-98b8-315076f55e19\") " pod="openshift-dns/dns-default-86l7f" Feb 23 14:18:51.148753 master-0 kubenswrapper[7728]: I0223 14:18:51.148739 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqkz4\" (UniqueName: \"kubernetes.io/projected/6a801da1-a7eb-4187-98b8-315076f55e19-kube-api-access-pqkz4\") pod \"dns-default-86l7f\" (UID: \"6a801da1-a7eb-4187-98b8-315076f55e19\") " pod="openshift-dns/dns-default-86l7f" Feb 23 14:18:51.149036 master-0 kubenswrapper[7728]: E0223 14:18:51.148862 7728 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Feb 23 14:18:51.149036 master-0 kubenswrapper[7728]: E0223 14:18:51.148937 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a801da1-a7eb-4187-98b8-315076f55e19-metrics-tls podName:6a801da1-a7eb-4187-98b8-315076f55e19 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:51.648916579 +0000 UTC m=+24.611577875 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a801da1-a7eb-4187-98b8-315076f55e19-metrics-tls") pod "dns-default-86l7f" (UID: "6a801da1-a7eb-4187-98b8-315076f55e19") : secret "dns-default-metrics-tls" not found Feb 23 14:18:51.149176 master-0 kubenswrapper[7728]: I0223 14:18:51.149041 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a801da1-a7eb-4187-98b8-315076f55e19-config-volume\") pod \"dns-default-86l7f\" (UID: \"6a801da1-a7eb-4187-98b8-315076f55e19\") " pod="openshift-dns/dns-default-86l7f" Feb 23 14:18:51.150247 master-0 kubenswrapper[7728]: I0223 14:18:51.150210 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a801da1-a7eb-4187-98b8-315076f55e19-config-volume\") pod \"dns-default-86l7f\" (UID: \"6a801da1-a7eb-4187-98b8-315076f55e19\") " pod="openshift-dns/dns-default-86l7f" Feb 23 14:18:51.172509 master-0 kubenswrapper[7728]: I0223 14:18:51.170849 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqkz4\" (UniqueName: \"kubernetes.io/projected/6a801da1-a7eb-4187-98b8-315076f55e19-kube-api-access-pqkz4\") pod \"dns-default-86l7f\" (UID: \"6a801da1-a7eb-4187-98b8-315076f55e19\") " pod="openshift-dns/dns-default-86l7f" Feb 23 14:18:51.209015 master-0 kubenswrapper[7728]: I0223 14:18:51.208966 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-7b6jk"] Feb 23 14:18:51.210194 master-0 kubenswrapper[7728]: I0223 14:18:51.209372 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7b6jk" Feb 23 14:18:51.250646 master-0 kubenswrapper[7728]: I0223 14:18:51.250595 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwtcj\" (UniqueName: \"kubernetes.io/projected/2f876e5d-2e82-47d0-8a9c-adacf2bddf77-kube-api-access-pwtcj\") pod \"node-resolver-7b6jk\" (UID: \"2f876e5d-2e82-47d0-8a9c-adacf2bddf77\") " pod="openshift-dns/node-resolver-7b6jk" Feb 23 14:18:51.250895 master-0 kubenswrapper[7728]: I0223 14:18:51.250665 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2f876e5d-2e82-47d0-8a9c-adacf2bddf77-hosts-file\") pod \"node-resolver-7b6jk\" (UID: \"2f876e5d-2e82-47d0-8a9c-adacf2bddf77\") " pod="openshift-dns/node-resolver-7b6jk" Feb 23 14:18:51.351504 master-0 kubenswrapper[7728]: I0223 14:18:51.351405 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwtcj\" (UniqueName: \"kubernetes.io/projected/2f876e5d-2e82-47d0-8a9c-adacf2bddf77-kube-api-access-pwtcj\") pod \"node-resolver-7b6jk\" (UID: \"2f876e5d-2e82-47d0-8a9c-adacf2bddf77\") " pod="openshift-dns/node-resolver-7b6jk" Feb 23 14:18:51.351504 master-0 kubenswrapper[7728]: I0223 14:18:51.351472 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2f876e5d-2e82-47d0-8a9c-adacf2bddf77-hosts-file\") pod \"node-resolver-7b6jk\" (UID: \"2f876e5d-2e82-47d0-8a9c-adacf2bddf77\") " pod="openshift-dns/node-resolver-7b6jk" Feb 23 14:18:51.351770 master-0 kubenswrapper[7728]: I0223 14:18:51.351665 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2f876e5d-2e82-47d0-8a9c-adacf2bddf77-hosts-file\") pod \"node-resolver-7b6jk\" (UID: \"2f876e5d-2e82-47d0-8a9c-adacf2bddf77\") " pod="openshift-dns/node-resolver-7b6jk" Feb 23 14:18:51.367444 master-0 kubenswrapper[7728]: I0223 14:18:51.367397 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwtcj\" (UniqueName: \"kubernetes.io/projected/2f876e5d-2e82-47d0-8a9c-adacf2bddf77-kube-api-access-pwtcj\") pod \"node-resolver-7b6jk\" (UID: \"2f876e5d-2e82-47d0-8a9c-adacf2bddf77\") " pod="openshift-dns/node-resolver-7b6jk" Feb 23 14:18:51.448505 master-0 kubenswrapper[7728]: I0223 14:18:51.441071 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" event={"ID":"9b558268-2262-4593-893e-408639a9987d","Type":"ContainerStarted","Data":"deea197b57c536e08f67845d8b2b560fc3c42e69e2de93f0b375ab236c9ca72e"} Feb 23 14:18:51.448505 master-0 kubenswrapper[7728]: I0223 14:18:51.441155 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" event={"ID":"9b558268-2262-4593-893e-408639a9987d","Type":"ContainerStarted","Data":"16f09e4885901a350ad5d473f91c2d104b970ac63a52a074d9fd82367db4b586"} Feb 23 14:18:51.448505 master-0 kubenswrapper[7728]: I0223 14:18:51.445977 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-8c7d49845-5rk2g" event={"ID":"607c1101-3533-43e3-9eda-13cea2b9dbb6","Type":"ContainerStarted","Data":"45850e1beb05a95ae5fcf3a42237f73e3fe69700a603134a49b7e403ae4ba9f9"} Feb 23 14:18:51.478504 master-0 kubenswrapper[7728]: I0223 14:18:51.478016 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" podStartSLOduration=1.4779961799999999 podStartE2EDuration="1.47799618s" podCreationTimestamp="2026-02-23 14:18:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:18:51.458635302 +0000 UTC m=+24.421296618" watchObservedRunningTime="2026-02-23 14:18:51.47799618 +0000 UTC m=+24.440657476" Feb 23 14:18:51.529533 master-0 kubenswrapper[7728]: I0223 14:18:51.528143 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7b6jk" Feb 23 14:18:51.543351 master-0 kubenswrapper[7728]: W0223 14:18:51.543292 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f876e5d_2e82_47d0_8a9c_adacf2bddf77.slice/crio-9307bb7ee156f4761943094e1eb907a68e217cea3a83d35051b952c84a004e40 WatchSource:0}: Error finding container 9307bb7ee156f4761943094e1eb907a68e217cea3a83d35051b952c84a004e40: Status 404 returned error can't find the container with id 9307bb7ee156f4761943094e1eb907a68e217cea3a83d35051b952c84a004e40 Feb 23 14:18:51.662516 master-0 kubenswrapper[7728]: I0223 14:18:51.655946 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a801da1-a7eb-4187-98b8-315076f55e19-metrics-tls\") pod \"dns-default-86l7f\" (UID: \"6a801da1-a7eb-4187-98b8-315076f55e19\") " pod="openshift-dns/dns-default-86l7f" Feb 23 14:18:51.662516 master-0 kubenswrapper[7728]: E0223 14:18:51.656206 7728 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Feb 23 14:18:51.662516 master-0 kubenswrapper[7728]: E0223 14:18:51.656310 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a801da1-a7eb-4187-98b8-315076f55e19-metrics-tls podName:6a801da1-a7eb-4187-98b8-315076f55e19 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:52.656284971 +0000 UTC m=+25.618946297 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a801da1-a7eb-4187-98b8-315076f55e19-metrics-tls") pod "dns-default-86l7f" (UID: "6a801da1-a7eb-4187-98b8-315076f55e19") : secret "dns-default-metrics-tls" not found Feb 23 14:18:51.763563 master-0 kubenswrapper[7728]: I0223 14:18:51.763213 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-audit\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:51.763563 master-0 kubenswrapper[7728]: I0223 14:18:51.763557 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-etcd-client\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:51.763798 master-0 kubenswrapper[7728]: I0223 14:18:51.763594 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-etcd-serving-ca\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:51.763798 master-0 kubenswrapper[7728]: I0223 14:18:51.763655 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-serving-cert\") pod \"apiserver-7cd76464f7-bfxtj\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:51.763900 master-0 kubenswrapper[7728]: E0223 14:18:51.763809 7728 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Feb 23 14:18:51.763900 master-0 kubenswrapper[7728]: E0223 14:18:51.763866 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-serving-cert podName:77499b29-66ab-4302-b2a4-d76c69b86c8c nodeName:}" failed. No retries permitted until 2026-02-23 14:18:55.763847418 +0000 UTC m=+28.726508714 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-serving-cert") pod "apiserver-7cd76464f7-bfxtj" (UID: "77499b29-66ab-4302-b2a4-d76c69b86c8c") : secret "serving-cert" not found Feb 23 14:18:51.764002 master-0 kubenswrapper[7728]: I0223 14:18:51.763916 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6c9b8f4d95-4vv86"] Feb 23 14:18:51.764286 master-0 kubenswrapper[7728]: E0223 14:18:51.764247 7728 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Feb 23 14:18:51.764349 master-0 kubenswrapper[7728]: E0223 14:18:51.764290 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-audit podName:77499b29-66ab-4302-b2a4-d76c69b86c8c nodeName:}" failed. No retries permitted until 2026-02-23 14:18:55.764280268 +0000 UTC m=+28.726941564 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-audit") pod "apiserver-7cd76464f7-bfxtj" (UID: "77499b29-66ab-4302-b2a4-d76c69b86c8c") : configmap "audit-0" not found Feb 23 14:18:51.764349 master-0 kubenswrapper[7728]: E0223 14:18:51.764340 7728 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Feb 23 14:18:51.764439 master-0 kubenswrapper[7728]: E0223 14:18:51.764365 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-etcd-client podName:77499b29-66ab-4302-b2a4-d76c69b86c8c nodeName:}" failed. No retries permitted until 2026-02-23 14:18:55.764357269 +0000 UTC m=+28.727018565 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-etcd-client") pod "apiserver-7cd76464f7-bfxtj" (UID: "77499b29-66ab-4302-b2a4-d76c69b86c8c") : secret "etcd-client" not found Feb 23 14:18:51.764439 master-0 kubenswrapper[7728]: E0223 14:18:51.764396 7728 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: configmap "etcd-serving-ca" not found Feb 23 14:18:51.764439 master-0 kubenswrapper[7728]: E0223 14:18:51.764419 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-etcd-serving-ca podName:77499b29-66ab-4302-b2a4-d76c69b86c8c nodeName:}" failed. No retries permitted until 2026-02-23 14:18:55.764411351 +0000 UTC m=+28.727072647 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-etcd-serving-ca") pod "apiserver-7cd76464f7-bfxtj" (UID: "77499b29-66ab-4302-b2a4-d76c69b86c8c") : configmap "etcd-serving-ca" not found Feb 23 14:18:51.764589 master-0 kubenswrapper[7728]: I0223 14:18:51.764524 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c9b8f4d95-4vv86" Feb 23 14:18:51.767083 master-0 kubenswrapper[7728]: I0223 14:18:51.767048 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 23 14:18:51.767245 master-0 kubenswrapper[7728]: I0223 14:18:51.767217 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 23 14:18:51.768817 master-0 kubenswrapper[7728]: I0223 14:18:51.768785 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 23 14:18:51.768891 master-0 kubenswrapper[7728]: I0223 14:18:51.768834 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 23 14:18:51.768891 master-0 kubenswrapper[7728]: I0223 14:18:51.768836 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 23 14:18:51.769002 master-0 kubenswrapper[7728]: I0223 14:18:51.768967 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 23 14:18:51.772807 master-0 kubenswrapper[7728]: I0223 14:18:51.772768 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c9b8f4d95-4vv86"] Feb 23 14:18:51.864360 master-0 kubenswrapper[7728]: I0223 14:18:51.864302 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-client-ca\") pod \"controller-manager-6c9b8f4d95-4vv86\" (UID: \"e801012f-bd6e-42a5-81a3-82a8120e7b50\") " pod="openshift-controller-manager/controller-manager-6c9b8f4d95-4vv86" Feb 23 14:18:51.864982 master-0 kubenswrapper[7728]: I0223 14:18:51.864387 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-proxy-ca-bundles\") pod \"controller-manager-6c9b8f4d95-4vv86\" (UID: \"e801012f-bd6e-42a5-81a3-82a8120e7b50\") " pod="openshift-controller-manager/controller-manager-6c9b8f4d95-4vv86" Feb 23 14:18:51.864982 master-0 kubenswrapper[7728]: I0223 14:18:51.864432 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e801012f-bd6e-42a5-81a3-82a8120e7b50-serving-cert\") pod \"controller-manager-6c9b8f4d95-4vv86\" (UID: \"e801012f-bd6e-42a5-81a3-82a8120e7b50\") " pod="openshift-controller-manager/controller-manager-6c9b8f4d95-4vv86" Feb 23 14:18:51.864982 master-0 kubenswrapper[7728]: I0223 14:18:51.864463 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-config\") pod \"controller-manager-6c9b8f4d95-4vv86\" (UID: \"e801012f-bd6e-42a5-81a3-82a8120e7b50\") " pod="openshift-controller-manager/controller-manager-6c9b8f4d95-4vv86" Feb 23 14:18:51.864982 master-0 kubenswrapper[7728]: I0223 14:18:51.864545 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5mv8\" (UniqueName: \"kubernetes.io/projected/e801012f-bd6e-42a5-81a3-82a8120e7b50-kube-api-access-l5mv8\") pod \"controller-manager-6c9b8f4d95-4vv86\" (UID: \"e801012f-bd6e-42a5-81a3-82a8120e7b50\") " pod="openshift-controller-manager/controller-manager-6c9b8f4d95-4vv86" Feb 23 14:18:51.966560 master-0 kubenswrapper[7728]: I0223 14:18:51.966270 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5mv8\" (UniqueName: \"kubernetes.io/projected/e801012f-bd6e-42a5-81a3-82a8120e7b50-kube-api-access-l5mv8\") pod \"controller-manager-6c9b8f4d95-4vv86\" (UID: \"e801012f-bd6e-42a5-81a3-82a8120e7b50\") " pod="openshift-controller-manager/controller-manager-6c9b8f4d95-4vv86" Feb 23 14:18:51.966560 master-0 kubenswrapper[7728]: I0223 14:18:51.966540 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-client-ca\") pod \"controller-manager-6c9b8f4d95-4vv86\" (UID: \"e801012f-bd6e-42a5-81a3-82a8120e7b50\") " pod="openshift-controller-manager/controller-manager-6c9b8f4d95-4vv86" Feb 23 14:18:51.966775 master-0 kubenswrapper[7728]: I0223 14:18:51.966632 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-proxy-ca-bundles\") pod \"controller-manager-6c9b8f4d95-4vv86\" (UID: \"e801012f-bd6e-42a5-81a3-82a8120e7b50\") " pod="openshift-controller-manager/controller-manager-6c9b8f4d95-4vv86" Feb 23 14:18:51.966775 master-0 kubenswrapper[7728]: E0223 14:18:51.966743 7728 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: configmap "openshift-global-ca" not found Feb 23 14:18:51.966914 master-0 kubenswrapper[7728]: E0223 14:18:51.966809 7728 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Feb 23 14:18:51.966914 master-0 kubenswrapper[7728]: E0223 14:18:51.966831 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-proxy-ca-bundles podName:e801012f-bd6e-42a5-81a3-82a8120e7b50 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:52.466806211 +0000 UTC m=+25.429467537 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-proxy-ca-bundles") pod "controller-manager-6c9b8f4d95-4vv86" (UID: "e801012f-bd6e-42a5-81a3-82a8120e7b50") : configmap "openshift-global-ca" not found Feb 23 14:18:51.966914 master-0 kubenswrapper[7728]: E0223 14:18:51.966886 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-client-ca podName:e801012f-bd6e-42a5-81a3-82a8120e7b50 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:52.466868093 +0000 UTC m=+25.429529379 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-client-ca") pod "controller-manager-6c9b8f4d95-4vv86" (UID: "e801012f-bd6e-42a5-81a3-82a8120e7b50") : configmap "client-ca" not found Feb 23 14:18:51.967080 master-0 kubenswrapper[7728]: I0223 14:18:51.967000 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e801012f-bd6e-42a5-81a3-82a8120e7b50-serving-cert\") pod \"controller-manager-6c9b8f4d95-4vv86\" (UID: \"e801012f-bd6e-42a5-81a3-82a8120e7b50\") " pod="openshift-controller-manager/controller-manager-6c9b8f4d95-4vv86" Feb 23 14:18:51.967187 master-0 kubenswrapper[7728]: I0223 14:18:51.967157 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-config\") pod \"controller-manager-6c9b8f4d95-4vv86\" (UID: \"e801012f-bd6e-42a5-81a3-82a8120e7b50\") " pod="openshift-controller-manager/controller-manager-6c9b8f4d95-4vv86" Feb 23 14:18:51.967244 master-0 kubenswrapper[7728]: E0223 14:18:51.967162 7728 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Feb 23 14:18:51.967321 master-0 kubenswrapper[7728]: E0223 14:18:51.967284 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e801012f-bd6e-42a5-81a3-82a8120e7b50-serving-cert podName:e801012f-bd6e-42a5-81a3-82a8120e7b50 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:52.467265951 +0000 UTC m=+25.429927257 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/e801012f-bd6e-42a5-81a3-82a8120e7b50-serving-cert") pod "controller-manager-6c9b8f4d95-4vv86" (UID: "e801012f-bd6e-42a5-81a3-82a8120e7b50") : secret "serving-cert" not found Feb 23 14:18:51.967357 master-0 kubenswrapper[7728]: E0223 14:18:51.967219 7728 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: configmap "config" not found Feb 23 14:18:51.967389 master-0 kubenswrapper[7728]: E0223 14:18:51.967360 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-config podName:e801012f-bd6e-42a5-81a3-82a8120e7b50 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:52.467350923 +0000 UTC m=+25.430012229 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-config") pod "controller-manager-6c9b8f4d95-4vv86" (UID: "e801012f-bd6e-42a5-81a3-82a8120e7b50") : configmap "config" not found Feb 23 14:18:51.998684 master-0 kubenswrapper[7728]: I0223 14:18:51.998615 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5mv8\" (UniqueName: \"kubernetes.io/projected/e801012f-bd6e-42a5-81a3-82a8120e7b50-kube-api-access-l5mv8\") pod \"controller-manager-6c9b8f4d95-4vv86\" (UID: \"e801012f-bd6e-42a5-81a3-82a8120e7b50\") " pod="openshift-controller-manager/controller-manager-6c9b8f4d95-4vv86" Feb 23 14:18:52.451170 master-0 kubenswrapper[7728]: I0223 14:18:52.451087 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7b6jk" event={"ID":"2f876e5d-2e82-47d0-8a9c-adacf2bddf77","Type":"ContainerStarted","Data":"f105f21310354d57ab5d681251f5d073aac907cbf6a02b20b473f0be82c5c836"} Feb 23 14:18:52.451170 master-0 kubenswrapper[7728]: I0223 14:18:52.451156 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7b6jk" event={"ID":"2f876e5d-2e82-47d0-8a9c-adacf2bddf77","Type":"ContainerStarted","Data":"9307bb7ee156f4761943094e1eb907a68e217cea3a83d35051b952c84a004e40"} Feb 23 14:18:52.482163 master-0 kubenswrapper[7728]: I0223 14:18:52.482090 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e801012f-bd6e-42a5-81a3-82a8120e7b50-serving-cert\") pod \"controller-manager-6c9b8f4d95-4vv86\" (UID: \"e801012f-bd6e-42a5-81a3-82a8120e7b50\") " pod="openshift-controller-manager/controller-manager-6c9b8f4d95-4vv86" Feb 23 14:18:52.482163 master-0 kubenswrapper[7728]: I0223 14:18:52.482189 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-config\") pod \"controller-manager-6c9b8f4d95-4vv86\" (UID: \"e801012f-bd6e-42a5-81a3-82a8120e7b50\") " pod="openshift-controller-manager/controller-manager-6c9b8f4d95-4vv86" Feb 23 14:18:52.482163 master-0 kubenswrapper[7728]: E0223 14:18:52.482323 7728 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: configmap "config" not found Feb 23 14:18:52.482163 master-0 kubenswrapper[7728]: E0223 14:18:52.482383 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-config podName:e801012f-bd6e-42a5-81a3-82a8120e7b50 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:53.482366019 +0000 UTC m=+26.445027325 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-config") pod "controller-manager-6c9b8f4d95-4vv86" (UID: "e801012f-bd6e-42a5-81a3-82a8120e7b50") : configmap "config" not found Feb 23 14:18:52.482163 master-0 kubenswrapper[7728]: E0223 14:18:52.482402 7728 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Feb 23 14:18:52.482163 master-0 kubenswrapper[7728]: E0223 14:18:52.482595 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e801012f-bd6e-42a5-81a3-82a8120e7b50-serving-cert podName:e801012f-bd6e-42a5-81a3-82a8120e7b50 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:53.482564554 +0000 UTC m=+26.445225870 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/e801012f-bd6e-42a5-81a3-82a8120e7b50-serving-cert") pod "controller-manager-6c9b8f4d95-4vv86" (UID: "e801012f-bd6e-42a5-81a3-82a8120e7b50") : secret "serving-cert" not found Feb 23 14:18:52.482163 master-0 kubenswrapper[7728]: I0223 14:18:52.482644 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-client-ca\") pod \"controller-manager-6c9b8f4d95-4vv86\" (UID: \"e801012f-bd6e-42a5-81a3-82a8120e7b50\") " pod="openshift-controller-manager/controller-manager-6c9b8f4d95-4vv86" Feb 23 14:18:52.482163 master-0 kubenswrapper[7728]: I0223 14:18:52.482775 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-proxy-ca-bundles\") pod \"controller-manager-6c9b8f4d95-4vv86\" (UID: \"e801012f-bd6e-42a5-81a3-82a8120e7b50\") " pod="openshift-controller-manager/controller-manager-6c9b8f4d95-4vv86" Feb 23 14:18:52.482163 master-0 kubenswrapper[7728]: E0223 14:18:52.482805 7728 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Feb 23 14:18:52.482163 master-0 kubenswrapper[7728]: E0223 14:18:52.482898 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-client-ca podName:e801012f-bd6e-42a5-81a3-82a8120e7b50 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:53.48287336 +0000 UTC m=+26.445534676 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-client-ca") pod "controller-manager-6c9b8f4d95-4vv86" (UID: "e801012f-bd6e-42a5-81a3-82a8120e7b50") : configmap "client-ca" not found Feb 23 14:18:52.482163 master-0 kubenswrapper[7728]: E0223 14:18:52.482924 7728 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: configmap "openshift-global-ca" not found Feb 23 14:18:52.482163 master-0 kubenswrapper[7728]: E0223 14:18:52.482983 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-proxy-ca-bundles podName:e801012f-bd6e-42a5-81a3-82a8120e7b50 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:53.482972782 +0000 UTC m=+26.445634088 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-proxy-ca-bundles") pod "controller-manager-6c9b8f4d95-4vv86" (UID: "e801012f-bd6e-42a5-81a3-82a8120e7b50") : configmap "openshift-global-ca" not found Feb 23 14:18:52.685551 master-0 kubenswrapper[7728]: I0223 14:18:52.685448 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a801da1-a7eb-4187-98b8-315076f55e19-metrics-tls\") pod \"dns-default-86l7f\" (UID: \"6a801da1-a7eb-4187-98b8-315076f55e19\") " pod="openshift-dns/dns-default-86l7f" Feb 23 14:18:52.685956 master-0 kubenswrapper[7728]: E0223 14:18:52.685621 7728 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Feb 23 14:18:52.685956 master-0 kubenswrapper[7728]: E0223 14:18:52.685674 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a801da1-a7eb-4187-98b8-315076f55e19-metrics-tls podName:6a801da1-a7eb-4187-98b8-315076f55e19 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:54.685660459 +0000 UTC m=+27.648321755 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a801da1-a7eb-4187-98b8-315076f55e19-metrics-tls") pod "dns-default-86l7f" (UID: "6a801da1-a7eb-4187-98b8-315076f55e19") : secret "dns-default-metrics-tls" not found Feb 23 14:18:52.863324 master-0 kubenswrapper[7728]: I0223 14:18:52.863182 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7b6jk" podStartSLOduration=1.863160284 podStartE2EDuration="1.863160284s" podCreationTimestamp="2026-02-23 14:18:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:18:52.470111925 +0000 UTC m=+25.432773231" watchObservedRunningTime="2026-02-23 14:18:52.863160284 +0000 UTC m=+25.825821580" Feb 23 14:18:52.864108 master-0 kubenswrapper[7728]: I0223 14:18:52.864070 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c9b8f4d95-4vv86"] Feb 23 14:18:52.864369 master-0 kubenswrapper[7728]: E0223 14:18:52.864322 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-6c9b8f4d95-4vv86" podUID="e801012f-bd6e-42a5-81a3-82a8120e7b50" Feb 23 14:18:52.890137 master-0 kubenswrapper[7728]: I0223 14:18:52.890072 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79dfb855f-nzb5f"] Feb 23 14:18:52.890920 master-0 kubenswrapper[7728]: I0223 14:18:52.890697 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79dfb855f-nzb5f" Feb 23 14:18:52.892742 master-0 kubenswrapper[7728]: I0223 14:18:52.892694 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 23 14:18:52.893661 master-0 kubenswrapper[7728]: I0223 14:18:52.893212 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 23 14:18:52.893958 master-0 kubenswrapper[7728]: I0223 14:18:52.893922 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 23 14:18:52.896075 master-0 kubenswrapper[7728]: I0223 14:18:52.894206 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 23 14:18:52.904341 master-0 kubenswrapper[7728]: I0223 14:18:52.899076 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 23 14:18:52.904873 master-0 kubenswrapper[7728]: I0223 14:18:52.904831 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79dfb855f-nzb5f"] Feb 23 14:18:52.992206 master-0 kubenswrapper[7728]: I0223 14:18:52.992148 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g4xp\" (UniqueName: \"kubernetes.io/projected/74ea74df-03a9-4879-b9ed-ea8760619dba-kube-api-access-5g4xp\") pod \"route-controller-manager-79dfb855f-nzb5f\" (UID: \"74ea74df-03a9-4879-b9ed-ea8760619dba\") " pod="openshift-route-controller-manager/route-controller-manager-79dfb855f-nzb5f" Feb 23 14:18:52.992325 master-0 kubenswrapper[7728]: I0223 14:18:52.992225 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74ea74df-03a9-4879-b9ed-ea8760619dba-config\") pod \"route-controller-manager-79dfb855f-nzb5f\" (UID: \"74ea74df-03a9-4879-b9ed-ea8760619dba\") " pod="openshift-route-controller-manager/route-controller-manager-79dfb855f-nzb5f" Feb 23 14:18:52.992422 master-0 kubenswrapper[7728]: I0223 14:18:52.992375 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/74ea74df-03a9-4879-b9ed-ea8760619dba-client-ca\") pod \"route-controller-manager-79dfb855f-nzb5f\" (UID: \"74ea74df-03a9-4879-b9ed-ea8760619dba\") " pod="openshift-route-controller-manager/route-controller-manager-79dfb855f-nzb5f" Feb 23 14:18:52.992574 master-0 kubenswrapper[7728]: I0223 14:18:52.992541 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74ea74df-03a9-4879-b9ed-ea8760619dba-serving-cert\") pod \"route-controller-manager-79dfb855f-nzb5f\" (UID: \"74ea74df-03a9-4879-b9ed-ea8760619dba\") " pod="openshift-route-controller-manager/route-controller-manager-79dfb855f-nzb5f" Feb 23 14:18:53.093815 master-0 kubenswrapper[7728]: I0223 14:18:53.093772 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74ea74df-03a9-4879-b9ed-ea8760619dba-config\") pod \"route-controller-manager-79dfb855f-nzb5f\" (UID: \"74ea74df-03a9-4879-b9ed-ea8760619dba\") " pod="openshift-route-controller-manager/route-controller-manager-79dfb855f-nzb5f" Feb 23 14:18:53.093923 master-0 kubenswrapper[7728]: I0223 14:18:53.093831 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/74ea74df-03a9-4879-b9ed-ea8760619dba-client-ca\") pod \"route-controller-manager-79dfb855f-nzb5f\" (UID: \"74ea74df-03a9-4879-b9ed-ea8760619dba\") " pod="openshift-route-controller-manager/route-controller-manager-79dfb855f-nzb5f" Feb 23 14:18:53.094064 master-0 kubenswrapper[7728]: I0223 14:18:53.094027 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74ea74df-03a9-4879-b9ed-ea8760619dba-serving-cert\") pod \"route-controller-manager-79dfb855f-nzb5f\" (UID: \"74ea74df-03a9-4879-b9ed-ea8760619dba\") " pod="openshift-route-controller-manager/route-controller-manager-79dfb855f-nzb5f" Feb 23 14:18:53.094164 master-0 kubenswrapper[7728]: I0223 14:18:53.094139 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g4xp\" (UniqueName: \"kubernetes.io/projected/74ea74df-03a9-4879-b9ed-ea8760619dba-kube-api-access-5g4xp\") pod \"route-controller-manager-79dfb855f-nzb5f\" (UID: \"74ea74df-03a9-4879-b9ed-ea8760619dba\") " pod="openshift-route-controller-manager/route-controller-manager-79dfb855f-nzb5f" Feb 23 14:18:53.094251 master-0 kubenswrapper[7728]: E0223 14:18:53.094225 7728 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Feb 23 14:18:53.094293 master-0 kubenswrapper[7728]: E0223 14:18:53.094284 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74ea74df-03a9-4879-b9ed-ea8760619dba-serving-cert podName:74ea74df-03a9-4879-b9ed-ea8760619dba nodeName:}" failed. No retries permitted until 2026-02-23 14:18:53.594268233 +0000 UTC m=+26.556929549 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/74ea74df-03a9-4879-b9ed-ea8760619dba-serving-cert") pod "route-controller-manager-79dfb855f-nzb5f" (UID: "74ea74df-03a9-4879-b9ed-ea8760619dba") : secret "serving-cert" not found Feb 23 14:18:53.094851 master-0 kubenswrapper[7728]: I0223 14:18:53.094815 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/74ea74df-03a9-4879-b9ed-ea8760619dba-client-ca\") pod \"route-controller-manager-79dfb855f-nzb5f\" (UID: \"74ea74df-03a9-4879-b9ed-ea8760619dba\") " pod="openshift-route-controller-manager/route-controller-manager-79dfb855f-nzb5f" Feb 23 14:18:53.094927 master-0 kubenswrapper[7728]: I0223 14:18:53.094849 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74ea74df-03a9-4879-b9ed-ea8760619dba-config\") pod \"route-controller-manager-79dfb855f-nzb5f\" (UID: \"74ea74df-03a9-4879-b9ed-ea8760619dba\") " pod="openshift-route-controller-manager/route-controller-manager-79dfb855f-nzb5f" Feb 23 14:18:53.116239 master-0 kubenswrapper[7728]: I0223 14:18:53.116152 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g4xp\" (UniqueName: \"kubernetes.io/projected/74ea74df-03a9-4879-b9ed-ea8760619dba-kube-api-access-5g4xp\") pod \"route-controller-manager-79dfb855f-nzb5f\" (UID: \"74ea74df-03a9-4879-b9ed-ea8760619dba\") " pod="openshift-route-controller-manager/route-controller-manager-79dfb855f-nzb5f" Feb 23 14:18:53.448487 master-0 kubenswrapper[7728]: I0223 14:18:53.448402 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-7cd76464f7-bfxtj"] Feb 23 14:18:53.448736 master-0 kubenswrapper[7728]: E0223 14:18:53.448696 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[audit etcd-client etcd-serving-ca serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" podUID="77499b29-66ab-4302-b2a4-d76c69b86c8c" Feb 23 14:18:53.457065 master-0 kubenswrapper[7728]: I0223 14:18:53.456998 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" event={"ID":"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061","Type":"ContainerStarted","Data":"bb60c962ed53b03fdfea9c76fcac5c126728571b797b7f917d784b1b7debd024"} Feb 23 14:18:53.457176 master-0 kubenswrapper[7728]: I0223 14:18:53.457034 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:53.457744 master-0 kubenswrapper[7728]: I0223 14:18:53.457710 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c9b8f4d95-4vv86" Feb 23 14:18:53.474695 master-0 kubenswrapper[7728]: I0223 14:18:53.474659 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:53.488197 master-0 kubenswrapper[7728]: I0223 14:18:53.488146 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c9b8f4d95-4vv86" Feb 23 14:18:53.498963 master-0 kubenswrapper[7728]: I0223 14:18:53.498847 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-proxy-ca-bundles\") pod \"controller-manager-6c9b8f4d95-4vv86\" (UID: \"e801012f-bd6e-42a5-81a3-82a8120e7b50\") " pod="openshift-controller-manager/controller-manager-6c9b8f4d95-4vv86" Feb 23 14:18:53.498963 master-0 kubenswrapper[7728]: I0223 14:18:53.498931 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e801012f-bd6e-42a5-81a3-82a8120e7b50-serving-cert\") pod \"controller-manager-6c9b8f4d95-4vv86\" (UID: \"e801012f-bd6e-42a5-81a3-82a8120e7b50\") " pod="openshift-controller-manager/controller-manager-6c9b8f4d95-4vv86" Feb 23 14:18:53.499276 master-0 kubenswrapper[7728]: I0223 14:18:53.498977 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-config\") pod \"controller-manager-6c9b8f4d95-4vv86\" (UID: \"e801012f-bd6e-42a5-81a3-82a8120e7b50\") " pod="openshift-controller-manager/controller-manager-6c9b8f4d95-4vv86" Feb 23 14:18:53.499276 master-0 kubenswrapper[7728]: I0223 14:18:53.499070 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-client-ca\") pod \"controller-manager-6c9b8f4d95-4vv86\" (UID: \"e801012f-bd6e-42a5-81a3-82a8120e7b50\") " pod="openshift-controller-manager/controller-manager-6c9b8f4d95-4vv86" Feb 23 14:18:53.500038 master-0 kubenswrapper[7728]: I0223 14:18:53.500009 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-client-ca\") pod \"controller-manager-6c9b8f4d95-4vv86\" (UID: \"e801012f-bd6e-42a5-81a3-82a8120e7b50\") " pod="openshift-controller-manager/controller-manager-6c9b8f4d95-4vv86" Feb 23 14:18:53.503512 master-0 kubenswrapper[7728]: E0223 14:18:53.500276 7728 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Feb 23 14:18:53.503512 master-0 kubenswrapper[7728]: E0223 14:18:53.500390 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e801012f-bd6e-42a5-81a3-82a8120e7b50-serving-cert podName:e801012f-bd6e-42a5-81a3-82a8120e7b50 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:55.500355412 +0000 UTC m=+28.463016768 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/e801012f-bd6e-42a5-81a3-82a8120e7b50-serving-cert") pod "controller-manager-6c9b8f4d95-4vv86" (UID: "e801012f-bd6e-42a5-81a3-82a8120e7b50") : secret "serving-cert" not found Feb 23 14:18:53.503512 master-0 kubenswrapper[7728]: I0223 14:18:53.502288 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-config\") pod \"controller-manager-6c9b8f4d95-4vv86\" (UID: \"e801012f-bd6e-42a5-81a3-82a8120e7b50\") " pod="openshift-controller-manager/controller-manager-6c9b8f4d95-4vv86" Feb 23 14:18:53.503813 master-0 kubenswrapper[7728]: I0223 14:18:53.503739 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-proxy-ca-bundles\") pod \"controller-manager-6c9b8f4d95-4vv86\" (UID: \"e801012f-bd6e-42a5-81a3-82a8120e7b50\") " pod="openshift-controller-manager/controller-manager-6c9b8f4d95-4vv86" Feb 23 14:18:53.602682 master-0 kubenswrapper[7728]: I0223 14:18:53.599763 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-proxy-ca-bundles\") pod \"e801012f-bd6e-42a5-81a3-82a8120e7b50\" (UID: \"e801012f-bd6e-42a5-81a3-82a8120e7b50\") " Feb 23 14:18:53.602682 master-0 kubenswrapper[7728]: I0223 14:18:53.599826 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-config\") pod \"77499b29-66ab-4302-b2a4-d76c69b86c8c\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " Feb 23 14:18:53.602682 master-0 kubenswrapper[7728]: I0223 14:18:53.599855 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpmjj\" (UniqueName: \"kubernetes.io/projected/77499b29-66ab-4302-b2a4-d76c69b86c8c-kube-api-access-lpmjj\") pod \"77499b29-66ab-4302-b2a4-d76c69b86c8c\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " Feb 23 14:18:53.602682 master-0 kubenswrapper[7728]: I0223 14:18:53.599886 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-client-ca\") pod \"e801012f-bd6e-42a5-81a3-82a8120e7b50\" (UID: \"e801012f-bd6e-42a5-81a3-82a8120e7b50\") " Feb 23 14:18:53.602682 master-0 kubenswrapper[7728]: I0223 14:18:53.599914 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-image-import-ca\") pod \"77499b29-66ab-4302-b2a4-d76c69b86c8c\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " Feb 23 14:18:53.602682 master-0 kubenswrapper[7728]: I0223 14:18:53.599936 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/77499b29-66ab-4302-b2a4-d76c69b86c8c-audit-dir\") pod \"77499b29-66ab-4302-b2a4-d76c69b86c8c\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " Feb 23 14:18:53.602682 master-0 kubenswrapper[7728]: I0223 14:18:53.599975 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/77499b29-66ab-4302-b2a4-d76c69b86c8c-node-pullsecrets\") pod \"77499b29-66ab-4302-b2a4-d76c69b86c8c\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " Feb 23 14:18:53.602682 master-0 kubenswrapper[7728]: I0223 14:18:53.600006 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5mv8\" (UniqueName: \"kubernetes.io/projected/e801012f-bd6e-42a5-81a3-82a8120e7b50-kube-api-access-l5mv8\") pod \"e801012f-bd6e-42a5-81a3-82a8120e7b50\" (UID: \"e801012f-bd6e-42a5-81a3-82a8120e7b50\") " Feb 23 14:18:53.602682 master-0 kubenswrapper[7728]: I0223 14:18:53.600031 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-encryption-config\") pod \"77499b29-66ab-4302-b2a4-d76c69b86c8c\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " Feb 23 14:18:53.602682 master-0 kubenswrapper[7728]: I0223 14:18:53.600059 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-config\") pod \"e801012f-bd6e-42a5-81a3-82a8120e7b50\" (UID: \"e801012f-bd6e-42a5-81a3-82a8120e7b50\") " Feb 23 14:18:53.602682 master-0 kubenswrapper[7728]: I0223 14:18:53.600089 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-trusted-ca-bundle\") pod \"77499b29-66ab-4302-b2a4-d76c69b86c8c\" (UID: \"77499b29-66ab-4302-b2a4-d76c69b86c8c\") " Feb 23 14:18:53.602682 master-0 kubenswrapper[7728]: I0223 14:18:53.600222 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-config" (OuterVolumeSpecName: "config") pod "77499b29-66ab-4302-b2a4-d76c69b86c8c" (UID: "77499b29-66ab-4302-b2a4-d76c69b86c8c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:18:53.602682 master-0 kubenswrapper[7728]: I0223 14:18:53.600258 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e801012f-bd6e-42a5-81a3-82a8120e7b50" (UID: "e801012f-bd6e-42a5-81a3-82a8120e7b50"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:18:53.602682 master-0 kubenswrapper[7728]: I0223 14:18:53.600399 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74ea74df-03a9-4879-b9ed-ea8760619dba-serving-cert\") pod \"route-controller-manager-79dfb855f-nzb5f\" (UID: \"74ea74df-03a9-4879-b9ed-ea8760619dba\") " pod="openshift-route-controller-manager/route-controller-manager-79dfb855f-nzb5f" Feb 23 14:18:53.602682 master-0 kubenswrapper[7728]: I0223 14:18:53.600444 7728 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:53.602682 master-0 kubenswrapper[7728]: I0223 14:18:53.600541 7728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:53.602682 master-0 kubenswrapper[7728]: I0223 14:18:53.600270 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77499b29-66ab-4302-b2a4-d76c69b86c8c-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "77499b29-66ab-4302-b2a4-d76c69b86c8c" (UID: "77499b29-66ab-4302-b2a4-d76c69b86c8c"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:18:53.602682 master-0 kubenswrapper[7728]: E0223 14:18:53.600633 7728 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Feb 23 14:18:53.602682 master-0 kubenswrapper[7728]: I0223 14:18:53.600667 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "77499b29-66ab-4302-b2a4-d76c69b86c8c" (UID: "77499b29-66ab-4302-b2a4-d76c69b86c8c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:18:53.602682 master-0 kubenswrapper[7728]: E0223 14:18:53.600682 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74ea74df-03a9-4879-b9ed-ea8760619dba-serving-cert podName:74ea74df-03a9-4879-b9ed-ea8760619dba nodeName:}" failed. No retries permitted until 2026-02-23 14:18:54.600665173 +0000 UTC m=+27.563326469 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/74ea74df-03a9-4879-b9ed-ea8760619dba-serving-cert") pod "route-controller-manager-79dfb855f-nzb5f" (UID: "74ea74df-03a9-4879-b9ed-ea8760619dba") : secret "serving-cert" not found Feb 23 14:18:53.602682 master-0 kubenswrapper[7728]: I0223 14:18:53.600705 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77499b29-66ab-4302-b2a4-d76c69b86c8c-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "77499b29-66ab-4302-b2a4-d76c69b86c8c" (UID: "77499b29-66ab-4302-b2a4-d76c69b86c8c"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:18:53.602682 master-0 kubenswrapper[7728]: I0223 14:18:53.600924 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-client-ca" (OuterVolumeSpecName: "client-ca") pod "e801012f-bd6e-42a5-81a3-82a8120e7b50" (UID: "e801012f-bd6e-42a5-81a3-82a8120e7b50"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:18:53.602682 master-0 kubenswrapper[7728]: I0223 14:18:53.601370 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "77499b29-66ab-4302-b2a4-d76c69b86c8c" (UID: "77499b29-66ab-4302-b2a4-d76c69b86c8c"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:18:53.602682 master-0 kubenswrapper[7728]: I0223 14:18:53.601590 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-config" (OuterVolumeSpecName: "config") pod "e801012f-bd6e-42a5-81a3-82a8120e7b50" (UID: "e801012f-bd6e-42a5-81a3-82a8120e7b50"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:18:53.603518 master-0 kubenswrapper[7728]: I0223 14:18:53.603358 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e801012f-bd6e-42a5-81a3-82a8120e7b50-kube-api-access-l5mv8" (OuterVolumeSpecName: "kube-api-access-l5mv8") pod "e801012f-bd6e-42a5-81a3-82a8120e7b50" (UID: "e801012f-bd6e-42a5-81a3-82a8120e7b50"). InnerVolumeSpecName "kube-api-access-l5mv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:18:53.609423 master-0 kubenswrapper[7728]: I0223 14:18:53.609388 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "77499b29-66ab-4302-b2a4-d76c69b86c8c" (UID: "77499b29-66ab-4302-b2a4-d76c69b86c8c"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:18:53.609423 master-0 kubenswrapper[7728]: I0223 14:18:53.609404 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77499b29-66ab-4302-b2a4-d76c69b86c8c-kube-api-access-lpmjj" (OuterVolumeSpecName: "kube-api-access-lpmjj") pod "77499b29-66ab-4302-b2a4-d76c69b86c8c" (UID: "77499b29-66ab-4302-b2a4-d76c69b86c8c"). InnerVolumeSpecName "kube-api-access-lpmjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:18:53.701580 master-0 kubenswrapper[7728]: I0223 14:18:53.701383 7728 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/77499b29-66ab-4302-b2a4-d76c69b86c8c-node-pullsecrets\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:53.701580 master-0 kubenswrapper[7728]: I0223 14:18:53.701415 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5mv8\" (UniqueName: \"kubernetes.io/projected/e801012f-bd6e-42a5-81a3-82a8120e7b50-kube-api-access-l5mv8\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:53.701580 master-0 kubenswrapper[7728]: I0223 14:18:53.701426 7728 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-encryption-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:53.701580 master-0 kubenswrapper[7728]: I0223 14:18:53.701435 7728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:53.701580 master-0 kubenswrapper[7728]: I0223 14:18:53.701443 7728 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:53.701580 master-0 kubenswrapper[7728]: I0223 14:18:53.701453 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpmjj\" (UniqueName: \"kubernetes.io/projected/77499b29-66ab-4302-b2a4-d76c69b86c8c-kube-api-access-lpmjj\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:53.701580 master-0 kubenswrapper[7728]: I0223 14:18:53.701461 7728 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e801012f-bd6e-42a5-81a3-82a8120e7b50-client-ca\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:53.701580 master-0 kubenswrapper[7728]: I0223 14:18:53.701469 7728 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-image-import-ca\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:53.701580 master-0 kubenswrapper[7728]: I0223 14:18:53.701512 7728 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/77499b29-66ab-4302-b2a4-d76c69b86c8c-audit-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:54.174392 master-0 kubenswrapper[7728]: I0223 14:18:54.174281 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Feb 23 14:18:54.174956 master-0 kubenswrapper[7728]: I0223 14:18:54.174849 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Feb 23 14:18:54.177328 master-0 kubenswrapper[7728]: I0223 14:18:54.177295 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Feb 23 14:18:54.182572 master-0 kubenswrapper[7728]: I0223 14:18:54.182524 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Feb 23 14:18:54.316506 master-0 kubenswrapper[7728]: I0223 14:18:54.314000 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5483bcd0-a9c7-4fdf-9c55-03f85a06b303-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"5483bcd0-a9c7-4fdf-9c55-03f85a06b303\") " pod="openshift-kube-scheduler/installer-1-master-0" Feb 23 14:18:54.316506 master-0 kubenswrapper[7728]: I0223 14:18:54.314297 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5483bcd0-a9c7-4fdf-9c55-03f85a06b303-var-lock\") pod \"installer-1-master-0\" (UID: \"5483bcd0-a9c7-4fdf-9c55-03f85a06b303\") " pod="openshift-kube-scheduler/installer-1-master-0" Feb 23 14:18:54.316506 master-0 kubenswrapper[7728]: I0223 14:18:54.314470 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5483bcd0-a9c7-4fdf-9c55-03f85a06b303-kube-api-access\") pod \"installer-1-master-0\" (UID: \"5483bcd0-a9c7-4fdf-9c55-03f85a06b303\") " pod="openshift-kube-scheduler/installer-1-master-0" Feb 23 14:18:54.415986 master-0 kubenswrapper[7728]: I0223 14:18:54.415861 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5483bcd0-a9c7-4fdf-9c55-03f85a06b303-var-lock\") pod \"installer-1-master-0\" (UID: \"5483bcd0-a9c7-4fdf-9c55-03f85a06b303\") " pod="openshift-kube-scheduler/installer-1-master-0" Feb 23 14:18:54.415986 master-0 kubenswrapper[7728]: I0223 14:18:54.415959 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5483bcd0-a9c7-4fdf-9c55-03f85a06b303-kube-api-access\") pod \"installer-1-master-0\" (UID: \"5483bcd0-a9c7-4fdf-9c55-03f85a06b303\") " pod="openshift-kube-scheduler/installer-1-master-0" Feb 23 14:18:54.416316 master-0 kubenswrapper[7728]: I0223 14:18:54.416047 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5483bcd0-a9c7-4fdf-9c55-03f85a06b303-var-lock\") pod \"installer-1-master-0\" (UID: \"5483bcd0-a9c7-4fdf-9c55-03f85a06b303\") " pod="openshift-kube-scheduler/installer-1-master-0" Feb 23 14:18:54.416316 master-0 kubenswrapper[7728]: I0223 14:18:54.416203 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5483bcd0-a9c7-4fdf-9c55-03f85a06b303-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"5483bcd0-a9c7-4fdf-9c55-03f85a06b303\") " pod="openshift-kube-scheduler/installer-1-master-0" Feb 23 14:18:54.416464 master-0 kubenswrapper[7728]: I0223 14:18:54.416361 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5483bcd0-a9c7-4fdf-9c55-03f85a06b303-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"5483bcd0-a9c7-4fdf-9c55-03f85a06b303\") " pod="openshift-kube-scheduler/installer-1-master-0" Feb 23 14:18:54.436464 master-0 kubenswrapper[7728]: I0223 14:18:54.436376 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5483bcd0-a9c7-4fdf-9c55-03f85a06b303-kube-api-access\") pod \"installer-1-master-0\" (UID: \"5483bcd0-a9c7-4fdf-9c55-03f85a06b303\") " pod="openshift-kube-scheduler/installer-1-master-0" Feb 23 14:18:54.460288 master-0 kubenswrapper[7728]: I0223 14:18:54.460209 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c9b8f4d95-4vv86" Feb 23 14:18:54.460288 master-0 kubenswrapper[7728]: I0223 14:18:54.460281 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-7cd76464f7-bfxtj" Feb 23 14:18:54.472746 master-0 kubenswrapper[7728]: I0223 14:18:54.472699 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79dfb855f-nzb5f"] Feb 23 14:18:54.473094 master-0 kubenswrapper[7728]: E0223 14:18:54.473056 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-79dfb855f-nzb5f" podUID="74ea74df-03a9-4879-b9ed-ea8760619dba" Feb 23 14:18:54.494999 master-0 kubenswrapper[7728]: I0223 14:18:54.494918 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Feb 23 14:18:54.517549 master-0 kubenswrapper[7728]: I0223 14:18:54.517513 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c9b8f4d95-4vv86"] Feb 23 14:18:54.519364 master-0 kubenswrapper[7728]: I0223 14:18:54.519317 7728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6c9b8f4d95-4vv86"] Feb 23 14:18:54.622183 master-0 kubenswrapper[7728]: I0223 14:18:54.622151 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74ea74df-03a9-4879-b9ed-ea8760619dba-serving-cert\") pod \"route-controller-manager-79dfb855f-nzb5f\" (UID: \"74ea74df-03a9-4879-b9ed-ea8760619dba\") " pod="openshift-route-controller-manager/route-controller-manager-79dfb855f-nzb5f" Feb 23 14:18:54.622284 master-0 kubenswrapper[7728]: I0223 14:18:54.622249 7728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e801012f-bd6e-42a5-81a3-82a8120e7b50-serving-cert\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:54.622400 master-0 kubenswrapper[7728]: E0223 14:18:54.622363 7728 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Feb 23 14:18:54.622513 master-0 kubenswrapper[7728]: E0223 14:18:54.622453 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74ea74df-03a9-4879-b9ed-ea8760619dba-serving-cert podName:74ea74df-03a9-4879-b9ed-ea8760619dba nodeName:}" failed. No retries permitted until 2026-02-23 14:18:56.622431838 +0000 UTC m=+29.585093184 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/74ea74df-03a9-4879-b9ed-ea8760619dba-serving-cert") pod "route-controller-manager-79dfb855f-nzb5f" (UID: "74ea74df-03a9-4879-b9ed-ea8760619dba") : secret "serving-cert" not found Feb 23 14:18:54.724255 master-0 kubenswrapper[7728]: I0223 14:18:54.722993 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a801da1-a7eb-4187-98b8-315076f55e19-metrics-tls\") pod \"dns-default-86l7f\" (UID: \"6a801da1-a7eb-4187-98b8-315076f55e19\") " pod="openshift-dns/dns-default-86l7f" Feb 23 14:18:54.724255 master-0 kubenswrapper[7728]: E0223 14:18:54.723202 7728 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Feb 23 14:18:54.724255 master-0 kubenswrapper[7728]: E0223 14:18:54.723256 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a801da1-a7eb-4187-98b8-315076f55e19-metrics-tls podName:6a801da1-a7eb-4187-98b8-315076f55e19 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:58.72323985 +0000 UTC m=+31.685901146 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6a801da1-a7eb-4187-98b8-315076f55e19-metrics-tls") pod "dns-default-86l7f" (UID: "6a801da1-a7eb-4187-98b8-315076f55e19") : secret "dns-default-metrics-tls" not found Feb 23 14:18:55.226773 master-0 kubenswrapper[7728]: I0223 14:18:55.226702 7728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e801012f-bd6e-42a5-81a3-82a8120e7b50" path="/var/lib/kubelet/pods/e801012f-bd6e-42a5-81a3-82a8120e7b50/volumes" Feb 23 14:18:55.447659 master-0 kubenswrapper[7728]: I0223 14:18:55.447553 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-7cd76464f7-bfxtj"] Feb 23 14:18:55.464190 master-0 kubenswrapper[7728]: I0223 14:18:55.464163 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79dfb855f-nzb5f" Feb 23 14:18:55.470616 master-0 kubenswrapper[7728]: I0223 14:18:55.470568 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79dfb855f-nzb5f" Feb 23 14:18:55.537050 master-0 kubenswrapper[7728]: I0223 14:18:55.536897 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5g4xp\" (UniqueName: \"kubernetes.io/projected/74ea74df-03a9-4879-b9ed-ea8760619dba-kube-api-access-5g4xp\") pod \"74ea74df-03a9-4879-b9ed-ea8760619dba\" (UID: \"74ea74df-03a9-4879-b9ed-ea8760619dba\") " Feb 23 14:18:55.537271 master-0 kubenswrapper[7728]: I0223 14:18:55.537055 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74ea74df-03a9-4879-b9ed-ea8760619dba-config\") pod \"74ea74df-03a9-4879-b9ed-ea8760619dba\" (UID: \"74ea74df-03a9-4879-b9ed-ea8760619dba\") " Feb 23 14:18:55.537271 master-0 kubenswrapper[7728]: I0223 14:18:55.537083 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/74ea74df-03a9-4879-b9ed-ea8760619dba-client-ca\") pod \"74ea74df-03a9-4879-b9ed-ea8760619dba\" (UID: \"74ea74df-03a9-4879-b9ed-ea8760619dba\") " Feb 23 14:18:55.537921 master-0 kubenswrapper[7728]: I0223 14:18:55.537869 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74ea74df-03a9-4879-b9ed-ea8760619dba-client-ca" (OuterVolumeSpecName: "client-ca") pod "74ea74df-03a9-4879-b9ed-ea8760619dba" (UID: "74ea74df-03a9-4879-b9ed-ea8760619dba"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:18:55.538300 master-0 kubenswrapper[7728]: I0223 14:18:55.538266 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74ea74df-03a9-4879-b9ed-ea8760619dba-config" (OuterVolumeSpecName: "config") pod "74ea74df-03a9-4879-b9ed-ea8760619dba" (UID: "74ea74df-03a9-4879-b9ed-ea8760619dba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:18:55.539842 master-0 kubenswrapper[7728]: I0223 14:18:55.539775 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74ea74df-03a9-4879-b9ed-ea8760619dba-kube-api-access-5g4xp" (OuterVolumeSpecName: "kube-api-access-5g4xp") pod "74ea74df-03a9-4879-b9ed-ea8760619dba" (UID: "74ea74df-03a9-4879-b9ed-ea8760619dba"). InnerVolumeSpecName "kube-api-access-5g4xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:18:55.606799 master-0 kubenswrapper[7728]: I0223 14:18:55.606743 7728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-apiserver/apiserver-7cd76464f7-bfxtj"] Feb 23 14:18:55.606982 master-0 kubenswrapper[7728]: I0223 14:18:55.606823 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Feb 23 14:18:55.621620 master-0 kubenswrapper[7728]: W0223 14:18:55.621399 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5483bcd0_a9c7_4fdf_9c55_03f85a06b303.slice/crio-204d0959f6c407ac3fee0a708d6a503403aa51babda5e1e3e7023222624b4361 WatchSource:0}: Error finding container 204d0959f6c407ac3fee0a708d6a503403aa51babda5e1e3e7023222624b4361: Status 404 returned error can't find the container with id 204d0959f6c407ac3fee0a708d6a503403aa51babda5e1e3e7023222624b4361 Feb 23 14:18:55.638730 master-0 kubenswrapper[7728]: I0223 14:18:55.638691 7728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74ea74df-03a9-4879-b9ed-ea8760619dba-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:55.638831 master-0 kubenswrapper[7728]: I0223 14:18:55.638736 7728 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/74ea74df-03a9-4879-b9ed-ea8760619dba-client-ca\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:55.638831 master-0 kubenswrapper[7728]: I0223 14:18:55.638757 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5g4xp\" (UniqueName: \"kubernetes.io/projected/74ea74df-03a9-4879-b9ed-ea8760619dba-kube-api-access-5g4xp\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:55.740641 master-0 kubenswrapper[7728]: I0223 14:18:55.740435 7728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-serving-cert\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:55.740641 master-0 kubenswrapper[7728]: I0223 14:18:55.740513 7728 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-audit\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:55.740641 master-0 kubenswrapper[7728]: I0223 14:18:55.740531 7728 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/77499b29-66ab-4302-b2a4-d76c69b86c8c-etcd-client\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:55.740641 master-0 kubenswrapper[7728]: I0223 14:18:55.740548 7728 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/77499b29-66ab-4302-b2a4-d76c69b86c8c-etcd-serving-ca\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:56.434190 master-0 kubenswrapper[7728]: I0223 14:18:56.432996 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5896cbddf7-qqhhj"] Feb 23 14:18:56.434190 master-0 kubenswrapper[7728]: I0223 14:18:56.433952 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5896cbddf7-qqhhj" Feb 23 14:18:56.440775 master-0 kubenswrapper[7728]: I0223 14:18:56.440672 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 23 14:18:56.441009 master-0 kubenswrapper[7728]: I0223 14:18:56.440916 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 23 14:18:56.441009 master-0 kubenswrapper[7728]: I0223 14:18:56.440948 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 23 14:18:56.441227 master-0 kubenswrapper[7728]: I0223 14:18:56.441198 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 23 14:18:56.441530 master-0 kubenswrapper[7728]: I0223 14:18:56.441451 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 23 14:18:56.452595 master-0 kubenswrapper[7728]: I0223 14:18:56.452533 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5896cbddf7-qqhhj"] Feb 23 14:18:56.452764 master-0 kubenswrapper[7728]: I0223 14:18:56.452738 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 23 14:18:56.493032 master-0 kubenswrapper[7728]: I0223 14:18:56.491840 7728 generic.go:334] "Generic (PLEG): container finished" podID="8de1f285-47ac-42aa-8026-8addce656362" containerID="9576dd15e5e70c1d1ba1e6d5d639886620c60fa49c2ad4add67f8fd17b2dd5ba" exitCode=0 Feb 23 14:18:56.493032 master-0 kubenswrapper[7728]: I0223 14:18:56.491952 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" event={"ID":"8de1f285-47ac-42aa-8026-8addce656362","Type":"ContainerDied","Data":"9576dd15e5e70c1d1ba1e6d5d639886620c60fa49c2ad4add67f8fd17b2dd5ba"} Feb 23 14:18:56.493032 master-0 kubenswrapper[7728]: I0223 14:18:56.492349 7728 scope.go:117] "RemoveContainer" containerID="9576dd15e5e70c1d1ba1e6d5d639886620c60fa49c2ad4add67f8fd17b2dd5ba" Feb 23 14:18:56.503406 master-0 kubenswrapper[7728]: I0223 14:18:56.503354 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79dfb855f-nzb5f" Feb 23 14:18:56.503909 master-0 kubenswrapper[7728]: I0223 14:18:56.503859 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"5483bcd0-a9c7-4fdf-9c55-03f85a06b303","Type":"ContainerStarted","Data":"2fddcd1257ffb4e028ba0fdbc707d561c98a5d237e44575892b25e37320d6d1d"} Feb 23 14:18:56.503983 master-0 kubenswrapper[7728]: I0223 14:18:56.503920 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"5483bcd0-a9c7-4fdf-9c55-03f85a06b303","Type":"ContainerStarted","Data":"204d0959f6c407ac3fee0a708d6a503403aa51babda5e1e3e7023222624b4361"} Feb 23 14:18:56.540430 master-0 kubenswrapper[7728]: I0223 14:18:56.540363 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-1-master-0" podStartSLOduration=2.54034317 podStartE2EDuration="2.54034317s" podCreationTimestamp="2026-02-23 14:18:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:18:56.538559952 +0000 UTC m=+29.501221248" watchObservedRunningTime="2026-02-23 14:18:56.54034317 +0000 UTC m=+29.503004466" Feb 23 14:18:56.553677 master-0 kubenswrapper[7728]: I0223 14:18:56.550871 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65845117-85ab-4133-99ed-6dcd6e736e09-config\") pod \"controller-manager-5896cbddf7-qqhhj\" (UID: \"65845117-85ab-4133-99ed-6dcd6e736e09\") " pod="openshift-controller-manager/controller-manager-5896cbddf7-qqhhj" Feb 23 14:18:56.553677 master-0 kubenswrapper[7728]: I0223 14:18:56.550908 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65845117-85ab-4133-99ed-6dcd6e736e09-serving-cert\") pod \"controller-manager-5896cbddf7-qqhhj\" (UID: \"65845117-85ab-4133-99ed-6dcd6e736e09\") " pod="openshift-controller-manager/controller-manager-5896cbddf7-qqhhj" Feb 23 14:18:56.553677 master-0 kubenswrapper[7728]: I0223 14:18:56.550927 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/65845117-85ab-4133-99ed-6dcd6e736e09-proxy-ca-bundles\") pod \"controller-manager-5896cbddf7-qqhhj\" (UID: \"65845117-85ab-4133-99ed-6dcd6e736e09\") " pod="openshift-controller-manager/controller-manager-5896cbddf7-qqhhj" Feb 23 14:18:56.553677 master-0 kubenswrapper[7728]: I0223 14:18:56.550993 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/65845117-85ab-4133-99ed-6dcd6e736e09-client-ca\") pod \"controller-manager-5896cbddf7-qqhhj\" (UID: \"65845117-85ab-4133-99ed-6dcd6e736e09\") " pod="openshift-controller-manager/controller-manager-5896cbddf7-qqhhj" Feb 23 14:18:56.553677 master-0 kubenswrapper[7728]: I0223 14:18:56.551007 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chc9w\" (UniqueName: \"kubernetes.io/projected/65845117-85ab-4133-99ed-6dcd6e736e09-kube-api-access-chc9w\") pod \"controller-manager-5896cbddf7-qqhhj\" (UID: \"65845117-85ab-4133-99ed-6dcd6e736e09\") " pod="openshift-controller-manager/controller-manager-5896cbddf7-qqhhj" Feb 23 14:18:56.564341 master-0 kubenswrapper[7728]: I0223 14:18:56.564285 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79dfb855f-nzb5f"] Feb 23 14:18:56.572236 master-0 kubenswrapper[7728]: I0223 14:18:56.572190 7728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79dfb855f-nzb5f"] Feb 23 14:18:56.652325 master-0 kubenswrapper[7728]: I0223 14:18:56.652275 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/65845117-85ab-4133-99ed-6dcd6e736e09-client-ca\") pod \"controller-manager-5896cbddf7-qqhhj\" (UID: \"65845117-85ab-4133-99ed-6dcd6e736e09\") " pod="openshift-controller-manager/controller-manager-5896cbddf7-qqhhj" Feb 23 14:18:56.652325 master-0 kubenswrapper[7728]: I0223 14:18:56.652316 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chc9w\" (UniqueName: \"kubernetes.io/projected/65845117-85ab-4133-99ed-6dcd6e736e09-kube-api-access-chc9w\") pod \"controller-manager-5896cbddf7-qqhhj\" (UID: \"65845117-85ab-4133-99ed-6dcd6e736e09\") " pod="openshift-controller-manager/controller-manager-5896cbddf7-qqhhj" Feb 23 14:18:56.652586 master-0 kubenswrapper[7728]: I0223 14:18:56.652557 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65845117-85ab-4133-99ed-6dcd6e736e09-config\") pod \"controller-manager-5896cbddf7-qqhhj\" (UID: \"65845117-85ab-4133-99ed-6dcd6e736e09\") " pod="openshift-controller-manager/controller-manager-5896cbddf7-qqhhj" Feb 23 14:18:56.652674 master-0 kubenswrapper[7728]: I0223 14:18:56.652594 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65845117-85ab-4133-99ed-6dcd6e736e09-serving-cert\") pod \"controller-manager-5896cbddf7-qqhhj\" (UID: \"65845117-85ab-4133-99ed-6dcd6e736e09\") " pod="openshift-controller-manager/controller-manager-5896cbddf7-qqhhj" Feb 23 14:18:56.652742 master-0 kubenswrapper[7728]: E0223 14:18:56.652700 7728 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Feb 23 14:18:56.653685 master-0 kubenswrapper[7728]: I0223 14:18:56.653616 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/65845117-85ab-4133-99ed-6dcd6e736e09-client-ca\") pod \"controller-manager-5896cbddf7-qqhhj\" (UID: \"65845117-85ab-4133-99ed-6dcd6e736e09\") " pod="openshift-controller-manager/controller-manager-5896cbddf7-qqhhj" Feb 23 14:18:56.653781 master-0 kubenswrapper[7728]: I0223 14:18:56.652756 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/65845117-85ab-4133-99ed-6dcd6e736e09-proxy-ca-bundles\") pod \"controller-manager-5896cbddf7-qqhhj\" (UID: \"65845117-85ab-4133-99ed-6dcd6e736e09\") " pod="openshift-controller-manager/controller-manager-5896cbddf7-qqhhj" Feb 23 14:18:56.653781 master-0 kubenswrapper[7728]: I0223 14:18:56.653728 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/65845117-85ab-4133-99ed-6dcd6e736e09-proxy-ca-bundles\") pod \"controller-manager-5896cbddf7-qqhhj\" (UID: \"65845117-85ab-4133-99ed-6dcd6e736e09\") " pod="openshift-controller-manager/controller-manager-5896cbddf7-qqhhj" Feb 23 14:18:56.653897 master-0 kubenswrapper[7728]: E0223 14:18:56.653795 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65845117-85ab-4133-99ed-6dcd6e736e09-serving-cert podName:65845117-85ab-4133-99ed-6dcd6e736e09 nodeName:}" failed. No retries permitted until 2026-02-23 14:18:57.15312634 +0000 UTC m=+30.115787636 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/65845117-85ab-4133-99ed-6dcd6e736e09-serving-cert") pod "controller-manager-5896cbddf7-qqhhj" (UID: "65845117-85ab-4133-99ed-6dcd6e736e09") : secret "serving-cert" not found Feb 23 14:18:56.653897 master-0 kubenswrapper[7728]: I0223 14:18:56.653863 7728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74ea74df-03a9-4879-b9ed-ea8760619dba-serving-cert\") on node \"master-0\" DevicePath \"\"" Feb 23 14:18:56.654089 master-0 kubenswrapper[7728]: I0223 14:18:56.654026 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65845117-85ab-4133-99ed-6dcd6e736e09-config\") pod \"controller-manager-5896cbddf7-qqhhj\" (UID: \"65845117-85ab-4133-99ed-6dcd6e736e09\") " pod="openshift-controller-manager/controller-manager-5896cbddf7-qqhhj" Feb 23 14:18:56.687032 master-0 kubenswrapper[7728]: I0223 14:18:56.686906 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chc9w\" (UniqueName: \"kubernetes.io/projected/65845117-85ab-4133-99ed-6dcd6e736e09-kube-api-access-chc9w\") pod \"controller-manager-5896cbddf7-qqhhj\" (UID: \"65845117-85ab-4133-99ed-6dcd6e736e09\") " pod="openshift-controller-manager/controller-manager-5896cbddf7-qqhhj" Feb 23 14:18:57.157451 master-0 kubenswrapper[7728]: I0223 14:18:57.157396 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65845117-85ab-4133-99ed-6dcd6e736e09-serving-cert\") pod \"controller-manager-5896cbddf7-qqhhj\" (UID: \"65845117-85ab-4133-99ed-6dcd6e736e09\") " pod="openshift-controller-manager/controller-manager-5896cbddf7-qqhhj" Feb 23 14:18:57.160576 master-0 kubenswrapper[7728]: I0223 14:18:57.160461 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65845117-85ab-4133-99ed-6dcd6e736e09-serving-cert\") pod \"controller-manager-5896cbddf7-qqhhj\" (UID: \"65845117-85ab-4133-99ed-6dcd6e736e09\") " pod="openshift-controller-manager/controller-manager-5896cbddf7-qqhhj" Feb 23 14:18:57.230620 master-0 kubenswrapper[7728]: I0223 14:18:57.229212 7728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74ea74df-03a9-4879-b9ed-ea8760619dba" path="/var/lib/kubelet/pods/74ea74df-03a9-4879-b9ed-ea8760619dba/volumes" Feb 23 14:18:57.230620 master-0 kubenswrapper[7728]: I0223 14:18:57.229738 7728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77499b29-66ab-4302-b2a4-d76c69b86c8c" path="/var/lib/kubelet/pods/77499b29-66ab-4302-b2a4-d76c69b86c8c/volumes" Feb 23 14:18:57.361819 master-0 kubenswrapper[7728]: I0223 14:18:57.361615 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-746d649f6b-54g2n"] Feb 23 14:18:57.368993 master-0 kubenswrapper[7728]: I0223 14:18:57.362951 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.368993 master-0 kubenswrapper[7728]: I0223 14:18:57.363446 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5896cbddf7-qqhhj" Feb 23 14:18:57.368993 master-0 kubenswrapper[7728]: I0223 14:18:57.368022 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 23 14:18:57.368993 master-0 kubenswrapper[7728]: I0223 14:18:57.368282 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 23 14:18:57.368993 master-0 kubenswrapper[7728]: I0223 14:18:57.368462 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 23 14:18:57.371735 master-0 kubenswrapper[7728]: I0223 14:18:57.371671 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 23 14:18:57.372208 master-0 kubenswrapper[7728]: I0223 14:18:57.372165 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 23 14:18:57.372634 master-0 kubenswrapper[7728]: I0223 14:18:57.372591 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 23 14:18:57.372841 master-0 kubenswrapper[7728]: I0223 14:18:57.372803 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 23 14:18:57.373018 master-0 kubenswrapper[7728]: I0223 14:18:57.372979 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 23 14:18:57.374591 master-0 kubenswrapper[7728]: I0223 14:18:57.374531 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-746d649f6b-54g2n"] Feb 23 14:18:57.375675 master-0 kubenswrapper[7728]: I0223 14:18:57.375624 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 23 14:18:57.385441 master-0 kubenswrapper[7728]: I0223 14:18:57.385383 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 23 14:18:57.461545 master-0 kubenswrapper[7728]: I0223 14:18:57.461139 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7319dd65-3b07-4120-9a7b-60da5d0ed066-node-pullsecrets\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.461545 master-0 kubenswrapper[7728]: I0223 14:18:57.461234 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7319dd65-3b07-4120-9a7b-60da5d0ed066-serving-cert\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.461545 master-0 kubenswrapper[7728]: I0223 14:18:57.461267 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7319dd65-3b07-4120-9a7b-60da5d0ed066-config\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.461545 master-0 kubenswrapper[7728]: I0223 14:18:57.461287 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7319dd65-3b07-4120-9a7b-60da5d0ed066-audit-dir\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.461545 master-0 kubenswrapper[7728]: I0223 14:18:57.461310 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7319dd65-3b07-4120-9a7b-60da5d0ed066-etcd-serving-ca\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.461545 master-0 kubenswrapper[7728]: I0223 14:18:57.461329 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7319dd65-3b07-4120-9a7b-60da5d0ed066-etcd-client\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.461545 master-0 kubenswrapper[7728]: I0223 14:18:57.461347 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7319dd65-3b07-4120-9a7b-60da5d0ed066-audit\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.461545 master-0 kubenswrapper[7728]: I0223 14:18:57.461365 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7319dd65-3b07-4120-9a7b-60da5d0ed066-image-import-ca\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.461545 master-0 kubenswrapper[7728]: I0223 14:18:57.461384 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7319dd65-3b07-4120-9a7b-60da5d0ed066-encryption-config\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.461545 master-0 kubenswrapper[7728]: I0223 14:18:57.461404 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7319dd65-3b07-4120-9a7b-60da5d0ed066-trusted-ca-bundle\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.461545 master-0 kubenswrapper[7728]: I0223 14:18:57.461425 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4wpr\" (UniqueName: \"kubernetes.io/projected/7319dd65-3b07-4120-9a7b-60da5d0ed066-kube-api-access-s4wpr\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.545848 master-0 kubenswrapper[7728]: I0223 14:18:57.520654 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" event={"ID":"8de1f285-47ac-42aa-8026-8addce656362","Type":"ContainerStarted","Data":"5f94c8fde6ae66d48d8282c5c57e237550057485795720cf3e4f35047fc2b408"} Feb 23 14:18:57.588504 master-0 kubenswrapper[7728]: I0223 14:18:57.588033 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4wpr\" (UniqueName: \"kubernetes.io/projected/7319dd65-3b07-4120-9a7b-60da5d0ed066-kube-api-access-s4wpr\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.588504 master-0 kubenswrapper[7728]: I0223 14:18:57.588110 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7319dd65-3b07-4120-9a7b-60da5d0ed066-node-pullsecrets\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.588504 master-0 kubenswrapper[7728]: I0223 14:18:57.588160 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7319dd65-3b07-4120-9a7b-60da5d0ed066-serving-cert\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.588504 master-0 kubenswrapper[7728]: I0223 14:18:57.588184 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7319dd65-3b07-4120-9a7b-60da5d0ed066-config\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.588504 master-0 kubenswrapper[7728]: I0223 14:18:57.588201 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7319dd65-3b07-4120-9a7b-60da5d0ed066-audit-dir\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.588504 master-0 kubenswrapper[7728]: I0223 14:18:57.588220 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7319dd65-3b07-4120-9a7b-60da5d0ed066-etcd-serving-ca\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.588504 master-0 kubenswrapper[7728]: I0223 14:18:57.588236 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7319dd65-3b07-4120-9a7b-60da5d0ed066-etcd-client\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.588504 master-0 kubenswrapper[7728]: I0223 14:18:57.588252 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7319dd65-3b07-4120-9a7b-60da5d0ed066-audit\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.588504 master-0 kubenswrapper[7728]: I0223 14:18:57.588265 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7319dd65-3b07-4120-9a7b-60da5d0ed066-image-import-ca\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.588504 master-0 kubenswrapper[7728]: I0223 14:18:57.588279 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7319dd65-3b07-4120-9a7b-60da5d0ed066-encryption-config\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.588504 master-0 kubenswrapper[7728]: I0223 14:18:57.588295 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7319dd65-3b07-4120-9a7b-60da5d0ed066-trusted-ca-bundle\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.588504 master-0 kubenswrapper[7728]: I0223 14:18:57.588434 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7319dd65-3b07-4120-9a7b-60da5d0ed066-audit-dir\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.593545 master-0 kubenswrapper[7728]: I0223 14:18:57.589137 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7319dd65-3b07-4120-9a7b-60da5d0ed066-image-import-ca\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.593545 master-0 kubenswrapper[7728]: I0223 14:18:57.591899 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7319dd65-3b07-4120-9a7b-60da5d0ed066-audit\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.593545 master-0 kubenswrapper[7728]: I0223 14:18:57.592652 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7319dd65-3b07-4120-9a7b-60da5d0ed066-node-pullsecrets\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.593545 master-0 kubenswrapper[7728]: I0223 14:18:57.593326 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7319dd65-3b07-4120-9a7b-60da5d0ed066-config\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.593545 master-0 kubenswrapper[7728]: I0223 14:18:57.593541 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7319dd65-3b07-4120-9a7b-60da5d0ed066-etcd-serving-ca\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.595003 master-0 kubenswrapper[7728]: I0223 14:18:57.594649 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7319dd65-3b07-4120-9a7b-60da5d0ed066-trusted-ca-bundle\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.595161 master-0 kubenswrapper[7728]: I0223 14:18:57.595132 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7319dd65-3b07-4120-9a7b-60da5d0ed066-etcd-client\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.604115 master-0 kubenswrapper[7728]: I0223 14:18:57.604065 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7319dd65-3b07-4120-9a7b-60da5d0ed066-serving-cert\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.615060 master-0 kubenswrapper[7728]: I0223 14:18:57.614948 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4wpr\" (UniqueName: \"kubernetes.io/projected/7319dd65-3b07-4120-9a7b-60da5d0ed066-kube-api-access-s4wpr\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.616929 master-0 kubenswrapper[7728]: I0223 14:18:57.616881 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7319dd65-3b07-4120-9a7b-60da5d0ed066-encryption-config\") pod \"apiserver-746d649f6b-54g2n\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:57.665266 master-0 kubenswrapper[7728]: I0223 14:18:57.665227 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5896cbddf7-qqhhj"] Feb 23 14:18:57.671038 master-0 kubenswrapper[7728]: W0223 14:18:57.670991 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65845117_85ab_4133_99ed_6dcd6e736e09.slice/crio-e64ca766bdf85321a781ce69ed3e1869acf08ca216f304fdb5561750a836b6b8 WatchSource:0}: Error finding container e64ca766bdf85321a781ce69ed3e1869acf08ca216f304fdb5561750a836b6b8: Status 404 returned error can't find the container with id e64ca766bdf85321a781ce69ed3e1869acf08ca216f304fdb5561750a836b6b8 Feb 23 14:18:57.687565 master-0 kubenswrapper[7728]: I0223 14:18:57.687522 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:18:58.075012 master-0 kubenswrapper[7728]: I0223 14:18:58.074958 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-746d649f6b-54g2n"] Feb 23 14:18:58.359369 master-0 kubenswrapper[7728]: I0223 14:18:58.359263 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56fb65d69d-lnsbc"] Feb 23 14:18:58.360440 master-0 kubenswrapper[7728]: I0223 14:18:58.360045 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56fb65d69d-lnsbc" Feb 23 14:18:58.363802 master-0 kubenswrapper[7728]: I0223 14:18:58.362689 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 23 14:18:58.363802 master-0 kubenswrapper[7728]: I0223 14:18:58.362738 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 23 14:18:58.363802 master-0 kubenswrapper[7728]: I0223 14:18:58.362699 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 23 14:18:58.363802 master-0 kubenswrapper[7728]: I0223 14:18:58.362934 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 23 14:18:58.363802 master-0 kubenswrapper[7728]: I0223 14:18:58.363510 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 23 14:18:58.373380 master-0 kubenswrapper[7728]: I0223 14:18:58.373323 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56fb65d69d-lnsbc"] Feb 23 14:18:58.498684 master-0 kubenswrapper[7728]: I0223 14:18:58.498599 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/722d4323-f61f-43f0-958a-38b117880306-client-ca\") pod \"route-controller-manager-56fb65d69d-lnsbc\" (UID: \"722d4323-f61f-43f0-958a-38b117880306\") " pod="openshift-route-controller-manager/route-controller-manager-56fb65d69d-lnsbc" Feb 23 14:18:58.498684 master-0 kubenswrapper[7728]: I0223 14:18:58.498646 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/722d4323-f61f-43f0-958a-38b117880306-serving-cert\") pod \"route-controller-manager-56fb65d69d-lnsbc\" (UID: \"722d4323-f61f-43f0-958a-38b117880306\") " pod="openshift-route-controller-manager/route-controller-manager-56fb65d69d-lnsbc" Feb 23 14:18:58.498684 master-0 kubenswrapper[7728]: I0223 14:18:58.498673 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/722d4323-f61f-43f0-958a-38b117880306-config\") pod \"route-controller-manager-56fb65d69d-lnsbc\" (UID: \"722d4323-f61f-43f0-958a-38b117880306\") " pod="openshift-route-controller-manager/route-controller-manager-56fb65d69d-lnsbc" Feb 23 14:18:58.499208 master-0 kubenswrapper[7728]: I0223 14:18:58.498730 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8cg6\" (UniqueName: \"kubernetes.io/projected/722d4323-f61f-43f0-958a-38b117880306-kube-api-access-f8cg6\") pod \"route-controller-manager-56fb65d69d-lnsbc\" (UID: \"722d4323-f61f-43f0-958a-38b117880306\") " pod="openshift-route-controller-manager/route-controller-manager-56fb65d69d-lnsbc" Feb 23 14:18:58.524679 master-0 kubenswrapper[7728]: I0223 14:18:58.524618 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5896cbddf7-qqhhj" event={"ID":"65845117-85ab-4133-99ed-6dcd6e736e09","Type":"ContainerStarted","Data":"e64ca766bdf85321a781ce69ed3e1869acf08ca216f304fdb5561750a836b6b8"} Feb 23 14:18:58.526041 master-0 kubenswrapper[7728]: I0223 14:18:58.525974 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-746d649f6b-54g2n" event={"ID":"7319dd65-3b07-4120-9a7b-60da5d0ed066","Type":"ContainerStarted","Data":"8e5569d1d3d19f6d47cce98e450c91513b5ae752318bb0728d77dfab67e4d723"} Feb 23 14:18:58.600199 master-0 kubenswrapper[7728]: I0223 14:18:58.600127 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8cg6\" (UniqueName: \"kubernetes.io/projected/722d4323-f61f-43f0-958a-38b117880306-kube-api-access-f8cg6\") pod \"route-controller-manager-56fb65d69d-lnsbc\" (UID: \"722d4323-f61f-43f0-958a-38b117880306\") " pod="openshift-route-controller-manager/route-controller-manager-56fb65d69d-lnsbc" Feb 23 14:18:58.600448 master-0 kubenswrapper[7728]: I0223 14:18:58.600414 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/722d4323-f61f-43f0-958a-38b117880306-client-ca\") pod \"route-controller-manager-56fb65d69d-lnsbc\" (UID: \"722d4323-f61f-43f0-958a-38b117880306\") " pod="openshift-route-controller-manager/route-controller-manager-56fb65d69d-lnsbc" Feb 23 14:18:58.600448 master-0 kubenswrapper[7728]: I0223 14:18:58.600444 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/722d4323-f61f-43f0-958a-38b117880306-serving-cert\") pod \"route-controller-manager-56fb65d69d-lnsbc\" (UID: \"722d4323-f61f-43f0-958a-38b117880306\") " pod="openshift-route-controller-manager/route-controller-manager-56fb65d69d-lnsbc" Feb 23 14:18:58.600590 master-0 kubenswrapper[7728]: I0223 14:18:58.600505 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/722d4323-f61f-43f0-958a-38b117880306-config\") pod \"route-controller-manager-56fb65d69d-lnsbc\" (UID: \"722d4323-f61f-43f0-958a-38b117880306\") " pod="openshift-route-controller-manager/route-controller-manager-56fb65d69d-lnsbc" Feb 23 14:18:58.601650 master-0 kubenswrapper[7728]: I0223 14:18:58.601617 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/722d4323-f61f-43f0-958a-38b117880306-config\") pod \"route-controller-manager-56fb65d69d-lnsbc\" (UID: \"722d4323-f61f-43f0-958a-38b117880306\") " pod="openshift-route-controller-manager/route-controller-manager-56fb65d69d-lnsbc" Feb 23 14:18:58.601745 master-0 kubenswrapper[7728]: I0223 14:18:58.601692 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/722d4323-f61f-43f0-958a-38b117880306-client-ca\") pod \"route-controller-manager-56fb65d69d-lnsbc\" (UID: \"722d4323-f61f-43f0-958a-38b117880306\") " pod="openshift-route-controller-manager/route-controller-manager-56fb65d69d-lnsbc" Feb 23 14:18:58.604203 master-0 kubenswrapper[7728]: I0223 14:18:58.604171 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/722d4323-f61f-43f0-958a-38b117880306-serving-cert\") pod \"route-controller-manager-56fb65d69d-lnsbc\" (UID: \"722d4323-f61f-43f0-958a-38b117880306\") " pod="openshift-route-controller-manager/route-controller-manager-56fb65d69d-lnsbc" Feb 23 14:18:58.625705 master-0 kubenswrapper[7728]: I0223 14:18:58.625603 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8cg6\" (UniqueName: \"kubernetes.io/projected/722d4323-f61f-43f0-958a-38b117880306-kube-api-access-f8cg6\") pod \"route-controller-manager-56fb65d69d-lnsbc\" (UID: \"722d4323-f61f-43f0-958a-38b117880306\") " pod="openshift-route-controller-manager/route-controller-manager-56fb65d69d-lnsbc" Feb 23 14:18:58.642937 master-0 kubenswrapper[7728]: I0223 14:18:58.642896 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-746d649f6b-54g2n"] Feb 23 14:18:58.698548 master-0 kubenswrapper[7728]: I0223 14:18:58.698455 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56fb65d69d-lnsbc" Feb 23 14:18:58.811991 master-0 kubenswrapper[7728]: I0223 14:18:58.811935 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a801da1-a7eb-4187-98b8-315076f55e19-metrics-tls\") pod \"dns-default-86l7f\" (UID: \"6a801da1-a7eb-4187-98b8-315076f55e19\") " pod="openshift-dns/dns-default-86l7f" Feb 23 14:18:58.814989 master-0 kubenswrapper[7728]: I0223 14:18:58.814932 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a801da1-a7eb-4187-98b8-315076f55e19-metrics-tls\") pod \"dns-default-86l7f\" (UID: \"6a801da1-a7eb-4187-98b8-315076f55e19\") " pod="openshift-dns/dns-default-86l7f" Feb 23 14:18:59.043773 master-0 kubenswrapper[7728]: I0223 14:18:59.043282 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-86l7f" Feb 23 14:18:59.063436 master-0 kubenswrapper[7728]: I0223 14:18:59.062870 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56fb65d69d-lnsbc"] Feb 23 14:18:59.073142 master-0 kubenswrapper[7728]: W0223 14:18:59.073105 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod722d4323_f61f_43f0_958a_38b117880306.slice/crio-5fc2b94e0100ca2b4a22051eea718d3cb70fbda931eb12d807c250eac49aeec2 WatchSource:0}: Error finding container 5fc2b94e0100ca2b4a22051eea718d3cb70fbda931eb12d807c250eac49aeec2: Status 404 returned error can't find the container with id 5fc2b94e0100ca2b4a22051eea718d3cb70fbda931eb12d807c250eac49aeec2 Feb 23 14:18:59.429591 master-0 kubenswrapper[7728]: I0223 14:18:59.429466 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-86l7f"] Feb 23 14:18:59.530607 master-0 kubenswrapper[7728]: I0223 14:18:59.530548 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56fb65d69d-lnsbc" event={"ID":"722d4323-f61f-43f0-958a-38b117880306","Type":"ContainerStarted","Data":"5fc2b94e0100ca2b4a22051eea718d3cb70fbda931eb12d807c250eac49aeec2"} Feb 23 14:19:00.027583 master-0 kubenswrapper[7728]: I0223 14:19:00.027409 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:19:00.027583 master-0 kubenswrapper[7728]: I0223 14:19:00.027459 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics\") pod \"marketplace-operator-6f5488b997-7b5sp\" (UID: \"585f74db-4593-426b-b0c7-ec8f64810549\") " pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:19:00.027583 master-0 kubenswrapper[7728]: I0223 14:19:00.027495 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert\") pod \"package-server-manager-5c75f78c8b-cj2l7\" (UID: \"5b54fc16-d2f7-4b10-a611-5b411b389c5a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:19:00.027583 master-0 kubenswrapper[7728]: I0223 14:19:00.027530 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs\") pod \"network-metrics-daemon-9dnsv\" (UID: \"ace75aae-6f4f-4299-90e2-d5292271b136\") " pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:19:00.027583 master-0 kubenswrapper[7728]: I0223 14:19:00.027548 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6bb6d78bf-wzqcp\" (UID: \"646fece3-4a42-4e0c-bcc7-5f705f948d63\") " pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:19:00.027583 master-0 kubenswrapper[7728]: I0223 14:19:00.027566 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs\") pod \"multus-admission-controller-5f98f4f8d5-fnc9v\" (UID: \"842d45c5-3452-4e97-b5f5-540395330a65\") " pod="openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v" Feb 23 14:19:00.031262 master-0 kubenswrapper[7728]: I0223 14:19:00.031224 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs\") pod \"multus-admission-controller-5f98f4f8d5-fnc9v\" (UID: \"842d45c5-3452-4e97-b5f5-540395330a65\") " pod="openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v" Feb 23 14:19:00.032983 master-0 kubenswrapper[7728]: I0223 14:19:00.031790 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert\") pod \"package-server-manager-5c75f78c8b-cj2l7\" (UID: \"5b54fc16-d2f7-4b10-a611-5b411b389c5a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:19:00.032983 master-0 kubenswrapper[7728]: I0223 14:19:00.032638 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics\") pod \"marketplace-operator-6f5488b997-7b5sp\" (UID: \"585f74db-4593-426b-b0c7-ec8f64810549\") " pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:19:00.033118 master-0 kubenswrapper[7728]: I0223 14:19:00.033077 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:19:00.035579 master-0 kubenswrapper[7728]: I0223 14:19:00.035559 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs\") pod \"network-metrics-daemon-9dnsv\" (UID: \"ace75aae-6f4f-4299-90e2-d5292271b136\") " pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:19:00.035738 master-0 kubenswrapper[7728]: I0223 14:19:00.035695 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6bb6d78bf-wzqcp\" (UID: \"646fece3-4a42-4e0c-bcc7-5f705f948d63\") " pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:19:00.263819 master-0 kubenswrapper[7728]: I0223 14:19:00.263763 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:19:00.263819 master-0 kubenswrapper[7728]: I0223 14:19:00.263804 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:19:00.264309 master-0 kubenswrapper[7728]: I0223 14:19:00.264284 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:19:00.264533 master-0 kubenswrapper[7728]: I0223 14:19:00.264515 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:19:00.270037 master-0 kubenswrapper[7728]: I0223 14:19:00.270001 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v" Feb 23 14:19:00.270373 master-0 kubenswrapper[7728]: I0223 14:19:00.270350 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:19:00.290956 master-0 kubenswrapper[7728]: W0223 14:19:00.290852 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a801da1_a7eb_4187_98b8_315076f55e19.slice/crio-6d0f92f8c3e4f5b782259d4379958d9e827827d44cda6b952bb56160c213bbf8 WatchSource:0}: Error finding container 6d0f92f8c3e4f5b782259d4379958d9e827827d44cda6b952bb56160c213bbf8: Status 404 returned error can't find the container with id 6d0f92f8c3e4f5b782259d4379958d9e827827d44cda6b952bb56160c213bbf8 Feb 23 14:19:00.534541 master-0 kubenswrapper[7728]: I0223 14:19:00.534465 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-86l7f" event={"ID":"6a801da1-a7eb-4187-98b8-315076f55e19","Type":"ContainerStarted","Data":"6d0f92f8c3e4f5b782259d4379958d9e827827d44cda6b952bb56160c213bbf8"} Feb 23 14:19:04.700935 master-0 kubenswrapper[7728]: I0223 14:19:04.700873 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Feb 23 14:19:04.701656 master-0 kubenswrapper[7728]: I0223 14:19:04.701167 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-1-master-0" podUID="5483bcd0-a9c7-4fdf-9c55-03f85a06b303" containerName="installer" containerID="cri-o://2fddcd1257ffb4e028ba0fdbc707d561c98a5d237e44575892b25e37320d6d1d" gracePeriod=30 Feb 23 14:19:04.750527 master-0 kubenswrapper[7728]: I0223 14:19:04.747978 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4"] Feb 23 14:19:04.750527 master-0 kubenswrapper[7728]: I0223 14:19:04.748537 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:19:04.758011 master-0 kubenswrapper[7728]: I0223 14:19:04.757255 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 23 14:19:04.758011 master-0 kubenswrapper[7728]: I0223 14:19:04.757471 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 23 14:19:04.758011 master-0 kubenswrapper[7728]: I0223 14:19:04.757627 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 23 14:19:04.758011 master-0 kubenswrapper[7728]: I0223 14:19:04.757791 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 23 14:19:04.758011 master-0 kubenswrapper[7728]: I0223 14:19:04.757910 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 23 14:19:04.758364 master-0 kubenswrapper[7728]: I0223 14:19:04.758050 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 23 14:19:04.758364 master-0 kubenswrapper[7728]: I0223 14:19:04.758229 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 23 14:19:04.761023 master-0 kubenswrapper[7728]: I0223 14:19:04.760569 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 23 14:19:04.766259 master-0 kubenswrapper[7728]: I0223 14:19:04.765355 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg"] Feb 23 14:19:04.766259 master-0 kubenswrapper[7728]: I0223 14:19:04.765641 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" podUID="3cea0ab8-258b-486c-bb7f-8c93930b296d" containerName="cluster-version-operator" containerID="cri-o://e532ebe35ecd05d1d110750408a3aca8e2e0ca55c7ed17dc1b108800da8ba8b6" gracePeriod=130 Feb 23 14:19:04.778733 master-0 kubenswrapper[7728]: I0223 14:19:04.773427 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4"] Feb 23 14:19:04.902639 master-0 kubenswrapper[7728]: I0223 14:19:04.893562 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ea0b3538-9a7d-4995-b628-2d63f21d683c-etcd-serving-ca\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:19:04.902639 master-0 kubenswrapper[7728]: I0223 14:19:04.893602 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cd7w\" (UniqueName: \"kubernetes.io/projected/ea0b3538-9a7d-4995-b628-2d63f21d683c-kube-api-access-2cd7w\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:19:04.902639 master-0 kubenswrapper[7728]: I0223 14:19:04.893629 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ea0b3538-9a7d-4995-b628-2d63f21d683c-etcd-client\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:19:04.902639 master-0 kubenswrapper[7728]: I0223 14:19:04.893645 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea0b3538-9a7d-4995-b628-2d63f21d683c-serving-cert\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:19:04.902639 master-0 kubenswrapper[7728]: I0223 14:19:04.893805 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea0b3538-9a7d-4995-b628-2d63f21d683c-trusted-ca-bundle\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:19:04.902639 master-0 kubenswrapper[7728]: I0223 14:19:04.893828 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ea0b3538-9a7d-4995-b628-2d63f21d683c-audit-policies\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:19:04.902639 master-0 kubenswrapper[7728]: I0223 14:19:04.893853 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ea0b3538-9a7d-4995-b628-2d63f21d683c-audit-dir\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:19:04.902639 master-0 kubenswrapper[7728]: I0223 14:19:04.893880 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ea0b3538-9a7d-4995-b628-2d63f21d683c-encryption-config\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:19:04.972385 master-0 kubenswrapper[7728]: I0223 14:19:04.972278 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s"] Feb 23 14:19:04.991788 master-0 kubenswrapper[7728]: I0223 14:19:04.991743 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9"] Feb 23 14:19:04.992114 master-0 kubenswrapper[7728]: I0223 14:19:04.992092 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9"] Feb 23 14:19:04.992114 master-0 kubenswrapper[7728]: I0223 14:19:04.992111 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s"] Feb 23 14:19:04.992206 master-0 kubenswrapper[7728]: I0223 14:19:04.992185 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:19:04.992508 master-0 kubenswrapper[7728]: I0223 14:19:04.992472 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:19:04.994466 master-0 kubenswrapper[7728]: I0223 14:19:04.994377 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ea0b3538-9a7d-4995-b628-2d63f21d683c-audit-dir\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:19:04.994466 master-0 kubenswrapper[7728]: I0223 14:19:04.994425 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ea0b3538-9a7d-4995-b628-2d63f21d683c-encryption-config\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:19:04.994573 master-0 kubenswrapper[7728]: I0223 14:19:04.994449 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ea0b3538-9a7d-4995-b628-2d63f21d683c-etcd-serving-ca\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:19:04.994573 master-0 kubenswrapper[7728]: I0223 14:19:04.994527 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cd7w\" (UniqueName: \"kubernetes.io/projected/ea0b3538-9a7d-4995-b628-2d63f21d683c-kube-api-access-2cd7w\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:19:04.994573 master-0 kubenswrapper[7728]: I0223 14:19:04.994547 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ea0b3538-9a7d-4995-b628-2d63f21d683c-etcd-client\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:19:04.994573 master-0 kubenswrapper[7728]: I0223 14:19:04.994559 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea0b3538-9a7d-4995-b628-2d63f21d683c-serving-cert\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:19:04.994715 master-0 kubenswrapper[7728]: I0223 14:19:04.994575 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea0b3538-9a7d-4995-b628-2d63f21d683c-trusted-ca-bundle\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:19:04.994715 master-0 kubenswrapper[7728]: I0223 14:19:04.994595 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ea0b3538-9a7d-4995-b628-2d63f21d683c-audit-policies\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:19:04.995386 master-0 kubenswrapper[7728]: I0223 14:19:04.995357 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ea0b3538-9a7d-4995-b628-2d63f21d683c-audit-policies\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:19:04.995428 master-0 kubenswrapper[7728]: I0223 14:19:04.995407 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ea0b3538-9a7d-4995-b628-2d63f21d683c-audit-dir\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:19:05.011362 master-0 kubenswrapper[7728]: I0223 14:19:05.001168 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea0b3538-9a7d-4995-b628-2d63f21d683c-trusted-ca-bundle\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:19:05.015260 master-0 kubenswrapper[7728]: I0223 14:19:05.013919 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ea0b3538-9a7d-4995-b628-2d63f21d683c-etcd-serving-ca\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:19:05.015260 master-0 kubenswrapper[7728]: I0223 14:19:05.014224 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ea0b3538-9a7d-4995-b628-2d63f21d683c-encryption-config\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:19:05.024531 master-0 kubenswrapper[7728]: I0223 14:19:05.021347 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Feb 23 14:19:05.024531 master-0 kubenswrapper[7728]: I0223 14:19:05.021559 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Feb 23 14:19:05.031936 master-0 kubenswrapper[7728]: I0223 14:19:05.026158 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea0b3538-9a7d-4995-b628-2d63f21d683c-serving-cert\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:19:05.032770 master-0 kubenswrapper[7728]: I0223 14:19:05.032423 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ea0b3538-9a7d-4995-b628-2d63f21d683c-etcd-client\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:19:05.032770 master-0 kubenswrapper[7728]: I0223 14:19:05.032448 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Feb 23 14:19:05.032770 master-0 kubenswrapper[7728]: I0223 14:19:05.032646 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Feb 23 14:19:05.032770 master-0 kubenswrapper[7728]: I0223 14:19:05.032702 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Feb 23 14:19:05.037041 master-0 kubenswrapper[7728]: I0223 14:19:05.036686 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Feb 23 14:19:05.040576 master-0 kubenswrapper[7728]: I0223 14:19:05.038553 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Feb 23 14:19:05.052179 master-0 kubenswrapper[7728]: I0223 14:19:05.052128 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cd7w\" (UniqueName: \"kubernetes.io/projected/ea0b3538-9a7d-4995-b628-2d63f21d683c-kube-api-access-2cd7w\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:19:05.054421 master-0 kubenswrapper[7728]: I0223 14:19:05.054351 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v"] Feb 23 14:19:05.095418 master-0 kubenswrapper[7728]: I0223 14:19:05.095374 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/66c72c71-f74a-43ab-bf0d-1f4c93623774-catalogserver-certs\") pod \"catalogd-controller-manager-84b8d9d697-2hr5s\" (UID: \"66c72c71-f74a-43ab-bf0d-1f4c93623774\") " pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:19:05.095418 master-0 kubenswrapper[7728]: I0223 14:19:05.095426 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/66c72c71-f74a-43ab-bf0d-1f4c93623774-etc-docker\") pod \"catalogd-controller-manager-84b8d9d697-2hr5s\" (UID: \"66c72c71-f74a-43ab-bf0d-1f4c93623774\") " pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:19:05.095677 master-0 kubenswrapper[7728]: I0223 14:19:05.095453 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/66c72c71-f74a-43ab-bf0d-1f4c93623774-cache\") pod \"catalogd-controller-manager-84b8d9d697-2hr5s\" (UID: \"66c72c71-f74a-43ab-bf0d-1f4c93623774\") " pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:19:05.095677 master-0 kubenswrapper[7728]: I0223 14:19:05.095639 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/66c72c71-f74a-43ab-bf0d-1f4c93623774-etc-containers\") pod \"catalogd-controller-manager-84b8d9d697-2hr5s\" (UID: \"66c72c71-f74a-43ab-bf0d-1f4c93623774\") " pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:19:05.095776 master-0 kubenswrapper[7728]: I0223 14:19:05.095739 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlqzc\" (UniqueName: \"kubernetes.io/projected/66c72c71-f74a-43ab-bf0d-1f4c93623774-kube-api-access-xlqzc\") pod \"catalogd-controller-manager-84b8d9d697-2hr5s\" (UID: \"66c72c71-f74a-43ab-bf0d-1f4c93623774\") " pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:19:05.095846 master-0 kubenswrapper[7728]: I0223 14:19:05.095785 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1c60ff3f-2bb1-422e-be27-5eca96d85fd2-etc-containers\") pod \"operator-controller-controller-manager-9cc7d7bb-6zmk9\" (UID: \"1c60ff3f-2bb1-422e-be27-5eca96d85fd2\") " pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:19:05.095891 master-0 kubenswrapper[7728]: I0223 14:19:05.095850 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/1c60ff3f-2bb1-422e-be27-5eca96d85fd2-ca-certs\") pod \"operator-controller-controller-manager-9cc7d7bb-6zmk9\" (UID: \"1c60ff3f-2bb1-422e-be27-5eca96d85fd2\") " pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:19:05.095891 master-0 kubenswrapper[7728]: I0223 14:19:05.095876 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1c60ff3f-2bb1-422e-be27-5eca96d85fd2-etc-docker\") pod \"operator-controller-controller-manager-9cc7d7bb-6zmk9\" (UID: \"1c60ff3f-2bb1-422e-be27-5eca96d85fd2\") " pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:19:05.095959 master-0 kubenswrapper[7728]: I0223 14:19:05.095903 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1c60ff3f-2bb1-422e-be27-5eca96d85fd2-cache\") pod \"operator-controller-controller-manager-9cc7d7bb-6zmk9\" (UID: \"1c60ff3f-2bb1-422e-be27-5eca96d85fd2\") " pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:19:05.095959 master-0 kubenswrapper[7728]: I0223 14:19:05.095932 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlz28\" (UniqueName: \"kubernetes.io/projected/1c60ff3f-2bb1-422e-be27-5eca96d85fd2-kube-api-access-jlz28\") pod \"operator-controller-controller-manager-9cc7d7bb-6zmk9\" (UID: \"1c60ff3f-2bb1-422e-be27-5eca96d85fd2\") " pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:19:05.096040 master-0 kubenswrapper[7728]: I0223 14:19:05.095978 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/66c72c71-f74a-43ab-bf0d-1f4c93623774-ca-certs\") pod \"catalogd-controller-manager-84b8d9d697-2hr5s\" (UID: \"66c72c71-f74a-43ab-bf0d-1f4c93623774\") " pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:19:05.121816 master-0 kubenswrapper[7728]: I0223 14:19:05.121767 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:19:05.196591 master-0 kubenswrapper[7728]: I0223 14:19:05.196533 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/66c72c71-f74a-43ab-bf0d-1f4c93623774-cache\") pod \"catalogd-controller-manager-84b8d9d697-2hr5s\" (UID: \"66c72c71-f74a-43ab-bf0d-1f4c93623774\") " pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:19:05.196591 master-0 kubenswrapper[7728]: I0223 14:19:05.196587 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/66c72c71-f74a-43ab-bf0d-1f4c93623774-etc-containers\") pod \"catalogd-controller-manager-84b8d9d697-2hr5s\" (UID: \"66c72c71-f74a-43ab-bf0d-1f4c93623774\") " pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:19:05.196835 master-0 kubenswrapper[7728]: I0223 14:19:05.196622 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlqzc\" (UniqueName: \"kubernetes.io/projected/66c72c71-f74a-43ab-bf0d-1f4c93623774-kube-api-access-xlqzc\") pod \"catalogd-controller-manager-84b8d9d697-2hr5s\" (UID: \"66c72c71-f74a-43ab-bf0d-1f4c93623774\") " pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:19:05.196835 master-0 kubenswrapper[7728]: I0223 14:19:05.196738 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/66c72c71-f74a-43ab-bf0d-1f4c93623774-etc-containers\") pod \"catalogd-controller-manager-84b8d9d697-2hr5s\" (UID: \"66c72c71-f74a-43ab-bf0d-1f4c93623774\") " pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:19:05.196913 master-0 kubenswrapper[7728]: I0223 14:19:05.196877 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1c60ff3f-2bb1-422e-be27-5eca96d85fd2-etc-containers\") pod \"operator-controller-controller-manager-9cc7d7bb-6zmk9\" (UID: \"1c60ff3f-2bb1-422e-be27-5eca96d85fd2\") " pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:19:05.196981 master-0 kubenswrapper[7728]: I0223 14:19:05.196948 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1c60ff3f-2bb1-422e-be27-5eca96d85fd2-etc-containers\") pod \"operator-controller-controller-manager-9cc7d7bb-6zmk9\" (UID: \"1c60ff3f-2bb1-422e-be27-5eca96d85fd2\") " pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:19:05.196981 master-0 kubenswrapper[7728]: I0223 14:19:05.196956 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/1c60ff3f-2bb1-422e-be27-5eca96d85fd2-ca-certs\") pod \"operator-controller-controller-manager-9cc7d7bb-6zmk9\" (UID: \"1c60ff3f-2bb1-422e-be27-5eca96d85fd2\") " pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:19:05.197046 master-0 kubenswrapper[7728]: I0223 14:19:05.196994 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1c60ff3f-2bb1-422e-be27-5eca96d85fd2-etc-docker\") pod \"operator-controller-controller-manager-9cc7d7bb-6zmk9\" (UID: \"1c60ff3f-2bb1-422e-be27-5eca96d85fd2\") " pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:19:05.197046 master-0 kubenswrapper[7728]: I0223 14:19:05.197026 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1c60ff3f-2bb1-422e-be27-5eca96d85fd2-cache\") pod \"operator-controller-controller-manager-9cc7d7bb-6zmk9\" (UID: \"1c60ff3f-2bb1-422e-be27-5eca96d85fd2\") " pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:19:05.197108 master-0 kubenswrapper[7728]: I0223 14:19:05.197069 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlz28\" (UniqueName: \"kubernetes.io/projected/1c60ff3f-2bb1-422e-be27-5eca96d85fd2-kube-api-access-jlz28\") pod \"operator-controller-controller-manager-9cc7d7bb-6zmk9\" (UID: \"1c60ff3f-2bb1-422e-be27-5eca96d85fd2\") " pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:19:05.197108 master-0 kubenswrapper[7728]: I0223 14:19:05.197091 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/66c72c71-f74a-43ab-bf0d-1f4c93623774-ca-certs\") pod \"catalogd-controller-manager-84b8d9d697-2hr5s\" (UID: \"66c72c71-f74a-43ab-bf0d-1f4c93623774\") " pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:19:05.197169 master-0 kubenswrapper[7728]: I0223 14:19:05.197123 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/66c72c71-f74a-43ab-bf0d-1f4c93623774-catalogserver-certs\") pod \"catalogd-controller-manager-84b8d9d697-2hr5s\" (UID: \"66c72c71-f74a-43ab-bf0d-1f4c93623774\") " pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:19:05.197267 master-0 kubenswrapper[7728]: I0223 14:19:05.197234 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/66c72c71-f74a-43ab-bf0d-1f4c93623774-cache\") pod \"catalogd-controller-manager-84b8d9d697-2hr5s\" (UID: \"66c72c71-f74a-43ab-bf0d-1f4c93623774\") " pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:19:05.197650 master-0 kubenswrapper[7728]: I0223 14:19:05.197616 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1c60ff3f-2bb1-422e-be27-5eca96d85fd2-etc-docker\") pod \"operator-controller-controller-manager-9cc7d7bb-6zmk9\" (UID: \"1c60ff3f-2bb1-422e-be27-5eca96d85fd2\") " pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:19:05.197704 master-0 kubenswrapper[7728]: I0223 14:19:05.197649 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/66c72c71-f74a-43ab-bf0d-1f4c93623774-etc-docker\") pod \"catalogd-controller-manager-84b8d9d697-2hr5s\" (UID: \"66c72c71-f74a-43ab-bf0d-1f4c93623774\") " pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:19:05.197753 master-0 kubenswrapper[7728]: I0223 14:19:05.197726 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/66c72c71-f74a-43ab-bf0d-1f4c93623774-etc-docker\") pod \"catalogd-controller-manager-84b8d9d697-2hr5s\" (UID: \"66c72c71-f74a-43ab-bf0d-1f4c93623774\") " pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:19:05.197834 master-0 kubenswrapper[7728]: I0223 14:19:05.197674 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1c60ff3f-2bb1-422e-be27-5eca96d85fd2-cache\") pod \"operator-controller-controller-manager-9cc7d7bb-6zmk9\" (UID: \"1c60ff3f-2bb1-422e-be27-5eca96d85fd2\") " pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:19:05.200490 master-0 kubenswrapper[7728]: I0223 14:19:05.200456 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/1c60ff3f-2bb1-422e-be27-5eca96d85fd2-ca-certs\") pod \"operator-controller-controller-manager-9cc7d7bb-6zmk9\" (UID: \"1c60ff3f-2bb1-422e-be27-5eca96d85fd2\") " pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:19:05.201418 master-0 kubenswrapper[7728]: I0223 14:19:05.201379 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/66c72c71-f74a-43ab-bf0d-1f4c93623774-catalogserver-certs\") pod \"catalogd-controller-manager-84b8d9d697-2hr5s\" (UID: \"66c72c71-f74a-43ab-bf0d-1f4c93623774\") " pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:19:05.202063 master-0 kubenswrapper[7728]: I0223 14:19:05.202031 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/66c72c71-f74a-43ab-bf0d-1f4c93623774-ca-certs\") pod \"catalogd-controller-manager-84b8d9d697-2hr5s\" (UID: \"66c72c71-f74a-43ab-bf0d-1f4c93623774\") " pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:19:05.213992 master-0 kubenswrapper[7728]: I0223 14:19:05.213954 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlz28\" (UniqueName: \"kubernetes.io/projected/1c60ff3f-2bb1-422e-be27-5eca96d85fd2-kube-api-access-jlz28\") pod \"operator-controller-controller-manager-9cc7d7bb-6zmk9\" (UID: \"1c60ff3f-2bb1-422e-be27-5eca96d85fd2\") " pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:19:05.214375 master-0 kubenswrapper[7728]: I0223 14:19:05.214339 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlqzc\" (UniqueName: \"kubernetes.io/projected/66c72c71-f74a-43ab-bf0d-1f4c93623774-kube-api-access-xlqzc\") pod \"catalogd-controller-manager-84b8d9d697-2hr5s\" (UID: \"66c72c71-f74a-43ab-bf0d-1f4c93623774\") " pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:19:05.368435 master-0 kubenswrapper[7728]: I0223 14:19:05.368314 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:19:05.382326 master-0 kubenswrapper[7728]: I0223 14:19:05.382299 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:19:05.472249 master-0 kubenswrapper[7728]: W0223 14:19:05.472193 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod842d45c5_3452_4e97_b5f5_540395330a65.slice/crio-fa13ad269ac35b8dfe7ce217763b060906b7b9f87b5b13a745caacb634cb9d80 WatchSource:0}: Error finding container fa13ad269ac35b8dfe7ce217763b060906b7b9f87b5b13a745caacb634cb9d80: Status 404 returned error can't find the container with id fa13ad269ac35b8dfe7ce217763b060906b7b9f87b5b13a745caacb634cb9d80 Feb 23 14:19:05.531598 master-0 kubenswrapper[7728]: I0223 14:19:05.531070 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:19:05.561313 master-0 kubenswrapper[7728]: I0223 14:19:05.560385 7728 generic.go:334] "Generic (PLEG): container finished" podID="3cea0ab8-258b-486c-bb7f-8c93930b296d" containerID="e532ebe35ecd05d1d110750408a3aca8e2e0ca55c7ed17dc1b108800da8ba8b6" exitCode=0 Feb 23 14:19:05.561313 master-0 kubenswrapper[7728]: I0223 14:19:05.560455 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" event={"ID":"3cea0ab8-258b-486c-bb7f-8c93930b296d","Type":"ContainerDied","Data":"e532ebe35ecd05d1d110750408a3aca8e2e0ca55c7ed17dc1b108800da8ba8b6"} Feb 23 14:19:05.561313 master-0 kubenswrapper[7728]: I0223 14:19:05.560504 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" event={"ID":"3cea0ab8-258b-486c-bb7f-8c93930b296d","Type":"ContainerDied","Data":"d94e3aa12ea197f83dbe73a39e5dd5f5709d36ebed817bef04e39396f416a043"} Feb 23 14:19:05.561313 master-0 kubenswrapper[7728]: I0223 14:19:05.560546 7728 scope.go:117] "RemoveContainer" containerID="e532ebe35ecd05d1d110750408a3aca8e2e0ca55c7ed17dc1b108800da8ba8b6" Feb 23 14:19:05.561313 master-0 kubenswrapper[7728]: I0223 14:19:05.560649 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg" Feb 23 14:19:05.564115 master-0 kubenswrapper[7728]: I0223 14:19:05.564070 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v" event={"ID":"842d45c5-3452-4e97-b5f5-540395330a65","Type":"ContainerStarted","Data":"fa13ad269ac35b8dfe7ce217763b060906b7b9f87b5b13a745caacb634cb9d80"} Feb 23 14:19:05.617817 master-0 kubenswrapper[7728]: I0223 14:19:05.617696 7728 scope.go:117] "RemoveContainer" containerID="e532ebe35ecd05d1d110750408a3aca8e2e0ca55c7ed17dc1b108800da8ba8b6" Feb 23 14:19:05.618407 master-0 kubenswrapper[7728]: E0223 14:19:05.618372 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e532ebe35ecd05d1d110750408a3aca8e2e0ca55c7ed17dc1b108800da8ba8b6\": container with ID starting with e532ebe35ecd05d1d110750408a3aca8e2e0ca55c7ed17dc1b108800da8ba8b6 not found: ID does not exist" containerID="e532ebe35ecd05d1d110750408a3aca8e2e0ca55c7ed17dc1b108800da8ba8b6" Feb 23 14:19:05.618597 master-0 kubenswrapper[7728]: I0223 14:19:05.618414 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e532ebe35ecd05d1d110750408a3aca8e2e0ca55c7ed17dc1b108800da8ba8b6"} err="failed to get container status \"e532ebe35ecd05d1d110750408a3aca8e2e0ca55c7ed17dc1b108800da8ba8b6\": rpc error: code = NotFound desc = could not find container \"e532ebe35ecd05d1d110750408a3aca8e2e0ca55c7ed17dc1b108800da8ba8b6\": container with ID starting with e532ebe35ecd05d1d110750408a3aca8e2e0ca55c7ed17dc1b108800da8ba8b6 not found: ID does not exist" Feb 23 14:19:05.705445 master-0 kubenswrapper[7728]: I0223 14:19:05.705147 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert\") pod \"3cea0ab8-258b-486c-bb7f-8c93930b296d\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " Feb 23 14:19:05.705445 master-0 kubenswrapper[7728]: I0223 14:19:05.705218 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3cea0ab8-258b-486c-bb7f-8c93930b296d-etc-ssl-certs\") pod \"3cea0ab8-258b-486c-bb7f-8c93930b296d\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " Feb 23 14:19:05.705445 master-0 kubenswrapper[7728]: I0223 14:19:05.705338 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3cea0ab8-258b-486c-bb7f-8c93930b296d-service-ca\") pod \"3cea0ab8-258b-486c-bb7f-8c93930b296d\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " Feb 23 14:19:05.705445 master-0 kubenswrapper[7728]: I0223 14:19:05.705416 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3cea0ab8-258b-486c-bb7f-8c93930b296d-kube-api-access\") pod \"3cea0ab8-258b-486c-bb7f-8c93930b296d\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " Feb 23 14:19:05.716857 master-0 kubenswrapper[7728]: I0223 14:19:05.705497 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3cea0ab8-258b-486c-bb7f-8c93930b296d-etc-cvo-updatepayloads\") pod \"3cea0ab8-258b-486c-bb7f-8c93930b296d\" (UID: \"3cea0ab8-258b-486c-bb7f-8c93930b296d\") " Feb 23 14:19:05.716857 master-0 kubenswrapper[7728]: I0223 14:19:05.705845 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3cea0ab8-258b-486c-bb7f-8c93930b296d-etc-ssl-certs" (OuterVolumeSpecName: "etc-ssl-certs") pod "3cea0ab8-258b-486c-bb7f-8c93930b296d" (UID: "3cea0ab8-258b-486c-bb7f-8c93930b296d"). InnerVolumeSpecName "etc-ssl-certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:19:05.716857 master-0 kubenswrapper[7728]: I0223 14:19:05.705876 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3cea0ab8-258b-486c-bb7f-8c93930b296d-etc-cvo-updatepayloads" (OuterVolumeSpecName: "etc-cvo-updatepayloads") pod "3cea0ab8-258b-486c-bb7f-8c93930b296d" (UID: "3cea0ab8-258b-486c-bb7f-8c93930b296d"). InnerVolumeSpecName "etc-cvo-updatepayloads". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:19:05.716857 master-0 kubenswrapper[7728]: I0223 14:19:05.706552 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cea0ab8-258b-486c-bb7f-8c93930b296d-service-ca" (OuterVolumeSpecName: "service-ca") pod "3cea0ab8-258b-486c-bb7f-8c93930b296d" (UID: "3cea0ab8-258b-486c-bb7f-8c93930b296d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:19:05.716857 master-0 kubenswrapper[7728]: I0223 14:19:05.716452 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cea0ab8-258b-486c-bb7f-8c93930b296d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3cea0ab8-258b-486c-bb7f-8c93930b296d" (UID: "3cea0ab8-258b-486c-bb7f-8c93930b296d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:19:05.728505 master-0 kubenswrapper[7728]: I0223 14:19:05.728431 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3cea0ab8-258b-486c-bb7f-8c93930b296d" (UID: "3cea0ab8-258b-486c-bb7f-8c93930b296d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:19:05.811353 master-0 kubenswrapper[7728]: I0223 14:19:05.807039 7728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cea0ab8-258b-486c-bb7f-8c93930b296d-serving-cert\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:05.811353 master-0 kubenswrapper[7728]: I0223 14:19:05.807099 7728 reconciler_common.go:293] "Volume detached for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3cea0ab8-258b-486c-bb7f-8c93930b296d-etc-ssl-certs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:05.811353 master-0 kubenswrapper[7728]: I0223 14:19:05.807110 7728 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3cea0ab8-258b-486c-bb7f-8c93930b296d-service-ca\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:05.811353 master-0 kubenswrapper[7728]: I0223 14:19:05.807121 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3cea0ab8-258b-486c-bb7f-8c93930b296d-kube-api-access\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:05.811353 master-0 kubenswrapper[7728]: I0223 14:19:05.807131 7728 reconciler_common.go:293] "Volume detached for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3cea0ab8-258b-486c-bb7f-8c93930b296d-etc-cvo-updatepayloads\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:05.863706 master-0 kubenswrapper[7728]: I0223 14:19:05.859769 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-6569778c84-hsl6c"] Feb 23 14:19:05.943109 master-0 kubenswrapper[7728]: I0223 14:19:05.942901 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7"] Feb 23 14:19:05.951846 master-0 kubenswrapper[7728]: I0223 14:19:05.951795 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-6f5488b997-7b5sp"] Feb 23 14:19:05.954806 master-0 kubenswrapper[7728]: I0223 14:19:05.954770 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp"] Feb 23 14:19:05.997352 master-0 kubenswrapper[7728]: I0223 14:19:05.994301 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg"] Feb 23 14:19:05.997652 master-0 kubenswrapper[7728]: W0223 14:19:05.997452 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3488a7eb_5170_478c_9af7_490dbe0f514e.slice/crio-867a47d4a06c655239027935fd0111c0fd83a1e1a4a4c825f97faccc95bc37fc WatchSource:0}: Error finding container 867a47d4a06c655239027935fd0111c0fd83a1e1a4a4c825f97faccc95bc37fc: Status 404 returned error can't find the container with id 867a47d4a06c655239027935fd0111c0fd83a1e1a4a4c825f97faccc95bc37fc Feb 23 14:19:06.015779 master-0 kubenswrapper[7728]: I0223 14:19:06.000751 7728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cluster-version/cluster-version-operator-5cfd9759cf-bsqrg"] Feb 23 14:19:06.046058 master-0 kubenswrapper[7728]: I0223 14:19:06.045707 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-57476485-m58rm"] Feb 23 14:19:06.046058 master-0 kubenswrapper[7728]: E0223 14:19:06.045904 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cea0ab8-258b-486c-bb7f-8c93930b296d" containerName="cluster-version-operator" Feb 23 14:19:06.046058 master-0 kubenswrapper[7728]: I0223 14:19:06.045918 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cea0ab8-258b-486c-bb7f-8c93930b296d" containerName="cluster-version-operator" Feb 23 14:19:06.046058 master-0 kubenswrapper[7728]: I0223 14:19:06.046002 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cea0ab8-258b-486c-bb7f-8c93930b296d" containerName="cluster-version-operator" Feb 23 14:19:06.046380 master-0 kubenswrapper[7728]: I0223 14:19:06.046353 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" Feb 23 14:19:06.048434 master-0 kubenswrapper[7728]: I0223 14:19:06.048383 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 23 14:19:06.049526 master-0 kubenswrapper[7728]: I0223 14:19:06.049503 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 23 14:19:06.050166 master-0 kubenswrapper[7728]: I0223 14:19:06.050053 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 23 14:19:06.135653 master-0 kubenswrapper[7728]: I0223 14:19:06.135611 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9"] Feb 23 14:19:06.148560 master-0 kubenswrapper[7728]: I0223 14:19:06.148311 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9dnsv"] Feb 23 14:19:06.158961 master-0 kubenswrapper[7728]: I0223 14:19:06.158438 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4"] Feb 23 14:19:06.159280 master-0 kubenswrapper[7728]: I0223 14:19:06.159243 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s"] Feb 23 14:19:06.228542 master-0 kubenswrapper[7728]: I0223 14:19:06.228412 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9774f8c-0f29-46d8-be77-81bcf74d5994-serving-cert\") pod \"cluster-version-operator-57476485-m58rm\" (UID: \"b9774f8c-0f29-46d8-be77-81bcf74d5994\") " pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" Feb 23 14:19:06.228542 master-0 kubenswrapper[7728]: I0223 14:19:06.228532 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b9774f8c-0f29-46d8-be77-81bcf74d5994-etc-cvo-updatepayloads\") pod \"cluster-version-operator-57476485-m58rm\" (UID: \"b9774f8c-0f29-46d8-be77-81bcf74d5994\") " pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" Feb 23 14:19:06.229663 master-0 kubenswrapper[7728]: I0223 14:19:06.228898 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9774f8c-0f29-46d8-be77-81bcf74d5994-kube-api-access\") pod \"cluster-version-operator-57476485-m58rm\" (UID: \"b9774f8c-0f29-46d8-be77-81bcf74d5994\") " pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" Feb 23 14:19:06.229663 master-0 kubenswrapper[7728]: I0223 14:19:06.228965 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9774f8c-0f29-46d8-be77-81bcf74d5994-service-ca\") pod \"cluster-version-operator-57476485-m58rm\" (UID: \"b9774f8c-0f29-46d8-be77-81bcf74d5994\") " pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" Feb 23 14:19:06.229663 master-0 kubenswrapper[7728]: I0223 14:19:06.229078 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b9774f8c-0f29-46d8-be77-81bcf74d5994-etc-ssl-certs\") pod \"cluster-version-operator-57476485-m58rm\" (UID: \"b9774f8c-0f29-46d8-be77-81bcf74d5994\") " pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" Feb 23 14:19:06.330626 master-0 kubenswrapper[7728]: I0223 14:19:06.330561 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9774f8c-0f29-46d8-be77-81bcf74d5994-serving-cert\") pod \"cluster-version-operator-57476485-m58rm\" (UID: \"b9774f8c-0f29-46d8-be77-81bcf74d5994\") " pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" Feb 23 14:19:06.330626 master-0 kubenswrapper[7728]: I0223 14:19:06.330623 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b9774f8c-0f29-46d8-be77-81bcf74d5994-etc-cvo-updatepayloads\") pod \"cluster-version-operator-57476485-m58rm\" (UID: \"b9774f8c-0f29-46d8-be77-81bcf74d5994\") " pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" Feb 23 14:19:06.330854 master-0 kubenswrapper[7728]: I0223 14:19:06.330654 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9774f8c-0f29-46d8-be77-81bcf74d5994-kube-api-access\") pod \"cluster-version-operator-57476485-m58rm\" (UID: \"b9774f8c-0f29-46d8-be77-81bcf74d5994\") " pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" Feb 23 14:19:06.330854 master-0 kubenswrapper[7728]: I0223 14:19:06.330795 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9774f8c-0f29-46d8-be77-81bcf74d5994-service-ca\") pod \"cluster-version-operator-57476485-m58rm\" (UID: \"b9774f8c-0f29-46d8-be77-81bcf74d5994\") " pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" Feb 23 14:19:06.330854 master-0 kubenswrapper[7728]: I0223 14:19:06.330822 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b9774f8c-0f29-46d8-be77-81bcf74d5994-etc-ssl-certs\") pod \"cluster-version-operator-57476485-m58rm\" (UID: \"b9774f8c-0f29-46d8-be77-81bcf74d5994\") " pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" Feb 23 14:19:06.330936 master-0 kubenswrapper[7728]: I0223 14:19:06.330902 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b9774f8c-0f29-46d8-be77-81bcf74d5994-etc-ssl-certs\") pod \"cluster-version-operator-57476485-m58rm\" (UID: \"b9774f8c-0f29-46d8-be77-81bcf74d5994\") " pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" Feb 23 14:19:06.331264 master-0 kubenswrapper[7728]: I0223 14:19:06.331231 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b9774f8c-0f29-46d8-be77-81bcf74d5994-etc-cvo-updatepayloads\") pod \"cluster-version-operator-57476485-m58rm\" (UID: \"b9774f8c-0f29-46d8-be77-81bcf74d5994\") " pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" Feb 23 14:19:06.332361 master-0 kubenswrapper[7728]: I0223 14:19:06.332321 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9774f8c-0f29-46d8-be77-81bcf74d5994-service-ca\") pod \"cluster-version-operator-57476485-m58rm\" (UID: \"b9774f8c-0f29-46d8-be77-81bcf74d5994\") " pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" Feb 23 14:19:06.353625 master-0 kubenswrapper[7728]: I0223 14:19:06.353574 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9774f8c-0f29-46d8-be77-81bcf74d5994-kube-api-access\") pod \"cluster-version-operator-57476485-m58rm\" (UID: \"b9774f8c-0f29-46d8-be77-81bcf74d5994\") " pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" Feb 23 14:19:06.354175 master-0 kubenswrapper[7728]: I0223 14:19:06.354138 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9774f8c-0f29-46d8-be77-81bcf74d5994-serving-cert\") pod \"cluster-version-operator-57476485-m58rm\" (UID: \"b9774f8c-0f29-46d8-be77-81bcf74d5994\") " pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" Feb 23 14:19:06.438452 master-0 kubenswrapper[7728]: I0223 14:19:06.438272 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" Feb 23 14:19:06.571210 master-0 kubenswrapper[7728]: I0223 14:19:06.571074 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56fb65d69d-lnsbc" event={"ID":"722d4323-f61f-43f0-958a-38b117880306","Type":"ContainerStarted","Data":"2d4c2e03b8c825b10187a251700d61e5829b4208fbb4aa1fd9687aeb12102689"} Feb 23 14:19:06.572269 master-0 kubenswrapper[7728]: I0223 14:19:06.572223 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-56fb65d69d-lnsbc" Feb 23 14:19:06.575393 master-0 kubenswrapper[7728]: I0223 14:19:06.575345 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5896cbddf7-qqhhj" event={"ID":"65845117-85ab-4133-99ed-6dcd6e736e09","Type":"ContainerStarted","Data":"35b34cdb83e58e0911707e2ac6fe213d3d4beb01c126e819a9b508fddfe737d3"} Feb 23 14:19:06.575668 master-0 kubenswrapper[7728]: I0223 14:19:06.575639 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5896cbddf7-qqhhj" Feb 23 14:19:06.576420 master-0 kubenswrapper[7728]: I0223 14:19:06.576368 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" event={"ID":"3488a7eb-5170-478c-9af7-490dbe0f514e","Type":"ContainerStarted","Data":"867a47d4a06c655239027935fd0111c0fd83a1e1a4a4c825f97faccc95bc37fc"} Feb 23 14:19:06.576904 master-0 kubenswrapper[7728]: I0223 14:19:06.576872 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-56fb65d69d-lnsbc" Feb 23 14:19:06.577744 master-0 kubenswrapper[7728]: I0223 14:19:06.577712 7728 generic.go:334] "Generic (PLEG): container finished" podID="7319dd65-3b07-4120-9a7b-60da5d0ed066" containerID="74b74e3a5975902b83372ba9185236b6fbcd47d3731e1d93753f1381faba3197" exitCode=0 Feb 23 14:19:06.577798 master-0 kubenswrapper[7728]: I0223 14:19:06.577779 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-746d649f6b-54g2n" event={"ID":"7319dd65-3b07-4120-9a7b-60da5d0ed066","Type":"ContainerDied","Data":"74b74e3a5975902b83372ba9185236b6fbcd47d3731e1d93753f1381faba3197"} Feb 23 14:19:06.585945 master-0 kubenswrapper[7728]: I0223 14:19:06.585766 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5896cbddf7-qqhhj" Feb 23 14:19:06.595672 master-0 kubenswrapper[7728]: I0223 14:19:06.593842 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-56fb65d69d-lnsbc" podStartSLOduration=6.05153738 podStartE2EDuration="12.593820094s" podCreationTimestamp="2026-02-23 14:18:54 +0000 UTC" firstStartedPulling="2026-02-23 14:18:59.075331877 +0000 UTC m=+32.037993173" lastFinishedPulling="2026-02-23 14:19:05.617614591 +0000 UTC m=+38.580275887" observedRunningTime="2026-02-23 14:19:06.592214799 +0000 UTC m=+39.554876095" watchObservedRunningTime="2026-02-23 14:19:06.593820094 +0000 UTC m=+39.556481390" Feb 23 14:19:06.616582 master-0 kubenswrapper[7728]: I0223 14:19:06.616494 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5896cbddf7-qqhhj" podStartSLOduration=4.787649089 podStartE2EDuration="12.616453312s" podCreationTimestamp="2026-02-23 14:18:54 +0000 UTC" firstStartedPulling="2026-02-23 14:18:57.67332184 +0000 UTC m=+30.635983136" lastFinishedPulling="2026-02-23 14:19:05.502126063 +0000 UTC m=+38.464787359" observedRunningTime="2026-02-23 14:19:06.616406171 +0000 UTC m=+39.579067467" watchObservedRunningTime="2026-02-23 14:19:06.616453312 +0000 UTC m=+39.579114608" Feb 23 14:19:06.717963 master-0 kubenswrapper[7728]: W0223 14:19:06.717853 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66c72c71_f74a_43ab_bf0d_1f4c93623774.slice/crio-e209b32301611ace99d9d8f60b3c7574bcb7691d3f24d73da6cbdd55987d8c54 WatchSource:0}: Error finding container e209b32301611ace99d9d8f60b3c7574bcb7691d3f24d73da6cbdd55987d8c54: Status 404 returned error can't find the container with id e209b32301611ace99d9d8f60b3c7574bcb7691d3f24d73da6cbdd55987d8c54 Feb 23 14:19:06.719597 master-0 kubenswrapper[7728]: W0223 14:19:06.719566 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea0b3538_9a7d_4995_b628_2d63f21d683c.slice/crio-fe38a11f2899f2913cfd5201bad475af2fb0c867e6d00537cbb69269270c3e16 WatchSource:0}: Error finding container fe38a11f2899f2913cfd5201bad475af2fb0c867e6d00537cbb69269270c3e16: Status 404 returned error can't find the container with id fe38a11f2899f2913cfd5201bad475af2fb0c867e6d00537cbb69269270c3e16 Feb 23 14:19:06.770289 master-0 kubenswrapper[7728]: I0223 14:19:06.770243 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Feb 23 14:19:06.771144 master-0 kubenswrapper[7728]: I0223 14:19:06.771120 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Feb 23 14:19:06.775238 master-0 kubenswrapper[7728]: I0223 14:19:06.775200 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Feb 23 14:19:06.839538 master-0 kubenswrapper[7728]: I0223 14:19:06.839433 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:19:06.935978 master-0 kubenswrapper[7728]: I0223 14:19:06.935932 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7319dd65-3b07-4120-9a7b-60da5d0ed066-config\") pod \"7319dd65-3b07-4120-9a7b-60da5d0ed066\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " Feb 23 14:19:06.936118 master-0 kubenswrapper[7728]: I0223 14:19:06.936004 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7319dd65-3b07-4120-9a7b-60da5d0ed066-serving-cert\") pod \"7319dd65-3b07-4120-9a7b-60da5d0ed066\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " Feb 23 14:19:06.936118 master-0 kubenswrapper[7728]: I0223 14:19:06.936029 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7319dd65-3b07-4120-9a7b-60da5d0ed066-encryption-config\") pod \"7319dd65-3b07-4120-9a7b-60da5d0ed066\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " Feb 23 14:19:06.936118 master-0 kubenswrapper[7728]: I0223 14:19:06.936054 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7319dd65-3b07-4120-9a7b-60da5d0ed066-audit-dir\") pod \"7319dd65-3b07-4120-9a7b-60da5d0ed066\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " Feb 23 14:19:06.936118 master-0 kubenswrapper[7728]: I0223 14:19:06.936069 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7319dd65-3b07-4120-9a7b-60da5d0ed066-image-import-ca\") pod \"7319dd65-3b07-4120-9a7b-60da5d0ed066\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " Feb 23 14:19:06.936118 master-0 kubenswrapper[7728]: I0223 14:19:06.936107 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7319dd65-3b07-4120-9a7b-60da5d0ed066-audit\") pod \"7319dd65-3b07-4120-9a7b-60da5d0ed066\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " Feb 23 14:19:06.936265 master-0 kubenswrapper[7728]: I0223 14:19:06.936124 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7319dd65-3b07-4120-9a7b-60da5d0ed066-trusted-ca-bundle\") pod \"7319dd65-3b07-4120-9a7b-60da5d0ed066\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " Feb 23 14:19:06.936265 master-0 kubenswrapper[7728]: I0223 14:19:06.936167 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4wpr\" (UniqueName: \"kubernetes.io/projected/7319dd65-3b07-4120-9a7b-60da5d0ed066-kube-api-access-s4wpr\") pod \"7319dd65-3b07-4120-9a7b-60da5d0ed066\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " Feb 23 14:19:06.938153 master-0 kubenswrapper[7728]: I0223 14:19:06.936374 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7319dd65-3b07-4120-9a7b-60da5d0ed066-config" (OuterVolumeSpecName: "config") pod "7319dd65-3b07-4120-9a7b-60da5d0ed066" (UID: "7319dd65-3b07-4120-9a7b-60da5d0ed066"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:19:06.938153 master-0 kubenswrapper[7728]: I0223 14:19:06.936439 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7319dd65-3b07-4120-9a7b-60da5d0ed066-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "7319dd65-3b07-4120-9a7b-60da5d0ed066" (UID: "7319dd65-3b07-4120-9a7b-60da5d0ed066"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:19:06.938153 master-0 kubenswrapper[7728]: I0223 14:19:06.936563 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7319dd65-3b07-4120-9a7b-60da5d0ed066-node-pullsecrets\") pod \"7319dd65-3b07-4120-9a7b-60da5d0ed066\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " Feb 23 14:19:06.938153 master-0 kubenswrapper[7728]: I0223 14:19:06.936635 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7319dd65-3b07-4120-9a7b-60da5d0ed066-etcd-client\") pod \"7319dd65-3b07-4120-9a7b-60da5d0ed066\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " Feb 23 14:19:06.938153 master-0 kubenswrapper[7728]: I0223 14:19:06.936636 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7319dd65-3b07-4120-9a7b-60da5d0ed066-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "7319dd65-3b07-4120-9a7b-60da5d0ed066" (UID: "7319dd65-3b07-4120-9a7b-60da5d0ed066"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:19:06.938153 master-0 kubenswrapper[7728]: I0223 14:19:06.936667 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7319dd65-3b07-4120-9a7b-60da5d0ed066-etcd-serving-ca\") pod \"7319dd65-3b07-4120-9a7b-60da5d0ed066\" (UID: \"7319dd65-3b07-4120-9a7b-60da5d0ed066\") " Feb 23 14:19:06.938153 master-0 kubenswrapper[7728]: I0223 14:19:06.936885 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ab99fbb6-d945-4bf7-a85c-239f83137a4d-var-lock\") pod \"installer-2-master-0\" (UID: \"ab99fbb6-d945-4bf7-a85c-239f83137a4d\") " pod="openshift-kube-scheduler/installer-2-master-0" Feb 23 14:19:06.938153 master-0 kubenswrapper[7728]: I0223 14:19:06.936915 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab99fbb6-d945-4bf7-a85c-239f83137a4d-kube-api-access\") pod \"installer-2-master-0\" (UID: \"ab99fbb6-d945-4bf7-a85c-239f83137a4d\") " pod="openshift-kube-scheduler/installer-2-master-0" Feb 23 14:19:06.938153 master-0 kubenswrapper[7728]: I0223 14:19:06.936938 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab99fbb6-d945-4bf7-a85c-239f83137a4d-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"ab99fbb6-d945-4bf7-a85c-239f83137a4d\") " pod="openshift-kube-scheduler/installer-2-master-0" Feb 23 14:19:06.938153 master-0 kubenswrapper[7728]: I0223 14:19:06.936992 7728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7319dd65-3b07-4120-9a7b-60da5d0ed066-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:06.938153 master-0 kubenswrapper[7728]: I0223 14:19:06.937003 7728 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7319dd65-3b07-4120-9a7b-60da5d0ed066-audit-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:06.938153 master-0 kubenswrapper[7728]: I0223 14:19:06.937014 7728 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7319dd65-3b07-4120-9a7b-60da5d0ed066-node-pullsecrets\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:06.938153 master-0 kubenswrapper[7728]: I0223 14:19:06.937325 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7319dd65-3b07-4120-9a7b-60da5d0ed066-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "7319dd65-3b07-4120-9a7b-60da5d0ed066" (UID: "7319dd65-3b07-4120-9a7b-60da5d0ed066"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:19:06.938153 master-0 kubenswrapper[7728]: I0223 14:19:06.937601 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7319dd65-3b07-4120-9a7b-60da5d0ed066-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "7319dd65-3b07-4120-9a7b-60da5d0ed066" (UID: "7319dd65-3b07-4120-9a7b-60da5d0ed066"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:19:06.938153 master-0 kubenswrapper[7728]: I0223 14:19:06.937639 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7319dd65-3b07-4120-9a7b-60da5d0ed066-audit" (OuterVolumeSpecName: "audit") pod "7319dd65-3b07-4120-9a7b-60da5d0ed066" (UID: "7319dd65-3b07-4120-9a7b-60da5d0ed066"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:19:06.938153 master-0 kubenswrapper[7728]: I0223 14:19:06.937672 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7319dd65-3b07-4120-9a7b-60da5d0ed066-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7319dd65-3b07-4120-9a7b-60da5d0ed066" (UID: "7319dd65-3b07-4120-9a7b-60da5d0ed066"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:19:06.939185 master-0 kubenswrapper[7728]: I0223 14:19:06.939145 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7319dd65-3b07-4120-9a7b-60da5d0ed066-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7319dd65-3b07-4120-9a7b-60da5d0ed066" (UID: "7319dd65-3b07-4120-9a7b-60da5d0ed066"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:19:06.940005 master-0 kubenswrapper[7728]: I0223 14:19:06.939975 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7319dd65-3b07-4120-9a7b-60da5d0ed066-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "7319dd65-3b07-4120-9a7b-60da5d0ed066" (UID: "7319dd65-3b07-4120-9a7b-60da5d0ed066"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:19:06.940435 master-0 kubenswrapper[7728]: I0223 14:19:06.940412 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7319dd65-3b07-4120-9a7b-60da5d0ed066-kube-api-access-s4wpr" (OuterVolumeSpecName: "kube-api-access-s4wpr") pod "7319dd65-3b07-4120-9a7b-60da5d0ed066" (UID: "7319dd65-3b07-4120-9a7b-60da5d0ed066"). InnerVolumeSpecName "kube-api-access-s4wpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:19:06.952588 master-0 kubenswrapper[7728]: I0223 14:19:06.952523 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7319dd65-3b07-4120-9a7b-60da5d0ed066-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "7319dd65-3b07-4120-9a7b-60da5d0ed066" (UID: "7319dd65-3b07-4120-9a7b-60da5d0ed066"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:19:07.038508 master-0 kubenswrapper[7728]: I0223 14:19:07.038459 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ab99fbb6-d945-4bf7-a85c-239f83137a4d-var-lock\") pod \"installer-2-master-0\" (UID: \"ab99fbb6-d945-4bf7-a85c-239f83137a4d\") " pod="openshift-kube-scheduler/installer-2-master-0" Feb 23 14:19:07.038630 master-0 kubenswrapper[7728]: I0223 14:19:07.038522 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab99fbb6-d945-4bf7-a85c-239f83137a4d-kube-api-access\") pod \"installer-2-master-0\" (UID: \"ab99fbb6-d945-4bf7-a85c-239f83137a4d\") " pod="openshift-kube-scheduler/installer-2-master-0" Feb 23 14:19:07.038630 master-0 kubenswrapper[7728]: I0223 14:19:07.038546 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab99fbb6-d945-4bf7-a85c-239f83137a4d-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"ab99fbb6-d945-4bf7-a85c-239f83137a4d\") " pod="openshift-kube-scheduler/installer-2-master-0" Feb 23 14:19:07.038630 master-0 kubenswrapper[7728]: I0223 14:19:07.038620 7728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7319dd65-3b07-4120-9a7b-60da5d0ed066-serving-cert\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:07.038992 master-0 kubenswrapper[7728]: I0223 14:19:07.038543 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ab99fbb6-d945-4bf7-a85c-239f83137a4d-var-lock\") pod \"installer-2-master-0\" (UID: \"ab99fbb6-d945-4bf7-a85c-239f83137a4d\") " pod="openshift-kube-scheduler/installer-2-master-0" Feb 23 14:19:07.038992 master-0 kubenswrapper[7728]: I0223 14:19:07.038634 7728 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7319dd65-3b07-4120-9a7b-60da5d0ed066-encryption-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:07.038992 master-0 kubenswrapper[7728]: I0223 14:19:07.038693 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab99fbb6-d945-4bf7-a85c-239f83137a4d-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"ab99fbb6-d945-4bf7-a85c-239f83137a4d\") " pod="openshift-kube-scheduler/installer-2-master-0" Feb 23 14:19:07.038992 master-0 kubenswrapper[7728]: I0223 14:19:07.038921 7728 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7319dd65-3b07-4120-9a7b-60da5d0ed066-image-import-ca\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:07.038992 master-0 kubenswrapper[7728]: I0223 14:19:07.038938 7728 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7319dd65-3b07-4120-9a7b-60da5d0ed066-audit\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:07.039348 master-0 kubenswrapper[7728]: I0223 14:19:07.039179 7728 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7319dd65-3b07-4120-9a7b-60da5d0ed066-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:07.039348 master-0 kubenswrapper[7728]: I0223 14:19:07.039205 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4wpr\" (UniqueName: \"kubernetes.io/projected/7319dd65-3b07-4120-9a7b-60da5d0ed066-kube-api-access-s4wpr\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:07.039348 master-0 kubenswrapper[7728]: I0223 14:19:07.039219 7728 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7319dd65-3b07-4120-9a7b-60da5d0ed066-etcd-client\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:07.039348 master-0 kubenswrapper[7728]: I0223 14:19:07.039232 7728 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7319dd65-3b07-4120-9a7b-60da5d0ed066-etcd-serving-ca\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:07.055753 master-0 kubenswrapper[7728]: I0223 14:19:07.055714 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab99fbb6-d945-4bf7-a85c-239f83137a4d-kube-api-access\") pod \"installer-2-master-0\" (UID: \"ab99fbb6-d945-4bf7-a85c-239f83137a4d\") " pod="openshift-kube-scheduler/installer-2-master-0" Feb 23 14:19:07.095777 master-0 kubenswrapper[7728]: I0223 14:19:07.095742 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Feb 23 14:19:07.259346 master-0 kubenswrapper[7728]: I0223 14:19:07.259296 7728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cea0ab8-258b-486c-bb7f-8c93930b296d" path="/var/lib/kubelet/pods/3cea0ab8-258b-486c-bb7f-8c93930b296d/volumes" Feb 23 14:19:07.533185 master-0 kubenswrapper[7728]: I0223 14:19:07.532971 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Feb 23 14:19:07.538934 master-0 kubenswrapper[7728]: W0223 14:19:07.538855 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podab99fbb6_d945_4bf7_a85c_239f83137a4d.slice/crio-244a5c8ecd9fd0725cd2b5c5bda17cdcac940497bdf23d6c2d83d2e9ef3475bf WatchSource:0}: Error finding container 244a5c8ecd9fd0725cd2b5c5bda17cdcac940497bdf23d6c2d83d2e9ef3475bf: Status 404 returned error can't find the container with id 244a5c8ecd9fd0725cd2b5c5bda17cdcac940497bdf23d6c2d83d2e9ef3475bf Feb 23 14:19:07.587953 master-0 kubenswrapper[7728]: I0223 14:19:07.587893 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" event={"ID":"ea0b3538-9a7d-4995-b628-2d63f21d683c","Type":"ContainerStarted","Data":"fe38a11f2899f2913cfd5201bad475af2fb0c867e6d00537cbb69269270c3e16"} Feb 23 14:19:07.589664 master-0 kubenswrapper[7728]: I0223 14:19:07.589631 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-86l7f" event={"ID":"6a801da1-a7eb-4187-98b8-315076f55e19","Type":"ContainerStarted","Data":"e9346a8d9dc410d0da0fd8dd3433eca16fb555bf7b080b48fc268fe5ccceefb8"} Feb 23 14:19:07.589664 master-0 kubenswrapper[7728]: I0223 14:19:07.589662 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-86l7f" event={"ID":"6a801da1-a7eb-4187-98b8-315076f55e19","Type":"ContainerStarted","Data":"3a1f76c10c7d29eada49f24161c1b7b7382f293e5fcfef65b090da9015564f91"} Feb 23 14:19:07.590460 master-0 kubenswrapper[7728]: I0223 14:19:07.590392 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-86l7f" Feb 23 14:19:07.593766 master-0 kubenswrapper[7728]: I0223 14:19:07.593731 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" event={"ID":"b9774f8c-0f29-46d8-be77-81bcf74d5994","Type":"ContainerStarted","Data":"94bfdbcfdcf4914977da334b3fd2fe80966ec6c36be33d3628e4eada6361765f"} Feb 23 14:19:07.593766 master-0 kubenswrapper[7728]: I0223 14:19:07.593770 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" event={"ID":"b9774f8c-0f29-46d8-be77-81bcf74d5994","Type":"ContainerStarted","Data":"a74df791e2285ece031ddb2cb6a548b32c5f641cf114501941f1933c7809fad4"} Feb 23 14:19:07.595610 master-0 kubenswrapper[7728]: I0223 14:19:07.595531 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" event={"ID":"5b54fc16-d2f7-4b10-a611-5b411b389c5a","Type":"ContainerStarted","Data":"d6900766f82c9e206cd3e7039baaa2878e6511f4627d07a00a47ed50496678b9"} Feb 23 14:19:07.595709 master-0 kubenswrapper[7728]: I0223 14:19:07.595624 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" event={"ID":"5b54fc16-d2f7-4b10-a611-5b411b389c5a","Type":"ContainerStarted","Data":"305f42f52b6ba5ef239c92f6ac8cee0e2721fbe74d1ef92a70428b3a6fabdd04"} Feb 23 14:19:07.598058 master-0 kubenswrapper[7728]: I0223 14:19:07.598021 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" event={"ID":"1c60ff3f-2bb1-422e-be27-5eca96d85fd2","Type":"ContainerStarted","Data":"da1d5c483a29eae497076b88d3f36ceedbd222eda1407e5ddf3aa59ebba386d7"} Feb 23 14:19:07.598163 master-0 kubenswrapper[7728]: I0223 14:19:07.598105 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" event={"ID":"1c60ff3f-2bb1-422e-be27-5eca96d85fd2","Type":"ContainerStarted","Data":"0bda8d15a11221e7b98f49af56e0807945868c4a5e5d028da4a5c53d7f410c01"} Feb 23 14:19:07.598163 master-0 kubenswrapper[7728]: I0223 14:19:07.598121 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" event={"ID":"1c60ff3f-2bb1-422e-be27-5eca96d85fd2","Type":"ContainerStarted","Data":"fee0bde3d0eee2f0bc5f9cbe5f3f907b178692716f3f6aef77b4bea08c864506"} Feb 23 14:19:07.599313 master-0 kubenswrapper[7728]: I0223 14:19:07.599250 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:19:07.601689 master-0 kubenswrapper[7728]: I0223 14:19:07.601627 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-746d649f6b-54g2n" event={"ID":"7319dd65-3b07-4120-9a7b-60da5d0ed066","Type":"ContainerDied","Data":"8e5569d1d3d19f6d47cce98e450c91513b5ae752318bb0728d77dfab67e4d723"} Feb 23 14:19:07.601689 master-0 kubenswrapper[7728]: I0223 14:19:07.601674 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-746d649f6b-54g2n" Feb 23 14:19:07.601856 master-0 kubenswrapper[7728]: I0223 14:19:07.601690 7728 scope.go:117] "RemoveContainer" containerID="74b74e3a5975902b83372ba9185236b6fbcd47d3731e1d93753f1381faba3197" Feb 23 14:19:07.603732 master-0 kubenswrapper[7728]: I0223 14:19:07.603662 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9dnsv" event={"ID":"ace75aae-6f4f-4299-90e2-d5292271b136","Type":"ContainerStarted","Data":"edfafba30f67b299b61cf6429b5bf8b47c050040f18802ede0a7f2834a957ae9"} Feb 23 14:19:07.608678 master-0 kubenswrapper[7728]: I0223 14:19:07.606293 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"ab99fbb6-d945-4bf7-a85c-239f83137a4d","Type":"ContainerStarted","Data":"244a5c8ecd9fd0725cd2b5c5bda17cdcac940497bdf23d6c2d83d2e9ef3475bf"} Feb 23 14:19:07.608678 master-0 kubenswrapper[7728]: I0223 14:19:07.607607 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-86l7f" podStartSLOduration=11.139382108 podStartE2EDuration="17.607561275s" podCreationTimestamp="2026-02-23 14:18:50 +0000 UTC" firstStartedPulling="2026-02-23 14:19:00.29334257 +0000 UTC m=+33.256003866" lastFinishedPulling="2026-02-23 14:19:06.761521737 +0000 UTC m=+39.724183033" observedRunningTime="2026-02-23 14:19:07.607120395 +0000 UTC m=+40.569781691" watchObservedRunningTime="2026-02-23 14:19:07.607561275 +0000 UTC m=+40.570222571" Feb 23 14:19:07.608678 master-0 kubenswrapper[7728]: I0223 14:19:07.608449 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" event={"ID":"585f74db-4593-426b-b0c7-ec8f64810549","Type":"ContainerStarted","Data":"601970e99fee05d1ddde3baeb681b21e539729838cc176b833fde61b155a74a5"} Feb 23 14:19:07.614474 master-0 kubenswrapper[7728]: I0223 14:19:07.614440 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" event={"ID":"646fece3-4a42-4e0c-bcc7-5f705f948d63","Type":"ContainerStarted","Data":"d4eec9eade1a6fd9bfe0d642fe3ae425b01a962b7129ee11f0681674274aaff6"} Feb 23 14:19:07.617206 master-0 kubenswrapper[7728]: I0223 14:19:07.617101 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" event={"ID":"66c72c71-f74a-43ab-bf0d-1f4c93623774","Type":"ContainerStarted","Data":"e192093c7698f9c13f14fd55a50b3b960cd4142b3b8cb914299c2709465ffc51"} Feb 23 14:19:07.617206 master-0 kubenswrapper[7728]: I0223 14:19:07.617136 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" event={"ID":"66c72c71-f74a-43ab-bf0d-1f4c93623774","Type":"ContainerStarted","Data":"4d87515ad6869d3e6b52fa61f2c70d062fc9d561e8b978fe5f1550fbaf07704a"} Feb 23 14:19:07.617206 master-0 kubenswrapper[7728]: I0223 14:19:07.617153 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" event={"ID":"66c72c71-f74a-43ab-bf0d-1f4c93623774","Type":"ContainerStarted","Data":"e209b32301611ace99d9d8f60b3c7574bcb7691d3f24d73da6cbdd55987d8c54"} Feb 23 14:19:07.647590 master-0 kubenswrapper[7728]: I0223 14:19:07.647536 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" podStartSLOduration=1.647519436 podStartE2EDuration="1.647519436s" podCreationTimestamp="2026-02-23 14:19:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:19:07.645458481 +0000 UTC m=+40.608119777" watchObservedRunningTime="2026-02-23 14:19:07.647519436 +0000 UTC m=+40.610180732" Feb 23 14:19:07.649470 master-0 kubenswrapper[7728]: I0223 14:19:07.649433 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" podStartSLOduration=3.649419876 podStartE2EDuration="3.649419876s" podCreationTimestamp="2026-02-23 14:19:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:19:07.622487846 +0000 UTC m=+40.585149152" watchObservedRunningTime="2026-02-23 14:19:07.649419876 +0000 UTC m=+40.612081182" Feb 23 14:19:07.683491 master-0 kubenswrapper[7728]: I0223 14:19:07.683429 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-666b887977-f7h55"] Feb 23 14:19:07.683681 master-0 kubenswrapper[7728]: E0223 14:19:07.683610 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7319dd65-3b07-4120-9a7b-60da5d0ed066" containerName="fix-audit-permissions" Feb 23 14:19:07.683681 master-0 kubenswrapper[7728]: I0223 14:19:07.683621 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="7319dd65-3b07-4120-9a7b-60da5d0ed066" containerName="fix-audit-permissions" Feb 23 14:19:07.683756 master-0 kubenswrapper[7728]: I0223 14:19:07.683708 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="7319dd65-3b07-4120-9a7b-60da5d0ed066" containerName="fix-audit-permissions" Feb 23 14:19:07.685004 master-0 kubenswrapper[7728]: I0223 14:19:07.684977 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.690571 master-0 kubenswrapper[7728]: I0223 14:19:07.689345 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 23 14:19:07.690571 master-0 kubenswrapper[7728]: I0223 14:19:07.689520 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 23 14:19:07.690571 master-0 kubenswrapper[7728]: I0223 14:19:07.689609 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 23 14:19:07.690571 master-0 kubenswrapper[7728]: I0223 14:19:07.689814 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 23 14:19:07.690571 master-0 kubenswrapper[7728]: I0223 14:19:07.689923 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 23 14:19:07.690571 master-0 kubenswrapper[7728]: I0223 14:19:07.690025 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 23 14:19:07.691306 master-0 kubenswrapper[7728]: I0223 14:19:07.691285 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 23 14:19:07.691405 master-0 kubenswrapper[7728]: I0223 14:19:07.689372 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 23 14:19:07.691619 master-0 kubenswrapper[7728]: I0223 14:19:07.691590 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 23 14:19:07.699612 master-0 kubenswrapper[7728]: I0223 14:19:07.696737 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-746d649f6b-54g2n"] Feb 23 14:19:07.708683 master-0 kubenswrapper[7728]: I0223 14:19:07.707846 7728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-apiserver/apiserver-746d649f6b-54g2n"] Feb 23 14:19:07.708683 master-0 kubenswrapper[7728]: I0223 14:19:07.707931 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-666b887977-f7h55"] Feb 23 14:19:07.708683 master-0 kubenswrapper[7728]: I0223 14:19:07.708006 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 23 14:19:07.715257 master-0 kubenswrapper[7728]: I0223 14:19:07.714088 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" podStartSLOduration=3.714065589 podStartE2EDuration="3.714065589s" podCreationTimestamp="2026-02-23 14:19:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:19:07.710243667 +0000 UTC m=+40.672904963" watchObservedRunningTime="2026-02-23 14:19:07.714065589 +0000 UTC m=+40.676726915" Feb 23 14:19:07.854830 master-0 kubenswrapper[7728]: I0223 14:19:07.854773 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/588a804a-430a-47f4-aa97-c08e907239da-trusted-ca-bundle\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.854830 master-0 kubenswrapper[7728]: I0223 14:19:07.854839 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzrqz\" (UniqueName: \"kubernetes.io/projected/588a804a-430a-47f4-aa97-c08e907239da-kube-api-access-hzrqz\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.855333 master-0 kubenswrapper[7728]: I0223 14:19:07.854866 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/588a804a-430a-47f4-aa97-c08e907239da-node-pullsecrets\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.855333 master-0 kubenswrapper[7728]: I0223 14:19:07.854891 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/588a804a-430a-47f4-aa97-c08e907239da-config\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.855333 master-0 kubenswrapper[7728]: I0223 14:19:07.854916 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/588a804a-430a-47f4-aa97-c08e907239da-encryption-config\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.855333 master-0 kubenswrapper[7728]: I0223 14:19:07.855011 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/588a804a-430a-47f4-aa97-c08e907239da-serving-cert\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.855333 master-0 kubenswrapper[7728]: I0223 14:19:07.855036 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/588a804a-430a-47f4-aa97-c08e907239da-audit-dir\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.855333 master-0 kubenswrapper[7728]: I0223 14:19:07.855059 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/588a804a-430a-47f4-aa97-c08e907239da-etcd-serving-ca\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.855333 master-0 kubenswrapper[7728]: I0223 14:19:07.855117 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/588a804a-430a-47f4-aa97-c08e907239da-image-import-ca\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.855333 master-0 kubenswrapper[7728]: I0223 14:19:07.855144 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/588a804a-430a-47f4-aa97-c08e907239da-etcd-client\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.855333 master-0 kubenswrapper[7728]: I0223 14:19:07.855158 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/588a804a-430a-47f4-aa97-c08e907239da-audit\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.919327 master-0 kubenswrapper[7728]: I0223 14:19:07.919288 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:19:07.956675 master-0 kubenswrapper[7728]: I0223 14:19:07.956634 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/588a804a-430a-47f4-aa97-c08e907239da-audit-dir\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.956675 master-0 kubenswrapper[7728]: I0223 14:19:07.956675 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/588a804a-430a-47f4-aa97-c08e907239da-etcd-serving-ca\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.956937 master-0 kubenswrapper[7728]: I0223 14:19:07.956707 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/588a804a-430a-47f4-aa97-c08e907239da-image-import-ca\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.956937 master-0 kubenswrapper[7728]: I0223 14:19:07.956737 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/588a804a-430a-47f4-aa97-c08e907239da-etcd-client\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.956937 master-0 kubenswrapper[7728]: I0223 14:19:07.956752 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/588a804a-430a-47f4-aa97-c08e907239da-audit\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.956937 master-0 kubenswrapper[7728]: I0223 14:19:07.956774 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/588a804a-430a-47f4-aa97-c08e907239da-trusted-ca-bundle\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.956937 master-0 kubenswrapper[7728]: I0223 14:19:07.956789 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzrqz\" (UniqueName: \"kubernetes.io/projected/588a804a-430a-47f4-aa97-c08e907239da-kube-api-access-hzrqz\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.956937 master-0 kubenswrapper[7728]: I0223 14:19:07.956805 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/588a804a-430a-47f4-aa97-c08e907239da-node-pullsecrets\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.956937 master-0 kubenswrapper[7728]: I0223 14:19:07.956824 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/588a804a-430a-47f4-aa97-c08e907239da-config\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.956937 master-0 kubenswrapper[7728]: I0223 14:19:07.956853 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/588a804a-430a-47f4-aa97-c08e907239da-encryption-config\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.956937 master-0 kubenswrapper[7728]: I0223 14:19:07.956881 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/588a804a-430a-47f4-aa97-c08e907239da-serving-cert\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.957631 master-0 kubenswrapper[7728]: I0223 14:19:07.957586 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/588a804a-430a-47f4-aa97-c08e907239da-node-pullsecrets\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.958114 master-0 kubenswrapper[7728]: I0223 14:19:07.958086 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/588a804a-430a-47f4-aa97-c08e907239da-image-import-ca\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.958156 master-0 kubenswrapper[7728]: I0223 14:19:07.958122 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/588a804a-430a-47f4-aa97-c08e907239da-config\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.958209 master-0 kubenswrapper[7728]: I0223 14:19:07.958171 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/588a804a-430a-47f4-aa97-c08e907239da-audit-dir\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.958838 master-0 kubenswrapper[7728]: I0223 14:19:07.958798 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/588a804a-430a-47f4-aa97-c08e907239da-audit\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.959397 master-0 kubenswrapper[7728]: I0223 14:19:07.959370 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/588a804a-430a-47f4-aa97-c08e907239da-trusted-ca-bundle\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.959980 master-0 kubenswrapper[7728]: I0223 14:19:07.959942 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/588a804a-430a-47f4-aa97-c08e907239da-etcd-serving-ca\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.976720 master-0 kubenswrapper[7728]: I0223 14:19:07.976684 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/588a804a-430a-47f4-aa97-c08e907239da-encryption-config\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.977274 master-0 kubenswrapper[7728]: I0223 14:19:07.977249 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/588a804a-430a-47f4-aa97-c08e907239da-serving-cert\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.977716 master-0 kubenswrapper[7728]: I0223 14:19:07.977695 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/588a804a-430a-47f4-aa97-c08e907239da-etcd-client\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:07.984561 master-0 kubenswrapper[7728]: I0223 14:19:07.984190 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzrqz\" (UniqueName: \"kubernetes.io/projected/588a804a-430a-47f4-aa97-c08e907239da-kube-api-access-hzrqz\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:08.013468 master-0 kubenswrapper[7728]: I0223 14:19:08.013418 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:08.625052 master-0 kubenswrapper[7728]: I0223 14:19:08.624971 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"ab99fbb6-d945-4bf7-a85c-239f83137a4d","Type":"ContainerStarted","Data":"1428da8f7e46531b25d14355faa50d250078cad565c334445a3cf45616e87b7a"} Feb 23 14:19:08.625929 master-0 kubenswrapper[7728]: I0223 14:19:08.625891 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:19:08.646131 master-0 kubenswrapper[7728]: I0223 14:19:08.646021 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-2-master-0" podStartSLOduration=2.6460004489999998 podStartE2EDuration="2.646000449s" podCreationTimestamp="2026-02-23 14:19:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:19:08.641598094 +0000 UTC m=+41.604259390" watchObservedRunningTime="2026-02-23 14:19:08.646000449 +0000 UTC m=+41.608661745" Feb 23 14:19:09.226293 master-0 kubenswrapper[7728]: I0223 14:19:09.226254 7728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7319dd65-3b07-4120-9a7b-60da5d0ed066" path="/var/lib/kubelet/pods/7319dd65-3b07-4120-9a7b-60da5d0ed066/volumes" Feb 23 14:19:10.826116 master-0 kubenswrapper[7728]: I0223 14:19:10.826062 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5896cbddf7-qqhhj"] Feb 23 14:19:10.827301 master-0 kubenswrapper[7728]: I0223 14:19:10.826413 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5896cbddf7-qqhhj" podUID="65845117-85ab-4133-99ed-6dcd6e736e09" containerName="controller-manager" containerID="cri-o://35b34cdb83e58e0911707e2ac6fe213d3d4beb01c126e819a9b508fddfe737d3" gracePeriod=30 Feb 23 14:19:10.851236 master-0 kubenswrapper[7728]: I0223 14:19:10.850980 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56fb65d69d-lnsbc"] Feb 23 14:19:10.851236 master-0 kubenswrapper[7728]: I0223 14:19:10.851212 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-56fb65d69d-lnsbc" podUID="722d4323-f61f-43f0-958a-38b117880306" containerName="route-controller-manager" containerID="cri-o://2d4c2e03b8c825b10187a251700d61e5829b4208fbb4aa1fd9687aeb12102689" gracePeriod=30 Feb 23 14:19:11.660102 master-0 kubenswrapper[7728]: I0223 14:19:11.659896 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5896cbddf7-qqhhj" event={"ID":"65845117-85ab-4133-99ed-6dcd6e736e09","Type":"ContainerDied","Data":"35b34cdb83e58e0911707e2ac6fe213d3d4beb01c126e819a9b508fddfe737d3"} Feb 23 14:19:11.660351 master-0 kubenswrapper[7728]: I0223 14:19:11.659868 7728 generic.go:334] "Generic (PLEG): container finished" podID="65845117-85ab-4133-99ed-6dcd6e736e09" containerID="35b34cdb83e58e0911707e2ac6fe213d3d4beb01c126e819a9b508fddfe737d3" exitCode=0 Feb 23 14:19:11.669795 master-0 kubenswrapper[7728]: I0223 14:19:11.663684 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" event={"ID":"3488a7eb-5170-478c-9af7-490dbe0f514e","Type":"ContainerStarted","Data":"475b682a5602a8b70516629df8770a92cda1f614d3b2e4b8f4d6b708bbc8532d"} Feb 23 14:19:11.669795 master-0 kubenswrapper[7728]: I0223 14:19:11.665853 7728 generic.go:334] "Generic (PLEG): container finished" podID="722d4323-f61f-43f0-958a-38b117880306" containerID="2d4c2e03b8c825b10187a251700d61e5829b4208fbb4aa1fd9687aeb12102689" exitCode=0 Feb 23 14:19:11.669795 master-0 kubenswrapper[7728]: I0223 14:19:11.665904 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56fb65d69d-lnsbc" event={"ID":"722d4323-f61f-43f0-958a-38b117880306","Type":"ContainerDied","Data":"2d4c2e03b8c825b10187a251700d61e5829b4208fbb4aa1fd9687aeb12102689"} Feb 23 14:19:11.669795 master-0 kubenswrapper[7728]: I0223 14:19:11.667667 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" event={"ID":"585f74db-4593-426b-b0c7-ec8f64810549","Type":"ContainerStarted","Data":"3d191963e287b24eb8e359eae476b7710f1b01ed3998cce17300434d7f6e8d0b"} Feb 23 14:19:11.669795 master-0 kubenswrapper[7728]: I0223 14:19:11.668797 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:19:11.671970 master-0 kubenswrapper[7728]: I0223 14:19:11.671933 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" event={"ID":"646fece3-4a42-4e0c-bcc7-5f705f948d63","Type":"ContainerStarted","Data":"616aaf781ead921aaf00c1f2c54e5afa774694ee9d0c517cb881a4d591a4f886"} Feb 23 14:19:11.674083 master-0 kubenswrapper[7728]: I0223 14:19:11.673743 7728 patch_prober.go:28] interesting pod/marketplace-operator-6f5488b997-7b5sp container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.15:8080/healthz\": dial tcp 10.128.0.15:8080: connect: connection refused" start-of-body= Feb 23 14:19:11.674083 master-0 kubenswrapper[7728]: I0223 14:19:11.673795 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" podUID="585f74db-4593-426b-b0c7-ec8f64810549" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.15:8080/healthz\": dial tcp 10.128.0.15:8080: connect: connection refused" Feb 23 14:19:11.843404 master-0 kubenswrapper[7728]: I0223 14:19:11.843364 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56fb65d69d-lnsbc" Feb 23 14:19:11.889595 master-0 kubenswrapper[7728]: I0223 14:19:11.889508 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-666b887977-f7h55"] Feb 23 14:19:11.933960 master-0 kubenswrapper[7728]: I0223 14:19:11.933921 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5896cbddf7-qqhhj" Feb 23 14:19:12.007512 master-0 kubenswrapper[7728]: I0223 14:19:12.007378 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/65845117-85ab-4133-99ed-6dcd6e736e09-proxy-ca-bundles\") pod \"65845117-85ab-4133-99ed-6dcd6e736e09\" (UID: \"65845117-85ab-4133-99ed-6dcd6e736e09\") " Feb 23 14:19:12.007512 master-0 kubenswrapper[7728]: I0223 14:19:12.007443 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65845117-85ab-4133-99ed-6dcd6e736e09-config\") pod \"65845117-85ab-4133-99ed-6dcd6e736e09\" (UID: \"65845117-85ab-4133-99ed-6dcd6e736e09\") " Feb 23 14:19:12.007512 master-0 kubenswrapper[7728]: I0223 14:19:12.007517 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65845117-85ab-4133-99ed-6dcd6e736e09-serving-cert\") pod \"65845117-85ab-4133-99ed-6dcd6e736e09\" (UID: \"65845117-85ab-4133-99ed-6dcd6e736e09\") " Feb 23 14:19:12.013093 master-0 kubenswrapper[7728]: I0223 14:19:12.007573 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/722d4323-f61f-43f0-958a-38b117880306-serving-cert\") pod \"722d4323-f61f-43f0-958a-38b117880306\" (UID: \"722d4323-f61f-43f0-958a-38b117880306\") " Feb 23 14:19:12.013093 master-0 kubenswrapper[7728]: I0223 14:19:12.007871 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/722d4323-f61f-43f0-958a-38b117880306-config\") pod \"722d4323-f61f-43f0-958a-38b117880306\" (UID: \"722d4323-f61f-43f0-958a-38b117880306\") " Feb 23 14:19:12.013093 master-0 kubenswrapper[7728]: I0223 14:19:12.007905 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/722d4323-f61f-43f0-958a-38b117880306-client-ca\") pod \"722d4323-f61f-43f0-958a-38b117880306\" (UID: \"722d4323-f61f-43f0-958a-38b117880306\") " Feb 23 14:19:12.013093 master-0 kubenswrapper[7728]: I0223 14:19:12.007945 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/65845117-85ab-4133-99ed-6dcd6e736e09-client-ca\") pod \"65845117-85ab-4133-99ed-6dcd6e736e09\" (UID: \"65845117-85ab-4133-99ed-6dcd6e736e09\") " Feb 23 14:19:12.013093 master-0 kubenswrapper[7728]: I0223 14:19:12.008262 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65845117-85ab-4133-99ed-6dcd6e736e09-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "65845117-85ab-4133-99ed-6dcd6e736e09" (UID: "65845117-85ab-4133-99ed-6dcd6e736e09"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:19:12.013093 master-0 kubenswrapper[7728]: I0223 14:19:12.008324 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chc9w\" (UniqueName: \"kubernetes.io/projected/65845117-85ab-4133-99ed-6dcd6e736e09-kube-api-access-chc9w\") pod \"65845117-85ab-4133-99ed-6dcd6e736e09\" (UID: \"65845117-85ab-4133-99ed-6dcd6e736e09\") " Feb 23 14:19:12.013093 master-0 kubenswrapper[7728]: I0223 14:19:12.008367 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8cg6\" (UniqueName: \"kubernetes.io/projected/722d4323-f61f-43f0-958a-38b117880306-kube-api-access-f8cg6\") pod \"722d4323-f61f-43f0-958a-38b117880306\" (UID: \"722d4323-f61f-43f0-958a-38b117880306\") " Feb 23 14:19:12.013093 master-0 kubenswrapper[7728]: I0223 14:19:12.008603 7728 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/65845117-85ab-4133-99ed-6dcd6e736e09-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:12.013093 master-0 kubenswrapper[7728]: I0223 14:19:12.009578 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65845117-85ab-4133-99ed-6dcd6e736e09-config" (OuterVolumeSpecName: "config") pod "65845117-85ab-4133-99ed-6dcd6e736e09" (UID: "65845117-85ab-4133-99ed-6dcd6e736e09"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:19:12.013093 master-0 kubenswrapper[7728]: I0223 14:19:12.010275 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/722d4323-f61f-43f0-958a-38b117880306-client-ca" (OuterVolumeSpecName: "client-ca") pod "722d4323-f61f-43f0-958a-38b117880306" (UID: "722d4323-f61f-43f0-958a-38b117880306"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:19:12.013093 master-0 kubenswrapper[7728]: I0223 14:19:12.011090 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/722d4323-f61f-43f0-958a-38b117880306-config" (OuterVolumeSpecName: "config") pod "722d4323-f61f-43f0-958a-38b117880306" (UID: "722d4323-f61f-43f0-958a-38b117880306"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:19:12.013093 master-0 kubenswrapper[7728]: I0223 14:19:12.011919 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65845117-85ab-4133-99ed-6dcd6e736e09-client-ca" (OuterVolumeSpecName: "client-ca") pod "65845117-85ab-4133-99ed-6dcd6e736e09" (UID: "65845117-85ab-4133-99ed-6dcd6e736e09"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:19:12.025277 master-0 kubenswrapper[7728]: I0223 14:19:12.025185 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/722d4323-f61f-43f0-958a-38b117880306-kube-api-access-f8cg6" (OuterVolumeSpecName: "kube-api-access-f8cg6") pod "722d4323-f61f-43f0-958a-38b117880306" (UID: "722d4323-f61f-43f0-958a-38b117880306"). InnerVolumeSpecName "kube-api-access-f8cg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:19:12.025432 master-0 kubenswrapper[7728]: I0223 14:19:12.025341 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65845117-85ab-4133-99ed-6dcd6e736e09-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "65845117-85ab-4133-99ed-6dcd6e736e09" (UID: "65845117-85ab-4133-99ed-6dcd6e736e09"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:19:12.029415 master-0 kubenswrapper[7728]: I0223 14:19:12.028670 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/722d4323-f61f-43f0-958a-38b117880306-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "722d4323-f61f-43f0-958a-38b117880306" (UID: "722d4323-f61f-43f0-958a-38b117880306"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:19:12.035537 master-0 kubenswrapper[7728]: I0223 14:19:12.033785 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65845117-85ab-4133-99ed-6dcd6e736e09-kube-api-access-chc9w" (OuterVolumeSpecName: "kube-api-access-chc9w") pod "65845117-85ab-4133-99ed-6dcd6e736e09" (UID: "65845117-85ab-4133-99ed-6dcd6e736e09"). InnerVolumeSpecName "kube-api-access-chc9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:19:12.113968 master-0 kubenswrapper[7728]: I0223 14:19:12.113913 7728 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/65845117-85ab-4133-99ed-6dcd6e736e09-client-ca\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:12.113968 master-0 kubenswrapper[7728]: I0223 14:19:12.113958 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chc9w\" (UniqueName: \"kubernetes.io/projected/65845117-85ab-4133-99ed-6dcd6e736e09-kube-api-access-chc9w\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:12.113968 master-0 kubenswrapper[7728]: I0223 14:19:12.113968 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8cg6\" (UniqueName: \"kubernetes.io/projected/722d4323-f61f-43f0-958a-38b117880306-kube-api-access-f8cg6\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:12.113968 master-0 kubenswrapper[7728]: I0223 14:19:12.113976 7728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65845117-85ab-4133-99ed-6dcd6e736e09-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:12.113968 master-0 kubenswrapper[7728]: I0223 14:19:12.113984 7728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65845117-85ab-4133-99ed-6dcd6e736e09-serving-cert\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:12.114270 master-0 kubenswrapper[7728]: I0223 14:19:12.113994 7728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/722d4323-f61f-43f0-958a-38b117880306-serving-cert\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:12.114270 master-0 kubenswrapper[7728]: I0223 14:19:12.114002 7728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/722d4323-f61f-43f0-958a-38b117880306-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:12.114270 master-0 kubenswrapper[7728]: I0223 14:19:12.114010 7728 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/722d4323-f61f-43f0-958a-38b117880306-client-ca\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:12.368724 master-0 kubenswrapper[7728]: I0223 14:19:12.368099 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c445f5bd9-qtw5z"] Feb 23 14:19:12.368724 master-0 kubenswrapper[7728]: E0223 14:19:12.368264 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="722d4323-f61f-43f0-958a-38b117880306" containerName="route-controller-manager" Feb 23 14:19:12.368724 master-0 kubenswrapper[7728]: I0223 14:19:12.368277 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="722d4323-f61f-43f0-958a-38b117880306" containerName="route-controller-manager" Feb 23 14:19:12.368724 master-0 kubenswrapper[7728]: E0223 14:19:12.368298 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65845117-85ab-4133-99ed-6dcd6e736e09" containerName="controller-manager" Feb 23 14:19:12.368724 master-0 kubenswrapper[7728]: I0223 14:19:12.368307 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="65845117-85ab-4133-99ed-6dcd6e736e09" containerName="controller-manager" Feb 23 14:19:12.368724 master-0 kubenswrapper[7728]: I0223 14:19:12.368393 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="65845117-85ab-4133-99ed-6dcd6e736e09" containerName="controller-manager" Feb 23 14:19:12.368724 master-0 kubenswrapper[7728]: I0223 14:19:12.368405 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="722d4323-f61f-43f0-958a-38b117880306" containerName="route-controller-manager" Feb 23 14:19:12.369600 master-0 kubenswrapper[7728]: I0223 14:19:12.369582 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c445f5bd9-qtw5z" Feb 23 14:19:12.378055 master-0 kubenswrapper[7728]: I0223 14:19:12.378010 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c445f5bd9-qtw5z"] Feb 23 14:19:12.518225 master-0 kubenswrapper[7728]: I0223 14:19:12.518152 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9694f604-5dcf-4aec-a54f-67b4d8dc8809-client-ca\") pod \"route-controller-manager-6c445f5bd9-qtw5z\" (UID: \"9694f604-5dcf-4aec-a54f-67b4d8dc8809\") " pod="openshift-route-controller-manager/route-controller-manager-6c445f5bd9-qtw5z" Feb 23 14:19:12.518225 master-0 kubenswrapper[7728]: I0223 14:19:12.518202 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9694f604-5dcf-4aec-a54f-67b4d8dc8809-serving-cert\") pod \"route-controller-manager-6c445f5bd9-qtw5z\" (UID: \"9694f604-5dcf-4aec-a54f-67b4d8dc8809\") " pod="openshift-route-controller-manager/route-controller-manager-6c445f5bd9-qtw5z" Feb 23 14:19:12.518225 master-0 kubenswrapper[7728]: I0223 14:19:12.518229 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5f7c\" (UniqueName: \"kubernetes.io/projected/9694f604-5dcf-4aec-a54f-67b4d8dc8809-kube-api-access-n5f7c\") pod \"route-controller-manager-6c445f5bd9-qtw5z\" (UID: \"9694f604-5dcf-4aec-a54f-67b4d8dc8809\") " pod="openshift-route-controller-manager/route-controller-manager-6c445f5bd9-qtw5z" Feb 23 14:19:12.518619 master-0 kubenswrapper[7728]: I0223 14:19:12.518252 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9694f604-5dcf-4aec-a54f-67b4d8dc8809-config\") pod \"route-controller-manager-6c445f5bd9-qtw5z\" (UID: \"9694f604-5dcf-4aec-a54f-67b4d8dc8809\") " pod="openshift-route-controller-manager/route-controller-manager-6c445f5bd9-qtw5z" Feb 23 14:19:12.619614 master-0 kubenswrapper[7728]: I0223 14:19:12.619549 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9694f604-5dcf-4aec-a54f-67b4d8dc8809-serving-cert\") pod \"route-controller-manager-6c445f5bd9-qtw5z\" (UID: \"9694f604-5dcf-4aec-a54f-67b4d8dc8809\") " pod="openshift-route-controller-manager/route-controller-manager-6c445f5bd9-qtw5z" Feb 23 14:19:12.619840 master-0 kubenswrapper[7728]: I0223 14:19:12.619744 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5f7c\" (UniqueName: \"kubernetes.io/projected/9694f604-5dcf-4aec-a54f-67b4d8dc8809-kube-api-access-n5f7c\") pod \"route-controller-manager-6c445f5bd9-qtw5z\" (UID: \"9694f604-5dcf-4aec-a54f-67b4d8dc8809\") " pod="openshift-route-controller-manager/route-controller-manager-6c445f5bd9-qtw5z" Feb 23 14:19:12.619840 master-0 kubenswrapper[7728]: I0223 14:19:12.619773 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9694f604-5dcf-4aec-a54f-67b4d8dc8809-config\") pod \"route-controller-manager-6c445f5bd9-qtw5z\" (UID: \"9694f604-5dcf-4aec-a54f-67b4d8dc8809\") " pod="openshift-route-controller-manager/route-controller-manager-6c445f5bd9-qtw5z" Feb 23 14:19:12.619840 master-0 kubenswrapper[7728]: I0223 14:19:12.619809 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9694f604-5dcf-4aec-a54f-67b4d8dc8809-client-ca\") pod \"route-controller-manager-6c445f5bd9-qtw5z\" (UID: \"9694f604-5dcf-4aec-a54f-67b4d8dc8809\") " pod="openshift-route-controller-manager/route-controller-manager-6c445f5bd9-qtw5z" Feb 23 14:19:12.621723 master-0 kubenswrapper[7728]: I0223 14:19:12.620665 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9694f604-5dcf-4aec-a54f-67b4d8dc8809-client-ca\") pod \"route-controller-manager-6c445f5bd9-qtw5z\" (UID: \"9694f604-5dcf-4aec-a54f-67b4d8dc8809\") " pod="openshift-route-controller-manager/route-controller-manager-6c445f5bd9-qtw5z" Feb 23 14:19:12.621723 master-0 kubenswrapper[7728]: I0223 14:19:12.621651 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9694f604-5dcf-4aec-a54f-67b4d8dc8809-config\") pod \"route-controller-manager-6c445f5bd9-qtw5z\" (UID: \"9694f604-5dcf-4aec-a54f-67b4d8dc8809\") " pod="openshift-route-controller-manager/route-controller-manager-6c445f5bd9-qtw5z" Feb 23 14:19:12.623228 master-0 kubenswrapper[7728]: I0223 14:19:12.623207 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9694f604-5dcf-4aec-a54f-67b4d8dc8809-serving-cert\") pod \"route-controller-manager-6c445f5bd9-qtw5z\" (UID: \"9694f604-5dcf-4aec-a54f-67b4d8dc8809\") " pod="openshift-route-controller-manager/route-controller-manager-6c445f5bd9-qtw5z" Feb 23 14:19:12.641104 master-0 kubenswrapper[7728]: I0223 14:19:12.641067 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5f7c\" (UniqueName: \"kubernetes.io/projected/9694f604-5dcf-4aec-a54f-67b4d8dc8809-kube-api-access-n5f7c\") pod \"route-controller-manager-6c445f5bd9-qtw5z\" (UID: \"9694f604-5dcf-4aec-a54f-67b4d8dc8809\") " pod="openshift-route-controller-manager/route-controller-manager-6c445f5bd9-qtw5z" Feb 23 14:19:12.678522 master-0 kubenswrapper[7728]: I0223 14:19:12.678200 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5896cbddf7-qqhhj" event={"ID":"65845117-85ab-4133-99ed-6dcd6e736e09","Type":"ContainerDied","Data":"e64ca766bdf85321a781ce69ed3e1869acf08ca216f304fdb5561750a836b6b8"} Feb 23 14:19:12.678522 master-0 kubenswrapper[7728]: I0223 14:19:12.678253 7728 scope.go:117] "RemoveContainer" containerID="35b34cdb83e58e0911707e2ac6fe213d3d4beb01c126e819a9b508fddfe737d3" Feb 23 14:19:12.678522 master-0 kubenswrapper[7728]: I0223 14:19:12.678259 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5896cbddf7-qqhhj" Feb 23 14:19:12.683991 master-0 kubenswrapper[7728]: I0223 14:19:12.683963 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" event={"ID":"3488a7eb-5170-478c-9af7-490dbe0f514e","Type":"ContainerStarted","Data":"67af818672ec3e17ceefc5db7bd1c60de0a5faf480f82c76fab1be5ba6eb05bb"} Feb 23 14:19:12.691872 master-0 kubenswrapper[7728]: I0223 14:19:12.690262 7728 generic.go:334] "Generic (PLEG): container finished" podID="588a804a-430a-47f4-aa97-c08e907239da" containerID="dd9d0218254ede188f4f7d6704abe8b5fb6e6589605eb6657b67d6dc33a0eb39" exitCode=0 Feb 23 14:19:12.691872 master-0 kubenswrapper[7728]: I0223 14:19:12.690289 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-666b887977-f7h55" event={"ID":"588a804a-430a-47f4-aa97-c08e907239da","Type":"ContainerDied","Data":"dd9d0218254ede188f4f7d6704abe8b5fb6e6589605eb6657b67d6dc33a0eb39"} Feb 23 14:19:12.691872 master-0 kubenswrapper[7728]: I0223 14:19:12.690332 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-666b887977-f7h55" event={"ID":"588a804a-430a-47f4-aa97-c08e907239da","Type":"ContainerStarted","Data":"15c8631804d0c1c71f4b64c5ee4ec4990f2c2f6adda4a03d015df366f3b28fd1"} Feb 23 14:19:12.703147 master-0 kubenswrapper[7728]: I0223 14:19:12.702981 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9dnsv" event={"ID":"ace75aae-6f4f-4299-90e2-d5292271b136","Type":"ContainerStarted","Data":"b5bbb5699c2dab9ad551747a05cc6c2594a6cc174c7a5d51b7f4dba4a3a2f82a"} Feb 23 14:19:12.703147 master-0 kubenswrapper[7728]: I0223 14:19:12.703021 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9dnsv" event={"ID":"ace75aae-6f4f-4299-90e2-d5292271b136","Type":"ContainerStarted","Data":"3b7e44d2452ae6675e651554bab681a42fae84f0e05ad5a73fb3027444acc8b8"} Feb 23 14:19:12.713843 master-0 kubenswrapper[7728]: I0223 14:19:12.713789 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56fb65d69d-lnsbc" event={"ID":"722d4323-f61f-43f0-958a-38b117880306","Type":"ContainerDied","Data":"5fc2b94e0100ca2b4a22051eea718d3cb70fbda931eb12d807c250eac49aeec2"} Feb 23 14:19:12.713843 master-0 kubenswrapper[7728]: I0223 14:19:12.713837 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56fb65d69d-lnsbc" Feb 23 14:19:12.714070 master-0 kubenswrapper[7728]: I0223 14:19:12.713856 7728 scope.go:117] "RemoveContainer" containerID="2d4c2e03b8c825b10187a251700d61e5829b4208fbb4aa1fd9687aeb12102689" Feb 23 14:19:12.724890 master-0 kubenswrapper[7728]: I0223 14:19:12.724850 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v" event={"ID":"842d45c5-3452-4e97-b5f5-540395330a65","Type":"ContainerStarted","Data":"2c192f5e695207c7fb2b849827d14cbf1e828f8b6127dbf574b9e2669fd9c4a7"} Feb 23 14:19:12.724967 master-0 kubenswrapper[7728]: I0223 14:19:12.724896 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v" event={"ID":"842d45c5-3452-4e97-b5f5-540395330a65","Type":"ContainerStarted","Data":"a1c596e71b919718ef2a5ecca5e4edc213fe0205cd3b4ffeba1110c64e033918"} Feb 23 14:19:12.726767 master-0 kubenswrapper[7728]: I0223 14:19:12.726724 7728 generic.go:334] "Generic (PLEG): container finished" podID="ea0b3538-9a7d-4995-b628-2d63f21d683c" containerID="ef55d8167c92b21a23135f3a8ced87d51d79df376d7aca850c7cba442f901e30" exitCode=0 Feb 23 14:19:12.726851 master-0 kubenswrapper[7728]: I0223 14:19:12.726821 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" event={"ID":"ea0b3538-9a7d-4995-b628-2d63f21d683c","Type":"ContainerDied","Data":"ef55d8167c92b21a23135f3a8ced87d51d79df376d7aca850c7cba442f901e30"} Feb 23 14:19:12.732901 master-0 kubenswrapper[7728]: I0223 14:19:12.732803 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:19:12.743798 master-0 kubenswrapper[7728]: I0223 14:19:12.743665 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5896cbddf7-qqhhj"] Feb 23 14:19:12.746182 master-0 kubenswrapper[7728]: I0223 14:19:12.746137 7728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5896cbddf7-qqhhj"] Feb 23 14:19:12.800139 master-0 kubenswrapper[7728]: I0223 14:19:12.794689 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c445f5bd9-qtw5z" Feb 23 14:19:12.878499 master-0 kubenswrapper[7728]: I0223 14:19:12.871538 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56fb65d69d-lnsbc"] Feb 23 14:19:12.878499 master-0 kubenswrapper[7728]: I0223 14:19:12.875969 7728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56fb65d69d-lnsbc"] Feb 23 14:19:13.226929 master-0 kubenswrapper[7728]: I0223 14:19:13.226874 7728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65845117-85ab-4133-99ed-6dcd6e736e09" path="/var/lib/kubelet/pods/65845117-85ab-4133-99ed-6dcd6e736e09/volumes" Feb 23 14:19:13.229544 master-0 kubenswrapper[7728]: I0223 14:19:13.227561 7728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="722d4323-f61f-43f0-958a-38b117880306" path="/var/lib/kubelet/pods/722d4323-f61f-43f0-958a-38b117880306/volumes" Feb 23 14:19:13.457128 master-0 kubenswrapper[7728]: I0223 14:19:13.457067 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-1-master-0"] Feb 23 14:19:13.457544 master-0 kubenswrapper[7728]: I0223 14:19:13.457517 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Feb 23 14:19:13.460600 master-0 kubenswrapper[7728]: I0223 14:19:13.460569 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Feb 23 14:19:13.488741 master-0 kubenswrapper[7728]: I0223 14:19:13.488630 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Feb 23 14:19:13.538078 master-0 kubenswrapper[7728]: I0223 14:19:13.537999 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5f67ab24-82bc-4e71-b974-e25b819986c8-var-lock\") pod \"installer-1-master-0\" (UID: \"5f67ab24-82bc-4e71-b974-e25b819986c8\") " pod="openshift-etcd/installer-1-master-0" Feb 23 14:19:13.538339 master-0 kubenswrapper[7728]: I0223 14:19:13.538143 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5f67ab24-82bc-4e71-b974-e25b819986c8-kube-api-access\") pod \"installer-1-master-0\" (UID: \"5f67ab24-82bc-4e71-b974-e25b819986c8\") " pod="openshift-etcd/installer-1-master-0" Feb 23 14:19:13.538339 master-0 kubenswrapper[7728]: I0223 14:19:13.538171 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5f67ab24-82bc-4e71-b974-e25b819986c8-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"5f67ab24-82bc-4e71-b974-e25b819986c8\") " pod="openshift-etcd/installer-1-master-0" Feb 23 14:19:13.639711 master-0 kubenswrapper[7728]: I0223 14:19:13.639656 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5f67ab24-82bc-4e71-b974-e25b819986c8-kube-api-access\") pod \"installer-1-master-0\" (UID: \"5f67ab24-82bc-4e71-b974-e25b819986c8\") " pod="openshift-etcd/installer-1-master-0" Feb 23 14:19:13.639711 master-0 kubenswrapper[7728]: I0223 14:19:13.639711 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5f67ab24-82bc-4e71-b974-e25b819986c8-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"5f67ab24-82bc-4e71-b974-e25b819986c8\") " pod="openshift-etcd/installer-1-master-0" Feb 23 14:19:13.639978 master-0 kubenswrapper[7728]: I0223 14:19:13.639905 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5f67ab24-82bc-4e71-b974-e25b819986c8-var-lock\") pod \"installer-1-master-0\" (UID: \"5f67ab24-82bc-4e71-b974-e25b819986c8\") " pod="openshift-etcd/installer-1-master-0" Feb 23 14:19:13.639978 master-0 kubenswrapper[7728]: I0223 14:19:13.639945 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5f67ab24-82bc-4e71-b974-e25b819986c8-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"5f67ab24-82bc-4e71-b974-e25b819986c8\") " pod="openshift-etcd/installer-1-master-0" Feb 23 14:19:13.640037 master-0 kubenswrapper[7728]: I0223 14:19:13.640014 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5f67ab24-82bc-4e71-b974-e25b819986c8-var-lock\") pod \"installer-1-master-0\" (UID: \"5f67ab24-82bc-4e71-b974-e25b819986c8\") " pod="openshift-etcd/installer-1-master-0" Feb 23 14:19:13.655496 master-0 kubenswrapper[7728]: I0223 14:19:13.655433 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5f67ab24-82bc-4e71-b974-e25b819986c8-kube-api-access\") pod \"installer-1-master-0\" (UID: \"5f67ab24-82bc-4e71-b974-e25b819986c8\") " pod="openshift-etcd/installer-1-master-0" Feb 23 14:19:13.771176 master-0 kubenswrapper[7728]: I0223 14:19:13.770877 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Feb 23 14:19:14.370641 master-0 kubenswrapper[7728]: I0223 14:19:14.370588 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-789c749c66-w826g"] Feb 23 14:19:14.371140 master-0 kubenswrapper[7728]: I0223 14:19:14.371112 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-789c749c66-w826g" Feb 23 14:19:14.372306 master-0 kubenswrapper[7728]: I0223 14:19:14.372266 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 23 14:19:14.372834 master-0 kubenswrapper[7728]: I0223 14:19:14.372810 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 23 14:19:14.373220 master-0 kubenswrapper[7728]: I0223 14:19:14.373168 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 23 14:19:14.373456 master-0 kubenswrapper[7728]: I0223 14:19:14.373422 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 23 14:19:14.373536 master-0 kubenswrapper[7728]: I0223 14:19:14.373522 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 23 14:19:14.384718 master-0 kubenswrapper[7728]: I0223 14:19:14.384691 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-789c749c66-w826g"] Feb 23 14:19:14.384845 master-0 kubenswrapper[7728]: I0223 14:19:14.384807 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 23 14:19:14.449034 master-0 kubenswrapper[7728]: I0223 14:19:14.448966 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b2af015a-7f5f-4702-9f06-8159a040dabe-client-ca\") pod \"controller-manager-789c749c66-w826g\" (UID: \"b2af015a-7f5f-4702-9f06-8159a040dabe\") " pod="openshift-controller-manager/controller-manager-789c749c66-w826g" Feb 23 14:19:14.449034 master-0 kubenswrapper[7728]: I0223 14:19:14.449037 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2af015a-7f5f-4702-9f06-8159a040dabe-serving-cert\") pod \"controller-manager-789c749c66-w826g\" (UID: \"b2af015a-7f5f-4702-9f06-8159a040dabe\") " pod="openshift-controller-manager/controller-manager-789c749c66-w826g" Feb 23 14:19:14.449306 master-0 kubenswrapper[7728]: I0223 14:19:14.449081 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b2af015a-7f5f-4702-9f06-8159a040dabe-proxy-ca-bundles\") pod \"controller-manager-789c749c66-w826g\" (UID: \"b2af015a-7f5f-4702-9f06-8159a040dabe\") " pod="openshift-controller-manager/controller-manager-789c749c66-w826g" Feb 23 14:19:14.449306 master-0 kubenswrapper[7728]: I0223 14:19:14.449107 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2af015a-7f5f-4702-9f06-8159a040dabe-config\") pod \"controller-manager-789c749c66-w826g\" (UID: \"b2af015a-7f5f-4702-9f06-8159a040dabe\") " pod="openshift-controller-manager/controller-manager-789c749c66-w826g" Feb 23 14:19:14.449306 master-0 kubenswrapper[7728]: I0223 14:19:14.449144 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpzm8\" (UniqueName: \"kubernetes.io/projected/b2af015a-7f5f-4702-9f06-8159a040dabe-kube-api-access-rpzm8\") pod \"controller-manager-789c749c66-w826g\" (UID: \"b2af015a-7f5f-4702-9f06-8159a040dabe\") " pod="openshift-controller-manager/controller-manager-789c749c66-w826g" Feb 23 14:19:14.550262 master-0 kubenswrapper[7728]: I0223 14:19:14.550198 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b2af015a-7f5f-4702-9f06-8159a040dabe-client-ca\") pod \"controller-manager-789c749c66-w826g\" (UID: \"b2af015a-7f5f-4702-9f06-8159a040dabe\") " pod="openshift-controller-manager/controller-manager-789c749c66-w826g" Feb 23 14:19:14.550262 master-0 kubenswrapper[7728]: I0223 14:19:14.550261 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2af015a-7f5f-4702-9f06-8159a040dabe-serving-cert\") pod \"controller-manager-789c749c66-w826g\" (UID: \"b2af015a-7f5f-4702-9f06-8159a040dabe\") " pod="openshift-controller-manager/controller-manager-789c749c66-w826g" Feb 23 14:19:14.550558 master-0 kubenswrapper[7728]: I0223 14:19:14.550288 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b2af015a-7f5f-4702-9f06-8159a040dabe-proxy-ca-bundles\") pod \"controller-manager-789c749c66-w826g\" (UID: \"b2af015a-7f5f-4702-9f06-8159a040dabe\") " pod="openshift-controller-manager/controller-manager-789c749c66-w826g" Feb 23 14:19:14.550558 master-0 kubenswrapper[7728]: I0223 14:19:14.550442 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2af015a-7f5f-4702-9f06-8159a040dabe-config\") pod \"controller-manager-789c749c66-w826g\" (UID: \"b2af015a-7f5f-4702-9f06-8159a040dabe\") " pod="openshift-controller-manager/controller-manager-789c749c66-w826g" Feb 23 14:19:14.550708 master-0 kubenswrapper[7728]: I0223 14:19:14.550659 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpzm8\" (UniqueName: \"kubernetes.io/projected/b2af015a-7f5f-4702-9f06-8159a040dabe-kube-api-access-rpzm8\") pod \"controller-manager-789c749c66-w826g\" (UID: \"b2af015a-7f5f-4702-9f06-8159a040dabe\") " pod="openshift-controller-manager/controller-manager-789c749c66-w826g" Feb 23 14:19:14.551246 master-0 kubenswrapper[7728]: I0223 14:19:14.551218 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b2af015a-7f5f-4702-9f06-8159a040dabe-proxy-ca-bundles\") pod \"controller-manager-789c749c66-w826g\" (UID: \"b2af015a-7f5f-4702-9f06-8159a040dabe\") " pod="openshift-controller-manager/controller-manager-789c749c66-w826g" Feb 23 14:19:14.551638 master-0 kubenswrapper[7728]: I0223 14:19:14.551595 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b2af015a-7f5f-4702-9f06-8159a040dabe-client-ca\") pod \"controller-manager-789c749c66-w826g\" (UID: \"b2af015a-7f5f-4702-9f06-8159a040dabe\") " pod="openshift-controller-manager/controller-manager-789c749c66-w826g" Feb 23 14:19:14.552335 master-0 kubenswrapper[7728]: I0223 14:19:14.552297 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2af015a-7f5f-4702-9f06-8159a040dabe-config\") pod \"controller-manager-789c749c66-w826g\" (UID: \"b2af015a-7f5f-4702-9f06-8159a040dabe\") " pod="openshift-controller-manager/controller-manager-789c749c66-w826g" Feb 23 14:19:14.554230 master-0 kubenswrapper[7728]: I0223 14:19:14.554191 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2af015a-7f5f-4702-9f06-8159a040dabe-serving-cert\") pod \"controller-manager-789c749c66-w826g\" (UID: \"b2af015a-7f5f-4702-9f06-8159a040dabe\") " pod="openshift-controller-manager/controller-manager-789c749c66-w826g" Feb 23 14:19:14.566585 master-0 kubenswrapper[7728]: I0223 14:19:14.566548 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpzm8\" (UniqueName: \"kubernetes.io/projected/b2af015a-7f5f-4702-9f06-8159a040dabe-kube-api-access-rpzm8\") pod \"controller-manager-789c749c66-w826g\" (UID: \"b2af015a-7f5f-4702-9f06-8159a040dabe\") " pod="openshift-controller-manager/controller-manager-789c749c66-w826g" Feb 23 14:19:14.604406 master-0 kubenswrapper[7728]: I0223 14:19:14.604313 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Feb 23 14:19:14.605225 master-0 kubenswrapper[7728]: I0223 14:19:14.605182 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Feb 23 14:19:14.607584 master-0 kubenswrapper[7728]: I0223 14:19:14.607542 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 23 14:19:14.613222 master-0 kubenswrapper[7728]: I0223 14:19:14.612533 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Feb 23 14:19:14.690981 master-0 kubenswrapper[7728]: I0223 14:19:14.690908 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-789c749c66-w826g" Feb 23 14:19:14.753014 master-0 kubenswrapper[7728]: I0223 14:19:14.752930 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d1bffce5-019a-4c97-85f2-929dc19a0bde-var-lock\") pod \"installer-1-master-0\" (UID: \"d1bffce5-019a-4c97-85f2-929dc19a0bde\") " pod="openshift-kube-controller-manager/installer-1-master-0" Feb 23 14:19:14.753014 master-0 kubenswrapper[7728]: I0223 14:19:14.752965 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d1bffce5-019a-4c97-85f2-929dc19a0bde-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"d1bffce5-019a-4c97-85f2-929dc19a0bde\") " pod="openshift-kube-controller-manager/installer-1-master-0" Feb 23 14:19:14.753294 master-0 kubenswrapper[7728]: I0223 14:19:14.753132 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1bffce5-019a-4c97-85f2-929dc19a0bde-kube-api-access\") pod \"installer-1-master-0\" (UID: \"d1bffce5-019a-4c97-85f2-929dc19a0bde\") " pod="openshift-kube-controller-manager/installer-1-master-0" Feb 23 14:19:14.854536 master-0 kubenswrapper[7728]: I0223 14:19:14.854457 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d1bffce5-019a-4c97-85f2-929dc19a0bde-var-lock\") pod \"installer-1-master-0\" (UID: \"d1bffce5-019a-4c97-85f2-929dc19a0bde\") " pod="openshift-kube-controller-manager/installer-1-master-0" Feb 23 14:19:14.854804 master-0 kubenswrapper[7728]: I0223 14:19:14.854582 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d1bffce5-019a-4c97-85f2-929dc19a0bde-var-lock\") pod \"installer-1-master-0\" (UID: \"d1bffce5-019a-4c97-85f2-929dc19a0bde\") " pod="openshift-kube-controller-manager/installer-1-master-0" Feb 23 14:19:14.854804 master-0 kubenswrapper[7728]: I0223 14:19:14.854629 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d1bffce5-019a-4c97-85f2-929dc19a0bde-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"d1bffce5-019a-4c97-85f2-929dc19a0bde\") " pod="openshift-kube-controller-manager/installer-1-master-0" Feb 23 14:19:14.854804 master-0 kubenswrapper[7728]: I0223 14:19:14.854792 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1bffce5-019a-4c97-85f2-929dc19a0bde-kube-api-access\") pod \"installer-1-master-0\" (UID: \"d1bffce5-019a-4c97-85f2-929dc19a0bde\") " pod="openshift-kube-controller-manager/installer-1-master-0" Feb 23 14:19:14.855078 master-0 kubenswrapper[7728]: I0223 14:19:14.854922 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d1bffce5-019a-4c97-85f2-929dc19a0bde-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"d1bffce5-019a-4c97-85f2-929dc19a0bde\") " pod="openshift-kube-controller-manager/installer-1-master-0" Feb 23 14:19:14.869076 master-0 kubenswrapper[7728]: I0223 14:19:14.869021 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1bffce5-019a-4c97-85f2-929dc19a0bde-kube-api-access\") pod \"installer-1-master-0\" (UID: \"d1bffce5-019a-4c97-85f2-929dc19a0bde\") " pod="openshift-kube-controller-manager/installer-1-master-0" Feb 23 14:19:14.941638 master-0 kubenswrapper[7728]: I0223 14:19:14.941571 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Feb 23 14:19:15.372203 master-0 kubenswrapper[7728]: I0223 14:19:15.372136 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:19:15.386545 master-0 kubenswrapper[7728]: I0223 14:19:15.386466 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:19:15.753213 master-0 kubenswrapper[7728]: I0223 14:19:15.753170 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" event={"ID":"ea0b3538-9a7d-4995-b628-2d63f21d683c","Type":"ContainerStarted","Data":"4b8ec9029f7fb9a56b769915848138fa88b329cbc5b64d2aed62c627e82f917f"} Feb 23 14:19:15.756414 master-0 kubenswrapper[7728]: I0223 14:19:15.756383 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-666b887977-f7h55" event={"ID":"588a804a-430a-47f4-aa97-c08e907239da","Type":"ContainerStarted","Data":"9204e03e920d3db05653e38c3d6e44bea01fa95318e2f1f94371f8808becda76"} Feb 23 14:19:15.758344 master-0 kubenswrapper[7728]: I0223 14:19:15.758312 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" event={"ID":"5b54fc16-d2f7-4b10-a611-5b411b389c5a","Type":"ContainerStarted","Data":"3e43920c8c9e66c01584e52a234477388c129ea94fe151ecc6c23098a8981522"} Feb 23 14:19:15.758764 master-0 kubenswrapper[7728]: I0223 14:19:15.758708 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:19:15.781501 master-0 kubenswrapper[7728]: I0223 14:19:15.780755 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" podStartSLOduration=7.054993572 podStartE2EDuration="11.780735659s" podCreationTimestamp="2026-02-23 14:19:04 +0000 UTC" firstStartedPulling="2026-02-23 14:19:06.748105288 +0000 UTC m=+39.710766584" lastFinishedPulling="2026-02-23 14:19:11.473847365 +0000 UTC m=+44.436508671" observedRunningTime="2026-02-23 14:19:15.779532183 +0000 UTC m=+48.742193479" watchObservedRunningTime="2026-02-23 14:19:15.780735659 +0000 UTC m=+48.743396955" Feb 23 14:19:15.813575 master-0 kubenswrapper[7728]: I0223 14:19:15.813534 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-789c749c66-w826g"] Feb 23 14:19:15.821537 master-0 kubenswrapper[7728]: W0223 14:19:15.821441 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2af015a_7f5f_4702_9f06_8159a040dabe.slice/crio-e51d035a0dfb539fd05e098e8e355aca5284eb3bb12234934a07a3e19cdf8677 WatchSource:0}: Error finding container e51d035a0dfb539fd05e098e8e355aca5284eb3bb12234934a07a3e19cdf8677: Status 404 returned error can't find the container with id e51d035a0dfb539fd05e098e8e355aca5284eb3bb12234934a07a3e19cdf8677 Feb 23 14:19:15.873809 master-0 kubenswrapper[7728]: I0223 14:19:15.873764 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c445f5bd9-qtw5z"] Feb 23 14:19:15.887954 master-0 kubenswrapper[7728]: W0223 14:19:15.887835 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9694f604_5dcf_4aec_a54f_67b4d8dc8809.slice/crio-d787bbe507abbf8e0df335f2db362ff8804d2d269abbde4b9813202bd59071f6 WatchSource:0}: Error finding container d787bbe507abbf8e0df335f2db362ff8804d2d269abbde4b9813202bd59071f6: Status 404 returned error can't find the container with id d787bbe507abbf8e0df335f2db362ff8804d2d269abbde4b9813202bd59071f6 Feb 23 14:19:15.895592 master-0 kubenswrapper[7728]: I0223 14:19:15.894362 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Feb 23 14:19:15.896090 master-0 kubenswrapper[7728]: I0223 14:19:15.896008 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Feb 23 14:19:15.927266 master-0 kubenswrapper[7728]: W0223 14:19:15.927205 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd1bffce5_019a_4c97_85f2_929dc19a0bde.slice/crio-d2f40a1ec635a92856cfa2a3f7bc1ac31eb82fe0e85c0e36c0e8828880ceda10 WatchSource:0}: Error finding container d2f40a1ec635a92856cfa2a3f7bc1ac31eb82fe0e85c0e36c0e8828880ceda10: Status 404 returned error can't find the container with id d2f40a1ec635a92856cfa2a3f7bc1ac31eb82fe0e85c0e36c0e8828880ceda10 Feb 23 14:19:16.503213 master-0 kubenswrapper[7728]: I0223 14:19:16.499549 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Feb 23 14:19:16.503213 master-0 kubenswrapper[7728]: I0223 14:19:16.500211 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Feb 23 14:19:16.503213 master-0 kubenswrapper[7728]: I0223 14:19:16.501841 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 23 14:19:16.519583 master-0 kubenswrapper[7728]: I0223 14:19:16.519111 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Feb 23 14:19:16.575515 master-0 kubenswrapper[7728]: I0223 14:19:16.575443 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/29f7b30e-bf6a-4e54-b009-1b0fcd830035-var-lock\") pod \"installer-1-master-0\" (UID: \"29f7b30e-bf6a-4e54-b009-1b0fcd830035\") " pod="openshift-kube-apiserver/installer-1-master-0" Feb 23 14:19:16.575515 master-0 kubenswrapper[7728]: I0223 14:19:16.575518 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29f7b30e-bf6a-4e54-b009-1b0fcd830035-kube-api-access\") pod \"installer-1-master-0\" (UID: \"29f7b30e-bf6a-4e54-b009-1b0fcd830035\") " pod="openshift-kube-apiserver/installer-1-master-0" Feb 23 14:19:16.575730 master-0 kubenswrapper[7728]: I0223 14:19:16.575539 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29f7b30e-bf6a-4e54-b009-1b0fcd830035-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"29f7b30e-bf6a-4e54-b009-1b0fcd830035\") " pod="openshift-kube-apiserver/installer-1-master-0" Feb 23 14:19:16.676886 master-0 kubenswrapper[7728]: I0223 14:19:16.676771 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/29f7b30e-bf6a-4e54-b009-1b0fcd830035-var-lock\") pod \"installer-1-master-0\" (UID: \"29f7b30e-bf6a-4e54-b009-1b0fcd830035\") " pod="openshift-kube-apiserver/installer-1-master-0" Feb 23 14:19:16.676886 master-0 kubenswrapper[7728]: I0223 14:19:16.676824 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29f7b30e-bf6a-4e54-b009-1b0fcd830035-kube-api-access\") pod \"installer-1-master-0\" (UID: \"29f7b30e-bf6a-4e54-b009-1b0fcd830035\") " pod="openshift-kube-apiserver/installer-1-master-0" Feb 23 14:19:16.676886 master-0 kubenswrapper[7728]: I0223 14:19:16.676844 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29f7b30e-bf6a-4e54-b009-1b0fcd830035-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"29f7b30e-bf6a-4e54-b009-1b0fcd830035\") " pod="openshift-kube-apiserver/installer-1-master-0" Feb 23 14:19:16.677113 master-0 kubenswrapper[7728]: I0223 14:19:16.676923 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29f7b30e-bf6a-4e54-b009-1b0fcd830035-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"29f7b30e-bf6a-4e54-b009-1b0fcd830035\") " pod="openshift-kube-apiserver/installer-1-master-0" Feb 23 14:19:16.677113 master-0 kubenswrapper[7728]: I0223 14:19:16.676955 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/29f7b30e-bf6a-4e54-b009-1b0fcd830035-var-lock\") pod \"installer-1-master-0\" (UID: \"29f7b30e-bf6a-4e54-b009-1b0fcd830035\") " pod="openshift-kube-apiserver/installer-1-master-0" Feb 23 14:19:16.692188 master-0 kubenswrapper[7728]: I0223 14:19:16.692147 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29f7b30e-bf6a-4e54-b009-1b0fcd830035-kube-api-access\") pod \"installer-1-master-0\" (UID: \"29f7b30e-bf6a-4e54-b009-1b0fcd830035\") " pod="openshift-kube-apiserver/installer-1-master-0" Feb 23 14:19:16.767887 master-0 kubenswrapper[7728]: I0223 14:19:16.767582 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c445f5bd9-qtw5z" event={"ID":"9694f604-5dcf-4aec-a54f-67b4d8dc8809","Type":"ContainerStarted","Data":"f5ecbbc97f8363e1b32d81fb3d21757d9b3607a089392deba2dfe5ae2374b9b0"} Feb 23 14:19:16.767887 master-0 kubenswrapper[7728]: I0223 14:19:16.767626 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c445f5bd9-qtw5z" event={"ID":"9694f604-5dcf-4aec-a54f-67b4d8dc8809","Type":"ContainerStarted","Data":"d787bbe507abbf8e0df335f2db362ff8804d2d269abbde4b9813202bd59071f6"} Feb 23 14:19:16.768314 master-0 kubenswrapper[7728]: I0223 14:19:16.768281 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6c445f5bd9-qtw5z" Feb 23 14:19:16.770556 master-0 kubenswrapper[7728]: I0223 14:19:16.770519 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-789c749c66-w826g" event={"ID":"b2af015a-7f5f-4702-9f06-8159a040dabe","Type":"ContainerStarted","Data":"ea6ce47f9f9923c200af89a0427525b1c65bce88c6467d37861598640e2b236f"} Feb 23 14:19:16.770646 master-0 kubenswrapper[7728]: I0223 14:19:16.770560 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-789c749c66-w826g" event={"ID":"b2af015a-7f5f-4702-9f06-8159a040dabe","Type":"ContainerStarted","Data":"e51d035a0dfb539fd05e098e8e355aca5284eb3bb12234934a07a3e19cdf8677"} Feb 23 14:19:16.787617 master-0 kubenswrapper[7728]: I0223 14:19:16.787553 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6c445f5bd9-qtw5z" Feb 23 14:19:16.787617 master-0 kubenswrapper[7728]: I0223 14:19:16.787599 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-666b887977-f7h55" event={"ID":"588a804a-430a-47f4-aa97-c08e907239da","Type":"ContainerStarted","Data":"2243bd1ff5f9d8cba5c5a4b887f7a319c493e141478b3849cae308750eee3552"} Feb 23 14:19:16.791007 master-0 kubenswrapper[7728]: I0223 14:19:16.790981 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"5f67ab24-82bc-4e71-b974-e25b819986c8","Type":"ContainerStarted","Data":"c172e0b4868c308f20f7ae8b13ba955f59eebc66ffba5fd517b3648866cbe26f"} Feb 23 14:19:16.791081 master-0 kubenswrapper[7728]: I0223 14:19:16.791010 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"5f67ab24-82bc-4e71-b974-e25b819986c8","Type":"ContainerStarted","Data":"6dfbc560ad1e1a5e7a72cca845fe403480de4a21ae08827666c9a7f55f0e049e"} Feb 23 14:19:16.796982 master-0 kubenswrapper[7728]: I0223 14:19:16.794744 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"d1bffce5-019a-4c97-85f2-929dc19a0bde","Type":"ContainerStarted","Data":"e8935c6e444aa0e24024f5ff856a9b30868587f159c8c2351155b9dff7539917"} Feb 23 14:19:16.796982 master-0 kubenswrapper[7728]: I0223 14:19:16.794822 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"d1bffce5-019a-4c97-85f2-929dc19a0bde","Type":"ContainerStarted","Data":"d2f40a1ec635a92856cfa2a3f7bc1ac31eb82fe0e85c0e36c0e8828880ceda10"} Feb 23 14:19:16.828414 master-0 kubenswrapper[7728]: I0223 14:19:16.828325 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6c445f5bd9-qtw5z" podStartSLOduration=6.82830391 podStartE2EDuration="6.82830391s" podCreationTimestamp="2026-02-23 14:19:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:19:16.794777977 +0000 UTC m=+49.757439273" watchObservedRunningTime="2026-02-23 14:19:16.82830391 +0000 UTC m=+49.790965206" Feb 23 14:19:16.829025 master-0 kubenswrapper[7728]: I0223 14:19:16.828980 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-1-master-0" podStartSLOduration=3.828973594 podStartE2EDuration="3.828973594s" podCreationTimestamp="2026-02-23 14:19:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:19:16.827034382 +0000 UTC m=+49.789695688" watchObservedRunningTime="2026-02-23 14:19:16.828973594 +0000 UTC m=+49.791634890" Feb 23 14:19:16.851536 master-0 kubenswrapper[7728]: I0223 14:19:16.850321 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Feb 23 14:19:16.851536 master-0 kubenswrapper[7728]: I0223 14:19:16.850356 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-789c749c66-w826g" podStartSLOduration=6.850329224 podStartE2EDuration="6.850329224s" podCreationTimestamp="2026-02-23 14:19:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:19:16.850052108 +0000 UTC m=+49.812713424" watchObservedRunningTime="2026-02-23 14:19:16.850329224 +0000 UTC m=+49.812990560" Feb 23 14:19:16.886782 master-0 kubenswrapper[7728]: I0223 14:19:16.886701 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-666b887977-f7h55" podStartSLOduration=18.886675877 podStartE2EDuration="18.886675877s" podCreationTimestamp="2026-02-23 14:18:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:19:16.883514529 +0000 UTC m=+49.846175825" watchObservedRunningTime="2026-02-23 14:19:16.886675877 +0000 UTC m=+49.849337193" Feb 23 14:19:16.924773 master-0 kubenswrapper[7728]: I0223 14:19:16.924356 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-1-master-0" podStartSLOduration=2.924333529 podStartE2EDuration="2.924333529s" podCreationTimestamp="2026-02-23 14:19:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:19:16.924322678 +0000 UTC m=+49.886983974" watchObservedRunningTime="2026-02-23 14:19:16.924333529 +0000 UTC m=+49.886994825" Feb 23 14:19:16.983293 master-0 kubenswrapper[7728]: I0223 14:19:16.983240 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Feb 23 14:19:16.983615 master-0 kubenswrapper[7728]: I0223 14:19:16.983556 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-2-master-0" podUID="ab99fbb6-d945-4bf7-a85c-239f83137a4d" containerName="installer" containerID="cri-o://1428da8f7e46531b25d14355faa50d250078cad565c334445a3cf45616e87b7a" gracePeriod=30 Feb 23 14:19:17.065916 master-0 kubenswrapper[7728]: I0223 14:19:17.061842 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-86l7f" Feb 23 14:19:17.290812 master-0 kubenswrapper[7728]: I0223 14:19:17.290145 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Feb 23 14:19:17.342701 master-0 kubenswrapper[7728]: I0223 14:19:17.342650 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_ab99fbb6-d945-4bf7-a85c-239f83137a4d/installer/0.log" Feb 23 14:19:17.342701 master-0 kubenswrapper[7728]: I0223 14:19:17.342716 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Feb 23 14:19:17.517733 master-0 kubenswrapper[7728]: I0223 14:19:17.517671 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ab99fbb6-d945-4bf7-a85c-239f83137a4d-var-lock\") pod \"ab99fbb6-d945-4bf7-a85c-239f83137a4d\" (UID: \"ab99fbb6-d945-4bf7-a85c-239f83137a4d\") " Feb 23 14:19:17.518243 master-0 kubenswrapper[7728]: I0223 14:19:17.517792 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab99fbb6-d945-4bf7-a85c-239f83137a4d-kubelet-dir\") pod \"ab99fbb6-d945-4bf7-a85c-239f83137a4d\" (UID: \"ab99fbb6-d945-4bf7-a85c-239f83137a4d\") " Feb 23 14:19:17.518243 master-0 kubenswrapper[7728]: I0223 14:19:17.517810 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab99fbb6-d945-4bf7-a85c-239f83137a4d-var-lock" (OuterVolumeSpecName: "var-lock") pod "ab99fbb6-d945-4bf7-a85c-239f83137a4d" (UID: "ab99fbb6-d945-4bf7-a85c-239f83137a4d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:19:17.518243 master-0 kubenswrapper[7728]: I0223 14:19:17.517831 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab99fbb6-d945-4bf7-a85c-239f83137a4d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ab99fbb6-d945-4bf7-a85c-239f83137a4d" (UID: "ab99fbb6-d945-4bf7-a85c-239f83137a4d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:19:17.518243 master-0 kubenswrapper[7728]: I0223 14:19:17.517842 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab99fbb6-d945-4bf7-a85c-239f83137a4d-kube-api-access\") pod \"ab99fbb6-d945-4bf7-a85c-239f83137a4d\" (UID: \"ab99fbb6-d945-4bf7-a85c-239f83137a4d\") " Feb 23 14:19:17.518243 master-0 kubenswrapper[7728]: I0223 14:19:17.518203 7728 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ab99fbb6-d945-4bf7-a85c-239f83137a4d-var-lock\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:17.518243 master-0 kubenswrapper[7728]: I0223 14:19:17.518218 7728 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab99fbb6-d945-4bf7-a85c-239f83137a4d-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:17.521019 master-0 kubenswrapper[7728]: I0223 14:19:17.520980 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab99fbb6-d945-4bf7-a85c-239f83137a4d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ab99fbb6-d945-4bf7-a85c-239f83137a4d" (UID: "ab99fbb6-d945-4bf7-a85c-239f83137a4d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:19:17.619643 master-0 kubenswrapper[7728]: I0223 14:19:17.619579 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab99fbb6-d945-4bf7-a85c-239f83137a4d-kube-api-access\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:17.802245 master-0 kubenswrapper[7728]: I0223 14:19:17.802147 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"29f7b30e-bf6a-4e54-b009-1b0fcd830035","Type":"ContainerStarted","Data":"325ae25a8338b6a2543759476e50b822896d1071332fcb78a23d45a461fab54f"} Feb 23 14:19:17.802245 master-0 kubenswrapper[7728]: I0223 14:19:17.802220 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"29f7b30e-bf6a-4e54-b009-1b0fcd830035","Type":"ContainerStarted","Data":"d4622d20df32d4655ff4c5d8c0ab82bdd9c1a367900ef291744093a4801a66c4"} Feb 23 14:19:17.806389 master-0 kubenswrapper[7728]: I0223 14:19:17.806342 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_ab99fbb6-d945-4bf7-a85c-239f83137a4d/installer/0.log" Feb 23 14:19:17.806661 master-0 kubenswrapper[7728]: I0223 14:19:17.806403 7728 generic.go:334] "Generic (PLEG): container finished" podID="ab99fbb6-d945-4bf7-a85c-239f83137a4d" containerID="1428da8f7e46531b25d14355faa50d250078cad565c334445a3cf45616e87b7a" exitCode=1 Feb 23 14:19:17.806661 master-0 kubenswrapper[7728]: I0223 14:19:17.806561 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Feb 23 14:19:17.807201 master-0 kubenswrapper[7728]: I0223 14:19:17.807142 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"ab99fbb6-d945-4bf7-a85c-239f83137a4d","Type":"ContainerDied","Data":"1428da8f7e46531b25d14355faa50d250078cad565c334445a3cf45616e87b7a"} Feb 23 14:19:17.807201 master-0 kubenswrapper[7728]: I0223 14:19:17.807190 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"ab99fbb6-d945-4bf7-a85c-239f83137a4d","Type":"ContainerDied","Data":"244a5c8ecd9fd0725cd2b5c5bda17cdcac940497bdf23d6c2d83d2e9ef3475bf"} Feb 23 14:19:17.807430 master-0 kubenswrapper[7728]: I0223 14:19:17.807217 7728 scope.go:117] "RemoveContainer" containerID="1428da8f7e46531b25d14355faa50d250078cad565c334445a3cf45616e87b7a" Feb 23 14:19:17.808829 master-0 kubenswrapper[7728]: I0223 14:19:17.808791 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-789c749c66-w826g" Feb 23 14:19:17.817695 master-0 kubenswrapper[7728]: I0223 14:19:17.817630 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-789c749c66-w826g" Feb 23 14:19:17.848596 master-0 kubenswrapper[7728]: I0223 14:19:17.848548 7728 scope.go:117] "RemoveContainer" containerID="1428da8f7e46531b25d14355faa50d250078cad565c334445a3cf45616e87b7a" Feb 23 14:19:17.849143 master-0 kubenswrapper[7728]: E0223 14:19:17.849090 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1428da8f7e46531b25d14355faa50d250078cad565c334445a3cf45616e87b7a\": container with ID starting with 1428da8f7e46531b25d14355faa50d250078cad565c334445a3cf45616e87b7a not found: ID does not exist" containerID="1428da8f7e46531b25d14355faa50d250078cad565c334445a3cf45616e87b7a" Feb 23 14:19:17.849207 master-0 kubenswrapper[7728]: I0223 14:19:17.849133 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1428da8f7e46531b25d14355faa50d250078cad565c334445a3cf45616e87b7a"} err="failed to get container status \"1428da8f7e46531b25d14355faa50d250078cad565c334445a3cf45616e87b7a\": rpc error: code = NotFound desc = could not find container \"1428da8f7e46531b25d14355faa50d250078cad565c334445a3cf45616e87b7a\": container with ID starting with 1428da8f7e46531b25d14355faa50d250078cad565c334445a3cf45616e87b7a not found: ID does not exist" Feb 23 14:19:17.893565 master-0 kubenswrapper[7728]: I0223 14:19:17.893430 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-1-master-0" podStartSLOduration=1.8934111279999999 podStartE2EDuration="1.893411128s" podCreationTimestamp="2026-02-23 14:19:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:19:17.84106163 +0000 UTC m=+50.803722966" watchObservedRunningTime="2026-02-23 14:19:17.893411128 +0000 UTC m=+50.856072434" Feb 23 14:19:17.906555 master-0 kubenswrapper[7728]: I0223 14:19:17.906446 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Feb 23 14:19:17.911135 master-0 kubenswrapper[7728]: I0223 14:19:17.910509 7728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Feb 23 14:19:18.015499 master-0 kubenswrapper[7728]: I0223 14:19:18.015409 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:18.015669 master-0 kubenswrapper[7728]: I0223 14:19:18.015537 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:18.030143 master-0 kubenswrapper[7728]: I0223 14:19:18.029282 7728 patch_prober.go:28] interesting pod/apiserver-666b887977-f7h55 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 23 14:19:18.030143 master-0 kubenswrapper[7728]: [+]log ok Feb 23 14:19:18.030143 master-0 kubenswrapper[7728]: [+]etcd ok Feb 23 14:19:18.030143 master-0 kubenswrapper[7728]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 23 14:19:18.030143 master-0 kubenswrapper[7728]: [+]poststarthook/generic-apiserver-start-informers ok Feb 23 14:19:18.030143 master-0 kubenswrapper[7728]: [+]poststarthook/max-in-flight-filter ok Feb 23 14:19:18.030143 master-0 kubenswrapper[7728]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 23 14:19:18.030143 master-0 kubenswrapper[7728]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 23 14:19:18.030143 master-0 kubenswrapper[7728]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 23 14:19:18.030143 master-0 kubenswrapper[7728]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Feb 23 14:19:18.030143 master-0 kubenswrapper[7728]: [+]poststarthook/project.openshift.io-projectcache ok Feb 23 14:19:18.030143 master-0 kubenswrapper[7728]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Feb 23 14:19:18.030143 master-0 kubenswrapper[7728]: [+]poststarthook/openshift.io-startinformers ok Feb 23 14:19:18.030143 master-0 kubenswrapper[7728]: [+]poststarthook/openshift.io-restmapperupdater ok Feb 23 14:19:18.030143 master-0 kubenswrapper[7728]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 23 14:19:18.030143 master-0 kubenswrapper[7728]: livez check failed Feb 23 14:19:18.030143 master-0 kubenswrapper[7728]: I0223 14:19:18.029360 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-666b887977-f7h55" podUID="588a804a-430a-47f4-aa97-c08e907239da" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:19:19.170214 master-0 kubenswrapper[7728]: I0223 14:19:19.170150 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Feb 23 14:19:19.170985 master-0 kubenswrapper[7728]: E0223 14:19:19.170317 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab99fbb6-d945-4bf7-a85c-239f83137a4d" containerName="installer" Feb 23 14:19:19.170985 master-0 kubenswrapper[7728]: I0223 14:19:19.170329 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab99fbb6-d945-4bf7-a85c-239f83137a4d" containerName="installer" Feb 23 14:19:19.170985 master-0 kubenswrapper[7728]: I0223 14:19:19.170405 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab99fbb6-d945-4bf7-a85c-239f83137a4d" containerName="installer" Feb 23 14:19:19.170985 master-0 kubenswrapper[7728]: I0223 14:19:19.170717 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Feb 23 14:19:19.201536 master-0 kubenswrapper[7728]: I0223 14:19:19.201440 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Feb 23 14:19:19.229501 master-0 kubenswrapper[7728]: I0223 14:19:19.226917 7728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab99fbb6-d945-4bf7-a85c-239f83137a4d" path="/var/lib/kubelet/pods/ab99fbb6-d945-4bf7-a85c-239f83137a4d/volumes" Feb 23 14:19:19.341288 master-0 kubenswrapper[7728]: I0223 14:19:19.341218 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a5f3286-9ec6-4867-aaf7-3c31b1f6c126-kube-api-access\") pod \"installer-3-master-0\" (UID: \"5a5f3286-9ec6-4867-aaf7-3c31b1f6c126\") " pod="openshift-kube-scheduler/installer-3-master-0" Feb 23 14:19:19.341579 master-0 kubenswrapper[7728]: I0223 14:19:19.341320 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5a5f3286-9ec6-4867-aaf7-3c31b1f6c126-var-lock\") pod \"installer-3-master-0\" (UID: \"5a5f3286-9ec6-4867-aaf7-3c31b1f6c126\") " pod="openshift-kube-scheduler/installer-3-master-0" Feb 23 14:19:19.341579 master-0 kubenswrapper[7728]: I0223 14:19:19.341392 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a5f3286-9ec6-4867-aaf7-3c31b1f6c126-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"5a5f3286-9ec6-4867-aaf7-3c31b1f6c126\") " pod="openshift-kube-scheduler/installer-3-master-0" Feb 23 14:19:19.442219 master-0 kubenswrapper[7728]: I0223 14:19:19.442166 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a5f3286-9ec6-4867-aaf7-3c31b1f6c126-kube-api-access\") pod \"installer-3-master-0\" (UID: \"5a5f3286-9ec6-4867-aaf7-3c31b1f6c126\") " pod="openshift-kube-scheduler/installer-3-master-0" Feb 23 14:19:19.442409 master-0 kubenswrapper[7728]: I0223 14:19:19.442376 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5a5f3286-9ec6-4867-aaf7-3c31b1f6c126-var-lock\") pod \"installer-3-master-0\" (UID: \"5a5f3286-9ec6-4867-aaf7-3c31b1f6c126\") " pod="openshift-kube-scheduler/installer-3-master-0" Feb 23 14:19:19.442519 master-0 kubenswrapper[7728]: I0223 14:19:19.442471 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a5f3286-9ec6-4867-aaf7-3c31b1f6c126-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"5a5f3286-9ec6-4867-aaf7-3c31b1f6c126\") " pod="openshift-kube-scheduler/installer-3-master-0" Feb 23 14:19:19.442657 master-0 kubenswrapper[7728]: I0223 14:19:19.442596 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5a5f3286-9ec6-4867-aaf7-3c31b1f6c126-var-lock\") pod \"installer-3-master-0\" (UID: \"5a5f3286-9ec6-4867-aaf7-3c31b1f6c126\") " pod="openshift-kube-scheduler/installer-3-master-0" Feb 23 14:19:19.442725 master-0 kubenswrapper[7728]: I0223 14:19:19.442688 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a5f3286-9ec6-4867-aaf7-3c31b1f6c126-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"5a5f3286-9ec6-4867-aaf7-3c31b1f6c126\") " pod="openshift-kube-scheduler/installer-3-master-0" Feb 23 14:19:19.461288 master-0 kubenswrapper[7728]: I0223 14:19:19.461227 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a5f3286-9ec6-4867-aaf7-3c31b1f6c126-kube-api-access\") pod \"installer-3-master-0\" (UID: \"5a5f3286-9ec6-4867-aaf7-3c31b1f6c126\") " pod="openshift-kube-scheduler/installer-3-master-0" Feb 23 14:19:19.488202 master-0 kubenswrapper[7728]: I0223 14:19:19.487771 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Feb 23 14:19:19.539194 master-0 kubenswrapper[7728]: I0223 14:19:19.538849 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-686847ff5f-9q266"] Feb 23 14:19:19.539407 master-0 kubenswrapper[7728]: I0223 14:19:19.539376 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-686847ff5f-9q266" Feb 23 14:19:19.541594 master-0 kubenswrapper[7728]: I0223 14:19:19.541431 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 23 14:19:19.541594 master-0 kubenswrapper[7728]: I0223 14:19:19.541452 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 23 14:19:19.541897 master-0 kubenswrapper[7728]: I0223 14:19:19.541871 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 23 14:19:19.545037 master-0 kubenswrapper[7728]: I0223 14:19:19.544868 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4373687a-61a0-434b-81f7-3fecaa1494ef-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-686847ff5f-9q266\" (UID: \"4373687a-61a0-434b-81f7-3fecaa1494ef\") " pod="openshift-machine-api/control-plane-machine-set-operator-686847ff5f-9q266" Feb 23 14:19:19.545037 master-0 kubenswrapper[7728]: I0223 14:19:19.544960 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv5nj\" (UniqueName: \"kubernetes.io/projected/4373687a-61a0-434b-81f7-3fecaa1494ef-kube-api-access-wv5nj\") pod \"control-plane-machine-set-operator-686847ff5f-9q266\" (UID: \"4373687a-61a0-434b-81f7-3fecaa1494ef\") " pod="openshift-machine-api/control-plane-machine-set-operator-686847ff5f-9q266" Feb 23 14:19:19.548860 master-0 kubenswrapper[7728]: I0223 14:19:19.548801 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-686847ff5f-9q266"] Feb 23 14:19:19.646165 master-0 kubenswrapper[7728]: I0223 14:19:19.646133 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4373687a-61a0-434b-81f7-3fecaa1494ef-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-686847ff5f-9q266\" (UID: \"4373687a-61a0-434b-81f7-3fecaa1494ef\") " pod="openshift-machine-api/control-plane-machine-set-operator-686847ff5f-9q266" Feb 23 14:19:19.646436 master-0 kubenswrapper[7728]: I0223 14:19:19.646410 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv5nj\" (UniqueName: \"kubernetes.io/projected/4373687a-61a0-434b-81f7-3fecaa1494ef-kube-api-access-wv5nj\") pod \"control-plane-machine-set-operator-686847ff5f-9q266\" (UID: \"4373687a-61a0-434b-81f7-3fecaa1494ef\") " pod="openshift-machine-api/control-plane-machine-set-operator-686847ff5f-9q266" Feb 23 14:19:19.649288 master-0 kubenswrapper[7728]: I0223 14:19:19.649265 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4373687a-61a0-434b-81f7-3fecaa1494ef-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-686847ff5f-9q266\" (UID: \"4373687a-61a0-434b-81f7-3fecaa1494ef\") " pod="openshift-machine-api/control-plane-machine-set-operator-686847ff5f-9q266" Feb 23 14:19:19.664277 master-0 kubenswrapper[7728]: I0223 14:19:19.664241 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv5nj\" (UniqueName: \"kubernetes.io/projected/4373687a-61a0-434b-81f7-3fecaa1494ef-kube-api-access-wv5nj\") pod \"control-plane-machine-set-operator-686847ff5f-9q266\" (UID: \"4373687a-61a0-434b-81f7-3fecaa1494ef\") " pod="openshift-machine-api/control-plane-machine-set-operator-686847ff5f-9q266" Feb 23 14:19:19.873337 master-0 kubenswrapper[7728]: I0223 14:19:19.873248 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-686847ff5f-9q266" Feb 23 14:19:19.912286 master-0 kubenswrapper[7728]: I0223 14:19:19.912242 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Feb 23 14:19:20.122249 master-0 kubenswrapper[7728]: I0223 14:19:20.122197 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:19:20.123223 master-0 kubenswrapper[7728]: I0223 14:19:20.123198 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:19:20.338940 master-0 kubenswrapper[7728]: I0223 14:19:20.338632 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:19:20.571498 master-0 kubenswrapper[7728]: I0223 14:19:20.571420 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-686847ff5f-9q266"] Feb 23 14:19:20.581187 master-0 kubenswrapper[7728]: W0223 14:19:20.581147 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4373687a_61a0_434b_81f7_3fecaa1494ef.slice/crio-4ae19cabdf4e15b9983be578ad7a63be61278ebfe49db1eb9827bad0d8d1a242 WatchSource:0}: Error finding container 4ae19cabdf4e15b9983be578ad7a63be61278ebfe49db1eb9827bad0d8d1a242: Status 404 returned error can't find the container with id 4ae19cabdf4e15b9983be578ad7a63be61278ebfe49db1eb9827bad0d8d1a242 Feb 23 14:19:20.832948 master-0 kubenswrapper[7728]: I0223 14:19:20.832785 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"5a5f3286-9ec6-4867-aaf7-3c31b1f6c126","Type":"ContainerStarted","Data":"0e1388d50e8407b49b200cd8d22318640f8b3d9ec7fffb9a2e4a3714e4725182"} Feb 23 14:19:20.832948 master-0 kubenswrapper[7728]: I0223 14:19:20.832849 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"5a5f3286-9ec6-4867-aaf7-3c31b1f6c126","Type":"ContainerStarted","Data":"4c3108828f0ca8b6e625b4baa7448d873f784369cabaee373f77769b11f1f00a"} Feb 23 14:19:20.834041 master-0 kubenswrapper[7728]: I0223 14:19:20.834000 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-686847ff5f-9q266" event={"ID":"4373687a-61a0-434b-81f7-3fecaa1494ef","Type":"ContainerStarted","Data":"4ae19cabdf4e15b9983be578ad7a63be61278ebfe49db1eb9827bad0d8d1a242"} Feb 23 14:19:20.838880 master-0 kubenswrapper[7728]: I0223 14:19:20.838847 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:19:21.952106 master-0 kubenswrapper[7728]: I0223 14:19:21.952018 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-3-master-0" podStartSLOduration=2.951998899 podStartE2EDuration="2.951998899s" podCreationTimestamp="2026-02-23 14:19:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:19:21.950882385 +0000 UTC m=+54.913543681" watchObservedRunningTime="2026-02-23 14:19:21.951998899 +0000 UTC m=+54.914660195" Feb 23 14:19:22.882142 master-0 kubenswrapper[7728]: I0223 14:19:22.881957 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-798b897698-p8wds"] Feb 23 14:19:22.882964 master-0 kubenswrapper[7728]: I0223 14:19:22.882840 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-798b897698-p8wds" Feb 23 14:19:22.886662 master-0 kubenswrapper[7728]: I0223 14:19:22.886526 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 23 14:19:22.892603 master-0 kubenswrapper[7728]: I0223 14:19:22.887299 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 23 14:19:22.892603 master-0 kubenswrapper[7728]: I0223 14:19:22.887693 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 23 14:19:22.892603 master-0 kubenswrapper[7728]: I0223 14:19:22.887864 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 23 14:19:22.893723 master-0 kubenswrapper[7728]: I0223 14:19:22.893687 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 23 14:19:22.921928 master-0 kubenswrapper[7728]: I0223 14:19:22.921876 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d45c7d67-e103-4f68-b10f-9a1f3c56af6e-machine-approver-tls\") pod \"machine-approver-798b897698-p8wds\" (UID: \"d45c7d67-e103-4f68-b10f-9a1f3c56af6e\") " pod="openshift-cluster-machine-approver/machine-approver-798b897698-p8wds" Feb 23 14:19:22.922134 master-0 kubenswrapper[7728]: I0223 14:19:22.921955 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d45c7d67-e103-4f68-b10f-9a1f3c56af6e-config\") pod \"machine-approver-798b897698-p8wds\" (UID: \"d45c7d67-e103-4f68-b10f-9a1f3c56af6e\") " pod="openshift-cluster-machine-approver/machine-approver-798b897698-p8wds" Feb 23 14:19:22.922134 master-0 kubenswrapper[7728]: I0223 14:19:22.921996 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d45c7d67-e103-4f68-b10f-9a1f3c56af6e-auth-proxy-config\") pod \"machine-approver-798b897698-p8wds\" (UID: \"d45c7d67-e103-4f68-b10f-9a1f3c56af6e\") " pod="openshift-cluster-machine-approver/machine-approver-798b897698-p8wds" Feb 23 14:19:22.922134 master-0 kubenswrapper[7728]: I0223 14:19:22.922058 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdcpf\" (UniqueName: \"kubernetes.io/projected/d45c7d67-e103-4f68-b10f-9a1f3c56af6e-kube-api-access-zdcpf\") pod \"machine-approver-798b897698-p8wds\" (UID: \"d45c7d67-e103-4f68-b10f-9a1f3c56af6e\") " pod="openshift-cluster-machine-approver/machine-approver-798b897698-p8wds" Feb 23 14:19:23.022789 master-0 kubenswrapper[7728]: I0223 14:19:23.022743 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d45c7d67-e103-4f68-b10f-9a1f3c56af6e-auth-proxy-config\") pod \"machine-approver-798b897698-p8wds\" (UID: \"d45c7d67-e103-4f68-b10f-9a1f3c56af6e\") " pod="openshift-cluster-machine-approver/machine-approver-798b897698-p8wds" Feb 23 14:19:23.023150 master-0 kubenswrapper[7728]: I0223 14:19:23.022807 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdcpf\" (UniqueName: \"kubernetes.io/projected/d45c7d67-e103-4f68-b10f-9a1f3c56af6e-kube-api-access-zdcpf\") pod \"machine-approver-798b897698-p8wds\" (UID: \"d45c7d67-e103-4f68-b10f-9a1f3c56af6e\") " pod="openshift-cluster-machine-approver/machine-approver-798b897698-p8wds" Feb 23 14:19:23.023150 master-0 kubenswrapper[7728]: I0223 14:19:23.022879 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d45c7d67-e103-4f68-b10f-9a1f3c56af6e-machine-approver-tls\") pod \"machine-approver-798b897698-p8wds\" (UID: \"d45c7d67-e103-4f68-b10f-9a1f3c56af6e\") " pod="openshift-cluster-machine-approver/machine-approver-798b897698-p8wds" Feb 23 14:19:23.023150 master-0 kubenswrapper[7728]: I0223 14:19:23.022911 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d45c7d67-e103-4f68-b10f-9a1f3c56af6e-config\") pod \"machine-approver-798b897698-p8wds\" (UID: \"d45c7d67-e103-4f68-b10f-9a1f3c56af6e\") " pod="openshift-cluster-machine-approver/machine-approver-798b897698-p8wds" Feb 23 14:19:23.023561 master-0 kubenswrapper[7728]: I0223 14:19:23.023511 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d45c7d67-e103-4f68-b10f-9a1f3c56af6e-auth-proxy-config\") pod \"machine-approver-798b897698-p8wds\" (UID: \"d45c7d67-e103-4f68-b10f-9a1f3c56af6e\") " pod="openshift-cluster-machine-approver/machine-approver-798b897698-p8wds" Feb 23 14:19:23.023765 master-0 kubenswrapper[7728]: I0223 14:19:23.023733 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d45c7d67-e103-4f68-b10f-9a1f3c56af6e-config\") pod \"machine-approver-798b897698-p8wds\" (UID: \"d45c7d67-e103-4f68-b10f-9a1f3c56af6e\") " pod="openshift-cluster-machine-approver/machine-approver-798b897698-p8wds" Feb 23 14:19:23.031295 master-0 kubenswrapper[7728]: I0223 14:19:23.027291 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d45c7d67-e103-4f68-b10f-9a1f3c56af6e-machine-approver-tls\") pod \"machine-approver-798b897698-p8wds\" (UID: \"d45c7d67-e103-4f68-b10f-9a1f3c56af6e\") " pod="openshift-cluster-machine-approver/machine-approver-798b897698-p8wds" Feb 23 14:19:23.031295 master-0 kubenswrapper[7728]: I0223 14:19:23.028596 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:23.033988 master-0 kubenswrapper[7728]: I0223 14:19:23.033960 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:19:23.046918 master-0 kubenswrapper[7728]: I0223 14:19:23.043646 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdcpf\" (UniqueName: \"kubernetes.io/projected/d45c7d67-e103-4f68-b10f-9a1f3c56af6e-kube-api-access-zdcpf\") pod \"machine-approver-798b897698-p8wds\" (UID: \"d45c7d67-e103-4f68-b10f-9a1f3c56af6e\") " pod="openshift-cluster-machine-approver/machine-approver-798b897698-p8wds" Feb 23 14:19:23.212104 master-0 kubenswrapper[7728]: I0223 14:19:23.212063 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-798b897698-p8wds" Feb 23 14:19:23.227595 master-0 kubenswrapper[7728]: W0223 14:19:23.227565 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd45c7d67_e103_4f68_b10f_9a1f3c56af6e.slice/crio-f2b1a440d4d0941474a18e0ad7a1503dff271821abbfe9ad25e9a1f920cb81fc WatchSource:0}: Error finding container f2b1a440d4d0941474a18e0ad7a1503dff271821abbfe9ad25e9a1f920cb81fc: Status 404 returned error can't find the container with id f2b1a440d4d0941474a18e0ad7a1503dff271821abbfe9ad25e9a1f920cb81fc Feb 23 14:19:23.854091 master-0 kubenswrapper[7728]: I0223 14:19:23.853946 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-686847ff5f-9q266" event={"ID":"4373687a-61a0-434b-81f7-3fecaa1494ef","Type":"ContainerStarted","Data":"9b45bf126e1d92621372b72946a5700b9c49834f8698b4a6266b185922dfcbee"} Feb 23 14:19:23.856514 master-0 kubenswrapper[7728]: I0223 14:19:23.856463 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-798b897698-p8wds" event={"ID":"d45c7d67-e103-4f68-b10f-9a1f3c56af6e","Type":"ContainerStarted","Data":"75b11e6bbaf0eb60d298e18d01053ee976e173fca1e1c0d3ffb9d0631545055a"} Feb 23 14:19:23.856514 master-0 kubenswrapper[7728]: I0223 14:19:23.856511 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-798b897698-p8wds" event={"ID":"d45c7d67-e103-4f68-b10f-9a1f3c56af6e","Type":"ContainerStarted","Data":"f2b1a440d4d0941474a18e0ad7a1503dff271821abbfe9ad25e9a1f920cb81fc"} Feb 23 14:19:25.690347 master-0 kubenswrapper[7728]: I0223 14:19:25.690279 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-686847ff5f-9q266" podStartSLOduration=4.278066012 podStartE2EDuration="6.690258792s" podCreationTimestamp="2026-02-23 14:19:19 +0000 UTC" firstStartedPulling="2026-02-23 14:19:20.58319713 +0000 UTC m=+53.545858416" lastFinishedPulling="2026-02-23 14:19:22.9953899 +0000 UTC m=+55.958051196" observedRunningTime="2026-02-23 14:19:23.870711959 +0000 UTC m=+56.833373345" watchObservedRunningTime="2026-02-23 14:19:25.690258792 +0000 UTC m=+58.652920088" Feb 23 14:19:25.692084 master-0 kubenswrapper[7728]: I0223 14:19:25.692061 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-6968c58f46-p7jh7"] Feb 23 14:19:25.692784 master-0 kubenswrapper[7728]: I0223 14:19:25.692767 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-6968c58f46-p7jh7" Feb 23 14:19:25.694786 master-0 kubenswrapper[7728]: I0223 14:19:25.694734 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-hrd9b" Feb 23 14:19:25.695135 master-0 kubenswrapper[7728]: I0223 14:19:25.695088 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Feb 23 14:19:25.695202 master-0 kubenswrapper[7728]: I0223 14:19:25.695172 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Feb 23 14:19:25.695606 master-0 kubenswrapper[7728]: I0223 14:19:25.695561 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Feb 23 14:19:25.702582 master-0 kubenswrapper[7728]: I0223 14:19:25.702535 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Feb 23 14:19:25.705339 master-0 kubenswrapper[7728]: I0223 14:19:25.705291 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-6968c58f46-p7jh7"] Feb 23 14:19:25.748382 master-0 kubenswrapper[7728]: I0223 14:19:25.748309 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5d9w\" (UniqueName: \"kubernetes.io/projected/85365dec-af50-406c-b258-890e4f454c4a-kube-api-access-k5d9w\") pod \"cloud-credential-operator-6968c58f46-p7jh7\" (UID: \"85365dec-af50-406c-b258-890e4f454c4a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-6968c58f46-p7jh7" Feb 23 14:19:25.748582 master-0 kubenswrapper[7728]: I0223 14:19:25.748439 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/85365dec-af50-406c-b258-890e4f454c4a-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-6968c58f46-p7jh7\" (UID: \"85365dec-af50-406c-b258-890e4f454c4a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-6968c58f46-p7jh7" Feb 23 14:19:25.748582 master-0 kubenswrapper[7728]: I0223 14:19:25.748497 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85365dec-af50-406c-b258-890e4f454c4a-cco-trusted-ca\") pod \"cloud-credential-operator-6968c58f46-p7jh7\" (UID: \"85365dec-af50-406c-b258-890e4f454c4a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-6968c58f46-p7jh7" Feb 23 14:19:25.849392 master-0 kubenswrapper[7728]: I0223 14:19:25.849339 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/85365dec-af50-406c-b258-890e4f454c4a-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-6968c58f46-p7jh7\" (UID: \"85365dec-af50-406c-b258-890e4f454c4a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-6968c58f46-p7jh7" Feb 23 14:19:25.850119 master-0 kubenswrapper[7728]: I0223 14:19:25.849404 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85365dec-af50-406c-b258-890e4f454c4a-cco-trusted-ca\") pod \"cloud-credential-operator-6968c58f46-p7jh7\" (UID: \"85365dec-af50-406c-b258-890e4f454c4a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-6968c58f46-p7jh7" Feb 23 14:19:25.850119 master-0 kubenswrapper[7728]: I0223 14:19:25.849777 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5d9w\" (UniqueName: \"kubernetes.io/projected/85365dec-af50-406c-b258-890e4f454c4a-kube-api-access-k5d9w\") pod \"cloud-credential-operator-6968c58f46-p7jh7\" (UID: \"85365dec-af50-406c-b258-890e4f454c4a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-6968c58f46-p7jh7" Feb 23 14:19:25.850881 master-0 kubenswrapper[7728]: I0223 14:19:25.850849 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85365dec-af50-406c-b258-890e4f454c4a-cco-trusted-ca\") pod \"cloud-credential-operator-6968c58f46-p7jh7\" (UID: \"85365dec-af50-406c-b258-890e4f454c4a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-6968c58f46-p7jh7" Feb 23 14:19:25.853519 master-0 kubenswrapper[7728]: I0223 14:19:25.853442 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/85365dec-af50-406c-b258-890e4f454c4a-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-6968c58f46-p7jh7\" (UID: \"85365dec-af50-406c-b258-890e4f454c4a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-6968c58f46-p7jh7" Feb 23 14:19:25.870069 master-0 kubenswrapper[7728]: I0223 14:19:25.869958 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-798b897698-p8wds" event={"ID":"d45c7d67-e103-4f68-b10f-9a1f3c56af6e","Type":"ContainerStarted","Data":"216cf725b5e9119d6e5677183ba7a2ec7e3c14a7187b5187103db1bf344eb2a5"} Feb 23 14:19:25.884092 master-0 kubenswrapper[7728]: I0223 14:19:25.884034 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5d9w\" (UniqueName: \"kubernetes.io/projected/85365dec-af50-406c-b258-890e4f454c4a-kube-api-access-k5d9w\") pod \"cloud-credential-operator-6968c58f46-p7jh7\" (UID: \"85365dec-af50-406c-b258-890e4f454c4a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-6968c58f46-p7jh7" Feb 23 14:19:25.889708 master-0 kubenswrapper[7728]: I0223 14:19:25.889633 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-798b897698-p8wds" podStartSLOduration=2.05830328 podStartE2EDuration="3.889610777s" podCreationTimestamp="2026-02-23 14:19:22 +0000 UTC" firstStartedPulling="2026-02-23 14:19:23.524896858 +0000 UTC m=+56.487558144" lastFinishedPulling="2026-02-23 14:19:25.356204345 +0000 UTC m=+58.318865641" observedRunningTime="2026-02-23 14:19:25.889050745 +0000 UTC m=+58.851712061" watchObservedRunningTime="2026-02-23 14:19:25.889610777 +0000 UTC m=+58.852272083" Feb 23 14:19:26.023284 master-0 kubenswrapper[7728]: I0223 14:19:26.023201 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-6968c58f46-p7jh7" Feb 23 14:19:26.450719 master-0 kubenswrapper[7728]: I0223 14:19:26.450671 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-6968c58f46-p7jh7"] Feb 23 14:19:26.460737 master-0 kubenswrapper[7728]: W0223 14:19:26.460669 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85365dec_af50_406c_b258_890e4f454c4a.slice/crio-b4132c8230caf30ada71198ad6ab1bfac93f4aab775d0d4c1263153a8363aaf9 WatchSource:0}: Error finding container b4132c8230caf30ada71198ad6ab1bfac93f4aab775d0d4c1263153a8363aaf9: Status 404 returned error can't find the container with id b4132c8230caf30ada71198ad6ab1bfac93f4aab775d0d4c1263153a8363aaf9 Feb 23 14:19:26.607036 master-0 kubenswrapper[7728]: I0223 14:19:26.604521 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7"] Feb 23 14:19:26.607036 master-0 kubenswrapper[7728]: I0223 14:19:26.605213 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7" Feb 23 14:19:26.607279 master-0 kubenswrapper[7728]: I0223 14:19:26.607173 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 23 14:19:26.608258 master-0 kubenswrapper[7728]: I0223 14:19:26.608228 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 23 14:19:26.608434 master-0 kubenswrapper[7728]: I0223 14:19:26.608409 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 23 14:19:26.631189 master-0 kubenswrapper[7728]: I0223 14:19:26.631139 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 23 14:19:26.637005 master-0 kubenswrapper[7728]: I0223 14:19:26.636962 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7"] Feb 23 14:19:26.661890 master-0 kubenswrapper[7728]: I0223 14:19:26.661824 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad0f0d72-0337-4347-bb50-e299a175f3ca-image-registry-operator-tls\") pod \"cluster-image-registry-operator-779979bdf7-ml2d7\" (UID: \"ad0f0d72-0337-4347-bb50-e299a175f3ca\") " pod="openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7" Feb 23 14:19:26.661890 master-0 kubenswrapper[7728]: I0223 14:19:26.661895 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad0f0d72-0337-4347-bb50-e299a175f3ca-trusted-ca\") pod \"cluster-image-registry-operator-779979bdf7-ml2d7\" (UID: \"ad0f0d72-0337-4347-bb50-e299a175f3ca\") " pod="openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7" Feb 23 14:19:26.662173 master-0 kubenswrapper[7728]: I0223 14:19:26.661932 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knkx2\" (UniqueName: \"kubernetes.io/projected/ad0f0d72-0337-4347-bb50-e299a175f3ca-kube-api-access-knkx2\") pod \"cluster-image-registry-operator-779979bdf7-ml2d7\" (UID: \"ad0f0d72-0337-4347-bb50-e299a175f3ca\") " pod="openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7" Feb 23 14:19:26.662173 master-0 kubenswrapper[7728]: I0223 14:19:26.661963 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad0f0d72-0337-4347-bb50-e299a175f3ca-bound-sa-token\") pod \"cluster-image-registry-operator-779979bdf7-ml2d7\" (UID: \"ad0f0d72-0337-4347-bb50-e299a175f3ca\") " pod="openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7" Feb 23 14:19:26.762930 master-0 kubenswrapper[7728]: I0223 14:19:26.762879 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad0f0d72-0337-4347-bb50-e299a175f3ca-image-registry-operator-tls\") pod \"cluster-image-registry-operator-779979bdf7-ml2d7\" (UID: \"ad0f0d72-0337-4347-bb50-e299a175f3ca\") " pod="openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7" Feb 23 14:19:26.763462 master-0 kubenswrapper[7728]: I0223 14:19:26.762945 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad0f0d72-0337-4347-bb50-e299a175f3ca-trusted-ca\") pod \"cluster-image-registry-operator-779979bdf7-ml2d7\" (UID: \"ad0f0d72-0337-4347-bb50-e299a175f3ca\") " pod="openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7" Feb 23 14:19:26.763462 master-0 kubenswrapper[7728]: I0223 14:19:26.762983 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knkx2\" (UniqueName: \"kubernetes.io/projected/ad0f0d72-0337-4347-bb50-e299a175f3ca-kube-api-access-knkx2\") pod \"cluster-image-registry-operator-779979bdf7-ml2d7\" (UID: \"ad0f0d72-0337-4347-bb50-e299a175f3ca\") " pod="openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7" Feb 23 14:19:26.763462 master-0 kubenswrapper[7728]: I0223 14:19:26.763014 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad0f0d72-0337-4347-bb50-e299a175f3ca-bound-sa-token\") pod \"cluster-image-registry-operator-779979bdf7-ml2d7\" (UID: \"ad0f0d72-0337-4347-bb50-e299a175f3ca\") " pod="openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7" Feb 23 14:19:26.765156 master-0 kubenswrapper[7728]: I0223 14:19:26.764772 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad0f0d72-0337-4347-bb50-e299a175f3ca-trusted-ca\") pod \"cluster-image-registry-operator-779979bdf7-ml2d7\" (UID: \"ad0f0d72-0337-4347-bb50-e299a175f3ca\") " pod="openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7" Feb 23 14:19:26.772092 master-0 kubenswrapper[7728]: I0223 14:19:26.772050 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad0f0d72-0337-4347-bb50-e299a175f3ca-image-registry-operator-tls\") pod \"cluster-image-registry-operator-779979bdf7-ml2d7\" (UID: \"ad0f0d72-0337-4347-bb50-e299a175f3ca\") " pod="openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7" Feb 23 14:19:26.777981 master-0 kubenswrapper[7728]: I0223 14:19:26.777931 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad0f0d72-0337-4347-bb50-e299a175f3ca-bound-sa-token\") pod \"cluster-image-registry-operator-779979bdf7-ml2d7\" (UID: \"ad0f0d72-0337-4347-bb50-e299a175f3ca\") " pod="openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7" Feb 23 14:19:26.779071 master-0 kubenswrapper[7728]: I0223 14:19:26.779032 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knkx2\" (UniqueName: \"kubernetes.io/projected/ad0f0d72-0337-4347-bb50-e299a175f3ca-kube-api-access-knkx2\") pod \"cluster-image-registry-operator-779979bdf7-ml2d7\" (UID: \"ad0f0d72-0337-4347-bb50-e299a175f3ca\") " pod="openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7" Feb 23 14:19:26.877930 master-0 kubenswrapper[7728]: I0223 14:19:26.877000 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-6968c58f46-p7jh7" event={"ID":"85365dec-af50-406c-b258-890e4f454c4a","Type":"ContainerStarted","Data":"4f625178953c567a0bf9ef3fc88c664d915e017cda848446b8ac0cad04aeff48"} Feb 23 14:19:26.877930 master-0 kubenswrapper[7728]: I0223 14:19:26.877050 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-6968c58f46-p7jh7" event={"ID":"85365dec-af50-406c-b258-890e4f454c4a","Type":"ContainerStarted","Data":"b4132c8230caf30ada71198ad6ab1bfac93f4aab775d0d4c1263153a8363aaf9"} Feb 23 14:19:26.878609 master-0 kubenswrapper[7728]: I0223 14:19:26.878567 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_5483bcd0-a9c7-4fdf-9c55-03f85a06b303/installer/0.log" Feb 23 14:19:26.878685 master-0 kubenswrapper[7728]: I0223 14:19:26.878620 7728 generic.go:334] "Generic (PLEG): container finished" podID="5483bcd0-a9c7-4fdf-9c55-03f85a06b303" containerID="2fddcd1257ffb4e028ba0fdbc707d561c98a5d237e44575892b25e37320d6d1d" exitCode=1 Feb 23 14:19:26.879316 master-0 kubenswrapper[7728]: I0223 14:19:26.879286 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"5483bcd0-a9c7-4fdf-9c55-03f85a06b303","Type":"ContainerDied","Data":"2fddcd1257ffb4e028ba0fdbc707d561c98a5d237e44575892b25e37320d6d1d"} Feb 23 14:19:26.880637 master-0 kubenswrapper[7728]: I0223 14:19:26.880419 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-65c5c48b9b-ps6x5"] Feb 23 14:19:26.881313 master-0 kubenswrapper[7728]: I0223 14:19:26.881290 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-65c5c48b9b-ps6x5" Feb 23 14:19:26.887177 master-0 kubenswrapper[7728]: I0223 14:19:26.886093 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 23 14:19:26.887177 master-0 kubenswrapper[7728]: I0223 14:19:26.886819 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 23 14:19:26.887177 master-0 kubenswrapper[7728]: I0223 14:19:26.886890 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 23 14:19:26.898242 master-0 kubenswrapper[7728]: I0223 14:19:26.897690 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-65c5c48b9b-ps6x5"] Feb 23 14:19:26.965292 master-0 kubenswrapper[7728]: I0223 14:19:26.965228 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr7rw\" (UniqueName: \"kubernetes.io/projected/1a283e3a-33ba-4ef7-87d3-55ed8c953fb4-kube-api-access-rr7rw\") pod \"cluster-samples-operator-65c5c48b9b-ps6x5\" (UID: \"1a283e3a-33ba-4ef7-87d3-55ed8c953fb4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-65c5c48b9b-ps6x5" Feb 23 14:19:26.965292 master-0 kubenswrapper[7728]: I0223 14:19:26.965277 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1a283e3a-33ba-4ef7-87d3-55ed8c953fb4-samples-operator-tls\") pod \"cluster-samples-operator-65c5c48b9b-ps6x5\" (UID: \"1a283e3a-33ba-4ef7-87d3-55ed8c953fb4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-65c5c48b9b-ps6x5" Feb 23 14:19:27.051567 master-0 kubenswrapper[7728]: I0223 14:19:27.051406 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7" Feb 23 14:19:27.066136 master-0 kubenswrapper[7728]: I0223 14:19:27.066072 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr7rw\" (UniqueName: \"kubernetes.io/projected/1a283e3a-33ba-4ef7-87d3-55ed8c953fb4-kube-api-access-rr7rw\") pod \"cluster-samples-operator-65c5c48b9b-ps6x5\" (UID: \"1a283e3a-33ba-4ef7-87d3-55ed8c953fb4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-65c5c48b9b-ps6x5" Feb 23 14:19:27.066136 master-0 kubenswrapper[7728]: I0223 14:19:27.066131 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1a283e3a-33ba-4ef7-87d3-55ed8c953fb4-samples-operator-tls\") pod \"cluster-samples-operator-65c5c48b9b-ps6x5\" (UID: \"1a283e3a-33ba-4ef7-87d3-55ed8c953fb4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-65c5c48b9b-ps6x5" Feb 23 14:19:27.072494 master-0 kubenswrapper[7728]: I0223 14:19:27.072397 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1a283e3a-33ba-4ef7-87d3-55ed8c953fb4-samples-operator-tls\") pod \"cluster-samples-operator-65c5c48b9b-ps6x5\" (UID: \"1a283e3a-33ba-4ef7-87d3-55ed8c953fb4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-65c5c48b9b-ps6x5" Feb 23 14:19:27.089762 master-0 kubenswrapper[7728]: I0223 14:19:27.089716 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr7rw\" (UniqueName: \"kubernetes.io/projected/1a283e3a-33ba-4ef7-87d3-55ed8c953fb4-kube-api-access-rr7rw\") pod \"cluster-samples-operator-65c5c48b9b-ps6x5\" (UID: \"1a283e3a-33ba-4ef7-87d3-55ed8c953fb4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-65c5c48b9b-ps6x5" Feb 23 14:19:27.210408 master-0 kubenswrapper[7728]: I0223 14:19:27.210364 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-65c5c48b9b-ps6x5" Feb 23 14:19:27.437898 master-0 kubenswrapper[7728]: I0223 14:19:27.437802 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_5483bcd0-a9c7-4fdf-9c55-03f85a06b303/installer/0.log" Feb 23 14:19:27.437898 master-0 kubenswrapper[7728]: I0223 14:19:27.437861 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Feb 23 14:19:27.578827 master-0 kubenswrapper[7728]: I0223 14:19:27.575609 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5483bcd0-a9c7-4fdf-9c55-03f85a06b303-var-lock\") pod \"5483bcd0-a9c7-4fdf-9c55-03f85a06b303\" (UID: \"5483bcd0-a9c7-4fdf-9c55-03f85a06b303\") " Feb 23 14:19:27.578827 master-0 kubenswrapper[7728]: I0223 14:19:27.575732 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5483bcd0-a9c7-4fdf-9c55-03f85a06b303-kube-api-access\") pod \"5483bcd0-a9c7-4fdf-9c55-03f85a06b303\" (UID: \"5483bcd0-a9c7-4fdf-9c55-03f85a06b303\") " Feb 23 14:19:27.578827 master-0 kubenswrapper[7728]: I0223 14:19:27.575793 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5483bcd0-a9c7-4fdf-9c55-03f85a06b303-kubelet-dir\") pod \"5483bcd0-a9c7-4fdf-9c55-03f85a06b303\" (UID: \"5483bcd0-a9c7-4fdf-9c55-03f85a06b303\") " Feb 23 14:19:27.578827 master-0 kubenswrapper[7728]: I0223 14:19:27.575906 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5483bcd0-a9c7-4fdf-9c55-03f85a06b303-var-lock" (OuterVolumeSpecName: "var-lock") pod "5483bcd0-a9c7-4fdf-9c55-03f85a06b303" (UID: "5483bcd0-a9c7-4fdf-9c55-03f85a06b303"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:19:27.578827 master-0 kubenswrapper[7728]: I0223 14:19:27.576139 7728 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5483bcd0-a9c7-4fdf-9c55-03f85a06b303-var-lock\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:27.578827 master-0 kubenswrapper[7728]: I0223 14:19:27.576249 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5483bcd0-a9c7-4fdf-9c55-03f85a06b303-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5483bcd0-a9c7-4fdf-9c55-03f85a06b303" (UID: "5483bcd0-a9c7-4fdf-9c55-03f85a06b303"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:19:27.581268 master-0 kubenswrapper[7728]: I0223 14:19:27.581237 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5483bcd0-a9c7-4fdf-9c55-03f85a06b303-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5483bcd0-a9c7-4fdf-9c55-03f85a06b303" (UID: "5483bcd0-a9c7-4fdf-9c55-03f85a06b303"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:19:27.676810 master-0 kubenswrapper[7728]: I0223 14:19:27.676773 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5483bcd0-a9c7-4fdf-9c55-03f85a06b303-kube-api-access\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:27.677041 master-0 kubenswrapper[7728]: I0223 14:19:27.677027 7728 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5483bcd0-a9c7-4fdf-9c55-03f85a06b303-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:27.897608 master-0 kubenswrapper[7728]: I0223 14:19:27.897437 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_5483bcd0-a9c7-4fdf-9c55-03f85a06b303/installer/0.log" Feb 23 14:19:27.898343 master-0 kubenswrapper[7728]: I0223 14:19:27.897605 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"5483bcd0-a9c7-4fdf-9c55-03f85a06b303","Type":"ContainerDied","Data":"204d0959f6c407ac3fee0a708d6a503403aa51babda5e1e3e7023222624b4361"} Feb 23 14:19:27.898343 master-0 kubenswrapper[7728]: I0223 14:19:27.897677 7728 scope.go:117] "RemoveContainer" containerID="2fddcd1257ffb4e028ba0fdbc707d561c98a5d237e44575892b25e37320d6d1d" Feb 23 14:19:27.898343 master-0 kubenswrapper[7728]: I0223 14:19:27.897710 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Feb 23 14:19:28.580047 master-0 kubenswrapper[7728]: I0223 14:19:28.579980 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7"] Feb 23 14:19:28.582629 master-0 kubenswrapper[7728]: I0223 14:19:28.582597 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-65c5c48b9b-ps6x5"] Feb 23 14:19:28.589251 master-0 kubenswrapper[7728]: W0223 14:19:28.587353 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad0f0d72_0337_4347_bb50_e299a175f3ca.slice/crio-3dbe1f3d3698f2e251e24d454f894aefdf798ceecbb606aa9dd5f9be4602195a WatchSource:0}: Error finding container 3dbe1f3d3698f2e251e24d454f894aefdf798ceecbb606aa9dd5f9be4602195a: Status 404 returned error can't find the container with id 3dbe1f3d3698f2e251e24d454f894aefdf798ceecbb606aa9dd5f9be4602195a Feb 23 14:19:28.632239 master-0 kubenswrapper[7728]: I0223 14:19:28.632193 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Feb 23 14:19:28.647853 master-0 kubenswrapper[7728]: I0223 14:19:28.647792 7728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Feb 23 14:19:28.668308 master-0 kubenswrapper[7728]: I0223 14:19:28.668236 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp"] Feb 23 14:19:28.669556 master-0 kubenswrapper[7728]: E0223 14:19:28.668741 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5483bcd0-a9c7-4fdf-9c55-03f85a06b303" containerName="installer" Feb 23 14:19:28.669556 master-0 kubenswrapper[7728]: I0223 14:19:28.668770 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="5483bcd0-a9c7-4fdf-9c55-03f85a06b303" containerName="installer" Feb 23 14:19:28.669556 master-0 kubenswrapper[7728]: I0223 14:19:28.668992 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="5483bcd0-a9c7-4fdf-9c55-03f85a06b303" containerName="installer" Feb 23 14:19:28.670949 master-0 kubenswrapper[7728]: I0223 14:19:28.670157 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp" Feb 23 14:19:28.672569 master-0 kubenswrapper[7728]: I0223 14:19:28.672546 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-dnwls" Feb 23 14:19:28.672909 master-0 kubenswrapper[7728]: I0223 14:19:28.672896 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Feb 23 14:19:28.673110 master-0 kubenswrapper[7728]: I0223 14:19:28.673099 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Feb 23 14:19:28.674707 master-0 kubenswrapper[7728]: I0223 14:19:28.674691 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6"] Feb 23 14:19:28.675625 master-0 kubenswrapper[7728]: I0223 14:19:28.675611 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" Feb 23 14:19:28.678242 master-0 kubenswrapper[7728]: I0223 14:19:28.678217 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-dockercfg-vvbrf" Feb 23 14:19:28.678438 master-0 kubenswrapper[7728]: I0223 14:19:28.678377 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Feb 23 14:19:28.678515 master-0 kubenswrapper[7728]: I0223 14:19:28.678436 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Feb 23 14:19:28.678645 master-0 kubenswrapper[7728]: I0223 14:19:28.678231 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Feb 23 14:19:28.678645 master-0 kubenswrapper[7728]: I0223 14:19:28.678255 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Feb 23 14:19:28.685329 master-0 kubenswrapper[7728]: I0223 14:19:28.685264 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp"] Feb 23 14:19:28.694854 master-0 kubenswrapper[7728]: I0223 14:19:28.694810 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rg9g\" (UniqueName: \"kubernetes.io/projected/12b256b7-a57b-4124-8452-25e74cfa7926-kube-api-access-2rg9g\") pod \"cluster-baremetal-operator-d6bb9bb76-4frj6\" (UID: \"12b256b7-a57b-4124-8452-25e74cfa7926\") " pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" Feb 23 14:19:28.694984 master-0 kubenswrapper[7728]: I0223 14:19:28.694878 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/12b256b7-a57b-4124-8452-25e74cfa7926-images\") pod \"cluster-baremetal-operator-d6bb9bb76-4frj6\" (UID: \"12b256b7-a57b-4124-8452-25e74cfa7926\") " pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" Feb 23 14:19:28.694984 master-0 kubenswrapper[7728]: I0223 14:19:28.694908 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3-auth-proxy-config\") pod \"cluster-autoscaler-operator-86b8dc6d6-2kvfp\" (UID: \"3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3\") " pod="openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp" Feb 23 14:19:28.694984 master-0 kubenswrapper[7728]: I0223 14:19:28.694932 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3-cert\") pod \"cluster-autoscaler-operator-86b8dc6d6-2kvfp\" (UID: \"3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3\") " pod="openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp" Feb 23 14:19:28.694984 master-0 kubenswrapper[7728]: I0223 14:19:28.694967 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12b256b7-a57b-4124-8452-25e74cfa7926-config\") pod \"cluster-baremetal-operator-d6bb9bb76-4frj6\" (UID: \"12b256b7-a57b-4124-8452-25e74cfa7926\") " pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" Feb 23 14:19:28.695101 master-0 kubenswrapper[7728]: I0223 14:19:28.694986 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/12b256b7-a57b-4124-8452-25e74cfa7926-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-d6bb9bb76-4frj6\" (UID: \"12b256b7-a57b-4124-8452-25e74cfa7926\") " pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" Feb 23 14:19:28.695101 master-0 kubenswrapper[7728]: I0223 14:19:28.695013 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12b256b7-a57b-4124-8452-25e74cfa7926-cert\") pod \"cluster-baremetal-operator-d6bb9bb76-4frj6\" (UID: \"12b256b7-a57b-4124-8452-25e74cfa7926\") " pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" Feb 23 14:19:28.695101 master-0 kubenswrapper[7728]: I0223 14:19:28.695039 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtbcj\" (UniqueName: \"kubernetes.io/projected/3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3-kube-api-access-qtbcj\") pod \"cluster-autoscaler-operator-86b8dc6d6-2kvfp\" (UID: \"3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3\") " pod="openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp" Feb 23 14:19:28.697232 master-0 kubenswrapper[7728]: I0223 14:19:28.696192 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6"] Feb 23 14:19:28.795696 master-0 kubenswrapper[7728]: I0223 14:19:28.795610 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtbcj\" (UniqueName: \"kubernetes.io/projected/3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3-kube-api-access-qtbcj\") pod \"cluster-autoscaler-operator-86b8dc6d6-2kvfp\" (UID: \"3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3\") " pod="openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp" Feb 23 14:19:28.795914 master-0 kubenswrapper[7728]: I0223 14:19:28.795821 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rg9g\" (UniqueName: \"kubernetes.io/projected/12b256b7-a57b-4124-8452-25e74cfa7926-kube-api-access-2rg9g\") pod \"cluster-baremetal-operator-d6bb9bb76-4frj6\" (UID: \"12b256b7-a57b-4124-8452-25e74cfa7926\") " pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" Feb 23 14:19:28.795914 master-0 kubenswrapper[7728]: I0223 14:19:28.795863 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/12b256b7-a57b-4124-8452-25e74cfa7926-images\") pod \"cluster-baremetal-operator-d6bb9bb76-4frj6\" (UID: \"12b256b7-a57b-4124-8452-25e74cfa7926\") " pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" Feb 23 14:19:28.795914 master-0 kubenswrapper[7728]: I0223 14:19:28.795895 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3-auth-proxy-config\") pod \"cluster-autoscaler-operator-86b8dc6d6-2kvfp\" (UID: \"3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3\") " pod="openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp" Feb 23 14:19:28.796243 master-0 kubenswrapper[7728]: I0223 14:19:28.795923 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3-cert\") pod \"cluster-autoscaler-operator-86b8dc6d6-2kvfp\" (UID: \"3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3\") " pod="openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp" Feb 23 14:19:28.796243 master-0 kubenswrapper[7728]: I0223 14:19:28.796185 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/12b256b7-a57b-4124-8452-25e74cfa7926-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-d6bb9bb76-4frj6\" (UID: \"12b256b7-a57b-4124-8452-25e74cfa7926\") " pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" Feb 23 14:19:28.796243 master-0 kubenswrapper[7728]: I0223 14:19:28.796228 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12b256b7-a57b-4124-8452-25e74cfa7926-config\") pod \"cluster-baremetal-operator-d6bb9bb76-4frj6\" (UID: \"12b256b7-a57b-4124-8452-25e74cfa7926\") " pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" Feb 23 14:19:28.796390 master-0 kubenswrapper[7728]: I0223 14:19:28.796298 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12b256b7-a57b-4124-8452-25e74cfa7926-cert\") pod \"cluster-baremetal-operator-d6bb9bb76-4frj6\" (UID: \"12b256b7-a57b-4124-8452-25e74cfa7926\") " pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" Feb 23 14:19:28.797158 master-0 kubenswrapper[7728]: I0223 14:19:28.797115 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3-auth-proxy-config\") pod \"cluster-autoscaler-operator-86b8dc6d6-2kvfp\" (UID: \"3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3\") " pod="openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp" Feb 23 14:19:28.797236 master-0 kubenswrapper[7728]: I0223 14:19:28.797181 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/12b256b7-a57b-4124-8452-25e74cfa7926-images\") pod \"cluster-baremetal-operator-d6bb9bb76-4frj6\" (UID: \"12b256b7-a57b-4124-8452-25e74cfa7926\") " pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" Feb 23 14:19:28.797337 master-0 kubenswrapper[7728]: I0223 14:19:28.797286 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12b256b7-a57b-4124-8452-25e74cfa7926-config\") pod \"cluster-baremetal-operator-d6bb9bb76-4frj6\" (UID: \"12b256b7-a57b-4124-8452-25e74cfa7926\") " pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" Feb 23 14:19:28.799597 master-0 kubenswrapper[7728]: I0223 14:19:28.799567 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/12b256b7-a57b-4124-8452-25e74cfa7926-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-d6bb9bb76-4frj6\" (UID: \"12b256b7-a57b-4124-8452-25e74cfa7926\") " pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" Feb 23 14:19:28.800038 master-0 kubenswrapper[7728]: I0223 14:19:28.799996 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12b256b7-a57b-4124-8452-25e74cfa7926-cert\") pod \"cluster-baremetal-operator-d6bb9bb76-4frj6\" (UID: \"12b256b7-a57b-4124-8452-25e74cfa7926\") " pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" Feb 23 14:19:28.811533 master-0 kubenswrapper[7728]: I0223 14:19:28.811504 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3-cert\") pod \"cluster-autoscaler-operator-86b8dc6d6-2kvfp\" (UID: \"3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3\") " pod="openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp" Feb 23 14:19:28.813793 master-0 kubenswrapper[7728]: I0223 14:19:28.813750 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rg9g\" (UniqueName: \"kubernetes.io/projected/12b256b7-a57b-4124-8452-25e74cfa7926-kube-api-access-2rg9g\") pod \"cluster-baremetal-operator-d6bb9bb76-4frj6\" (UID: \"12b256b7-a57b-4124-8452-25e74cfa7926\") " pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" Feb 23 14:19:28.815076 master-0 kubenswrapper[7728]: I0223 14:19:28.815038 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtbcj\" (UniqueName: \"kubernetes.io/projected/3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3-kube-api-access-qtbcj\") pod \"cluster-autoscaler-operator-86b8dc6d6-2kvfp\" (UID: \"3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3\") " pod="openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp" Feb 23 14:19:28.903458 master-0 kubenswrapper[7728]: I0223 14:19:28.903344 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7" event={"ID":"ad0f0d72-0337-4347-bb50-e299a175f3ca","Type":"ContainerStarted","Data":"3dbe1f3d3698f2e251e24d454f894aefdf798ceecbb606aa9dd5f9be4602195a"} Feb 23 14:19:28.906365 master-0 kubenswrapper[7728]: I0223 14:19:28.906302 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-65c5c48b9b-ps6x5" event={"ID":"1a283e3a-33ba-4ef7-87d3-55ed8c953fb4","Type":"ContainerStarted","Data":"d9d7c0c01b5302e99d057b82e04bc20f6aaa2ecd35612cea6195a17dbb1d878e"} Feb 23 14:19:29.021474 master-0 kubenswrapper[7728]: I0223 14:19:29.021362 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp" Feb 23 14:19:29.037801 master-0 kubenswrapper[7728]: I0223 14:19:29.037746 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" Feb 23 14:19:29.232870 master-0 kubenswrapper[7728]: I0223 14:19:29.232550 7728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5483bcd0-a9c7-4fdf-9c55-03f85a06b303" path="/var/lib/kubelet/pods/5483bcd0-a9c7-4fdf-9c55-03f85a06b303/volumes" Feb 23 14:19:30.016136 master-0 kubenswrapper[7728]: I0223 14:19:30.015957 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6"] Feb 23 14:19:30.019614 master-0 kubenswrapper[7728]: I0223 14:19:30.018430 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp"] Feb 23 14:19:30.028001 master-0 kubenswrapper[7728]: I0223 14:19:30.026841 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Feb 23 14:19:30.028001 master-0 kubenswrapper[7728]: I0223 14:19:30.027104 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/installer-1-master-0" podUID="d1bffce5-019a-4c97-85f2-929dc19a0bde" containerName="installer" containerID="cri-o://e8935c6e444aa0e24024f5ff856a9b30868587f159c8c2351155b9dff7539917" gracePeriod=30 Feb 23 14:19:30.052986 master-0 kubenswrapper[7728]: I0223 14:19:30.052925 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-5499d7f7bb-t45zz"] Feb 23 14:19:30.095202 master-0 kubenswrapper[7728]: I0223 14:19:30.053736 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-5499d7f7bb-t45zz" Feb 23 14:19:30.095202 master-0 kubenswrapper[7728]: I0223 14:19:30.057045 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr"] Feb 23 14:19:30.095202 master-0 kubenswrapper[7728]: I0223 14:19:30.057931 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" Feb 23 14:19:30.095202 master-0 kubenswrapper[7728]: I0223 14:19:30.065226 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-zc2l5" Feb 23 14:19:30.095202 master-0 kubenswrapper[7728]: I0223 14:19:30.065759 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 23 14:19:30.095202 master-0 kubenswrapper[7728]: I0223 14:19:30.065966 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 23 14:19:30.095202 master-0 kubenswrapper[7728]: I0223 14:19:30.066442 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 23 14:19:30.095202 master-0 kubenswrapper[7728]: I0223 14:19:30.066641 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 23 14:19:30.095202 master-0 kubenswrapper[7728]: I0223 14:19:30.066784 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 23 14:19:30.095202 master-0 kubenswrapper[7728]: I0223 14:19:30.066938 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-r2snt" Feb 23 14:19:30.096937 master-0 kubenswrapper[7728]: W0223 14:19:30.096893 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12b256b7_a57b_4124_8452_25e74cfa7926.slice/crio-14329fd568a14f04f43b97498ab954734f3a702059891d8fa97640d70060f640 WatchSource:0}: Error finding container 14329fd568a14f04f43b97498ab954734f3a702059891d8fa97640d70060f640: Status 404 returned error can't find the container with id 14329fd568a14f04f43b97498ab954734f3a702059891d8fa97640d70060f640 Feb 23 14:19:30.127509 master-0 kubenswrapper[7728]: I0223 14:19:30.124576 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-596f79dd6f-mhzxn"] Feb 23 14:19:30.127509 master-0 kubenswrapper[7728]: I0223 14:19:30.125226 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-596f79dd6f-mhzxn" Feb 23 14:19:30.127509 master-0 kubenswrapper[7728]: I0223 14:19:30.126675 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 23 14:19:30.130527 master-0 kubenswrapper[7728]: I0223 14:19:30.128389 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-5499d7f7bb-t45zz"] Feb 23 14:19:30.151502 master-0 kubenswrapper[7728]: I0223 14:19:30.138560 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr"] Feb 23 14:19:30.151502 master-0 kubenswrapper[7728]: I0223 14:19:30.145430 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-596f79dd6f-mhzxn"] Feb 23 14:19:30.177611 master-0 kubenswrapper[7728]: I0223 14:19:30.177490 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-f94476f49-s6c8v"] Feb 23 14:19:30.178268 master-0 kubenswrapper[7728]: I0223 14:19:30.178234 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-f94476f49-s6c8v" Feb 23 14:19:30.178990 master-0 kubenswrapper[7728]: I0223 14:19:30.178924 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-59b498fcfb-rz897"] Feb 23 14:19:30.181099 master-0 kubenswrapper[7728]: I0223 14:19:30.179502 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-59b498fcfb-rz897" Feb 23 14:19:30.194549 master-0 kubenswrapper[7728]: I0223 14:19:30.194151 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-cpnmg" Feb 23 14:19:30.194549 master-0 kubenswrapper[7728]: I0223 14:19:30.194538 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Feb 23 14:19:30.199993 master-0 kubenswrapper[7728]: I0223 14:19:30.195229 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Feb 23 14:19:30.199993 master-0 kubenswrapper[7728]: I0223 14:19:30.195505 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Feb 23 14:19:30.199993 master-0 kubenswrapper[7728]: I0223 14:19:30.195741 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Feb 23 14:19:30.202005 master-0 kubenswrapper[7728]: I0223 14:19:30.201969 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Feb 23 14:19:30.202205 master-0 kubenswrapper[7728]: I0223 14:19:30.202178 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Feb 23 14:19:30.212773 master-0 kubenswrapper[7728]: I0223 14:19:30.211950 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/255b5a89-1b89-42dc-868a-32ce67975a54-profile-collector-cert\") pod \"catalog-operator-596f79dd6f-mhzxn\" (UID: \"255b5a89-1b89-42dc-868a-32ce67975a54\") " pod="openshift-operator-lifecycle-manager/catalog-operator-596f79dd6f-mhzxn" Feb 23 14:19:30.212773 master-0 kubenswrapper[7728]: I0223 14:19:30.212004 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nn4m\" (UniqueName: \"kubernetes.io/projected/255b5a89-1b89-42dc-868a-32ce67975a54-kube-api-access-5nn4m\") pod \"catalog-operator-596f79dd6f-mhzxn\" (UID: \"255b5a89-1b89-42dc-868a-32ce67975a54\") " pod="openshift-operator-lifecycle-manager/catalog-operator-596f79dd6f-mhzxn" Feb 23 14:19:30.212773 master-0 kubenswrapper[7728]: I0223 14:19:30.212043 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0cebb80d-d898-44c8-82b3-1e18833cee3f-srv-cert\") pod \"olm-operator-5499d7f7bb-t45zz\" (UID: \"0cebb80d-d898-44c8-82b3-1e18833cee3f\") " pod="openshift-operator-lifecycle-manager/olm-operator-5499d7f7bb-t45zz" Feb 23 14:19:30.212773 master-0 kubenswrapper[7728]: I0223 14:19:30.212060 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0cebb80d-d898-44c8-82b3-1e18833cee3f-profile-collector-cert\") pod \"olm-operator-5499d7f7bb-t45zz\" (UID: \"0cebb80d-d898-44c8-82b3-1e18833cee3f\") " pod="openshift-operator-lifecycle-manager/olm-operator-5499d7f7bb-t45zz" Feb 23 14:19:30.212773 master-0 kubenswrapper[7728]: I0223 14:19:30.212079 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tl7p\" (UniqueName: \"kubernetes.io/projected/92c63c95-e880-4f51-9858-7715343f7bd8-kube-api-access-9tl7p\") pod \"openshift-config-operator-6f47d587d6-55qjr\" (UID: \"92c63c95-e880-4f51-9858-7715343f7bd8\") " pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" Feb 23 14:19:30.212773 master-0 kubenswrapper[7728]: I0223 14:19:30.212123 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/255b5a89-1b89-42dc-868a-32ce67975a54-srv-cert\") pod \"catalog-operator-596f79dd6f-mhzxn\" (UID: \"255b5a89-1b89-42dc-868a-32ce67975a54\") " pod="openshift-operator-lifecycle-manager/catalog-operator-596f79dd6f-mhzxn" Feb 23 14:19:30.212773 master-0 kubenswrapper[7728]: I0223 14:19:30.212149 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44599\" (UniqueName: \"kubernetes.io/projected/0cebb80d-d898-44c8-82b3-1e18833cee3f-kube-api-access-44599\") pod \"olm-operator-5499d7f7bb-t45zz\" (UID: \"0cebb80d-d898-44c8-82b3-1e18833cee3f\") " pod="openshift-operator-lifecycle-manager/olm-operator-5499d7f7bb-t45zz" Feb 23 14:19:30.212773 master-0 kubenswrapper[7728]: I0223 14:19:30.212186 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/92c63c95-e880-4f51-9858-7715343f7bd8-available-featuregates\") pod \"openshift-config-operator-6f47d587d6-55qjr\" (UID: \"92c63c95-e880-4f51-9858-7715343f7bd8\") " pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" Feb 23 14:19:30.212773 master-0 kubenswrapper[7728]: I0223 14:19:30.212206 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92c63c95-e880-4f51-9858-7715343f7bd8-serving-cert\") pod \"openshift-config-operator-6f47d587d6-55qjr\" (UID: \"92c63c95-e880-4f51-9858-7715343f7bd8\") " pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" Feb 23 14:19:30.219912 master-0 kubenswrapper[7728]: I0223 14:19:30.216556 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-f94476f49-s6c8v"] Feb 23 14:19:30.219912 master-0 kubenswrapper[7728]: I0223 14:19:30.219036 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-59b498fcfb-rz897"] Feb 23 14:19:30.313227 master-0 kubenswrapper[7728]: I0223 14:19:30.313010 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbb66172-1ea9-4683-b88f-227c4fd94924-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-f94476f49-s6c8v\" (UID: \"fbb66172-1ea9-4683-b88f-227c4fd94924\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-f94476f49-s6c8v" Feb 23 14:19:30.313227 master-0 kubenswrapper[7728]: I0223 14:19:30.313083 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/255b5a89-1b89-42dc-868a-32ce67975a54-srv-cert\") pod \"catalog-operator-596f79dd6f-mhzxn\" (UID: \"255b5a89-1b89-42dc-868a-32ce67975a54\") " pod="openshift-operator-lifecycle-manager/catalog-operator-596f79dd6f-mhzxn" Feb 23 14:19:30.313227 master-0 kubenswrapper[7728]: I0223 14:19:30.313126 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl87q\" (UniqueName: \"kubernetes.io/projected/fbb66172-1ea9-4683-b88f-227c4fd94924-kube-api-access-kl87q\") pod \"cluster-storage-operator-f94476f49-s6c8v\" (UID: \"fbb66172-1ea9-4683-b88f-227c4fd94924\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-f94476f49-s6c8v" Feb 23 14:19:30.313227 master-0 kubenswrapper[7728]: I0223 14:19:30.313150 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzj2j\" (UniqueName: \"kubernetes.io/projected/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-kube-api-access-lzj2j\") pod \"insights-operator-59b498fcfb-rz897\" (UID: \"ae4baa4e-4ef4-433d-aa36-149e92fa6ee2\") " pod="openshift-insights/insights-operator-59b498fcfb-rz897" Feb 23 14:19:30.313227 master-0 kubenswrapper[7728]: I0223 14:19:30.313177 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44599\" (UniqueName: \"kubernetes.io/projected/0cebb80d-d898-44c8-82b3-1e18833cee3f-kube-api-access-44599\") pod \"olm-operator-5499d7f7bb-t45zz\" (UID: \"0cebb80d-d898-44c8-82b3-1e18833cee3f\") " pod="openshift-operator-lifecycle-manager/olm-operator-5499d7f7bb-t45zz" Feb 23 14:19:30.313227 master-0 kubenswrapper[7728]: I0223 14:19:30.313205 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-service-ca-bundle\") pod \"insights-operator-59b498fcfb-rz897\" (UID: \"ae4baa4e-4ef4-433d-aa36-149e92fa6ee2\") " pod="openshift-insights/insights-operator-59b498fcfb-rz897" Feb 23 14:19:30.313227 master-0 kubenswrapper[7728]: I0223 14:19:30.313231 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/92c63c95-e880-4f51-9858-7715343f7bd8-available-featuregates\") pod \"openshift-config-operator-6f47d587d6-55qjr\" (UID: \"92c63c95-e880-4f51-9858-7715343f7bd8\") " pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" Feb 23 14:19:30.313533 master-0 kubenswrapper[7728]: I0223 14:19:30.313254 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92c63c95-e880-4f51-9858-7715343f7bd8-serving-cert\") pod \"openshift-config-operator-6f47d587d6-55qjr\" (UID: \"92c63c95-e880-4f51-9858-7715343f7bd8\") " pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" Feb 23 14:19:30.313533 master-0 kubenswrapper[7728]: I0223 14:19:30.313281 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/255b5a89-1b89-42dc-868a-32ce67975a54-profile-collector-cert\") pod \"catalog-operator-596f79dd6f-mhzxn\" (UID: \"255b5a89-1b89-42dc-868a-32ce67975a54\") " pod="openshift-operator-lifecycle-manager/catalog-operator-596f79dd6f-mhzxn" Feb 23 14:19:30.313533 master-0 kubenswrapper[7728]: I0223 14:19:30.313304 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nn4m\" (UniqueName: \"kubernetes.io/projected/255b5a89-1b89-42dc-868a-32ce67975a54-kube-api-access-5nn4m\") pod \"catalog-operator-596f79dd6f-mhzxn\" (UID: \"255b5a89-1b89-42dc-868a-32ce67975a54\") " pod="openshift-operator-lifecycle-manager/catalog-operator-596f79dd6f-mhzxn" Feb 23 14:19:30.313533 master-0 kubenswrapper[7728]: I0223 14:19:30.313333 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-serving-cert\") pod \"insights-operator-59b498fcfb-rz897\" (UID: \"ae4baa4e-4ef4-433d-aa36-149e92fa6ee2\") " pod="openshift-insights/insights-operator-59b498fcfb-rz897" Feb 23 14:19:30.313533 master-0 kubenswrapper[7728]: I0223 14:19:30.313361 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0cebb80d-d898-44c8-82b3-1e18833cee3f-srv-cert\") pod \"olm-operator-5499d7f7bb-t45zz\" (UID: \"0cebb80d-d898-44c8-82b3-1e18833cee3f\") " pod="openshift-operator-lifecycle-manager/olm-operator-5499d7f7bb-t45zz" Feb 23 14:19:30.313533 master-0 kubenswrapper[7728]: I0223 14:19:30.313391 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0cebb80d-d898-44c8-82b3-1e18833cee3f-profile-collector-cert\") pod \"olm-operator-5499d7f7bb-t45zz\" (UID: \"0cebb80d-d898-44c8-82b3-1e18833cee3f\") " pod="openshift-operator-lifecycle-manager/olm-operator-5499d7f7bb-t45zz" Feb 23 14:19:30.315247 master-0 kubenswrapper[7728]: I0223 14:19:30.314872 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/92c63c95-e880-4f51-9858-7715343f7bd8-available-featuregates\") pod \"openshift-config-operator-6f47d587d6-55qjr\" (UID: \"92c63c95-e880-4f51-9858-7715343f7bd8\") " pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" Feb 23 14:19:30.315247 master-0 kubenswrapper[7728]: I0223 14:19:30.315030 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tl7p\" (UniqueName: \"kubernetes.io/projected/92c63c95-e880-4f51-9858-7715343f7bd8-kube-api-access-9tl7p\") pod \"openshift-config-operator-6f47d587d6-55qjr\" (UID: \"92c63c95-e880-4f51-9858-7715343f7bd8\") " pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" Feb 23 14:19:30.315247 master-0 kubenswrapper[7728]: I0223 14:19:30.315071 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-trusted-ca-bundle\") pod \"insights-operator-59b498fcfb-rz897\" (UID: \"ae4baa4e-4ef4-433d-aa36-149e92fa6ee2\") " pod="openshift-insights/insights-operator-59b498fcfb-rz897" Feb 23 14:19:30.315247 master-0 kubenswrapper[7728]: I0223 14:19:30.315099 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-snapshots\") pod \"insights-operator-59b498fcfb-rz897\" (UID: \"ae4baa4e-4ef4-433d-aa36-149e92fa6ee2\") " pod="openshift-insights/insights-operator-59b498fcfb-rz897" Feb 23 14:19:30.324341 master-0 kubenswrapper[7728]: I0223 14:19:30.318155 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0cebb80d-d898-44c8-82b3-1e18833cee3f-srv-cert\") pod \"olm-operator-5499d7f7bb-t45zz\" (UID: \"0cebb80d-d898-44c8-82b3-1e18833cee3f\") " pod="openshift-operator-lifecycle-manager/olm-operator-5499d7f7bb-t45zz" Feb 23 14:19:30.324341 master-0 kubenswrapper[7728]: I0223 14:19:30.318716 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/255b5a89-1b89-42dc-868a-32ce67975a54-profile-collector-cert\") pod \"catalog-operator-596f79dd6f-mhzxn\" (UID: \"255b5a89-1b89-42dc-868a-32ce67975a54\") " pod="openshift-operator-lifecycle-manager/catalog-operator-596f79dd6f-mhzxn" Feb 23 14:19:30.324341 master-0 kubenswrapper[7728]: I0223 14:19:30.319933 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/255b5a89-1b89-42dc-868a-32ce67975a54-srv-cert\") pod \"catalog-operator-596f79dd6f-mhzxn\" (UID: \"255b5a89-1b89-42dc-868a-32ce67975a54\") " pod="openshift-operator-lifecycle-manager/catalog-operator-596f79dd6f-mhzxn" Feb 23 14:19:30.332136 master-0 kubenswrapper[7728]: I0223 14:19:30.332100 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0cebb80d-d898-44c8-82b3-1e18833cee3f-profile-collector-cert\") pod \"olm-operator-5499d7f7bb-t45zz\" (UID: \"0cebb80d-d898-44c8-82b3-1e18833cee3f\") " pod="openshift-operator-lifecycle-manager/olm-operator-5499d7f7bb-t45zz" Feb 23 14:19:30.338865 master-0 kubenswrapper[7728]: I0223 14:19:30.338825 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92c63c95-e880-4f51-9858-7715343f7bd8-serving-cert\") pod \"openshift-config-operator-6f47d587d6-55qjr\" (UID: \"92c63c95-e880-4f51-9858-7715343f7bd8\") " pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" Feb 23 14:19:30.348569 master-0 kubenswrapper[7728]: I0223 14:19:30.348424 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tl7p\" (UniqueName: \"kubernetes.io/projected/92c63c95-e880-4f51-9858-7715343f7bd8-kube-api-access-9tl7p\") pod \"openshift-config-operator-6f47d587d6-55qjr\" (UID: \"92c63c95-e880-4f51-9858-7715343f7bd8\") " pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" Feb 23 14:19:30.352568 master-0 kubenswrapper[7728]: I0223 14:19:30.352530 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nn4m\" (UniqueName: \"kubernetes.io/projected/255b5a89-1b89-42dc-868a-32ce67975a54-kube-api-access-5nn4m\") pod \"catalog-operator-596f79dd6f-mhzxn\" (UID: \"255b5a89-1b89-42dc-868a-32ce67975a54\") " pod="openshift-operator-lifecycle-manager/catalog-operator-596f79dd6f-mhzxn" Feb 23 14:19:30.355252 master-0 kubenswrapper[7728]: I0223 14:19:30.355221 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44599\" (UniqueName: \"kubernetes.io/projected/0cebb80d-d898-44c8-82b3-1e18833cee3f-kube-api-access-44599\") pod \"olm-operator-5499d7f7bb-t45zz\" (UID: \"0cebb80d-d898-44c8-82b3-1e18833cee3f\") " pod="openshift-operator-lifecycle-manager/olm-operator-5499d7f7bb-t45zz" Feb 23 14:19:30.420497 master-0 kubenswrapper[7728]: I0223 14:19:30.416336 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-snapshots\") pod \"insights-operator-59b498fcfb-rz897\" (UID: \"ae4baa4e-4ef4-433d-aa36-149e92fa6ee2\") " pod="openshift-insights/insights-operator-59b498fcfb-rz897" Feb 23 14:19:30.420497 master-0 kubenswrapper[7728]: I0223 14:19:30.416379 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-trusted-ca-bundle\") pod \"insights-operator-59b498fcfb-rz897\" (UID: \"ae4baa4e-4ef4-433d-aa36-149e92fa6ee2\") " pod="openshift-insights/insights-operator-59b498fcfb-rz897" Feb 23 14:19:30.420497 master-0 kubenswrapper[7728]: I0223 14:19:30.416407 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbb66172-1ea9-4683-b88f-227c4fd94924-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-f94476f49-s6c8v\" (UID: \"fbb66172-1ea9-4683-b88f-227c4fd94924\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-f94476f49-s6c8v" Feb 23 14:19:30.420497 master-0 kubenswrapper[7728]: I0223 14:19:30.416440 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl87q\" (UniqueName: \"kubernetes.io/projected/fbb66172-1ea9-4683-b88f-227c4fd94924-kube-api-access-kl87q\") pod \"cluster-storage-operator-f94476f49-s6c8v\" (UID: \"fbb66172-1ea9-4683-b88f-227c4fd94924\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-f94476f49-s6c8v" Feb 23 14:19:30.420497 master-0 kubenswrapper[7728]: I0223 14:19:30.416459 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzj2j\" (UniqueName: \"kubernetes.io/projected/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-kube-api-access-lzj2j\") pod \"insights-operator-59b498fcfb-rz897\" (UID: \"ae4baa4e-4ef4-433d-aa36-149e92fa6ee2\") " pod="openshift-insights/insights-operator-59b498fcfb-rz897" Feb 23 14:19:30.420497 master-0 kubenswrapper[7728]: I0223 14:19:30.416494 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-service-ca-bundle\") pod \"insights-operator-59b498fcfb-rz897\" (UID: \"ae4baa4e-4ef4-433d-aa36-149e92fa6ee2\") " pod="openshift-insights/insights-operator-59b498fcfb-rz897" Feb 23 14:19:30.420497 master-0 kubenswrapper[7728]: I0223 14:19:30.416522 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-serving-cert\") pod \"insights-operator-59b498fcfb-rz897\" (UID: \"ae4baa4e-4ef4-433d-aa36-149e92fa6ee2\") " pod="openshift-insights/insights-operator-59b498fcfb-rz897" Feb 23 14:19:30.420497 master-0 kubenswrapper[7728]: I0223 14:19:30.419137 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-serving-cert\") pod \"insights-operator-59b498fcfb-rz897\" (UID: \"ae4baa4e-4ef4-433d-aa36-149e92fa6ee2\") " pod="openshift-insights/insights-operator-59b498fcfb-rz897" Feb 23 14:19:30.420497 master-0 kubenswrapper[7728]: I0223 14:19:30.419524 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-snapshots\") pod \"insights-operator-59b498fcfb-rz897\" (UID: \"ae4baa4e-4ef4-433d-aa36-149e92fa6ee2\") " pod="openshift-insights/insights-operator-59b498fcfb-rz897" Feb 23 14:19:30.420881 master-0 kubenswrapper[7728]: I0223 14:19:30.420602 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-service-ca-bundle\") pod \"insights-operator-59b498fcfb-rz897\" (UID: \"ae4baa4e-4ef4-433d-aa36-149e92fa6ee2\") " pod="openshift-insights/insights-operator-59b498fcfb-rz897" Feb 23 14:19:30.421521 master-0 kubenswrapper[7728]: I0223 14:19:30.421377 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-trusted-ca-bundle\") pod \"insights-operator-59b498fcfb-rz897\" (UID: \"ae4baa4e-4ef4-433d-aa36-149e92fa6ee2\") " pod="openshift-insights/insights-operator-59b498fcfb-rz897" Feb 23 14:19:30.425497 master-0 kubenswrapper[7728]: I0223 14:19:30.423750 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbb66172-1ea9-4683-b88f-227c4fd94924-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-f94476f49-s6c8v\" (UID: \"fbb66172-1ea9-4683-b88f-227c4fd94924\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-f94476f49-s6c8v" Feb 23 14:19:30.441286 master-0 kubenswrapper[7728]: I0223 14:19:30.441237 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-5499d7f7bb-t45zz" Feb 23 14:19:30.441723 master-0 kubenswrapper[7728]: I0223 14:19:30.441696 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl87q\" (UniqueName: \"kubernetes.io/projected/fbb66172-1ea9-4683-b88f-227c4fd94924-kube-api-access-kl87q\") pod \"cluster-storage-operator-f94476f49-s6c8v\" (UID: \"fbb66172-1ea9-4683-b88f-227c4fd94924\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-f94476f49-s6c8v" Feb 23 14:19:30.449505 master-0 kubenswrapper[7728]: I0223 14:19:30.449089 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzj2j\" (UniqueName: \"kubernetes.io/projected/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-kube-api-access-lzj2j\") pod \"insights-operator-59b498fcfb-rz897\" (UID: \"ae4baa4e-4ef4-433d-aa36-149e92fa6ee2\") " pod="openshift-insights/insights-operator-59b498fcfb-rz897" Feb 23 14:19:30.488803 master-0 kubenswrapper[7728]: I0223 14:19:30.484282 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" Feb 23 14:19:30.497157 master-0 kubenswrapper[7728]: I0223 14:19:30.496713 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr"] Feb 23 14:19:30.498972 master-0 kubenswrapper[7728]: I0223 14:19:30.498951 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" Feb 23 14:19:30.500082 master-0 kubenswrapper[7728]: I0223 14:19:30.500050 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr"] Feb 23 14:19:30.503248 master-0 kubenswrapper[7728]: I0223 14:19:30.502605 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 23 14:19:30.503248 master-0 kubenswrapper[7728]: I0223 14:19:30.502916 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 23 14:19:30.503248 master-0 kubenswrapper[7728]: I0223 14:19:30.502928 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-zmwkf" Feb 23 14:19:30.503248 master-0 kubenswrapper[7728]: I0223 14:19:30.503012 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 23 14:19:30.503248 master-0 kubenswrapper[7728]: I0223 14:19:30.503091 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 23 14:19:30.504347 master-0 kubenswrapper[7728]: I0223 14:19:30.504237 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 23 14:19:30.512501 master-0 kubenswrapper[7728]: I0223 14:19:30.512446 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-596f79dd6f-mhzxn" Feb 23 14:19:30.590832 master-0 kubenswrapper[7728]: I0223 14:19:30.580946 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Feb 23 14:19:30.590832 master-0 kubenswrapper[7728]: I0223 14:19:30.581147 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-3-master-0" podUID="5a5f3286-9ec6-4867-aaf7-3c31b1f6c126" containerName="installer" containerID="cri-o://0e1388d50e8407b49b200cd8d22318640f8b3d9ec7fffb9a2e4a3714e4725182" gracePeriod=30 Feb 23 14:19:30.625586 master-0 kubenswrapper[7728]: I0223 14:19:30.615195 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-f94476f49-s6c8v" Feb 23 14:19:30.626583 master-0 kubenswrapper[7728]: I0223 14:19:30.626391 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/af950a67-1557-4352-8100-27281bb8ecbe-auth-proxy-config\") pod \"machine-config-operator-7f8c75f984-rdjxr\" (UID: \"af950a67-1557-4352-8100-27281bb8ecbe\") " pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" Feb 23 14:19:30.626583 master-0 kubenswrapper[7728]: I0223 14:19:30.626425 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxrvf\" (UniqueName: \"kubernetes.io/projected/af950a67-1557-4352-8100-27281bb8ecbe-kube-api-access-jxrvf\") pod \"machine-config-operator-7f8c75f984-rdjxr\" (UID: \"af950a67-1557-4352-8100-27281bb8ecbe\") " pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" Feb 23 14:19:30.626583 master-0 kubenswrapper[7728]: I0223 14:19:30.626449 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/af950a67-1557-4352-8100-27281bb8ecbe-images\") pod \"machine-config-operator-7f8c75f984-rdjxr\" (UID: \"af950a67-1557-4352-8100-27281bb8ecbe\") " pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" Feb 23 14:19:30.626583 master-0 kubenswrapper[7728]: I0223 14:19:30.626493 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/af950a67-1557-4352-8100-27281bb8ecbe-proxy-tls\") pod \"machine-config-operator-7f8c75f984-rdjxr\" (UID: \"af950a67-1557-4352-8100-27281bb8ecbe\") " pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" Feb 23 14:19:30.659760 master-0 kubenswrapper[7728]: I0223 14:19:30.659650 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-59b498fcfb-rz897" Feb 23 14:19:30.728335 master-0 kubenswrapper[7728]: I0223 14:19:30.727760 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/af950a67-1557-4352-8100-27281bb8ecbe-auth-proxy-config\") pod \"machine-config-operator-7f8c75f984-rdjxr\" (UID: \"af950a67-1557-4352-8100-27281bb8ecbe\") " pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" Feb 23 14:19:30.728335 master-0 kubenswrapper[7728]: I0223 14:19:30.727800 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxrvf\" (UniqueName: \"kubernetes.io/projected/af950a67-1557-4352-8100-27281bb8ecbe-kube-api-access-jxrvf\") pod \"machine-config-operator-7f8c75f984-rdjxr\" (UID: \"af950a67-1557-4352-8100-27281bb8ecbe\") " pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" Feb 23 14:19:30.728335 master-0 kubenswrapper[7728]: I0223 14:19:30.727822 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/af950a67-1557-4352-8100-27281bb8ecbe-images\") pod \"machine-config-operator-7f8c75f984-rdjxr\" (UID: \"af950a67-1557-4352-8100-27281bb8ecbe\") " pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" Feb 23 14:19:30.728335 master-0 kubenswrapper[7728]: I0223 14:19:30.727857 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/af950a67-1557-4352-8100-27281bb8ecbe-proxy-tls\") pod \"machine-config-operator-7f8c75f984-rdjxr\" (UID: \"af950a67-1557-4352-8100-27281bb8ecbe\") " pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" Feb 23 14:19:30.729336 master-0 kubenswrapper[7728]: I0223 14:19:30.729318 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/af950a67-1557-4352-8100-27281bb8ecbe-auth-proxy-config\") pod \"machine-config-operator-7f8c75f984-rdjxr\" (UID: \"af950a67-1557-4352-8100-27281bb8ecbe\") " pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" Feb 23 14:19:30.729834 master-0 kubenswrapper[7728]: I0223 14:19:30.729795 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/af950a67-1557-4352-8100-27281bb8ecbe-images\") pod \"machine-config-operator-7f8c75f984-rdjxr\" (UID: \"af950a67-1557-4352-8100-27281bb8ecbe\") " pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" Feb 23 14:19:30.730638 master-0 kubenswrapper[7728]: I0223 14:19:30.730595 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/af950a67-1557-4352-8100-27281bb8ecbe-proxy-tls\") pod \"machine-config-operator-7f8c75f984-rdjxr\" (UID: \"af950a67-1557-4352-8100-27281bb8ecbe\") " pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" Feb 23 14:19:30.745327 master-0 kubenswrapper[7728]: I0223 14:19:30.745249 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxrvf\" (UniqueName: \"kubernetes.io/projected/af950a67-1557-4352-8100-27281bb8ecbe-kube-api-access-jxrvf\") pod \"machine-config-operator-7f8c75f984-rdjxr\" (UID: \"af950a67-1557-4352-8100-27281bb8ecbe\") " pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" Feb 23 14:19:30.823931 master-0 kubenswrapper[7728]: I0223 14:19:30.823570 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" Feb 23 14:19:30.921375 master-0 kubenswrapper[7728]: I0223 14:19:30.921321 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" event={"ID":"12b256b7-a57b-4124-8452-25e74cfa7926","Type":"ContainerStarted","Data":"14329fd568a14f04f43b97498ab954734f3a702059891d8fa97640d70060f640"} Feb 23 14:19:30.923208 master-0 kubenswrapper[7728]: I0223 14:19:30.923080 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_5a5f3286-9ec6-4867-aaf7-3c31b1f6c126/installer/0.log" Feb 23 14:19:30.923208 master-0 kubenswrapper[7728]: I0223 14:19:30.923126 7728 generic.go:334] "Generic (PLEG): container finished" podID="5a5f3286-9ec6-4867-aaf7-3c31b1f6c126" containerID="0e1388d50e8407b49b200cd8d22318640f8b3d9ec7fffb9a2e4a3714e4725182" exitCode=1 Feb 23 14:19:30.923208 master-0 kubenswrapper[7728]: I0223 14:19:30.923183 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"5a5f3286-9ec6-4867-aaf7-3c31b1f6c126","Type":"ContainerDied","Data":"0e1388d50e8407b49b200cd8d22318640f8b3d9ec7fffb9a2e4a3714e4725182"} Feb 23 14:19:30.925972 master-0 kubenswrapper[7728]: I0223 14:19:30.925918 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp" event={"ID":"3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3","Type":"ContainerStarted","Data":"20ad7c69969be4fe0faec7c107f4d93134a311b1be86286c2c8404730a5b81aa"} Feb 23 14:19:30.925972 master-0 kubenswrapper[7728]: I0223 14:19:30.925967 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp" event={"ID":"3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3","Type":"ContainerStarted","Data":"2a52d8e1940b8a601f24fbfede361672aeb32a5195856b935f21043b52b85ae5"} Feb 23 14:19:30.964876 master-0 kubenswrapper[7728]: I0223 14:19:30.964796 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-5499d7f7bb-t45zz"] Feb 23 14:19:31.011269 master-0 kubenswrapper[7728]: I0223 14:19:31.011217 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-596f79dd6f-mhzxn"] Feb 23 14:19:31.027431 master-0 kubenswrapper[7728]: I0223 14:19:31.025955 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb"] Feb 23 14:19:31.027431 master-0 kubenswrapper[7728]: I0223 14:19:31.026777 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" Feb 23 14:19:31.029860 master-0 kubenswrapper[7728]: I0223 14:19:31.029675 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-bqgj7" Feb 23 14:19:31.030900 master-0 kubenswrapper[7728]: I0223 14:19:31.030519 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Feb 23 14:19:31.030900 master-0 kubenswrapper[7728]: I0223 14:19:31.030725 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Feb 23 14:19:31.030900 master-0 kubenswrapper[7728]: I0223 14:19:31.030720 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Feb 23 14:19:31.030900 master-0 kubenswrapper[7728]: I0223 14:19:31.030833 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Feb 23 14:19:31.031047 master-0 kubenswrapper[7728]: I0223 14:19:31.031025 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Feb 23 14:19:31.123843 master-0 kubenswrapper[7728]: I0223 14:19:31.123777 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr"] Feb 23 14:19:31.135872 master-0 kubenswrapper[7728]: I0223 14:19:31.131426 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvp2k\" (UniqueName: \"kubernetes.io/projected/c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04-kube-api-access-vvp2k\") pod \"cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb\" (UID: \"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" Feb 23 14:19:31.135872 master-0 kubenswrapper[7728]: I0223 14:19:31.131465 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb\" (UID: \"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" Feb 23 14:19:31.135872 master-0 kubenswrapper[7728]: I0223 14:19:31.131516 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04-images\") pod \"cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb\" (UID: \"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" Feb 23 14:19:31.135872 master-0 kubenswrapper[7728]: I0223 14:19:31.131540 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb\" (UID: \"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" Feb 23 14:19:31.135872 master-0 kubenswrapper[7728]: I0223 14:19:31.131596 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb\" (UID: \"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" Feb 23 14:19:31.237548 master-0 kubenswrapper[7728]: I0223 14:19:31.236258 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb\" (UID: \"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" Feb 23 14:19:31.237548 master-0 kubenswrapper[7728]: I0223 14:19:31.236332 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvp2k\" (UniqueName: \"kubernetes.io/projected/c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04-kube-api-access-vvp2k\") pod \"cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb\" (UID: \"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" Feb 23 14:19:31.237548 master-0 kubenswrapper[7728]: I0223 14:19:31.236368 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb\" (UID: \"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" Feb 23 14:19:31.237548 master-0 kubenswrapper[7728]: I0223 14:19:31.236398 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04-images\") pod \"cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb\" (UID: \"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" Feb 23 14:19:31.237548 master-0 kubenswrapper[7728]: I0223 14:19:31.236418 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb\" (UID: \"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" Feb 23 14:19:31.237548 master-0 kubenswrapper[7728]: I0223 14:19:31.237372 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04-images\") pod \"cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb\" (UID: \"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" Feb 23 14:19:31.237819 master-0 kubenswrapper[7728]: I0223 14:19:31.237789 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb\" (UID: \"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" Feb 23 14:19:31.239876 master-0 kubenswrapper[7728]: I0223 14:19:31.236429 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb\" (UID: \"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" Feb 23 14:19:31.240122 master-0 kubenswrapper[7728]: I0223 14:19:31.240023 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb\" (UID: \"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" Feb 23 14:19:31.249602 master-0 kubenswrapper[7728]: I0223 14:19:31.247130 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_5a5f3286-9ec6-4867-aaf7-3c31b1f6c126/installer/0.log" Feb 23 14:19:31.249602 master-0 kubenswrapper[7728]: I0223 14:19:31.247206 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Feb 23 14:19:31.261377 master-0 kubenswrapper[7728]: I0223 14:19:31.257931 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-789c749c66-w826g"] Feb 23 14:19:31.262776 master-0 kubenswrapper[7728]: I0223 14:19:31.262725 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-789c749c66-w826g" podUID="b2af015a-7f5f-4702-9f06-8159a040dabe" containerName="controller-manager" containerID="cri-o://ea6ce47f9f9923c200af89a0427525b1c65bce88c6467d37861598640e2b236f" gracePeriod=30 Feb 23 14:19:31.272595 master-0 kubenswrapper[7728]: I0223 14:19:31.272548 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvp2k\" (UniqueName: \"kubernetes.io/projected/c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04-kube-api-access-vvp2k\") pod \"cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb\" (UID: \"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" Feb 23 14:19:31.286105 master-0 kubenswrapper[7728]: I0223 14:19:31.275562 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-f94476f49-s6c8v"] Feb 23 14:19:31.286105 master-0 kubenswrapper[7728]: I0223 14:19:31.279553 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c445f5bd9-qtw5z"] Feb 23 14:19:31.286105 master-0 kubenswrapper[7728]: I0223 14:19:31.279844 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6c445f5bd9-qtw5z" podUID="9694f604-5dcf-4aec-a54f-67b4d8dc8809" containerName="route-controller-manager" containerID="cri-o://f5ecbbc97f8363e1b32d81fb3d21757d9b3607a089392deba2dfe5ae2374b9b0" gracePeriod=30 Feb 23 14:19:31.287167 master-0 kubenswrapper[7728]: W0223 14:19:31.287024 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbb66172_1ea9_4683_b88f_227c4fd94924.slice/crio-16c1ddc99b10767e840864cd1d61fae8aedde334dad20b9e34c987fd909a5a36 WatchSource:0}: Error finding container 16c1ddc99b10767e840864cd1d61fae8aedde334dad20b9e34c987fd909a5a36: Status 404 returned error can't find the container with id 16c1ddc99b10767e840864cd1d61fae8aedde334dad20b9e34c987fd909a5a36 Feb 23 14:19:31.319904 master-0 kubenswrapper[7728]: I0223 14:19:31.319849 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-59b498fcfb-rz897"] Feb 23 14:19:31.328408 master-0 kubenswrapper[7728]: I0223 14:19:31.327908 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr"] Feb 23 14:19:31.346283 master-0 kubenswrapper[7728]: I0223 14:19:31.341798 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a5f3286-9ec6-4867-aaf7-3c31b1f6c126-kubelet-dir\") pod \"5a5f3286-9ec6-4867-aaf7-3c31b1f6c126\" (UID: \"5a5f3286-9ec6-4867-aaf7-3c31b1f6c126\") " Feb 23 14:19:31.346283 master-0 kubenswrapper[7728]: I0223 14:19:31.341885 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a5f3286-9ec6-4867-aaf7-3c31b1f6c126-kube-api-access\") pod \"5a5f3286-9ec6-4867-aaf7-3c31b1f6c126\" (UID: \"5a5f3286-9ec6-4867-aaf7-3c31b1f6c126\") " Feb 23 14:19:31.346283 master-0 kubenswrapper[7728]: I0223 14:19:31.341925 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5a5f3286-9ec6-4867-aaf7-3c31b1f6c126-var-lock\") pod \"5a5f3286-9ec6-4867-aaf7-3c31b1f6c126\" (UID: \"5a5f3286-9ec6-4867-aaf7-3c31b1f6c126\") " Feb 23 14:19:31.346283 master-0 kubenswrapper[7728]: I0223 14:19:31.342177 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5a5f3286-9ec6-4867-aaf7-3c31b1f6c126-var-lock" (OuterVolumeSpecName: "var-lock") pod "5a5f3286-9ec6-4867-aaf7-3c31b1f6c126" (UID: "5a5f3286-9ec6-4867-aaf7-3c31b1f6c126"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:19:31.346283 master-0 kubenswrapper[7728]: I0223 14:19:31.342208 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5a5f3286-9ec6-4867-aaf7-3c31b1f6c126-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5a5f3286-9ec6-4867-aaf7-3c31b1f6c126" (UID: "5a5f3286-9ec6-4867-aaf7-3c31b1f6c126"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:19:31.351457 master-0 kubenswrapper[7728]: I0223 14:19:31.349508 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a5f3286-9ec6-4867-aaf7-3c31b1f6c126-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5a5f3286-9ec6-4867-aaf7-3c31b1f6c126" (UID: "5a5f3286-9ec6-4867-aaf7-3c31b1f6c126"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:19:31.443950 master-0 kubenswrapper[7728]: I0223 14:19:31.443866 7728 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a5f3286-9ec6-4867-aaf7-3c31b1f6c126-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:31.443950 master-0 kubenswrapper[7728]: I0223 14:19:31.443899 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a5f3286-9ec6-4867-aaf7-3c31b1f6c126-kube-api-access\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:31.443950 master-0 kubenswrapper[7728]: I0223 14:19:31.443908 7728 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5a5f3286-9ec6-4867-aaf7-3c31b1f6c126-var-lock\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:31.451807 master-0 kubenswrapper[7728]: I0223 14:19:31.449860 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" Feb 23 14:19:31.654225 master-0 kubenswrapper[7728]: I0223 14:19:31.654119 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl"] Feb 23 14:19:31.654532 master-0 kubenswrapper[7728]: E0223 14:19:31.654503 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a5f3286-9ec6-4867-aaf7-3c31b1f6c126" containerName="installer" Feb 23 14:19:31.654532 master-0 kubenswrapper[7728]: I0223 14:19:31.654528 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a5f3286-9ec6-4867-aaf7-3c31b1f6c126" containerName="installer" Feb 23 14:19:31.654639 master-0 kubenswrapper[7728]: I0223 14:19:31.654623 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a5f3286-9ec6-4867-aaf7-3c31b1f6c126" containerName="installer" Feb 23 14:19:31.655273 master-0 kubenswrapper[7728]: I0223 14:19:31.655239 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl" Feb 23 14:19:31.657850 master-0 kubenswrapper[7728]: I0223 14:19:31.657815 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 23 14:19:31.657988 master-0 kubenswrapper[7728]: I0223 14:19:31.657951 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 23 14:19:31.658027 master-0 kubenswrapper[7728]: I0223 14:19:31.658008 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 23 14:19:31.658177 master-0 kubenswrapper[7728]: I0223 14:19:31.658149 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-5wrwc" Feb 23 14:19:31.686541 master-0 kubenswrapper[7728]: I0223 14:19:31.683305 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl"] Feb 23 14:19:31.746616 master-0 kubenswrapper[7728]: I0223 14:19:31.746553 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceba7b56-f910-473d-aed5-add94868fb31-config\") pod \"machine-api-operator-5c7cf458b4-bb7zl\" (UID: \"ceba7b56-f910-473d-aed5-add94868fb31\") " pod="openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl" Feb 23 14:19:31.746616 master-0 kubenswrapper[7728]: I0223 14:19:31.746622 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72769\" (UniqueName: \"kubernetes.io/projected/ceba7b56-f910-473d-aed5-add94868fb31-kube-api-access-72769\") pod \"machine-api-operator-5c7cf458b4-bb7zl\" (UID: \"ceba7b56-f910-473d-aed5-add94868fb31\") " pod="openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl" Feb 23 14:19:31.747270 master-0 kubenswrapper[7728]: I0223 14:19:31.746653 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ceba7b56-f910-473d-aed5-add94868fb31-machine-api-operator-tls\") pod \"machine-api-operator-5c7cf458b4-bb7zl\" (UID: \"ceba7b56-f910-473d-aed5-add94868fb31\") " pod="openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl" Feb 23 14:19:31.747270 master-0 kubenswrapper[7728]: I0223 14:19:31.746674 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ceba7b56-f910-473d-aed5-add94868fb31-images\") pod \"machine-api-operator-5c7cf458b4-bb7zl\" (UID: \"ceba7b56-f910-473d-aed5-add94868fb31\") " pod="openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl" Feb 23 14:19:31.802439 master-0 kubenswrapper[7728]: I0223 14:19:31.802410 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-789c749c66-w826g" Feb 23 14:19:31.845929 master-0 kubenswrapper[7728]: I0223 14:19:31.845899 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c445f5bd9-qtw5z" Feb 23 14:19:31.847800 master-0 kubenswrapper[7728]: I0223 14:19:31.847752 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceba7b56-f910-473d-aed5-add94868fb31-config\") pod \"machine-api-operator-5c7cf458b4-bb7zl\" (UID: \"ceba7b56-f910-473d-aed5-add94868fb31\") " pod="openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl" Feb 23 14:19:31.847888 master-0 kubenswrapper[7728]: I0223 14:19:31.847839 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72769\" (UniqueName: \"kubernetes.io/projected/ceba7b56-f910-473d-aed5-add94868fb31-kube-api-access-72769\") pod \"machine-api-operator-5c7cf458b4-bb7zl\" (UID: \"ceba7b56-f910-473d-aed5-add94868fb31\") " pod="openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl" Feb 23 14:19:31.847935 master-0 kubenswrapper[7728]: I0223 14:19:31.847918 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ceba7b56-f910-473d-aed5-add94868fb31-machine-api-operator-tls\") pod \"machine-api-operator-5c7cf458b4-bb7zl\" (UID: \"ceba7b56-f910-473d-aed5-add94868fb31\") " pod="openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl" Feb 23 14:19:31.848055 master-0 kubenswrapper[7728]: I0223 14:19:31.847982 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ceba7b56-f910-473d-aed5-add94868fb31-images\") pod \"machine-api-operator-5c7cf458b4-bb7zl\" (UID: \"ceba7b56-f910-473d-aed5-add94868fb31\") " pod="openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl" Feb 23 14:19:31.848886 master-0 kubenswrapper[7728]: I0223 14:19:31.848841 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceba7b56-f910-473d-aed5-add94868fb31-config\") pod \"machine-api-operator-5c7cf458b4-bb7zl\" (UID: \"ceba7b56-f910-473d-aed5-add94868fb31\") " pod="openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl" Feb 23 14:19:31.849220 master-0 kubenswrapper[7728]: I0223 14:19:31.849193 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ceba7b56-f910-473d-aed5-add94868fb31-images\") pod \"machine-api-operator-5c7cf458b4-bb7zl\" (UID: \"ceba7b56-f910-473d-aed5-add94868fb31\") " pod="openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl" Feb 23 14:19:31.851443 master-0 kubenswrapper[7728]: I0223 14:19:31.851271 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ceba7b56-f910-473d-aed5-add94868fb31-machine-api-operator-tls\") pod \"machine-api-operator-5c7cf458b4-bb7zl\" (UID: \"ceba7b56-f910-473d-aed5-add94868fb31\") " pod="openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl" Feb 23 14:19:31.934181 master-0 kubenswrapper[7728]: I0223 14:19:31.934120 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72769\" (UniqueName: \"kubernetes.io/projected/ceba7b56-f910-473d-aed5-add94868fb31-kube-api-access-72769\") pod \"machine-api-operator-5c7cf458b4-bb7zl\" (UID: \"ceba7b56-f910-473d-aed5-add94868fb31\") " pod="openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl" Feb 23 14:19:31.943587 master-0 kubenswrapper[7728]: I0223 14:19:31.943519 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-5499d7f7bb-t45zz" event={"ID":"0cebb80d-d898-44c8-82b3-1e18833cee3f","Type":"ContainerStarted","Data":"9c93c157cec045045a20555d6c266b19e6eda783508c1ee10612534c934a9d8f"} Feb 23 14:19:31.943587 master-0 kubenswrapper[7728]: I0223 14:19:31.943565 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-5499d7f7bb-t45zz" event={"ID":"0cebb80d-d898-44c8-82b3-1e18833cee3f","Type":"ContainerStarted","Data":"5b1e3102064a2333a33694b441523235b1896dd8c0ad7164b8c2f46c1cc4e9c2"} Feb 23 14:19:31.945006 master-0 kubenswrapper[7728]: I0223 14:19:31.944973 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-5499d7f7bb-t45zz" Feb 23 14:19:31.948373 master-0 kubenswrapper[7728]: I0223 14:19:31.948219 7728 generic.go:334] "Generic (PLEG): container finished" podID="9694f604-5dcf-4aec-a54f-67b4d8dc8809" containerID="f5ecbbc97f8363e1b32d81fb3d21757d9b3607a089392deba2dfe5ae2374b9b0" exitCode=0 Feb 23 14:19:31.948373 master-0 kubenswrapper[7728]: I0223 14:19:31.948265 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c445f5bd9-qtw5z" event={"ID":"9694f604-5dcf-4aec-a54f-67b4d8dc8809","Type":"ContainerDied","Data":"f5ecbbc97f8363e1b32d81fb3d21757d9b3607a089392deba2dfe5ae2374b9b0"} Feb 23 14:19:31.948373 master-0 kubenswrapper[7728]: I0223 14:19:31.948290 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c445f5bd9-qtw5z" event={"ID":"9694f604-5dcf-4aec-a54f-67b4d8dc8809","Type":"ContainerDied","Data":"d787bbe507abbf8e0df335f2db362ff8804d2d269abbde4b9813202bd59071f6"} Feb 23 14:19:31.948373 master-0 kubenswrapper[7728]: I0223 14:19:31.948308 7728 scope.go:117] "RemoveContainer" containerID="f5ecbbc97f8363e1b32d81fb3d21757d9b3607a089392deba2dfe5ae2374b9b0" Feb 23 14:19:31.948733 master-0 kubenswrapper[7728]: I0223 14:19:31.948467 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c445f5bd9-qtw5z" Feb 23 14:19:31.949044 master-0 kubenswrapper[7728]: I0223 14:19:31.948953 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-5499d7f7bb-t45zz" Feb 23 14:19:31.949044 master-0 kubenswrapper[7728]: I0223 14:19:31.948964 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2af015a-7f5f-4702-9f06-8159a040dabe-serving-cert\") pod \"b2af015a-7f5f-4702-9f06-8159a040dabe\" (UID: \"b2af015a-7f5f-4702-9f06-8159a040dabe\") " Feb 23 14:19:31.960747 master-0 kubenswrapper[7728]: I0223 14:19:31.952734 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpzm8\" (UniqueName: \"kubernetes.io/projected/b2af015a-7f5f-4702-9f06-8159a040dabe-kube-api-access-rpzm8\") pod \"b2af015a-7f5f-4702-9f06-8159a040dabe\" (UID: \"b2af015a-7f5f-4702-9f06-8159a040dabe\") " Feb 23 14:19:31.960747 master-0 kubenswrapper[7728]: I0223 14:19:31.952841 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b2af015a-7f5f-4702-9f06-8159a040dabe-client-ca\") pod \"b2af015a-7f5f-4702-9f06-8159a040dabe\" (UID: \"b2af015a-7f5f-4702-9f06-8159a040dabe\") " Feb 23 14:19:31.960747 master-0 kubenswrapper[7728]: I0223 14:19:31.952895 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2af015a-7f5f-4702-9f06-8159a040dabe-config\") pod \"b2af015a-7f5f-4702-9f06-8159a040dabe\" (UID: \"b2af015a-7f5f-4702-9f06-8159a040dabe\") " Feb 23 14:19:31.960747 master-0 kubenswrapper[7728]: I0223 14:19:31.952953 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b2af015a-7f5f-4702-9f06-8159a040dabe-proxy-ca-bundles\") pod \"b2af015a-7f5f-4702-9f06-8159a040dabe\" (UID: \"b2af015a-7f5f-4702-9f06-8159a040dabe\") " Feb 23 14:19:31.960747 master-0 kubenswrapper[7728]: I0223 14:19:31.953023 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5f7c\" (UniqueName: \"kubernetes.io/projected/9694f604-5dcf-4aec-a54f-67b4d8dc8809-kube-api-access-n5f7c\") pod \"9694f604-5dcf-4aec-a54f-67b4d8dc8809\" (UID: \"9694f604-5dcf-4aec-a54f-67b4d8dc8809\") " Feb 23 14:19:31.960747 master-0 kubenswrapper[7728]: I0223 14:19:31.953060 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9694f604-5dcf-4aec-a54f-67b4d8dc8809-serving-cert\") pod \"9694f604-5dcf-4aec-a54f-67b4d8dc8809\" (UID: \"9694f604-5dcf-4aec-a54f-67b4d8dc8809\") " Feb 23 14:19:31.960747 master-0 kubenswrapper[7728]: I0223 14:19:31.953110 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9694f604-5dcf-4aec-a54f-67b4d8dc8809-client-ca\") pod \"9694f604-5dcf-4aec-a54f-67b4d8dc8809\" (UID: \"9694f604-5dcf-4aec-a54f-67b4d8dc8809\") " Feb 23 14:19:31.960747 master-0 kubenswrapper[7728]: I0223 14:19:31.953182 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9694f604-5dcf-4aec-a54f-67b4d8dc8809-config\") pod \"9694f604-5dcf-4aec-a54f-67b4d8dc8809\" (UID: \"9694f604-5dcf-4aec-a54f-67b4d8dc8809\") " Feb 23 14:19:31.960747 master-0 kubenswrapper[7728]: I0223 14:19:31.953770 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2af015a-7f5f-4702-9f06-8159a040dabe-client-ca" (OuterVolumeSpecName: "client-ca") pod "b2af015a-7f5f-4702-9f06-8159a040dabe" (UID: "b2af015a-7f5f-4702-9f06-8159a040dabe"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:19:31.960747 master-0 kubenswrapper[7728]: I0223 14:19:31.954613 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2af015a-7f5f-4702-9f06-8159a040dabe-config" (OuterVolumeSpecName: "config") pod "b2af015a-7f5f-4702-9f06-8159a040dabe" (UID: "b2af015a-7f5f-4702-9f06-8159a040dabe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:19:31.960747 master-0 kubenswrapper[7728]: I0223 14:19:31.954632 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9694f604-5dcf-4aec-a54f-67b4d8dc8809-client-ca" (OuterVolumeSpecName: "client-ca") pod "9694f604-5dcf-4aec-a54f-67b4d8dc8809" (UID: "9694f604-5dcf-4aec-a54f-67b4d8dc8809"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:19:31.960747 master-0 kubenswrapper[7728]: I0223 14:19:31.955159 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9694f604-5dcf-4aec-a54f-67b4d8dc8809-config" (OuterVolumeSpecName: "config") pod "9694f604-5dcf-4aec-a54f-67b4d8dc8809" (UID: "9694f604-5dcf-4aec-a54f-67b4d8dc8809"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:19:31.960747 master-0 kubenswrapper[7728]: I0223 14:19:31.957378 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_5a5f3286-9ec6-4867-aaf7-3c31b1f6c126/installer/0.log" Feb 23 14:19:31.960747 master-0 kubenswrapper[7728]: I0223 14:19:31.957591 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Feb 23 14:19:31.960747 master-0 kubenswrapper[7728]: I0223 14:19:31.957594 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"5a5f3286-9ec6-4867-aaf7-3c31b1f6c126","Type":"ContainerDied","Data":"4c3108828f0ca8b6e625b4baa7448d873f784369cabaee373f77769b11f1f00a"} Feb 23 14:19:31.960747 master-0 kubenswrapper[7728]: I0223 14:19:31.958106 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2af015a-7f5f-4702-9f06-8159a040dabe-kube-api-access-rpzm8" (OuterVolumeSpecName: "kube-api-access-rpzm8") pod "b2af015a-7f5f-4702-9f06-8159a040dabe" (UID: "b2af015a-7f5f-4702-9f06-8159a040dabe"). InnerVolumeSpecName "kube-api-access-rpzm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:19:31.961689 master-0 kubenswrapper[7728]: I0223 14:19:31.960810 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2af015a-7f5f-4702-9f06-8159a040dabe-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b2af015a-7f5f-4702-9f06-8159a040dabe" (UID: "b2af015a-7f5f-4702-9f06-8159a040dabe"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:19:31.961689 master-0 kubenswrapper[7728]: I0223 14:19:31.961389 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2af015a-7f5f-4702-9f06-8159a040dabe-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b2af015a-7f5f-4702-9f06-8159a040dabe" (UID: "b2af015a-7f5f-4702-9f06-8159a040dabe"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:19:31.961689 master-0 kubenswrapper[7728]: I0223 14:19:31.961455 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-f94476f49-s6c8v" event={"ID":"fbb66172-1ea9-4683-b88f-227c4fd94924","Type":"ContainerStarted","Data":"16c1ddc99b10767e840864cd1d61fae8aedde334dad20b9e34c987fd909a5a36"} Feb 23 14:19:31.961689 master-0 kubenswrapper[7728]: I0223 14:19:31.961601 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9694f604-5dcf-4aec-a54f-67b4d8dc8809-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9694f604-5dcf-4aec-a54f-67b4d8dc8809" (UID: "9694f604-5dcf-4aec-a54f-67b4d8dc8809"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:19:31.963783 master-0 kubenswrapper[7728]: I0223 14:19:31.963255 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9694f604-5dcf-4aec-a54f-67b4d8dc8809-kube-api-access-n5f7c" (OuterVolumeSpecName: "kube-api-access-n5f7c") pod "9694f604-5dcf-4aec-a54f-67b4d8dc8809" (UID: "9694f604-5dcf-4aec-a54f-67b4d8dc8809"). InnerVolumeSpecName "kube-api-access-n5f7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:19:31.969388 master-0 kubenswrapper[7728]: I0223 14:19:31.969341 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" event={"ID":"92c63c95-e880-4f51-9858-7715343f7bd8","Type":"ContainerStarted","Data":"5c45c6ef7e4b05f37927f974cf6cb4b129b6dd3a04cd82a3e97ac1e29ceb5010"} Feb 23 14:19:31.972459 master-0 kubenswrapper[7728]: I0223 14:19:31.972413 7728 generic.go:334] "Generic (PLEG): container finished" podID="b2af015a-7f5f-4702-9f06-8159a040dabe" containerID="ea6ce47f9f9923c200af89a0427525b1c65bce88c6467d37861598640e2b236f" exitCode=0 Feb 23 14:19:31.972550 master-0 kubenswrapper[7728]: I0223 14:19:31.972524 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-789c749c66-w826g" event={"ID":"b2af015a-7f5f-4702-9f06-8159a040dabe","Type":"ContainerDied","Data":"ea6ce47f9f9923c200af89a0427525b1c65bce88c6467d37861598640e2b236f"} Feb 23 14:19:31.972615 master-0 kubenswrapper[7728]: I0223 14:19:31.972554 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-789c749c66-w826g" event={"ID":"b2af015a-7f5f-4702-9f06-8159a040dabe","Type":"ContainerDied","Data":"e51d035a0dfb539fd05e098e8e355aca5284eb3bb12234934a07a3e19cdf8677"} Feb 23 14:19:31.972675 master-0 kubenswrapper[7728]: I0223 14:19:31.972642 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-789c749c66-w826g" Feb 23 14:19:31.978688 master-0 kubenswrapper[7728]: I0223 14:19:31.978259 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" event={"ID":"af950a67-1557-4352-8100-27281bb8ecbe","Type":"ContainerStarted","Data":"3d6da8a2ab007c14781f7a758e38f4dc17838974a913b095ba4be079439082e2"} Feb 23 14:19:31.979217 master-0 kubenswrapper[7728]: I0223 14:19:31.979125 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" event={"ID":"af950a67-1557-4352-8100-27281bb8ecbe","Type":"ContainerStarted","Data":"7c2bb3b30fb024eb641e1765113a1a36bccda4c627461f72aa312212e851ddb2"} Feb 23 14:19:31.993063 master-0 kubenswrapper[7728]: I0223 14:19:31.988271 7728 scope.go:117] "RemoveContainer" containerID="f5ecbbc97f8363e1b32d81fb3d21757d9b3607a089392deba2dfe5ae2374b9b0" Feb 23 14:19:31.993063 master-0 kubenswrapper[7728]: I0223 14:19:31.988404 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-59b498fcfb-rz897" event={"ID":"ae4baa4e-4ef4-433d-aa36-149e92fa6ee2","Type":"ContainerStarted","Data":"dac97420bb9e3883db6238fff45b69e331260668444b09a65159744c334f79d2"} Feb 23 14:19:31.993063 master-0 kubenswrapper[7728]: E0223 14:19:31.989201 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5ecbbc97f8363e1b32d81fb3d21757d9b3607a089392deba2dfe5ae2374b9b0\": container with ID starting with f5ecbbc97f8363e1b32d81fb3d21757d9b3607a089392deba2dfe5ae2374b9b0 not found: ID does not exist" containerID="f5ecbbc97f8363e1b32d81fb3d21757d9b3607a089392deba2dfe5ae2374b9b0" Feb 23 14:19:31.993063 master-0 kubenswrapper[7728]: I0223 14:19:31.989239 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5ecbbc97f8363e1b32d81fb3d21757d9b3607a089392deba2dfe5ae2374b9b0"} err="failed to get container status \"f5ecbbc97f8363e1b32d81fb3d21757d9b3607a089392deba2dfe5ae2374b9b0\": rpc error: code = NotFound desc = could not find container \"f5ecbbc97f8363e1b32d81fb3d21757d9b3607a089392deba2dfe5ae2374b9b0\": container with ID starting with f5ecbbc97f8363e1b32d81fb3d21757d9b3607a089392deba2dfe5ae2374b9b0 not found: ID does not exist" Feb 23 14:19:31.993063 master-0 kubenswrapper[7728]: I0223 14:19:31.989264 7728 scope.go:117] "RemoveContainer" containerID="0e1388d50e8407b49b200cd8d22318640f8b3d9ec7fffb9a2e4a3714e4725182" Feb 23 14:19:32.020280 master-0 kubenswrapper[7728]: I0223 14:19:32.020237 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" event={"ID":"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04","Type":"ContainerStarted","Data":"223c331c7c80307a3bc03e311dd668265166e3a40331282a045bd5ac8d6db865"} Feb 23 14:19:32.028125 master-0 kubenswrapper[7728]: I0223 14:19:32.022635 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-596f79dd6f-mhzxn" event={"ID":"255b5a89-1b89-42dc-868a-32ce67975a54","Type":"ContainerStarted","Data":"3b1c0720f422e4cf08280fe87e035038c55aa20e5d2dbdf7c6fac26671b08d29"} Feb 23 14:19:32.028125 master-0 kubenswrapper[7728]: I0223 14:19:32.022698 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-596f79dd6f-mhzxn" event={"ID":"255b5a89-1b89-42dc-868a-32ce67975a54","Type":"ContainerStarted","Data":"a5727697c2b4cf38a7045ae8edfe9cf2a413bc9b589c95f8630aa7de7ef3ba40"} Feb 23 14:19:32.028125 master-0 kubenswrapper[7728]: I0223 14:19:32.023605 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-596f79dd6f-mhzxn" Feb 23 14:19:32.037320 master-0 kubenswrapper[7728]: I0223 14:19:32.036241 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-5499d7f7bb-t45zz" podStartSLOduration=2.036220028 podStartE2EDuration="2.036220028s" podCreationTimestamp="2026-02-23 14:19:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:19:32.035964973 +0000 UTC m=+64.998626269" watchObservedRunningTime="2026-02-23 14:19:32.036220028 +0000 UTC m=+64.998881324" Feb 23 14:19:32.048951 master-0 kubenswrapper[7728]: I0223 14:19:32.046391 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-596f79dd6f-mhzxn" Feb 23 14:19:32.051453 master-0 kubenswrapper[7728]: I0223 14:19:32.051198 7728 scope.go:117] "RemoveContainer" containerID="ea6ce47f9f9923c200af89a0427525b1c65bce88c6467d37861598640e2b236f" Feb 23 14:19:32.057750 master-0 kubenswrapper[7728]: I0223 14:19:32.057644 7728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9694f604-5dcf-4aec-a54f-67b4d8dc8809-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:32.057750 master-0 kubenswrapper[7728]: I0223 14:19:32.057671 7728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b2af015a-7f5f-4702-9f06-8159a040dabe-serving-cert\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:32.057750 master-0 kubenswrapper[7728]: I0223 14:19:32.057681 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpzm8\" (UniqueName: \"kubernetes.io/projected/b2af015a-7f5f-4702-9f06-8159a040dabe-kube-api-access-rpzm8\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:32.057750 master-0 kubenswrapper[7728]: I0223 14:19:32.057691 7728 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b2af015a-7f5f-4702-9f06-8159a040dabe-client-ca\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:32.057750 master-0 kubenswrapper[7728]: I0223 14:19:32.057699 7728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2af015a-7f5f-4702-9f06-8159a040dabe-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:32.057750 master-0 kubenswrapper[7728]: I0223 14:19:32.057707 7728 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b2af015a-7f5f-4702-9f06-8159a040dabe-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:32.057750 master-0 kubenswrapper[7728]: I0223 14:19:32.057716 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5f7c\" (UniqueName: \"kubernetes.io/projected/9694f604-5dcf-4aec-a54f-67b4d8dc8809-kube-api-access-n5f7c\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:32.057750 master-0 kubenswrapper[7728]: I0223 14:19:32.057724 7728 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9694f604-5dcf-4aec-a54f-67b4d8dc8809-serving-cert\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:32.057750 master-0 kubenswrapper[7728]: I0223 14:19:32.057735 7728 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9694f604-5dcf-4aec-a54f-67b4d8dc8809-client-ca\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:32.072428 master-0 kubenswrapper[7728]: I0223 14:19:32.072342 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-596f79dd6f-mhzxn" podStartSLOduration=2.072310846 podStartE2EDuration="2.072310846s" podCreationTimestamp="2026-02-23 14:19:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:19:32.061025562 +0000 UTC m=+65.023686858" watchObservedRunningTime="2026-02-23 14:19:32.072310846 +0000 UTC m=+65.034972142" Feb 23 14:19:32.080387 master-0 kubenswrapper[7728]: I0223 14:19:32.079454 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-789c749c66-w826g"] Feb 23 14:19:32.088009 master-0 kubenswrapper[7728]: I0223 14:19:32.087397 7728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-789c749c66-w826g"] Feb 23 14:19:32.098936 master-0 kubenswrapper[7728]: I0223 14:19:32.098886 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Feb 23 14:19:32.099341 master-0 kubenswrapper[7728]: I0223 14:19:32.099299 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl" Feb 23 14:19:32.106562 master-0 kubenswrapper[7728]: I0223 14:19:32.106517 7728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Feb 23 14:19:32.123760 master-0 kubenswrapper[7728]: I0223 14:19:32.123493 7728 scope.go:117] "RemoveContainer" containerID="ea6ce47f9f9923c200af89a0427525b1c65bce88c6467d37861598640e2b236f" Feb 23 14:19:32.124095 master-0 kubenswrapper[7728]: E0223 14:19:32.124051 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea6ce47f9f9923c200af89a0427525b1c65bce88c6467d37861598640e2b236f\": container with ID starting with ea6ce47f9f9923c200af89a0427525b1c65bce88c6467d37861598640e2b236f not found: ID does not exist" containerID="ea6ce47f9f9923c200af89a0427525b1c65bce88c6467d37861598640e2b236f" Feb 23 14:19:32.124231 master-0 kubenswrapper[7728]: I0223 14:19:32.124110 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea6ce47f9f9923c200af89a0427525b1c65bce88c6467d37861598640e2b236f"} err="failed to get container status \"ea6ce47f9f9923c200af89a0427525b1c65bce88c6467d37861598640e2b236f\": rpc error: code = NotFound desc = could not find container \"ea6ce47f9f9923c200af89a0427525b1c65bce88c6467d37861598640e2b236f\": container with ID starting with ea6ce47f9f9923c200af89a0427525b1c65bce88c6467d37861598640e2b236f not found: ID does not exist" Feb 23 14:19:32.592854 master-0 kubenswrapper[7728]: I0223 14:19:32.592807 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c445f5bd9-qtw5z"] Feb 23 14:19:32.636245 master-0 kubenswrapper[7728]: I0223 14:19:32.636088 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-55d786cb4c-cqkbt"] Feb 23 14:19:32.636757 master-0 kubenswrapper[7728]: E0223 14:19:32.636409 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9694f604-5dcf-4aec-a54f-67b4d8dc8809" containerName="route-controller-manager" Feb 23 14:19:32.636757 master-0 kubenswrapper[7728]: I0223 14:19:32.636425 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="9694f604-5dcf-4aec-a54f-67b4d8dc8809" containerName="route-controller-manager" Feb 23 14:19:32.636757 master-0 kubenswrapper[7728]: E0223 14:19:32.636442 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2af015a-7f5f-4702-9f06-8159a040dabe" containerName="controller-manager" Feb 23 14:19:32.636757 master-0 kubenswrapper[7728]: I0223 14:19:32.636449 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2af015a-7f5f-4702-9f06-8159a040dabe" containerName="controller-manager" Feb 23 14:19:32.636757 master-0 kubenswrapper[7728]: I0223 14:19:32.636563 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2af015a-7f5f-4702-9f06-8159a040dabe" containerName="controller-manager" Feb 23 14:19:32.636757 master-0 kubenswrapper[7728]: I0223 14:19:32.636575 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="9694f604-5dcf-4aec-a54f-67b4d8dc8809" containerName="route-controller-manager" Feb 23 14:19:32.637254 master-0 kubenswrapper[7728]: I0223 14:19:32.636932 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:19:32.642639 master-0 kubenswrapper[7728]: I0223 14:19:32.640056 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 23 14:19:32.642639 master-0 kubenswrapper[7728]: I0223 14:19:32.640296 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 23 14:19:32.642639 master-0 kubenswrapper[7728]: I0223 14:19:32.641736 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 23 14:19:32.642639 master-0 kubenswrapper[7728]: I0223 14:19:32.642592 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 23 14:19:32.644438 master-0 kubenswrapper[7728]: I0223 14:19:32.642894 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 23 14:19:32.647402 master-0 kubenswrapper[7728]: I0223 14:19:32.647330 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 23 14:19:32.668842 master-0 kubenswrapper[7728]: I0223 14:19:32.662501 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Feb 23 14:19:32.668842 master-0 kubenswrapper[7728]: I0223 14:19:32.663794 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Feb 23 14:19:32.668842 master-0 kubenswrapper[7728]: I0223 14:19:32.667388 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-4hjzs" Feb 23 14:19:32.668842 master-0 kubenswrapper[7728]: I0223 14:19:32.667413 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl"] Feb 23 14:19:32.671832 master-0 kubenswrapper[7728]: I0223 14:19:32.671650 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-55d786cb4c-cqkbt"] Feb 23 14:19:32.678209 master-0 kubenswrapper[7728]: I0223 14:19:32.674272 7728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c445f5bd9-qtw5z"] Feb 23 14:19:32.686415 master-0 kubenswrapper[7728]: I0223 14:19:32.683444 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Feb 23 14:19:32.686415 master-0 kubenswrapper[7728]: I0223 14:19:32.683530 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vwtc6"] Feb 23 14:19:32.686415 master-0 kubenswrapper[7728]: I0223 14:19:32.684431 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vwtc6" Feb 23 14:19:32.715887 master-0 kubenswrapper[7728]: I0223 14:19:32.715678 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vwtc6"] Feb 23 14:19:32.761954 master-0 kubenswrapper[7728]: I0223 14:19:32.761895 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xxh6f"] Feb 23 14:19:32.762939 master-0 kubenswrapper[7728]: I0223 14:19:32.762919 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xxh6f" Feb 23 14:19:32.766291 master-0 kubenswrapper[7728]: I0223 14:19:32.766218 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b993917a-bce8-4467-a09d-bfc923a90460-catalog-content\") pod \"certified-operators-vwtc6\" (UID: \"b993917a-bce8-4467-a09d-bfc923a90460\") " pod="openshift-marketplace/certified-operators-vwtc6" Feb 23 14:19:32.766291 master-0 kubenswrapper[7728]: I0223 14:19:32.766267 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96vz5\" (UniqueName: \"kubernetes.io/projected/b993917a-bce8-4467-a09d-bfc923a90460-kube-api-access-96vz5\") pod \"certified-operators-vwtc6\" (UID: \"b993917a-bce8-4467-a09d-bfc923a90460\") " pod="openshift-marketplace/certified-operators-vwtc6" Feb 23 14:19:32.766408 master-0 kubenswrapper[7728]: I0223 14:19:32.766290 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-config\") pod \"controller-manager-55d786cb4c-cqkbt\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:19:32.766408 master-0 kubenswrapper[7728]: I0223 14:19:32.766318 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/493a9ed3-6d64-489a-a68c-235b69a58782-kube-api-access\") pod \"installer-2-master-0\" (UID: \"493a9ed3-6d64-489a-a68c-235b69a58782\") " pod="openshift-kube-controller-manager/installer-2-master-0" Feb 23 14:19:32.766408 master-0 kubenswrapper[7728]: I0223 14:19:32.766394 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/493a9ed3-6d64-489a-a68c-235b69a58782-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"493a9ed3-6d64-489a-a68c-235b69a58782\") " pod="openshift-kube-controller-manager/installer-2-master-0" Feb 23 14:19:32.766543 master-0 kubenswrapper[7728]: I0223 14:19:32.766462 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-proxy-ca-bundles\") pod \"controller-manager-55d786cb4c-cqkbt\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:19:32.766832 master-0 kubenswrapper[7728]: I0223 14:19:32.766808 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-client-ca\") pod \"controller-manager-55d786cb4c-cqkbt\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:19:32.766947 master-0 kubenswrapper[7728]: I0223 14:19:32.766927 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/959c2393-e914-4c10-a18f-b30fcf012d19-serving-cert\") pod \"controller-manager-55d786cb4c-cqkbt\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:19:32.766998 master-0 kubenswrapper[7728]: I0223 14:19:32.766977 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42sml\" (UniqueName: \"kubernetes.io/projected/959c2393-e914-4c10-a18f-b30fcf012d19-kube-api-access-42sml\") pod \"controller-manager-55d786cb4c-cqkbt\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:19:32.767089 master-0 kubenswrapper[7728]: I0223 14:19:32.767064 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b993917a-bce8-4467-a09d-bfc923a90460-utilities\") pod \"certified-operators-vwtc6\" (UID: \"b993917a-bce8-4467-a09d-bfc923a90460\") " pod="openshift-marketplace/certified-operators-vwtc6" Feb 23 14:19:32.767197 master-0 kubenswrapper[7728]: I0223 14:19:32.767179 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/493a9ed3-6d64-489a-a68c-235b69a58782-var-lock\") pod \"installer-2-master-0\" (UID: \"493a9ed3-6d64-489a-a68c-235b69a58782\") " pod="openshift-kube-controller-manager/installer-2-master-0" Feb 23 14:19:32.780593 master-0 kubenswrapper[7728]: I0223 14:19:32.780553 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xxh6f"] Feb 23 14:19:32.868941 master-0 kubenswrapper[7728]: I0223 14:19:32.868318 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-client-ca\") pod \"controller-manager-55d786cb4c-cqkbt\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:19:32.868941 master-0 kubenswrapper[7728]: I0223 14:19:32.868363 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/959c2393-e914-4c10-a18f-b30fcf012d19-serving-cert\") pod \"controller-manager-55d786cb4c-cqkbt\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:19:32.868941 master-0 kubenswrapper[7728]: I0223 14:19:32.868382 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42sml\" (UniqueName: \"kubernetes.io/projected/959c2393-e914-4c10-a18f-b30fcf012d19-kube-api-access-42sml\") pod \"controller-manager-55d786cb4c-cqkbt\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:19:32.868941 master-0 kubenswrapper[7728]: I0223 14:19:32.868407 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a0bf4c4-8272-4f24-8e48-525d7a278b26-utilities\") pod \"community-operators-xxh6f\" (UID: \"2a0bf4c4-8272-4f24-8e48-525d7a278b26\") " pod="openshift-marketplace/community-operators-xxh6f" Feb 23 14:19:32.868941 master-0 kubenswrapper[7728]: I0223 14:19:32.868424 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b993917a-bce8-4467-a09d-bfc923a90460-utilities\") pod \"certified-operators-vwtc6\" (UID: \"b993917a-bce8-4467-a09d-bfc923a90460\") " pod="openshift-marketplace/certified-operators-vwtc6" Feb 23 14:19:32.868941 master-0 kubenswrapper[7728]: I0223 14:19:32.868454 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59xbh\" (UniqueName: \"kubernetes.io/projected/2a0bf4c4-8272-4f24-8e48-525d7a278b26-kube-api-access-59xbh\") pod \"community-operators-xxh6f\" (UID: \"2a0bf4c4-8272-4f24-8e48-525d7a278b26\") " pod="openshift-marketplace/community-operators-xxh6f" Feb 23 14:19:32.868941 master-0 kubenswrapper[7728]: I0223 14:19:32.868472 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a0bf4c4-8272-4f24-8e48-525d7a278b26-catalog-content\") pod \"community-operators-xxh6f\" (UID: \"2a0bf4c4-8272-4f24-8e48-525d7a278b26\") " pod="openshift-marketplace/community-operators-xxh6f" Feb 23 14:19:32.868941 master-0 kubenswrapper[7728]: I0223 14:19:32.868507 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/493a9ed3-6d64-489a-a68c-235b69a58782-var-lock\") pod \"installer-2-master-0\" (UID: \"493a9ed3-6d64-489a-a68c-235b69a58782\") " pod="openshift-kube-controller-manager/installer-2-master-0" Feb 23 14:19:32.868941 master-0 kubenswrapper[7728]: I0223 14:19:32.868530 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b993917a-bce8-4467-a09d-bfc923a90460-catalog-content\") pod \"certified-operators-vwtc6\" (UID: \"b993917a-bce8-4467-a09d-bfc923a90460\") " pod="openshift-marketplace/certified-operators-vwtc6" Feb 23 14:19:32.868941 master-0 kubenswrapper[7728]: I0223 14:19:32.868550 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96vz5\" (UniqueName: \"kubernetes.io/projected/b993917a-bce8-4467-a09d-bfc923a90460-kube-api-access-96vz5\") pod \"certified-operators-vwtc6\" (UID: \"b993917a-bce8-4467-a09d-bfc923a90460\") " pod="openshift-marketplace/certified-operators-vwtc6" Feb 23 14:19:32.868941 master-0 kubenswrapper[7728]: I0223 14:19:32.868567 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-config\") pod \"controller-manager-55d786cb4c-cqkbt\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:19:32.868941 master-0 kubenswrapper[7728]: I0223 14:19:32.868583 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/493a9ed3-6d64-489a-a68c-235b69a58782-kube-api-access\") pod \"installer-2-master-0\" (UID: \"493a9ed3-6d64-489a-a68c-235b69a58782\") " pod="openshift-kube-controller-manager/installer-2-master-0" Feb 23 14:19:32.868941 master-0 kubenswrapper[7728]: I0223 14:19:32.868604 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/493a9ed3-6d64-489a-a68c-235b69a58782-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"493a9ed3-6d64-489a-a68c-235b69a58782\") " pod="openshift-kube-controller-manager/installer-2-master-0" Feb 23 14:19:32.868941 master-0 kubenswrapper[7728]: I0223 14:19:32.868620 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-proxy-ca-bundles\") pod \"controller-manager-55d786cb4c-cqkbt\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:19:32.869678 master-0 kubenswrapper[7728]: I0223 14:19:32.869646 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-proxy-ca-bundles\") pod \"controller-manager-55d786cb4c-cqkbt\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:19:32.871792 master-0 kubenswrapper[7728]: I0223 14:19:32.870188 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-client-ca\") pod \"controller-manager-55d786cb4c-cqkbt\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:19:32.872036 master-0 kubenswrapper[7728]: I0223 14:19:32.871995 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/493a9ed3-6d64-489a-a68c-235b69a58782-var-lock\") pod \"installer-2-master-0\" (UID: \"493a9ed3-6d64-489a-a68c-235b69a58782\") " pod="openshift-kube-controller-manager/installer-2-master-0" Feb 23 14:19:32.874178 master-0 kubenswrapper[7728]: I0223 14:19:32.872576 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b993917a-bce8-4467-a09d-bfc923a90460-utilities\") pod \"certified-operators-vwtc6\" (UID: \"b993917a-bce8-4467-a09d-bfc923a90460\") " pod="openshift-marketplace/certified-operators-vwtc6" Feb 23 14:19:32.874178 master-0 kubenswrapper[7728]: I0223 14:19:32.872958 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b993917a-bce8-4467-a09d-bfc923a90460-catalog-content\") pod \"certified-operators-vwtc6\" (UID: \"b993917a-bce8-4467-a09d-bfc923a90460\") " pod="openshift-marketplace/certified-operators-vwtc6" Feb 23 14:19:32.874178 master-0 kubenswrapper[7728]: I0223 14:19:32.873102 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/493a9ed3-6d64-489a-a68c-235b69a58782-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"493a9ed3-6d64-489a-a68c-235b69a58782\") " pod="openshift-kube-controller-manager/installer-2-master-0" Feb 23 14:19:32.874178 master-0 kubenswrapper[7728]: I0223 14:19:32.873318 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-config\") pod \"controller-manager-55d786cb4c-cqkbt\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:19:32.875213 master-0 kubenswrapper[7728]: I0223 14:19:32.875099 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh"] Feb 23 14:19:32.877983 master-0 kubenswrapper[7728]: I0223 14:19:32.877797 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" Feb 23 14:19:32.885865 master-0 kubenswrapper[7728]: I0223 14:19:32.885835 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 23 14:19:32.894836 master-0 kubenswrapper[7728]: I0223 14:19:32.894794 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh"] Feb 23 14:19:32.895571 master-0 kubenswrapper[7728]: I0223 14:19:32.895543 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/959c2393-e914-4c10-a18f-b30fcf012d19-serving-cert\") pod \"controller-manager-55d786cb4c-cqkbt\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:19:32.898639 master-0 kubenswrapper[7728]: I0223 14:19:32.898572 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/493a9ed3-6d64-489a-a68c-235b69a58782-kube-api-access\") pod \"installer-2-master-0\" (UID: \"493a9ed3-6d64-489a-a68c-235b69a58782\") " pod="openshift-kube-controller-manager/installer-2-master-0" Feb 23 14:19:32.899402 master-0 kubenswrapper[7728]: I0223 14:19:32.899372 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96vz5\" (UniqueName: \"kubernetes.io/projected/b993917a-bce8-4467-a09d-bfc923a90460-kube-api-access-96vz5\") pod \"certified-operators-vwtc6\" (UID: \"b993917a-bce8-4467-a09d-bfc923a90460\") " pod="openshift-marketplace/certified-operators-vwtc6" Feb 23 14:19:32.915881 master-0 kubenswrapper[7728]: I0223 14:19:32.915844 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42sml\" (UniqueName: \"kubernetes.io/projected/959c2393-e914-4c10-a18f-b30fcf012d19-kube-api-access-42sml\") pod \"controller-manager-55d786cb4c-cqkbt\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:19:32.966485 master-0 kubenswrapper[7728]: I0223 14:19:32.966338 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:19:32.973147 master-0 kubenswrapper[7728]: I0223 14:19:32.973031 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a0bf4c4-8272-4f24-8e48-525d7a278b26-utilities\") pod \"community-operators-xxh6f\" (UID: \"2a0bf4c4-8272-4f24-8e48-525d7a278b26\") " pod="openshift-marketplace/community-operators-xxh6f" Feb 23 14:19:32.973147 master-0 kubenswrapper[7728]: I0223 14:19:32.973087 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0315476e-7140-4777-8061-9cead4c92024-webhook-cert\") pod \"packageserver-65c9585877-m66zh\" (UID: \"0315476e-7140-4777-8061-9cead4c92024\") " pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" Feb 23 14:19:32.973147 master-0 kubenswrapper[7728]: I0223 14:19:32.973114 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0315476e-7140-4777-8061-9cead4c92024-apiservice-cert\") pod \"packageserver-65c9585877-m66zh\" (UID: \"0315476e-7140-4777-8061-9cead4c92024\") " pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" Feb 23 14:19:32.974450 master-0 kubenswrapper[7728]: I0223 14:19:32.973151 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59xbh\" (UniqueName: \"kubernetes.io/projected/2a0bf4c4-8272-4f24-8e48-525d7a278b26-kube-api-access-59xbh\") pod \"community-operators-xxh6f\" (UID: \"2a0bf4c4-8272-4f24-8e48-525d7a278b26\") " pod="openshift-marketplace/community-operators-xxh6f" Feb 23 14:19:32.974450 master-0 kubenswrapper[7728]: I0223 14:19:32.973178 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a0bf4c4-8272-4f24-8e48-525d7a278b26-catalog-content\") pod \"community-operators-xxh6f\" (UID: \"2a0bf4c4-8272-4f24-8e48-525d7a278b26\") " pod="openshift-marketplace/community-operators-xxh6f" Feb 23 14:19:32.974450 master-0 kubenswrapper[7728]: I0223 14:19:32.973229 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/0315476e-7140-4777-8061-9cead4c92024-tmpfs\") pod \"packageserver-65c9585877-m66zh\" (UID: \"0315476e-7140-4777-8061-9cead4c92024\") " pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" Feb 23 14:19:32.974450 master-0 kubenswrapper[7728]: I0223 14:19:32.973261 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtp7w\" (UniqueName: \"kubernetes.io/projected/0315476e-7140-4777-8061-9cead4c92024-kube-api-access-jtp7w\") pod \"packageserver-65c9585877-m66zh\" (UID: \"0315476e-7140-4777-8061-9cead4c92024\") " pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" Feb 23 14:19:32.974450 master-0 kubenswrapper[7728]: I0223 14:19:32.973791 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a0bf4c4-8272-4f24-8e48-525d7a278b26-utilities\") pod \"community-operators-xxh6f\" (UID: \"2a0bf4c4-8272-4f24-8e48-525d7a278b26\") " pod="openshift-marketplace/community-operators-xxh6f" Feb 23 14:19:32.975277 master-0 kubenswrapper[7728]: I0223 14:19:32.974831 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a0bf4c4-8272-4f24-8e48-525d7a278b26-catalog-content\") pod \"community-operators-xxh6f\" (UID: \"2a0bf4c4-8272-4f24-8e48-525d7a278b26\") " pod="openshift-marketplace/community-operators-xxh6f" Feb 23 14:19:32.978721 master-0 kubenswrapper[7728]: I0223 14:19:32.978228 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Feb 23 14:19:32.979635 master-0 kubenswrapper[7728]: I0223 14:19:32.979608 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Feb 23 14:19:32.981870 master-0 kubenswrapper[7728]: I0223 14:19:32.981835 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-bhcxh" Feb 23 14:19:32.982284 master-0 kubenswrapper[7728]: I0223 14:19:32.982161 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Feb 23 14:19:32.985087 master-0 kubenswrapper[7728]: I0223 14:19:32.985000 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Feb 23 14:19:32.989269 master-0 kubenswrapper[7728]: I0223 14:19:32.988712 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Feb 23 14:19:32.993750 master-0 kubenswrapper[7728]: I0223 14:19:32.993717 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59xbh\" (UniqueName: \"kubernetes.io/projected/2a0bf4c4-8272-4f24-8e48-525d7a278b26-kube-api-access-59xbh\") pod \"community-operators-xxh6f\" (UID: \"2a0bf4c4-8272-4f24-8e48-525d7a278b26\") " pod="openshift-marketplace/community-operators-xxh6f" Feb 23 14:19:33.020064 master-0 kubenswrapper[7728]: I0223 14:19:33.020024 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vwtc6" Feb 23 14:19:33.036738 master-0 kubenswrapper[7728]: I0223 14:19:33.036657 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" event={"ID":"af950a67-1557-4352-8100-27281bb8ecbe","Type":"ContainerStarted","Data":"6e8f1beca1435f1c0deddd5b1efa7b8ba2b2233374745801d4686a667cf0b8d6"} Feb 23 14:19:33.055406 master-0 kubenswrapper[7728]: I0223 14:19:33.055130 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" podStartSLOduration=3.05511317 podStartE2EDuration="3.05511317s" podCreationTimestamp="2026-02-23 14:19:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:19:33.053808392 +0000 UTC m=+66.016469688" watchObservedRunningTime="2026-02-23 14:19:33.05511317 +0000 UTC m=+66.017774466" Feb 23 14:19:33.074850 master-0 kubenswrapper[7728]: I0223 14:19:33.074729 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0315476e-7140-4777-8061-9cead4c92024-apiservice-cert\") pod \"packageserver-65c9585877-m66zh\" (UID: \"0315476e-7140-4777-8061-9cead4c92024\") " pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" Feb 23 14:19:33.074850 master-0 kubenswrapper[7728]: I0223 14:19:33.074807 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0fdb9885-7479-43b5-8613-b2857a798ade-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"0fdb9885-7479-43b5-8613-b2857a798ade\") " pod="openshift-kube-scheduler/installer-4-master-0" Feb 23 14:19:33.074850 master-0 kubenswrapper[7728]: I0223 14:19:33.074826 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/0315476e-7140-4777-8061-9cead4c92024-tmpfs\") pod \"packageserver-65c9585877-m66zh\" (UID: \"0315476e-7140-4777-8061-9cead4c92024\") " pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" Feb 23 14:19:33.074850 master-0 kubenswrapper[7728]: I0223 14:19:33.074852 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtp7w\" (UniqueName: \"kubernetes.io/projected/0315476e-7140-4777-8061-9cead4c92024-kube-api-access-jtp7w\") pod \"packageserver-65c9585877-m66zh\" (UID: \"0315476e-7140-4777-8061-9cead4c92024\") " pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" Feb 23 14:19:33.075376 master-0 kubenswrapper[7728]: I0223 14:19:33.075129 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0fdb9885-7479-43b5-8613-b2857a798ade-var-lock\") pod \"installer-4-master-0\" (UID: \"0fdb9885-7479-43b5-8613-b2857a798ade\") " pod="openshift-kube-scheduler/installer-4-master-0" Feb 23 14:19:33.075376 master-0 kubenswrapper[7728]: I0223 14:19:33.075158 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0fdb9885-7479-43b5-8613-b2857a798ade-kube-api-access\") pod \"installer-4-master-0\" (UID: \"0fdb9885-7479-43b5-8613-b2857a798ade\") " pod="openshift-kube-scheduler/installer-4-master-0" Feb 23 14:19:33.075376 master-0 kubenswrapper[7728]: I0223 14:19:33.075184 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0315476e-7140-4777-8061-9cead4c92024-webhook-cert\") pod \"packageserver-65c9585877-m66zh\" (UID: \"0315476e-7140-4777-8061-9cead4c92024\") " pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" Feb 23 14:19:33.075456 master-0 kubenswrapper[7728]: I0223 14:19:33.075435 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/0315476e-7140-4777-8061-9cead4c92024-tmpfs\") pod \"packageserver-65c9585877-m66zh\" (UID: \"0315476e-7140-4777-8061-9cead4c92024\") " pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" Feb 23 14:19:33.079191 master-0 kubenswrapper[7728]: I0223 14:19:33.078988 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0315476e-7140-4777-8061-9cead4c92024-apiservice-cert\") pod \"packageserver-65c9585877-m66zh\" (UID: \"0315476e-7140-4777-8061-9cead4c92024\") " pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" Feb 23 14:19:33.081243 master-0 kubenswrapper[7728]: I0223 14:19:33.081212 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0315476e-7140-4777-8061-9cead4c92024-webhook-cert\") pod \"packageserver-65c9585877-m66zh\" (UID: \"0315476e-7140-4777-8061-9cead4c92024\") " pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" Feb 23 14:19:33.088964 master-0 kubenswrapper[7728]: I0223 14:19:33.086993 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xxh6f" Feb 23 14:19:33.091772 master-0 kubenswrapper[7728]: I0223 14:19:33.091747 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtp7w\" (UniqueName: \"kubernetes.io/projected/0315476e-7140-4777-8061-9cead4c92024-kube-api-access-jtp7w\") pod \"packageserver-65c9585877-m66zh\" (UID: \"0315476e-7140-4777-8061-9cead4c92024\") " pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" Feb 23 14:19:33.186569 master-0 kubenswrapper[7728]: I0223 14:19:33.177295 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0fdb9885-7479-43b5-8613-b2857a798ade-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"0fdb9885-7479-43b5-8613-b2857a798ade\") " pod="openshift-kube-scheduler/installer-4-master-0" Feb 23 14:19:33.186569 master-0 kubenswrapper[7728]: I0223 14:19:33.177592 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0fdb9885-7479-43b5-8613-b2857a798ade-var-lock\") pod \"installer-4-master-0\" (UID: \"0fdb9885-7479-43b5-8613-b2857a798ade\") " pod="openshift-kube-scheduler/installer-4-master-0" Feb 23 14:19:33.186569 master-0 kubenswrapper[7728]: I0223 14:19:33.177645 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0fdb9885-7479-43b5-8613-b2857a798ade-kube-api-access\") pod \"installer-4-master-0\" (UID: \"0fdb9885-7479-43b5-8613-b2857a798ade\") " pod="openshift-kube-scheduler/installer-4-master-0" Feb 23 14:19:33.186569 master-0 kubenswrapper[7728]: I0223 14:19:33.178872 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0fdb9885-7479-43b5-8613-b2857a798ade-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"0fdb9885-7479-43b5-8613-b2857a798ade\") " pod="openshift-kube-scheduler/installer-4-master-0" Feb 23 14:19:33.186569 master-0 kubenswrapper[7728]: I0223 14:19:33.179520 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0fdb9885-7479-43b5-8613-b2857a798ade-var-lock\") pod \"installer-4-master-0\" (UID: \"0fdb9885-7479-43b5-8613-b2857a798ade\") " pod="openshift-kube-scheduler/installer-4-master-0" Feb 23 14:19:33.197584 master-0 kubenswrapper[7728]: I0223 14:19:33.197267 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0fdb9885-7479-43b5-8613-b2857a798ade-kube-api-access\") pod \"installer-4-master-0\" (UID: \"0fdb9885-7479-43b5-8613-b2857a798ade\") " pod="openshift-kube-scheduler/installer-4-master-0" Feb 23 14:19:33.231503 master-0 kubenswrapper[7728]: I0223 14:19:33.229735 7728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a5f3286-9ec6-4867-aaf7-3c31b1f6c126" path="/var/lib/kubelet/pods/5a5f3286-9ec6-4867-aaf7-3c31b1f6c126/volumes" Feb 23 14:19:33.231503 master-0 kubenswrapper[7728]: I0223 14:19:33.230624 7728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9694f604-5dcf-4aec-a54f-67b4d8dc8809" path="/var/lib/kubelet/pods/9694f604-5dcf-4aec-a54f-67b4d8dc8809/volumes" Feb 23 14:19:33.231503 master-0 kubenswrapper[7728]: I0223 14:19:33.231311 7728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2af015a-7f5f-4702-9f06-8159a040dabe" path="/var/lib/kubelet/pods/b2af015a-7f5f-4702-9f06-8159a040dabe/volumes" Feb 23 14:19:33.243710 master-0 kubenswrapper[7728]: I0223 14:19:33.240125 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" Feb 23 14:19:33.327454 master-0 kubenswrapper[7728]: I0223 14:19:33.327350 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Feb 23 14:19:33.394173 master-0 kubenswrapper[7728]: I0223 14:19:33.394126 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f"] Feb 23 14:19:33.395794 master-0 kubenswrapper[7728]: I0223 14:19:33.395758 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:19:33.398209 master-0 kubenswrapper[7728]: I0223 14:19:33.397876 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 23 14:19:33.398209 master-0 kubenswrapper[7728]: I0223 14:19:33.398056 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-hnfcp" Feb 23 14:19:33.398379 master-0 kubenswrapper[7728]: I0223 14:19:33.398223 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 23 14:19:33.398543 master-0 kubenswrapper[7728]: I0223 14:19:33.398510 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 23 14:19:33.398634 master-0 kubenswrapper[7728]: I0223 14:19:33.398615 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 23 14:19:33.398893 master-0 kubenswrapper[7728]: I0223 14:19:33.398826 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 23 14:19:33.410549 master-0 kubenswrapper[7728]: I0223 14:19:33.403614 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f"] Feb 23 14:19:33.481323 master-0 kubenswrapper[7728]: I0223 14:19:33.481166 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/482284fd-6911-4ba6-8d57-7966cc51117a-client-ca\") pod \"route-controller-manager-8bb99f4f-msq8f\" (UID: \"482284fd-6911-4ba6-8d57-7966cc51117a\") " pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:19:33.481323 master-0 kubenswrapper[7728]: I0223 14:19:33.481212 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khfkr\" (UniqueName: \"kubernetes.io/projected/482284fd-6911-4ba6-8d57-7966cc51117a-kube-api-access-khfkr\") pod \"route-controller-manager-8bb99f4f-msq8f\" (UID: \"482284fd-6911-4ba6-8d57-7966cc51117a\") " pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:19:33.481544 master-0 kubenswrapper[7728]: I0223 14:19:33.481373 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/482284fd-6911-4ba6-8d57-7966cc51117a-config\") pod \"route-controller-manager-8bb99f4f-msq8f\" (UID: \"482284fd-6911-4ba6-8d57-7966cc51117a\") " pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:19:33.481544 master-0 kubenswrapper[7728]: I0223 14:19:33.481462 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/482284fd-6911-4ba6-8d57-7966cc51117a-serving-cert\") pod \"route-controller-manager-8bb99f4f-msq8f\" (UID: \"482284fd-6911-4ba6-8d57-7966cc51117a\") " pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:19:33.583155 master-0 kubenswrapper[7728]: I0223 14:19:33.582735 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/482284fd-6911-4ba6-8d57-7966cc51117a-client-ca\") pod \"route-controller-manager-8bb99f4f-msq8f\" (UID: \"482284fd-6911-4ba6-8d57-7966cc51117a\") " pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:19:33.583155 master-0 kubenswrapper[7728]: I0223 14:19:33.582778 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khfkr\" (UniqueName: \"kubernetes.io/projected/482284fd-6911-4ba6-8d57-7966cc51117a-kube-api-access-khfkr\") pod \"route-controller-manager-8bb99f4f-msq8f\" (UID: \"482284fd-6911-4ba6-8d57-7966cc51117a\") " pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:19:33.583155 master-0 kubenswrapper[7728]: I0223 14:19:33.582817 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/482284fd-6911-4ba6-8d57-7966cc51117a-config\") pod \"route-controller-manager-8bb99f4f-msq8f\" (UID: \"482284fd-6911-4ba6-8d57-7966cc51117a\") " pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:19:33.583155 master-0 kubenswrapper[7728]: I0223 14:19:33.582843 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/482284fd-6911-4ba6-8d57-7966cc51117a-serving-cert\") pod \"route-controller-manager-8bb99f4f-msq8f\" (UID: \"482284fd-6911-4ba6-8d57-7966cc51117a\") " pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:19:33.584705 master-0 kubenswrapper[7728]: I0223 14:19:33.584655 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/482284fd-6911-4ba6-8d57-7966cc51117a-config\") pod \"route-controller-manager-8bb99f4f-msq8f\" (UID: \"482284fd-6911-4ba6-8d57-7966cc51117a\") " pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:19:33.584906 master-0 kubenswrapper[7728]: I0223 14:19:33.584870 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/482284fd-6911-4ba6-8d57-7966cc51117a-client-ca\") pod \"route-controller-manager-8bb99f4f-msq8f\" (UID: \"482284fd-6911-4ba6-8d57-7966cc51117a\") " pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:19:33.586975 master-0 kubenswrapper[7728]: I0223 14:19:33.586816 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/482284fd-6911-4ba6-8d57-7966cc51117a-serving-cert\") pod \"route-controller-manager-8bb99f4f-msq8f\" (UID: \"482284fd-6911-4ba6-8d57-7966cc51117a\") " pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:19:33.599053 master-0 kubenswrapper[7728]: I0223 14:19:33.599009 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khfkr\" (UniqueName: \"kubernetes.io/projected/482284fd-6911-4ba6-8d57-7966cc51117a-kube-api-access-khfkr\") pod \"route-controller-manager-8bb99f4f-msq8f\" (UID: \"482284fd-6911-4ba6-8d57-7966cc51117a\") " pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:19:33.712537 master-0 kubenswrapper[7728]: I0223 14:19:33.712062 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:19:35.346125 master-0 kubenswrapper[7728]: W0223 14:19:35.346041 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podceba7b56_f910_473d_aed5_add94868fb31.slice/crio-f2696aa250be24ef04b3fabb47f7471ddc013ced2adb7ee07e74a3053e3dcc2e WatchSource:0}: Error finding container f2696aa250be24ef04b3fabb47f7471ddc013ced2adb7ee07e74a3053e3dcc2e: Status 404 returned error can't find the container with id f2696aa250be24ef04b3fabb47f7471ddc013ced2adb7ee07e74a3053e3dcc2e Feb 23 14:19:35.425158 master-0 kubenswrapper[7728]: I0223 14:19:35.425082 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n82gm"] Feb 23 14:19:35.426452 master-0 kubenswrapper[7728]: I0223 14:19:35.426389 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n82gm" Feb 23 14:19:35.523991 master-0 kubenswrapper[7728]: I0223 14:19:35.523921 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43ce2f82-05aa-4778-a444-848a408cf570-utilities\") pod \"redhat-marketplace-n82gm\" (UID: \"43ce2f82-05aa-4778-a444-848a408cf570\") " pod="openshift-marketplace/redhat-marketplace-n82gm" Feb 23 14:19:35.524232 master-0 kubenswrapper[7728]: I0223 14:19:35.524017 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggcqk\" (UniqueName: \"kubernetes.io/projected/43ce2f82-05aa-4778-a444-848a408cf570-kube-api-access-ggcqk\") pod \"redhat-marketplace-n82gm\" (UID: \"43ce2f82-05aa-4778-a444-848a408cf570\") " pod="openshift-marketplace/redhat-marketplace-n82gm" Feb 23 14:19:35.524232 master-0 kubenswrapper[7728]: I0223 14:19:35.524147 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43ce2f82-05aa-4778-a444-848a408cf570-catalog-content\") pod \"redhat-marketplace-n82gm\" (UID: \"43ce2f82-05aa-4778-a444-848a408cf570\") " pod="openshift-marketplace/redhat-marketplace-n82gm" Feb 23 14:19:35.547312 master-0 kubenswrapper[7728]: I0223 14:19:35.547269 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n82gm"] Feb 23 14:19:35.565532 master-0 kubenswrapper[7728]: I0223 14:19:35.565294 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mtvwp"] Feb 23 14:19:35.567634 master-0 kubenswrapper[7728]: I0223 14:19:35.567605 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mtvwp" Feb 23 14:19:35.576984 master-0 kubenswrapper[7728]: I0223 14:19:35.575988 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mtvwp"] Feb 23 14:19:35.625182 master-0 kubenswrapper[7728]: I0223 14:19:35.625041 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43ce2f82-05aa-4778-a444-848a408cf570-utilities\") pod \"redhat-marketplace-n82gm\" (UID: \"43ce2f82-05aa-4778-a444-848a408cf570\") " pod="openshift-marketplace/redhat-marketplace-n82gm" Feb 23 14:19:35.625182 master-0 kubenswrapper[7728]: I0223 14:19:35.625094 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggcqk\" (UniqueName: \"kubernetes.io/projected/43ce2f82-05aa-4778-a444-848a408cf570-kube-api-access-ggcqk\") pod \"redhat-marketplace-n82gm\" (UID: \"43ce2f82-05aa-4778-a444-848a408cf570\") " pod="openshift-marketplace/redhat-marketplace-n82gm" Feb 23 14:19:35.625182 master-0 kubenswrapper[7728]: I0223 14:19:35.625136 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43ce2f82-05aa-4778-a444-848a408cf570-catalog-content\") pod \"redhat-marketplace-n82gm\" (UID: \"43ce2f82-05aa-4778-a444-848a408cf570\") " pod="openshift-marketplace/redhat-marketplace-n82gm" Feb 23 14:19:35.625673 master-0 kubenswrapper[7728]: I0223 14:19:35.625649 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43ce2f82-05aa-4778-a444-848a408cf570-catalog-content\") pod \"redhat-marketplace-n82gm\" (UID: \"43ce2f82-05aa-4778-a444-848a408cf570\") " pod="openshift-marketplace/redhat-marketplace-n82gm" Feb 23 14:19:35.625755 master-0 kubenswrapper[7728]: I0223 14:19:35.625701 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43ce2f82-05aa-4778-a444-848a408cf570-utilities\") pod \"redhat-marketplace-n82gm\" (UID: \"43ce2f82-05aa-4778-a444-848a408cf570\") " pod="openshift-marketplace/redhat-marketplace-n82gm" Feb 23 14:19:35.646035 master-0 kubenswrapper[7728]: I0223 14:19:35.645976 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggcqk\" (UniqueName: \"kubernetes.io/projected/43ce2f82-05aa-4778-a444-848a408cf570-kube-api-access-ggcqk\") pod \"redhat-marketplace-n82gm\" (UID: \"43ce2f82-05aa-4778-a444-848a408cf570\") " pod="openshift-marketplace/redhat-marketplace-n82gm" Feb 23 14:19:35.652752 master-0 kubenswrapper[7728]: I0223 14:19:35.652684 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-fhcgg"] Feb 23 14:19:35.664958 master-0 kubenswrapper[7728]: I0223 14:19:35.664925 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" Feb 23 14:19:35.667205 master-0 kubenswrapper[7728]: I0223 14:19:35.667154 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 23 14:19:35.726799 master-0 kubenswrapper[7728]: I0223 14:19:35.726736 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jlzj\" (UniqueName: \"kubernetes.io/projected/76c67569-3a72-4de9-87cd-432a4607b15b-kube-api-access-2jlzj\") pod \"machine-config-daemon-fhcgg\" (UID: \"76c67569-3a72-4de9-87cd-432a4607b15b\") " pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" Feb 23 14:19:35.726799 master-0 kubenswrapper[7728]: I0223 14:19:35.726782 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a398c0f-1b6a-4836-a8b4-33b004350d84-catalog-content\") pod \"redhat-operators-mtvwp\" (UID: \"3a398c0f-1b6a-4836-a8b4-33b004350d84\") " pod="openshift-marketplace/redhat-operators-mtvwp" Feb 23 14:19:35.726799 master-0 kubenswrapper[7728]: I0223 14:19:35.726808 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/76c67569-3a72-4de9-87cd-432a4607b15b-rootfs\") pod \"machine-config-daemon-fhcgg\" (UID: \"76c67569-3a72-4de9-87cd-432a4607b15b\") " pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" Feb 23 14:19:35.727055 master-0 kubenswrapper[7728]: I0223 14:19:35.726840 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/76c67569-3a72-4de9-87cd-432a4607b15b-proxy-tls\") pod \"machine-config-daemon-fhcgg\" (UID: \"76c67569-3a72-4de9-87cd-432a4607b15b\") " pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" Feb 23 14:19:35.727055 master-0 kubenswrapper[7728]: I0223 14:19:35.726882 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqvf6\" (UniqueName: \"kubernetes.io/projected/3a398c0f-1b6a-4836-a8b4-33b004350d84-kube-api-access-rqvf6\") pod \"redhat-operators-mtvwp\" (UID: \"3a398c0f-1b6a-4836-a8b4-33b004350d84\") " pod="openshift-marketplace/redhat-operators-mtvwp" Feb 23 14:19:35.727055 master-0 kubenswrapper[7728]: I0223 14:19:35.726918 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/76c67569-3a72-4de9-87cd-432a4607b15b-mcd-auth-proxy-config\") pod \"machine-config-daemon-fhcgg\" (UID: \"76c67569-3a72-4de9-87cd-432a4607b15b\") " pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" Feb 23 14:19:35.727055 master-0 kubenswrapper[7728]: I0223 14:19:35.726948 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a398c0f-1b6a-4836-a8b4-33b004350d84-utilities\") pod \"redhat-operators-mtvwp\" (UID: \"3a398c0f-1b6a-4836-a8b4-33b004350d84\") " pod="openshift-marketplace/redhat-operators-mtvwp" Feb 23 14:19:35.762838 master-0 kubenswrapper[7728]: I0223 14:19:35.762759 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n82gm" Feb 23 14:19:35.827756 master-0 kubenswrapper[7728]: I0223 14:19:35.827712 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/76c67569-3a72-4de9-87cd-432a4607b15b-rootfs\") pod \"machine-config-daemon-fhcgg\" (UID: \"76c67569-3a72-4de9-87cd-432a4607b15b\") " pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" Feb 23 14:19:35.827966 master-0 kubenswrapper[7728]: I0223 14:19:35.827792 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/76c67569-3a72-4de9-87cd-432a4607b15b-proxy-tls\") pod \"machine-config-daemon-fhcgg\" (UID: \"76c67569-3a72-4de9-87cd-432a4607b15b\") " pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" Feb 23 14:19:35.827966 master-0 kubenswrapper[7728]: I0223 14:19:35.827825 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqvf6\" (UniqueName: \"kubernetes.io/projected/3a398c0f-1b6a-4836-a8b4-33b004350d84-kube-api-access-rqvf6\") pod \"redhat-operators-mtvwp\" (UID: \"3a398c0f-1b6a-4836-a8b4-33b004350d84\") " pod="openshift-marketplace/redhat-operators-mtvwp" Feb 23 14:19:35.827966 master-0 kubenswrapper[7728]: I0223 14:19:35.827870 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/76c67569-3a72-4de9-87cd-432a4607b15b-mcd-auth-proxy-config\") pod \"machine-config-daemon-fhcgg\" (UID: \"76c67569-3a72-4de9-87cd-432a4607b15b\") " pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" Feb 23 14:19:35.827966 master-0 kubenswrapper[7728]: I0223 14:19:35.827918 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a398c0f-1b6a-4836-a8b4-33b004350d84-utilities\") pod \"redhat-operators-mtvwp\" (UID: \"3a398c0f-1b6a-4836-a8b4-33b004350d84\") " pod="openshift-marketplace/redhat-operators-mtvwp" Feb 23 14:19:35.827966 master-0 kubenswrapper[7728]: I0223 14:19:35.827952 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jlzj\" (UniqueName: \"kubernetes.io/projected/76c67569-3a72-4de9-87cd-432a4607b15b-kube-api-access-2jlzj\") pod \"machine-config-daemon-fhcgg\" (UID: \"76c67569-3a72-4de9-87cd-432a4607b15b\") " pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" Feb 23 14:19:35.828116 master-0 kubenswrapper[7728]: I0223 14:19:35.828009 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a398c0f-1b6a-4836-a8b4-33b004350d84-catalog-content\") pod \"redhat-operators-mtvwp\" (UID: \"3a398c0f-1b6a-4836-a8b4-33b004350d84\") " pod="openshift-marketplace/redhat-operators-mtvwp" Feb 23 14:19:35.828639 master-0 kubenswrapper[7728]: I0223 14:19:35.828587 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a398c0f-1b6a-4836-a8b4-33b004350d84-catalog-content\") pod \"redhat-operators-mtvwp\" (UID: \"3a398c0f-1b6a-4836-a8b4-33b004350d84\") " pod="openshift-marketplace/redhat-operators-mtvwp" Feb 23 14:19:35.828872 master-0 kubenswrapper[7728]: I0223 14:19:35.828825 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/76c67569-3a72-4de9-87cd-432a4607b15b-rootfs\") pod \"machine-config-daemon-fhcgg\" (UID: \"76c67569-3a72-4de9-87cd-432a4607b15b\") " pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" Feb 23 14:19:35.829243 master-0 kubenswrapper[7728]: I0223 14:19:35.829206 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a398c0f-1b6a-4836-a8b4-33b004350d84-utilities\") pod \"redhat-operators-mtvwp\" (UID: \"3a398c0f-1b6a-4836-a8b4-33b004350d84\") " pod="openshift-marketplace/redhat-operators-mtvwp" Feb 23 14:19:35.829730 master-0 kubenswrapper[7728]: I0223 14:19:35.829648 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/76c67569-3a72-4de9-87cd-432a4607b15b-mcd-auth-proxy-config\") pod \"machine-config-daemon-fhcgg\" (UID: \"76c67569-3a72-4de9-87cd-432a4607b15b\") " pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" Feb 23 14:19:35.831709 master-0 kubenswrapper[7728]: I0223 14:19:35.831678 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/76c67569-3a72-4de9-87cd-432a4607b15b-proxy-tls\") pod \"machine-config-daemon-fhcgg\" (UID: \"76c67569-3a72-4de9-87cd-432a4607b15b\") " pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" Feb 23 14:19:35.847740 master-0 kubenswrapper[7728]: I0223 14:19:35.847699 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jlzj\" (UniqueName: \"kubernetes.io/projected/76c67569-3a72-4de9-87cd-432a4607b15b-kube-api-access-2jlzj\") pod \"machine-config-daemon-fhcgg\" (UID: \"76c67569-3a72-4de9-87cd-432a4607b15b\") " pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" Feb 23 14:19:35.857881 master-0 kubenswrapper[7728]: I0223 14:19:35.857843 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqvf6\" (UniqueName: \"kubernetes.io/projected/3a398c0f-1b6a-4836-a8b4-33b004350d84-kube-api-access-rqvf6\") pod \"redhat-operators-mtvwp\" (UID: \"3a398c0f-1b6a-4836-a8b4-33b004350d84\") " pod="openshift-marketplace/redhat-operators-mtvwp" Feb 23 14:19:35.896775 master-0 kubenswrapper[7728]: I0223 14:19:35.896693 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mtvwp" Feb 23 14:19:35.991886 master-0 kubenswrapper[7728]: I0223 14:19:35.991823 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" Feb 23 14:19:36.055521 master-0 kubenswrapper[7728]: I0223 14:19:36.055463 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl" event={"ID":"ceba7b56-f910-473d-aed5-add94868fb31","Type":"ContainerStarted","Data":"f2696aa250be24ef04b3fabb47f7471ddc013ced2adb7ee07e74a3053e3dcc2e"} Feb 23 14:19:41.080725 master-0 kubenswrapper[7728]: I0223 14:19:41.080643 7728 generic.go:334] "Generic (PLEG): container finished" podID="e2d00ece-7586-4346-adbb-eaae1aeda69e" containerID="6fdaded4c1d5d4706ada0063d02a22ac0f3bed1016ec71609468c9f080c894da" exitCode=0 Feb 23 14:19:41.080725 master-0 kubenswrapper[7728]: I0223 14:19:41.080682 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" event={"ID":"e2d00ece-7586-4346-adbb-eaae1aeda69e","Type":"ContainerDied","Data":"6fdaded4c1d5d4706ada0063d02a22ac0f3bed1016ec71609468c9f080c894da"} Feb 23 14:19:41.081651 master-0 kubenswrapper[7728]: I0223 14:19:41.081195 7728 scope.go:117] "RemoveContainer" containerID="6fdaded4c1d5d4706ada0063d02a22ac0f3bed1016ec71609468c9f080c894da" Feb 23 14:19:42.692482 master-0 kubenswrapper[7728]: W0223 14:19:42.692377 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76c67569_3a72_4de9_87cd_432a4607b15b.slice/crio-475b3c76ad7ac657e1ef59565d052b44742cf128419941b4feb55cbb0d636474 WatchSource:0}: Error finding container 475b3c76ad7ac657e1ef59565d052b44742cf128419941b4feb55cbb0d636474: Status 404 returned error can't find the container with id 475b3c76ad7ac657e1ef59565d052b44742cf128419941b4feb55cbb0d636474 Feb 23 14:19:43.149774 master-0 kubenswrapper[7728]: I0223 14:19:43.147932 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" event={"ID":"76c67569-3a72-4de9-87cd-432a4607b15b","Type":"ContainerStarted","Data":"475b3c76ad7ac657e1ef59565d052b44742cf128419941b4feb55cbb0d636474"} Feb 23 14:19:43.152405 master-0 kubenswrapper[7728]: I0223 14:19:43.152359 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" event={"ID":"92c63c95-e880-4f51-9858-7715343f7bd8","Type":"ContainerStarted","Data":"4c56fad74102d69cc7f4ad84a92d8641223fd6345d161d89086b8aeb7a8a3450"} Feb 23 14:19:43.157596 master-0 kubenswrapper[7728]: I0223 14:19:43.156873 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl" event={"ID":"ceba7b56-f910-473d-aed5-add94868fb31","Type":"ContainerStarted","Data":"a92c459609a9caf278c919ccf0e276499a4748b87d0a276b18119bbb8961e6d8"} Feb 23 14:19:43.159461 master-0 kubenswrapper[7728]: I0223 14:19:43.159419 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" event={"ID":"e2d00ece-7586-4346-adbb-eaae1aeda69e","Type":"ContainerStarted","Data":"54011ffa8f000620849835983f9e2c00740786e321e8cd4e4de797c7d208b465"} Feb 23 14:19:43.428066 master-0 kubenswrapper[7728]: I0223 14:19:43.422613 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Feb 23 14:19:43.591384 master-0 kubenswrapper[7728]: I0223 14:19:43.588671 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-55d786cb4c-cqkbt"] Feb 23 14:19:43.600024 master-0 kubenswrapper[7728]: I0223 14:19:43.599118 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mtvwp"] Feb 23 14:19:43.778440 master-0 kubenswrapper[7728]: I0223 14:19:43.778388 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh"] Feb 23 14:19:43.778913 master-0 kubenswrapper[7728]: I0223 14:19:43.778900 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Feb 23 14:19:43.782098 master-0 kubenswrapper[7728]: I0223 14:19:43.782063 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vwtc6"] Feb 23 14:19:43.784074 master-0 kubenswrapper[7728]: I0223 14:19:43.783961 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n82gm"] Feb 23 14:19:43.839819 master-0 kubenswrapper[7728]: I0223 14:19:43.839541 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f"] Feb 23 14:19:43.839819 master-0 kubenswrapper[7728]: I0223 14:19:43.839607 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xxh6f"] Feb 23 14:19:44.169323 master-0 kubenswrapper[7728]: I0223 14:19:44.169270 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"493a9ed3-6d64-489a-a68c-235b69a58782","Type":"ContainerStarted","Data":"cb9c2c793a1a03d8088100a56c493a223ec7cd474c24708ed4bb05825975b542"} Feb 23 14:19:44.174549 master-0 kubenswrapper[7728]: I0223 14:19:44.174470 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-6968c58f46-p7jh7" event={"ID":"85365dec-af50-406c-b258-890e4f454c4a","Type":"ContainerStarted","Data":"348647c8be47f1f0398a726d98ab4e65fbf23ef3ceae1691e078bd87dddb99c7"} Feb 23 14:19:44.178373 master-0 kubenswrapper[7728]: I0223 14:19:44.178336 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" event={"ID":"482284fd-6911-4ba6-8d57-7966cc51117a","Type":"ContainerStarted","Data":"30dd1f19a8b444dbc9b769a06f0917819d1c1e9174b5fb3b5552595a9eed345f"} Feb 23 14:19:44.178428 master-0 kubenswrapper[7728]: I0223 14:19:44.178384 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" event={"ID":"482284fd-6911-4ba6-8d57-7966cc51117a","Type":"ContainerStarted","Data":"842c59a633c6726baab1699104248bceff992214333b768aa99b1550ee1de3d0"} Feb 23 14:19:44.178809 master-0 kubenswrapper[7728]: I0223 14:19:44.178594 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:19:44.183463 master-0 kubenswrapper[7728]: I0223 14:19:44.183300 7728 patch_prober.go:28] interesting pod/route-controller-manager-8bb99f4f-msq8f container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.64:8443/healthz\": dial tcp 10.128.0.64:8443: connect: connection refused" start-of-body= Feb 23 14:19:44.183623 master-0 kubenswrapper[7728]: I0223 14:19:44.183561 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-59b498fcfb-rz897" event={"ID":"ae4baa4e-4ef4-433d-aa36-149e92fa6ee2","Type":"ContainerStarted","Data":"8d9482f71c6dc9a00e4b04f7ca64bfc8a0661a7250561f2c67d72c80ed865f03"} Feb 23 14:19:44.183664 master-0 kubenswrapper[7728]: I0223 14:19:44.183613 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" podUID="482284fd-6911-4ba6-8d57-7966cc51117a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.64:8443/healthz\": dial tcp 10.128.0.64:8443: connect: connection refused" Feb 23 14:19:44.188839 master-0 kubenswrapper[7728]: I0223 14:19:44.185777 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" event={"ID":"12b256b7-a57b-4124-8452-25e74cfa7926","Type":"ContainerStarted","Data":"ea6ff3745fdddde6e725677d9a1d30748c6e897ee50a3c7ab0203d9af3e9590f"} Feb 23 14:19:44.188839 master-0 kubenswrapper[7728]: I0223 14:19:44.185814 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" event={"ID":"12b256b7-a57b-4124-8452-25e74cfa7926","Type":"ContainerStarted","Data":"59103074e5c9d28cc59a99d2933688907ecdae822b440f6d4da07709d19793c9"} Feb 23 14:19:44.190053 master-0 kubenswrapper[7728]: I0223 14:19:44.190009 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" event={"ID":"959c2393-e914-4c10-a18f-b30fcf012d19","Type":"ContainerStarted","Data":"943dceb3c19889e0c21143fb06ce16ff62e733710dc9afea16ddd3ae92da4904"} Feb 23 14:19:44.190127 master-0 kubenswrapper[7728]: I0223 14:19:44.190056 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" event={"ID":"959c2393-e914-4c10-a18f-b30fcf012d19","Type":"ContainerStarted","Data":"121fb1d62a402b22b2ce0dcefcc58af76b44ad548fdacc6da5113c93b5d1d4e0"} Feb 23 14:19:44.191647 master-0 kubenswrapper[7728]: I0223 14:19:44.191338 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:19:44.218714 master-0 kubenswrapper[7728]: I0223 14:19:44.203763 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-credential-operator/cloud-credential-operator-6968c58f46-p7jh7" podStartSLOduration=3.029223794 podStartE2EDuration="19.203746991s" podCreationTimestamp="2026-02-23 14:19:25 +0000 UTC" firstStartedPulling="2026-02-23 14:19:26.622604159 +0000 UTC m=+59.585265455" lastFinishedPulling="2026-02-23 14:19:42.797127356 +0000 UTC m=+75.759788652" observedRunningTime="2026-02-23 14:19:44.202906343 +0000 UTC m=+77.165567659" watchObservedRunningTime="2026-02-23 14:19:44.203746991 +0000 UTC m=+77.166408287" Feb 23 14:19:44.218714 master-0 kubenswrapper[7728]: I0223 14:19:44.205492 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"0fdb9885-7479-43b5-8613-b2857a798ade","Type":"ContainerStarted","Data":"0a7994e86e7ddf474fa9a6e9d028e17c8d71e5299119418e1b05d25a7b604984"} Feb 23 14:19:44.218714 master-0 kubenswrapper[7728]: I0223 14:19:44.205539 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"0fdb9885-7479-43b5-8613-b2857a798ade","Type":"ContainerStarted","Data":"3cbdfb9045c2d2cb397063c37573cc9d345a2e61b6805238ad5391bd43edfbaa"} Feb 23 14:19:44.218714 master-0 kubenswrapper[7728]: I0223 14:19:44.212379 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:19:44.232442 master-0 kubenswrapper[7728]: I0223 14:19:44.232383 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp" event={"ID":"3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3","Type":"ContainerStarted","Data":"08d3df84ad8de18eec9e6a636baf4cb95ff798ffedb9a2d917a6b77d6c934fb7"} Feb 23 14:19:44.246509 master-0 kubenswrapper[7728]: I0223 14:19:44.243967 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7" event={"ID":"ad0f0d72-0337-4347-bb50-e299a175f3ca","Type":"ContainerStarted","Data":"c3f209a9ce16ae00e125bd88a555117337a8948041a4b5c781124f66c958f969"} Feb 23 14:19:44.253791 master-0 kubenswrapper[7728]: I0223 14:19:44.253711 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xxh6f" event={"ID":"2a0bf4c4-8272-4f24-8e48-525d7a278b26","Type":"ContainerStarted","Data":"023a865fc1618efadf669d9c23556b255086ea013fc0ed3bfdf62e3156c24070"} Feb 23 14:19:44.280523 master-0 kubenswrapper[7728]: I0223 14:19:44.275213 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" podStartSLOduration=3.740572838 podStartE2EDuration="16.275194041s" podCreationTimestamp="2026-02-23 14:19:28 +0000 UTC" firstStartedPulling="2026-02-23 14:19:30.099431909 +0000 UTC m=+63.062093215" lastFinishedPulling="2026-02-23 14:19:42.634053122 +0000 UTC m=+75.596714418" observedRunningTime="2026-02-23 14:19:44.268374764 +0000 UTC m=+77.231036060" watchObservedRunningTime="2026-02-23 14:19:44.275194041 +0000 UTC m=+77.237855337" Feb 23 14:19:44.280523 master-0 kubenswrapper[7728]: I0223 14:19:44.279761 7728 generic.go:334] "Generic (PLEG): container finished" podID="3a398c0f-1b6a-4836-a8b4-33b004350d84" containerID="b36b4de11e46511d30e9c21f6168fa9e64f6cbbd9e9b705ec3f96b48214779bc" exitCode=0 Feb 23 14:19:44.280523 master-0 kubenswrapper[7728]: I0223 14:19:44.279864 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mtvwp" event={"ID":"3a398c0f-1b6a-4836-a8b4-33b004350d84","Type":"ContainerDied","Data":"b36b4de11e46511d30e9c21f6168fa9e64f6cbbd9e9b705ec3f96b48214779bc"} Feb 23 14:19:44.280523 master-0 kubenswrapper[7728]: I0223 14:19:44.279890 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mtvwp" event={"ID":"3a398c0f-1b6a-4836-a8b4-33b004350d84","Type":"ContainerStarted","Data":"ccfc6bc0e26aee4698f58d8e9759b6618ec00fc42f551f9f7741ee4b369d51bb"} Feb 23 14:19:44.284833 master-0 kubenswrapper[7728]: I0223 14:19:44.284590 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" event={"ID":"0315476e-7140-4777-8061-9cead4c92024","Type":"ContainerStarted","Data":"65ed56b86651ccae834781c13bcafb7784fdc880e0648daccca8f9316f065493"} Feb 23 14:19:44.284833 master-0 kubenswrapper[7728]: I0223 14:19:44.284637 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" event={"ID":"0315476e-7140-4777-8061-9cead4c92024","Type":"ContainerStarted","Data":"97f199fa26d5c3158a89f49cff2f70c3039a2be9fc4ad7fe0571f7a519be854c"} Feb 23 14:19:44.285161 master-0 kubenswrapper[7728]: I0223 14:19:44.285139 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" Feb 23 14:19:44.310563 master-0 kubenswrapper[7728]: I0223 14:19:44.310497 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-65c5c48b9b-ps6x5" event={"ID":"1a283e3a-33ba-4ef7-87d3-55ed8c953fb4","Type":"ContainerStarted","Data":"7d1da5487ea8eb934b2cc7ce134e56590c6e42e0ab8b62329b544c3d98919034"} Feb 23 14:19:44.310563 master-0 kubenswrapper[7728]: I0223 14:19:44.310544 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-65c5c48b9b-ps6x5" event={"ID":"1a283e3a-33ba-4ef7-87d3-55ed8c953fb4","Type":"ContainerStarted","Data":"69bf6c3db3f46a2800c793cb644d9476720461fc601ab495118365c94dd14b4f"} Feb 23 14:19:44.330539 master-0 kubenswrapper[7728]: I0223 14:19:44.327343 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" event={"ID":"76c67569-3a72-4de9-87cd-432a4607b15b","Type":"ContainerStarted","Data":"d07dabdbb75e5831d675bd90d3cedb35103b104effb049877f3263f5f9bc95d3"} Feb 23 14:19:44.330539 master-0 kubenswrapper[7728]: I0223 14:19:44.327398 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" event={"ID":"76c67569-3a72-4de9-87cd-432a4607b15b","Type":"ContainerStarted","Data":"a927374fcf62fad56c9d8325450d07e92c07f04787ed291d9c0071fab4d22549"} Feb 23 14:19:44.333535 master-0 kubenswrapper[7728]: I0223 14:19:44.332916 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-59b498fcfb-rz897" podStartSLOduration=3.093736992 podStartE2EDuration="14.332899344s" podCreationTimestamp="2026-02-23 14:19:30 +0000 UTC" firstStartedPulling="2026-02-23 14:19:31.45336701 +0000 UTC m=+64.416028306" lastFinishedPulling="2026-02-23 14:19:42.692529362 +0000 UTC m=+75.655190658" observedRunningTime="2026-02-23 14:19:44.332524146 +0000 UTC m=+77.295185442" watchObservedRunningTime="2026-02-23 14:19:44.332899344 +0000 UTC m=+77.295560640" Feb 23 14:19:44.334896 master-0 kubenswrapper[7728]: I0223 14:19:44.334154 7728 generic.go:334] "Generic (PLEG): container finished" podID="92c63c95-e880-4f51-9858-7715343f7bd8" containerID="4c56fad74102d69cc7f4ad84a92d8641223fd6345d161d89086b8aeb7a8a3450" exitCode=0 Feb 23 14:19:44.334896 master-0 kubenswrapper[7728]: I0223 14:19:44.334239 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" event={"ID":"92c63c95-e880-4f51-9858-7715343f7bd8","Type":"ContainerDied","Data":"4c56fad74102d69cc7f4ad84a92d8641223fd6345d161d89086b8aeb7a8a3450"} Feb 23 14:19:44.335732 master-0 kubenswrapper[7728]: I0223 14:19:44.335286 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n82gm" event={"ID":"43ce2f82-05aa-4778-a444-848a408cf570","Type":"ContainerStarted","Data":"f5924ea863c0ed9e2a9af26deac1c7c0b867ae7a40e98d453b8a9fd857b9c3cf"} Feb 23 14:19:44.335732 master-0 kubenswrapper[7728]: I0223 14:19:44.335304 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n82gm" event={"ID":"43ce2f82-05aa-4778-a444-848a408cf570","Type":"ContainerStarted","Data":"d034ee90da289ad6b98a9bab827fdbf238a1e22975fb6217c4d11b94c5b9f815"} Feb 23 14:19:44.338710 master-0 kubenswrapper[7728]: I0223 14:19:44.338112 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" event={"ID":"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04","Type":"ContainerStarted","Data":"06ed5eab4f45a414dec39fdf73e09eda9befba12eaf73ac8d264e79dbcbe1fcb"} Feb 23 14:19:44.338710 master-0 kubenswrapper[7728]: I0223 14:19:44.338137 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" event={"ID":"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04","Type":"ContainerStarted","Data":"89c65e32357fb90a65db3743a53bf98698ca1c5da74b91fe797e842ada8b4fd8"} Feb 23 14:19:44.351527 master-0 kubenswrapper[7728]: I0223 14:19:44.351198 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwtc6" event={"ID":"b993917a-bce8-4467-a09d-bfc923a90460","Type":"ContainerStarted","Data":"265114de1092744105855e531fe1a322b26c21db647549095788228771d6e014"} Feb 23 14:19:44.351527 master-0 kubenswrapper[7728]: I0223 14:19:44.351232 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwtc6" event={"ID":"b993917a-bce8-4467-a09d-bfc923a90460","Type":"ContainerStarted","Data":"72afd593f65efb98e2b6fe3298c11d36a194a008b8e1244f1c39b4b489e885a8"} Feb 23 14:19:44.367778 master-0 kubenswrapper[7728]: I0223 14:19:44.367739 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-f94476f49-s6c8v" event={"ID":"fbb66172-1ea9-4683-b88f-227c4fd94924","Type":"ContainerStarted","Data":"4bd96aadee1934ae65fac50d897e75007505a399d7d143ad871ced8edd81b895"} Feb 23 14:19:44.507985 master-0 kubenswrapper[7728]: I0223 14:19:44.507914 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" podStartSLOduration=13.507895064 podStartE2EDuration="13.507895064s" podCreationTimestamp="2026-02-23 14:19:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:19:44.429499895 +0000 UTC m=+77.392161191" watchObservedRunningTime="2026-02-23 14:19:44.507895064 +0000 UTC m=+77.470556360" Feb 23 14:19:44.508413 master-0 kubenswrapper[7728]: I0223 14:19:44.508376 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" podStartSLOduration=13.508372285 podStartE2EDuration="13.508372285s" podCreationTimestamp="2026-02-23 14:19:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:19:44.50723511 +0000 UTC m=+77.469896406" watchObservedRunningTime="2026-02-23 14:19:44.508372285 +0000 UTC m=+77.471033581" Feb 23 14:19:44.579536 master-0 kubenswrapper[7728]: I0223 14:19:44.575967 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-798b897698-p8wds"] Feb 23 14:19:44.579536 master-0 kubenswrapper[7728]: I0223 14:19:44.576228 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-machine-approver/machine-approver-798b897698-p8wds" podUID="d45c7d67-e103-4f68-b10f-9a1f3c56af6e" containerName="kube-rbac-proxy" containerID="cri-o://75b11e6bbaf0eb60d298e18d01053ee976e173fca1e1c0d3ffb9d0631545055a" gracePeriod=30 Feb 23 14:19:44.579536 master-0 kubenswrapper[7728]: I0223 14:19:44.576358 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-machine-approver/machine-approver-798b897698-p8wds" podUID="d45c7d67-e103-4f68-b10f-9a1f3c56af6e" containerName="machine-approver-controller" containerID="cri-o://216cf725b5e9119d6e5677183ba7a2ec7e3c14a7187b5187103db1bf344eb2a5" gracePeriod=30 Feb 23 14:19:44.579536 master-0 kubenswrapper[7728]: I0223 14:19:44.578645 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-4-master-0" podStartSLOduration=12.578620458 podStartE2EDuration="12.578620458s" podCreationTimestamp="2026-02-23 14:19:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:19:44.563616795 +0000 UTC m=+77.526278091" watchObservedRunningTime="2026-02-23 14:19:44.578620458 +0000 UTC m=+77.541281754" Feb 23 14:19:44.667934 master-0 kubenswrapper[7728]: I0223 14:19:44.667800 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp" podStartSLOduration=4.321713298 podStartE2EDuration="16.667778309s" podCreationTimestamp="2026-02-23 14:19:28 +0000 UTC" firstStartedPulling="2026-02-23 14:19:30.360448512 +0000 UTC m=+63.323109818" lastFinishedPulling="2026-02-23 14:19:42.706513523 +0000 UTC m=+75.669174829" observedRunningTime="2026-02-23 14:19:44.655185538 +0000 UTC m=+77.617846834" watchObservedRunningTime="2026-02-23 14:19:44.667778309 +0000 UTC m=+77.630439605" Feb 23 14:19:44.744440 master-0 kubenswrapper[7728]: I0223 14:19:44.744361 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" podStartSLOduration=12.744339499 podStartE2EDuration="12.744339499s" podCreationTimestamp="2026-02-23 14:19:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:19:44.738274628 +0000 UTC m=+77.700935924" watchObservedRunningTime="2026-02-23 14:19:44.744339499 +0000 UTC m=+77.707000795" Feb 23 14:19:44.771801 master-0 kubenswrapper[7728]: I0223 14:19:44.771578 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7" podStartSLOduration=4.769427304 podStartE2EDuration="18.771561715s" podCreationTimestamp="2026-02-23 14:19:26 +0000 UTC" firstStartedPulling="2026-02-23 14:19:28.591209964 +0000 UTC m=+61.553871260" lastFinishedPulling="2026-02-23 14:19:42.593344375 +0000 UTC m=+75.556005671" observedRunningTime="2026-02-23 14:19:44.770097754 +0000 UTC m=+77.732759050" watchObservedRunningTime="2026-02-23 14:19:44.771561715 +0000 UTC m=+77.734223011" Feb 23 14:19:44.794446 master-0 kubenswrapper[7728]: I0223 14:19:44.794398 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-798b897698-p8wds" Feb 23 14:19:44.795192 master-0 kubenswrapper[7728]: I0223 14:19:44.795132 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-65c5c48b9b-ps6x5" podStartSLOduration=4.819071854 podStartE2EDuration="18.795121283s" podCreationTimestamp="2026-02-23 14:19:26 +0000 UTC" firstStartedPulling="2026-02-23 14:19:28.71077507 +0000 UTC m=+61.673436366" lastFinishedPulling="2026-02-23 14:19:42.686824499 +0000 UTC m=+75.649485795" observedRunningTime="2026-02-23 14:19:44.794278715 +0000 UTC m=+77.756940011" watchObservedRunningTime="2026-02-23 14:19:44.795121283 +0000 UTC m=+77.757782579" Feb 23 14:19:44.832331 master-0 kubenswrapper[7728]: I0223 14:19:44.832281 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d45c7d67-e103-4f68-b10f-9a1f3c56af6e-machine-approver-tls\") pod \"d45c7d67-e103-4f68-b10f-9a1f3c56af6e\" (UID: \"d45c7d67-e103-4f68-b10f-9a1f3c56af6e\") " Feb 23 14:19:44.832331 master-0 kubenswrapper[7728]: I0223 14:19:44.832327 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdcpf\" (UniqueName: \"kubernetes.io/projected/d45c7d67-e103-4f68-b10f-9a1f3c56af6e-kube-api-access-zdcpf\") pod \"d45c7d67-e103-4f68-b10f-9a1f3c56af6e\" (UID: \"d45c7d67-e103-4f68-b10f-9a1f3c56af6e\") " Feb 23 14:19:44.832563 master-0 kubenswrapper[7728]: I0223 14:19:44.832369 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d45c7d67-e103-4f68-b10f-9a1f3c56af6e-config\") pod \"d45c7d67-e103-4f68-b10f-9a1f3c56af6e\" (UID: \"d45c7d67-e103-4f68-b10f-9a1f3c56af6e\") " Feb 23 14:19:44.832563 master-0 kubenswrapper[7728]: I0223 14:19:44.832423 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d45c7d67-e103-4f68-b10f-9a1f3c56af6e-auth-proxy-config\") pod \"d45c7d67-e103-4f68-b10f-9a1f3c56af6e\" (UID: \"d45c7d67-e103-4f68-b10f-9a1f3c56af6e\") " Feb 23 14:19:44.834241 master-0 kubenswrapper[7728]: I0223 14:19:44.833448 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d45c7d67-e103-4f68-b10f-9a1f3c56af6e-config" (OuterVolumeSpecName: "config") pod "d45c7d67-e103-4f68-b10f-9a1f3c56af6e" (UID: "d45c7d67-e103-4f68-b10f-9a1f3c56af6e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:19:44.835085 master-0 kubenswrapper[7728]: I0223 14:19:44.834836 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d45c7d67-e103-4f68-b10f-9a1f3c56af6e-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "d45c7d67-e103-4f68-b10f-9a1f3c56af6e" (UID: "d45c7d67-e103-4f68-b10f-9a1f3c56af6e"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:19:44.840639 master-0 kubenswrapper[7728]: I0223 14:19:44.840580 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d45c7d67-e103-4f68-b10f-9a1f3c56af6e-kube-api-access-zdcpf" (OuterVolumeSpecName: "kube-api-access-zdcpf") pod "d45c7d67-e103-4f68-b10f-9a1f3c56af6e" (UID: "d45c7d67-e103-4f68-b10f-9a1f3c56af6e"). InnerVolumeSpecName "kube-api-access-zdcpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:19:44.845416 master-0 kubenswrapper[7728]: I0223 14:19:44.840317 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d45c7d67-e103-4f68-b10f-9a1f3c56af6e-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "d45c7d67-e103-4f68-b10f-9a1f3c56af6e" (UID: "d45c7d67-e103-4f68-b10f-9a1f3c56af6e"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:19:44.853590 master-0 kubenswrapper[7728]: I0223 14:19:44.853293 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/cluster-storage-operator-f94476f49-s6c8v" podStartSLOduration=3.498334259 podStartE2EDuration="14.853245275s" podCreationTimestamp="2026-02-23 14:19:30 +0000 UTC" firstStartedPulling="2026-02-23 14:19:31.312051735 +0000 UTC m=+64.274713031" lastFinishedPulling="2026-02-23 14:19:42.666962741 +0000 UTC m=+75.629624047" observedRunningTime="2026-02-23 14:19:44.849445183 +0000 UTC m=+77.812106489" watchObservedRunningTime="2026-02-23 14:19:44.853245275 +0000 UTC m=+77.815906571" Feb 23 14:19:44.905721 master-0 kubenswrapper[7728]: I0223 14:19:44.905641 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" podStartSLOduration=9.905620504 podStartE2EDuration="9.905620504s" podCreationTimestamp="2026-02-23 14:19:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:19:44.903724433 +0000 UTC m=+77.866385739" watchObservedRunningTime="2026-02-23 14:19:44.905620504 +0000 UTC m=+77.868281810" Feb 23 14:19:44.935509 master-0 kubenswrapper[7728]: I0223 14:19:44.935104 7728 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d45c7d67-e103-4f68-b10f-9a1f3c56af6e-auth-proxy-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:44.935509 master-0 kubenswrapper[7728]: I0223 14:19:44.935149 7728 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d45c7d67-e103-4f68-b10f-9a1f3c56af6e-machine-approver-tls\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:44.935509 master-0 kubenswrapper[7728]: I0223 14:19:44.935158 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdcpf\" (UniqueName: \"kubernetes.io/projected/d45c7d67-e103-4f68-b10f-9a1f3c56af6e-kube-api-access-zdcpf\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:44.935509 master-0 kubenswrapper[7728]: I0223 14:19:44.935167 7728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d45c7d67-e103-4f68-b10f-9a1f3c56af6e-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:45.021719 master-0 kubenswrapper[7728]: I0223 14:19:45.021662 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" Feb 23 14:19:45.387007 master-0 kubenswrapper[7728]: I0223 14:19:45.386890 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwtc6" event={"ID":"b993917a-bce8-4467-a09d-bfc923a90460","Type":"ContainerDied","Data":"265114de1092744105855e531fe1a322b26c21db647549095788228771d6e014"} Feb 23 14:19:45.387007 master-0 kubenswrapper[7728]: I0223 14:19:45.386830 7728 generic.go:334] "Generic (PLEG): container finished" podID="b993917a-bce8-4467-a09d-bfc923a90460" containerID="265114de1092744105855e531fe1a322b26c21db647549095788228771d6e014" exitCode=0 Feb 23 14:19:45.389741 master-0 kubenswrapper[7728]: I0223 14:19:45.389706 7728 generic.go:334] "Generic (PLEG): container finished" podID="d45c7d67-e103-4f68-b10f-9a1f3c56af6e" containerID="216cf725b5e9119d6e5677183ba7a2ec7e3c14a7187b5187103db1bf344eb2a5" exitCode=0 Feb 23 14:19:45.389741 master-0 kubenswrapper[7728]: I0223 14:19:45.389734 7728 generic.go:334] "Generic (PLEG): container finished" podID="d45c7d67-e103-4f68-b10f-9a1f3c56af6e" containerID="75b11e6bbaf0eb60d298e18d01053ee976e173fca1e1c0d3ffb9d0631545055a" exitCode=0 Feb 23 14:19:45.389857 master-0 kubenswrapper[7728]: I0223 14:19:45.389779 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-798b897698-p8wds" Feb 23 14:19:45.389857 master-0 kubenswrapper[7728]: I0223 14:19:45.389792 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-798b897698-p8wds" event={"ID":"d45c7d67-e103-4f68-b10f-9a1f3c56af6e","Type":"ContainerDied","Data":"216cf725b5e9119d6e5677183ba7a2ec7e3c14a7187b5187103db1bf344eb2a5"} Feb 23 14:19:45.389857 master-0 kubenswrapper[7728]: I0223 14:19:45.389848 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-798b897698-p8wds" event={"ID":"d45c7d67-e103-4f68-b10f-9a1f3c56af6e","Type":"ContainerDied","Data":"75b11e6bbaf0eb60d298e18d01053ee976e173fca1e1c0d3ffb9d0631545055a"} Feb 23 14:19:45.389964 master-0 kubenswrapper[7728]: I0223 14:19:45.389858 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-798b897698-p8wds" event={"ID":"d45c7d67-e103-4f68-b10f-9a1f3c56af6e","Type":"ContainerDied","Data":"f2b1a440d4d0941474a18e0ad7a1503dff271821abbfe9ad25e9a1f920cb81fc"} Feb 23 14:19:45.389964 master-0 kubenswrapper[7728]: I0223 14:19:45.389879 7728 scope.go:117] "RemoveContainer" containerID="216cf725b5e9119d6e5677183ba7a2ec7e3c14a7187b5187103db1bf344eb2a5" Feb 23 14:19:45.391366 master-0 kubenswrapper[7728]: I0223 14:19:45.391317 7728 generic.go:334] "Generic (PLEG): container finished" podID="43ce2f82-05aa-4778-a444-848a408cf570" containerID="f5924ea863c0ed9e2a9af26deac1c7c0b867ae7a40e98d453b8a9fd857b9c3cf" exitCode=0 Feb 23 14:19:45.391447 master-0 kubenswrapper[7728]: I0223 14:19:45.391376 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n82gm" event={"ID":"43ce2f82-05aa-4778-a444-848a408cf570","Type":"ContainerDied","Data":"f5924ea863c0ed9e2a9af26deac1c7c0b867ae7a40e98d453b8a9fd857b9c3cf"} Feb 23 14:19:45.393696 master-0 kubenswrapper[7728]: I0223 14:19:45.393651 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"493a9ed3-6d64-489a-a68c-235b69a58782","Type":"ContainerStarted","Data":"1df16973da8e7c98a51b37b7335c255585ebd5dc4bbbed0d842fe3c32df42186"} Feb 23 14:19:45.406352 master-0 kubenswrapper[7728]: I0223 14:19:45.406308 7728 generic.go:334] "Generic (PLEG): container finished" podID="2a0bf4c4-8272-4f24-8e48-525d7a278b26" containerID="65283b394e3f27d5698322b04176e2aa8708a2e0dcc6d84e940b16ca132e47aa" exitCode=0 Feb 23 14:19:45.406577 master-0 kubenswrapper[7728]: I0223 14:19:45.406417 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xxh6f" event={"ID":"2a0bf4c4-8272-4f24-8e48-525d7a278b26","Type":"ContainerDied","Data":"65283b394e3f27d5698322b04176e2aa8708a2e0dcc6d84e940b16ca132e47aa"} Feb 23 14:19:45.409583 master-0 kubenswrapper[7728]: I0223 14:19:45.409499 7728 scope.go:117] "RemoveContainer" containerID="75b11e6bbaf0eb60d298e18d01053ee976e173fca1e1c0d3ffb9d0631545055a" Feb 23 14:19:45.411456 master-0 kubenswrapper[7728]: I0223 14:19:45.411369 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" event={"ID":"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04","Type":"ContainerStarted","Data":"b9696feb917a539850d6758463096fc5875c43e8388f7fe164eb4e11ef28ad37"} Feb 23 14:19:45.419556 master-0 kubenswrapper[7728]: I0223 14:19:45.419509 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:19:45.431801 master-0 kubenswrapper[7728]: I0223 14:19:45.431734 7728 scope.go:117] "RemoveContainer" containerID="216cf725b5e9119d6e5677183ba7a2ec7e3c14a7187b5187103db1bf344eb2a5" Feb 23 14:19:45.434866 master-0 kubenswrapper[7728]: I0223 14:19:45.433847 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-2-master-0" podStartSLOduration=13.433824454 podStartE2EDuration="13.433824454s" podCreationTimestamp="2026-02-23 14:19:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:19:45.417108584 +0000 UTC m=+78.379769880" watchObservedRunningTime="2026-02-23 14:19:45.433824454 +0000 UTC m=+78.396485740" Feb 23 14:19:45.435208 master-0 kubenswrapper[7728]: E0223 14:19:45.435150 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"216cf725b5e9119d6e5677183ba7a2ec7e3c14a7187b5187103db1bf344eb2a5\": container with ID starting with 216cf725b5e9119d6e5677183ba7a2ec7e3c14a7187b5187103db1bf344eb2a5 not found: ID does not exist" containerID="216cf725b5e9119d6e5677183ba7a2ec7e3c14a7187b5187103db1bf344eb2a5" Feb 23 14:19:45.435348 master-0 kubenswrapper[7728]: I0223 14:19:45.435194 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"216cf725b5e9119d6e5677183ba7a2ec7e3c14a7187b5187103db1bf344eb2a5"} err="failed to get container status \"216cf725b5e9119d6e5677183ba7a2ec7e3c14a7187b5187103db1bf344eb2a5\": rpc error: code = NotFound desc = could not find container \"216cf725b5e9119d6e5677183ba7a2ec7e3c14a7187b5187103db1bf344eb2a5\": container with ID starting with 216cf725b5e9119d6e5677183ba7a2ec7e3c14a7187b5187103db1bf344eb2a5 not found: ID does not exist" Feb 23 14:19:45.435348 master-0 kubenswrapper[7728]: I0223 14:19:45.435222 7728 scope.go:117] "RemoveContainer" containerID="75b11e6bbaf0eb60d298e18d01053ee976e173fca1e1c0d3ffb9d0631545055a" Feb 23 14:19:45.435818 master-0 kubenswrapper[7728]: E0223 14:19:45.435702 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75b11e6bbaf0eb60d298e18d01053ee976e173fca1e1c0d3ffb9d0631545055a\": container with ID starting with 75b11e6bbaf0eb60d298e18d01053ee976e173fca1e1c0d3ffb9d0631545055a not found: ID does not exist" containerID="75b11e6bbaf0eb60d298e18d01053ee976e173fca1e1c0d3ffb9d0631545055a" Feb 23 14:19:45.435818 master-0 kubenswrapper[7728]: I0223 14:19:45.435800 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75b11e6bbaf0eb60d298e18d01053ee976e173fca1e1c0d3ffb9d0631545055a"} err="failed to get container status \"75b11e6bbaf0eb60d298e18d01053ee976e173fca1e1c0d3ffb9d0631545055a\": rpc error: code = NotFound desc = could not find container \"75b11e6bbaf0eb60d298e18d01053ee976e173fca1e1c0d3ffb9d0631545055a\": container with ID starting with 75b11e6bbaf0eb60d298e18d01053ee976e173fca1e1c0d3ffb9d0631545055a not found: ID does not exist" Feb 23 14:19:45.435818 master-0 kubenswrapper[7728]: I0223 14:19:45.435814 7728 scope.go:117] "RemoveContainer" containerID="216cf725b5e9119d6e5677183ba7a2ec7e3c14a7187b5187103db1bf344eb2a5" Feb 23 14:19:45.436271 master-0 kubenswrapper[7728]: I0223 14:19:45.436221 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"216cf725b5e9119d6e5677183ba7a2ec7e3c14a7187b5187103db1bf344eb2a5"} err="failed to get container status \"216cf725b5e9119d6e5677183ba7a2ec7e3c14a7187b5187103db1bf344eb2a5\": rpc error: code = NotFound desc = could not find container \"216cf725b5e9119d6e5677183ba7a2ec7e3c14a7187b5187103db1bf344eb2a5\": container with ID starting with 216cf725b5e9119d6e5677183ba7a2ec7e3c14a7187b5187103db1bf344eb2a5 not found: ID does not exist" Feb 23 14:19:45.436271 master-0 kubenswrapper[7728]: I0223 14:19:45.436250 7728 scope.go:117] "RemoveContainer" containerID="75b11e6bbaf0eb60d298e18d01053ee976e173fca1e1c0d3ffb9d0631545055a" Feb 23 14:19:45.436675 master-0 kubenswrapper[7728]: I0223 14:19:45.436607 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75b11e6bbaf0eb60d298e18d01053ee976e173fca1e1c0d3ffb9d0631545055a"} err="failed to get container status \"75b11e6bbaf0eb60d298e18d01053ee976e173fca1e1c0d3ffb9d0631545055a\": rpc error: code = NotFound desc = could not find container \"75b11e6bbaf0eb60d298e18d01053ee976e173fca1e1c0d3ffb9d0631545055a\": container with ID starting with 75b11e6bbaf0eb60d298e18d01053ee976e173fca1e1c0d3ffb9d0631545055a not found: ID does not exist" Feb 23 14:19:45.443868 master-0 kubenswrapper[7728]: I0223 14:19:45.443801 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-798b897698-p8wds"] Feb 23 14:19:45.447210 master-0 kubenswrapper[7728]: I0223 14:19:45.447166 7728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-798b897698-p8wds"] Feb 23 14:19:45.456058 master-0 kubenswrapper[7728]: I0223 14:19:45.455571 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" podStartSLOduration=4.3725658450000005 podStartE2EDuration="15.45545498s" podCreationTimestamp="2026-02-23 14:19:30 +0000 UTC" firstStartedPulling="2026-02-23 14:19:31.587613283 +0000 UTC m=+64.550274579" lastFinishedPulling="2026-02-23 14:19:42.670502418 +0000 UTC m=+75.633163714" observedRunningTime="2026-02-23 14:19:45.451958755 +0000 UTC m=+78.414620051" watchObservedRunningTime="2026-02-23 14:19:45.45545498 +0000 UTC m=+78.418116276" Feb 23 14:19:45.481322 master-0 kubenswrapper[7728]: I0223 14:19:45.481265 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj"] Feb 23 14:19:45.481613 master-0 kubenswrapper[7728]: E0223 14:19:45.481570 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d45c7d67-e103-4f68-b10f-9a1f3c56af6e" containerName="kube-rbac-proxy" Feb 23 14:19:45.481613 master-0 kubenswrapper[7728]: I0223 14:19:45.481586 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d45c7d67-e103-4f68-b10f-9a1f3c56af6e" containerName="kube-rbac-proxy" Feb 23 14:19:45.481613 master-0 kubenswrapper[7728]: E0223 14:19:45.481606 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d45c7d67-e103-4f68-b10f-9a1f3c56af6e" containerName="machine-approver-controller" Feb 23 14:19:45.481757 master-0 kubenswrapper[7728]: I0223 14:19:45.481612 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d45c7d67-e103-4f68-b10f-9a1f3c56af6e" containerName="machine-approver-controller" Feb 23 14:19:45.481757 master-0 kubenswrapper[7728]: I0223 14:19:45.481709 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="d45c7d67-e103-4f68-b10f-9a1f3c56af6e" containerName="machine-approver-controller" Feb 23 14:19:45.481757 master-0 kubenswrapper[7728]: I0223 14:19:45.481723 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="d45c7d67-e103-4f68-b10f-9a1f3c56af6e" containerName="kube-rbac-proxy" Feb 23 14:19:45.485937 master-0 kubenswrapper[7728]: I0223 14:19:45.482935 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" Feb 23 14:19:45.490273 master-0 kubenswrapper[7728]: I0223 14:19:45.490231 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-zqmp5" Feb 23 14:19:45.490514 master-0 kubenswrapper[7728]: I0223 14:19:45.490497 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 23 14:19:45.492047 master-0 kubenswrapper[7728]: I0223 14:19:45.492001 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 23 14:19:45.492732 master-0 kubenswrapper[7728]: I0223 14:19:45.492574 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 23 14:19:45.492732 master-0 kubenswrapper[7728]: I0223 14:19:45.492645 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 23 14:19:45.492732 master-0 kubenswrapper[7728]: I0223 14:19:45.492648 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 23 14:19:45.646213 master-0 kubenswrapper[7728]: I0223 14:19:45.646004 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-machine-approver-tls\") pod \"machine-approver-7dd9c7d7b9-rn8fj\" (UID: \"c67a2ed2-f520-46fc-84d3-6816dc19f4e0\") " pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" Feb 23 14:19:45.646423 master-0 kubenswrapper[7728]: I0223 14:19:45.646386 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj8ff\" (UniqueName: \"kubernetes.io/projected/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-kube-api-access-hj8ff\") pod \"machine-approver-7dd9c7d7b9-rn8fj\" (UID: \"c67a2ed2-f520-46fc-84d3-6816dc19f4e0\") " pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" Feb 23 14:19:45.646549 master-0 kubenswrapper[7728]: I0223 14:19:45.646497 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-auth-proxy-config\") pod \"machine-approver-7dd9c7d7b9-rn8fj\" (UID: \"c67a2ed2-f520-46fc-84d3-6816dc19f4e0\") " pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" Feb 23 14:19:45.646633 master-0 kubenswrapper[7728]: I0223 14:19:45.646550 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-config\") pod \"machine-approver-7dd9c7d7b9-rn8fj\" (UID: \"c67a2ed2-f520-46fc-84d3-6816dc19f4e0\") " pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" Feb 23 14:19:45.747913 master-0 kubenswrapper[7728]: I0223 14:19:45.747849 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-config\") pod \"machine-approver-7dd9c7d7b9-rn8fj\" (UID: \"c67a2ed2-f520-46fc-84d3-6816dc19f4e0\") " pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" Feb 23 14:19:45.747913 master-0 kubenswrapper[7728]: I0223 14:19:45.747905 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-machine-approver-tls\") pod \"machine-approver-7dd9c7d7b9-rn8fj\" (UID: \"c67a2ed2-f520-46fc-84d3-6816dc19f4e0\") " pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" Feb 23 14:19:45.748215 master-0 kubenswrapper[7728]: I0223 14:19:45.748028 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj8ff\" (UniqueName: \"kubernetes.io/projected/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-kube-api-access-hj8ff\") pod \"machine-approver-7dd9c7d7b9-rn8fj\" (UID: \"c67a2ed2-f520-46fc-84d3-6816dc19f4e0\") " pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" Feb 23 14:19:45.748215 master-0 kubenswrapper[7728]: I0223 14:19:45.748078 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-auth-proxy-config\") pod \"machine-approver-7dd9c7d7b9-rn8fj\" (UID: \"c67a2ed2-f520-46fc-84d3-6816dc19f4e0\") " pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" Feb 23 14:19:45.748855 master-0 kubenswrapper[7728]: I0223 14:19:45.748825 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-config\") pod \"machine-approver-7dd9c7d7b9-rn8fj\" (UID: \"c67a2ed2-f520-46fc-84d3-6816dc19f4e0\") " pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" Feb 23 14:19:45.749318 master-0 kubenswrapper[7728]: I0223 14:19:45.749300 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-auth-proxy-config\") pod \"machine-approver-7dd9c7d7b9-rn8fj\" (UID: \"c67a2ed2-f520-46fc-84d3-6816dc19f4e0\") " pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" Feb 23 14:19:45.752321 master-0 kubenswrapper[7728]: I0223 14:19:45.752301 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-machine-approver-tls\") pod \"machine-approver-7dd9c7d7b9-rn8fj\" (UID: \"c67a2ed2-f520-46fc-84d3-6816dc19f4e0\") " pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" Feb 23 14:19:45.921016 master-0 kubenswrapper[7728]: I0223 14:19:45.920874 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj8ff\" (UniqueName: \"kubernetes.io/projected/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-kube-api-access-hj8ff\") pod \"machine-approver-7dd9c7d7b9-rn8fj\" (UID: \"c67a2ed2-f520-46fc-84d3-6816dc19f4e0\") " pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" Feb 23 14:19:46.111387 master-0 kubenswrapper[7728]: I0223 14:19:46.111338 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" Feb 23 14:19:46.139032 master-0 kubenswrapper[7728]: W0223 14:19:46.138991 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc67a2ed2_f520_46fc_84d3_6816dc19f4e0.slice/crio-b96554c0b26af60fc366d01cdb0653dfe860650b866d469b8eb85b8f7a39e783 WatchSource:0}: Error finding container b96554c0b26af60fc366d01cdb0653dfe860650b866d469b8eb85b8f7a39e783: Status 404 returned error can't find the container with id b96554c0b26af60fc366d01cdb0653dfe860650b866d469b8eb85b8f7a39e783 Feb 23 14:19:46.424307 master-0 kubenswrapper[7728]: I0223 14:19:46.424274 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" event={"ID":"c67a2ed2-f520-46fc-84d3-6816dc19f4e0","Type":"ContainerStarted","Data":"db5ef21ef8eab7ca7c090f44b0b9b626032073152ce3cc515970e88c0bd210b9"} Feb 23 14:19:46.424307 master-0 kubenswrapper[7728]: I0223 14:19:46.424309 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" event={"ID":"c67a2ed2-f520-46fc-84d3-6816dc19f4e0","Type":"ContainerStarted","Data":"b96554c0b26af60fc366d01cdb0653dfe860650b866d469b8eb85b8f7a39e783"} Feb 23 14:19:46.748782 master-0 kubenswrapper[7728]: I0223 14:19:46.748718 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xxh6f"] Feb 23 14:19:46.940545 master-0 kubenswrapper[7728]: I0223 14:19:46.940455 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n82gm"] Feb 23 14:19:46.993115 master-0 kubenswrapper[7728]: I0223 14:19:46.993059 7728 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-0-master-0"] Feb 23 14:19:46.993360 master-0 kubenswrapper[7728]: I0223 14:19:46.993306 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="12dab5d350ebc129b0bfa4714d330b15" containerName="etcdctl" containerID="cri-o://5809ecf60a8e4db68dfab073298af03c567dcc4e91a5b6d7f6d78ca758010d15" gracePeriod=30 Feb 23 14:19:46.994418 master-0 kubenswrapper[7728]: I0223 14:19:46.993441 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="12dab5d350ebc129b0bfa4714d330b15" containerName="etcd" containerID="cri-o://86dd361ededa7f9d61d9c2bea900261b661a76c0468603804e9af20765f8d8cd" gracePeriod=30 Feb 23 14:19:46.997232 master-0 kubenswrapper[7728]: I0223 14:19:46.997183 7728 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Feb 23 14:19:46.997418 master-0 kubenswrapper[7728]: E0223 14:19:46.997401 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12dab5d350ebc129b0bfa4714d330b15" containerName="etcdctl" Feb 23 14:19:46.997418 master-0 kubenswrapper[7728]: I0223 14:19:46.997417 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="12dab5d350ebc129b0bfa4714d330b15" containerName="etcdctl" Feb 23 14:19:46.999303 master-0 kubenswrapper[7728]: E0223 14:19:46.997436 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12dab5d350ebc129b0bfa4714d330b15" containerName="etcd" Feb 23 14:19:46.999303 master-0 kubenswrapper[7728]: I0223 14:19:46.997443 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="12dab5d350ebc129b0bfa4714d330b15" containerName="etcd" Feb 23 14:19:47.009759 master-0 kubenswrapper[7728]: I0223 14:19:47.009707 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="12dab5d350ebc129b0bfa4714d330b15" containerName="etcd" Feb 23 14:19:47.009759 master-0 kubenswrapper[7728]: I0223 14:19:47.009751 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="12dab5d350ebc129b0bfa4714d330b15" containerName="etcdctl" Feb 23 14:19:47.013311 master-0 kubenswrapper[7728]: I0223 14:19:47.013280 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Feb 23 14:19:47.068829 master-0 kubenswrapper[7728]: I0223 14:19:47.068741 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-usr-local-bin\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:19:47.069034 master-0 kubenswrapper[7728]: I0223 14:19:47.068915 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-resource-dir\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:19:47.069034 master-0 kubenswrapper[7728]: I0223 14:19:47.068942 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-static-pod-dir\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:19:47.069034 master-0 kubenswrapper[7728]: I0223 14:19:47.068996 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-data-dir\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:19:47.069276 master-0 kubenswrapper[7728]: I0223 14:19:47.069246 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-cert-dir\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:19:47.069318 master-0 kubenswrapper[7728]: I0223 14:19:47.069298 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-log-dir\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:19:47.170856 master-0 kubenswrapper[7728]: I0223 14:19:47.170797 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-cert-dir\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:19:47.170856 master-0 kubenswrapper[7728]: I0223 14:19:47.170856 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-log-dir\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:19:47.171058 master-0 kubenswrapper[7728]: I0223 14:19:47.170903 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-usr-local-bin\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:19:47.171058 master-0 kubenswrapper[7728]: I0223 14:19:47.170941 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-resource-dir\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:19:47.171058 master-0 kubenswrapper[7728]: I0223 14:19:47.170964 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-static-pod-dir\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:19:47.171058 master-0 kubenswrapper[7728]: I0223 14:19:47.170984 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-data-dir\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:19:47.171234 master-0 kubenswrapper[7728]: I0223 14:19:47.171068 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-data-dir\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:19:47.171234 master-0 kubenswrapper[7728]: I0223 14:19:47.171114 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-cert-dir\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:19:47.171234 master-0 kubenswrapper[7728]: I0223 14:19:47.171142 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-log-dir\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:19:47.171234 master-0 kubenswrapper[7728]: I0223 14:19:47.171170 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-usr-local-bin\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:19:47.171441 master-0 kubenswrapper[7728]: I0223 14:19:47.171227 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-resource-dir\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:19:47.171441 master-0 kubenswrapper[7728]: I0223 14:19:47.171289 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-static-pod-dir\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:19:47.250834 master-0 kubenswrapper[7728]: I0223 14:19:47.250750 7728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d45c7d67-e103-4f68-b10f-9a1f3c56af6e" path="/var/lib/kubelet/pods/d45c7d67-e103-4f68-b10f-9a1f3c56af6e/volumes" Feb 23 14:19:47.437853 master-0 kubenswrapper[7728]: I0223 14:19:47.437818 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_d1bffce5-019a-4c97-85f2-929dc19a0bde/installer/0.log" Feb 23 14:19:47.438086 master-0 kubenswrapper[7728]: I0223 14:19:47.437867 7728 generic.go:334] "Generic (PLEG): container finished" podID="d1bffce5-019a-4c97-85f2-929dc19a0bde" containerID="e8935c6e444aa0e24024f5ff856a9b30868587f159c8c2351155b9dff7539917" exitCode=1 Feb 23 14:19:47.438086 master-0 kubenswrapper[7728]: I0223 14:19:47.437898 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"d1bffce5-019a-4c97-85f2-929dc19a0bde","Type":"ContainerDied","Data":"e8935c6e444aa0e24024f5ff856a9b30868587f159c8c2351155b9dff7539917"} Feb 23 14:19:48.122123 master-0 kubenswrapper[7728]: I0223 14:19:48.121728 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_d1bffce5-019a-4c97-85f2-929dc19a0bde/installer/0.log" Feb 23 14:19:48.122123 master-0 kubenswrapper[7728]: I0223 14:19:48.121818 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Feb 23 14:19:48.293403 master-0 kubenswrapper[7728]: I0223 14:19:48.290982 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d1bffce5-019a-4c97-85f2-929dc19a0bde-var-lock\") pod \"d1bffce5-019a-4c97-85f2-929dc19a0bde\" (UID: \"d1bffce5-019a-4c97-85f2-929dc19a0bde\") " Feb 23 14:19:48.293403 master-0 kubenswrapper[7728]: I0223 14:19:48.291086 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1bffce5-019a-4c97-85f2-929dc19a0bde-kube-api-access\") pod \"d1bffce5-019a-4c97-85f2-929dc19a0bde\" (UID: \"d1bffce5-019a-4c97-85f2-929dc19a0bde\") " Feb 23 14:19:48.293403 master-0 kubenswrapper[7728]: I0223 14:19:48.291094 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1bffce5-019a-4c97-85f2-929dc19a0bde-var-lock" (OuterVolumeSpecName: "var-lock") pod "d1bffce5-019a-4c97-85f2-929dc19a0bde" (UID: "d1bffce5-019a-4c97-85f2-929dc19a0bde"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:19:48.293403 master-0 kubenswrapper[7728]: I0223 14:19:48.291154 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d1bffce5-019a-4c97-85f2-929dc19a0bde-kubelet-dir\") pod \"d1bffce5-019a-4c97-85f2-929dc19a0bde\" (UID: \"d1bffce5-019a-4c97-85f2-929dc19a0bde\") " Feb 23 14:19:48.293403 master-0 kubenswrapper[7728]: I0223 14:19:48.291204 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1bffce5-019a-4c97-85f2-929dc19a0bde-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d1bffce5-019a-4c97-85f2-929dc19a0bde" (UID: "d1bffce5-019a-4c97-85f2-929dc19a0bde"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:19:48.293403 master-0 kubenswrapper[7728]: I0223 14:19:48.291666 7728 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d1bffce5-019a-4c97-85f2-929dc19a0bde-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:48.293403 master-0 kubenswrapper[7728]: I0223 14:19:48.291692 7728 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d1bffce5-019a-4c97-85f2-929dc19a0bde-var-lock\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:48.294508 master-0 kubenswrapper[7728]: I0223 14:19:48.293907 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1bffce5-019a-4c97-85f2-929dc19a0bde-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d1bffce5-019a-4c97-85f2-929dc19a0bde" (UID: "d1bffce5-019a-4c97-85f2-929dc19a0bde"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:19:48.392705 master-0 kubenswrapper[7728]: I0223 14:19:48.392660 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1bffce5-019a-4c97-85f2-929dc19a0bde-kube-api-access\") on node \"master-0\" DevicePath \"\"" Feb 23 14:19:48.446366 master-0 kubenswrapper[7728]: I0223 14:19:48.445852 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" event={"ID":"c67a2ed2-f520-46fc-84d3-6816dc19f4e0","Type":"ContainerStarted","Data":"c9089f7ec9b403b820c3589f967f8a29cc7167ff44a814892c003b600afb7102"} Feb 23 14:19:48.447611 master-0 kubenswrapper[7728]: I0223 14:19:48.447445 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_d1bffce5-019a-4c97-85f2-929dc19a0bde/installer/0.log" Feb 23 14:19:48.447611 master-0 kubenswrapper[7728]: I0223 14:19:48.447505 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"d1bffce5-019a-4c97-85f2-929dc19a0bde","Type":"ContainerDied","Data":"d2f40a1ec635a92856cfa2a3f7bc1ac31eb82fe0e85c0e36c0e8828880ceda10"} Feb 23 14:19:48.447611 master-0 kubenswrapper[7728]: I0223 14:19:48.447534 7728 scope.go:117] "RemoveContainer" containerID="e8935c6e444aa0e24024f5ff856a9b30868587f159c8c2351155b9dff7539917" Feb 23 14:19:48.447611 master-0 kubenswrapper[7728]: I0223 14:19:48.447569 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Feb 23 14:19:49.455610 master-0 kubenswrapper[7728]: I0223 14:19:49.455560 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" event={"ID":"92c63c95-e880-4f51-9858-7715343f7bd8","Type":"ContainerStarted","Data":"5d82b70b9c0cfec9d3d38ffda7072c232f8227384d8dba5c3b39ed19470ad748"} Feb 23 14:19:49.456129 master-0 kubenswrapper[7728]: I0223 14:19:49.456115 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" Feb 23 14:19:50.272296 master-0 kubenswrapper[7728]: I0223 14:19:50.272243 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:19:51.473262 master-0 kubenswrapper[7728]: I0223 14:19:51.473204 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:19:51.473828 master-0 kubenswrapper[7728]: I0223 14:19:51.473274 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:19:51.485867 master-0 kubenswrapper[7728]: I0223 14:19:51.485818 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:19:51.485867 master-0 kubenswrapper[7728]: I0223 14:19:51.485849 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:19:51.485986 master-0 kubenswrapper[7728]: I0223 14:19:51.485870 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:19:51.485986 master-0 kubenswrapper[7728]: I0223 14:19:51.485908 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:19:54.485715 master-0 kubenswrapper[7728]: I0223 14:19:54.485626 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:19:54.485715 master-0 kubenswrapper[7728]: I0223 14:19:54.485666 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:19:54.485715 master-0 kubenswrapper[7728]: I0223 14:19:54.485685 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:19:54.486793 master-0 kubenswrapper[7728]: I0223 14:19:54.485726 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:19:57.486306 master-0 kubenswrapper[7728]: I0223 14:19:57.486206 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:19:57.486306 master-0 kubenswrapper[7728]: I0223 14:19:57.486207 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:19:57.486306 master-0 kubenswrapper[7728]: I0223 14:19:57.486272 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:19:57.486306 master-0 kubenswrapper[7728]: I0223 14:19:57.486323 7728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" Feb 23 14:19:57.487556 master-0 kubenswrapper[7728]: I0223 14:19:57.486329 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:19:57.487556 master-0 kubenswrapper[7728]: I0223 14:19:57.487010 7728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="openshift-config-operator" containerStatusID={"Type":"cri-o","ID":"5d82b70b9c0cfec9d3d38ffda7072c232f8227384d8dba5c3b39ed19470ad748"} pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" containerMessage="Container openshift-config-operator failed liveness probe, will be restarted" Feb 23 14:19:57.487556 master-0 kubenswrapper[7728]: I0223 14:19:57.487049 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" containerID="cri-o://5d82b70b9c0cfec9d3d38ffda7072c232f8227384d8dba5c3b39ed19470ad748" gracePeriod=30 Feb 23 14:19:57.487556 master-0 kubenswrapper[7728]: I0223 14:19:57.487146 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:19:57.487556 master-0 kubenswrapper[7728]: I0223 14:19:57.487213 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:19:58.694651 master-0 kubenswrapper[7728]: E0223 14:19:58.694576 7728 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:19:59.011435 master-0 kubenswrapper[7728]: I0223 14:19:59.011353 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-6f47d587d6-55qjr_92c63c95-e880-4f51-9858-7715343f7bd8/openshift-config-operator/0.log" Feb 23 14:19:59.012322 master-0 kubenswrapper[7728]: I0223 14:19:59.012285 7728 generic.go:334] "Generic (PLEG): container finished" podID="92c63c95-e880-4f51-9858-7715343f7bd8" containerID="5d82b70b9c0cfec9d3d38ffda7072c232f8227384d8dba5c3b39ed19470ad748" exitCode=255 Feb 23 14:19:59.012379 master-0 kubenswrapper[7728]: I0223 14:19:59.012336 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" event={"ID":"92c63c95-e880-4f51-9858-7715343f7bd8","Type":"ContainerDied","Data":"5d82b70b9c0cfec9d3d38ffda7072c232f8227384d8dba5c3b39ed19470ad748"} Feb 23 14:19:59.402522 master-0 kubenswrapper[7728]: E0223 14:19:59.402131 7728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T14:19:49Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T14:19:49Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T14:19:49Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T14:19:49Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:94d88fe2fa42931a725508dbf17296b6ed99b8e20c1169f5d1fb8a36f4927ddd\\\"],\\\"sizeBytes\\\":1637274270},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d7a8ac0ba2e5115c9d451d553741173ae8744d4544da15e28bf38f61630182fd\\\"],\\\"sizeBytes\\\":1237794314},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4775c6461221dafe3ddd67ff683ccb665bed6eb278fa047d9d744aab9af65dcf\\\"],\\\"sizeBytes\\\":992461126},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8177c465e14c63854e5c0fa95ca0635cffc9b5dd3d077ecf971feedbc42b1274\\\"],\\\"sizeBytes\\\":943734757},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:72fafcd55ab739919dd8a114863fda27106af1c497f474e7ce0cb23b58dfa021\\\"],\\\"sizeBytes\\\":875998518},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7b9239f1f5e9590e3db71e61fde86db8f43e0085f61ae7769508d2ea058481c7\\\"],\\\"sizeBytes\\\":862501144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3fa84eaa1310d97fe55bb23a7c27ece85718d0643fa7fc0ff81014edb4b948b\\\"],\\\"sizeBytes\\\":772838975},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bd420e879c9f0271bca2d123a6d762591d9a4626b72f254d1f885842c32149e8\\\"],\\\"sizeBytes\\\":687849728},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3c467c1eeba7434b2aebf07169ab8afe0203d638e871dbdf29a16f830e9aef9e\\\"],\\\"sizeBytes\\\":682963466},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5121a0944000b7bfa57ae2e4eb3f412e1b4b89fcc75eec1ef20241182c0527f2\\\"],\\\"sizeBytes\\\":677827184},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a31b448302fbb994548ed801ac488a44e8a7c4ae9149c3b4cc20d6af832f83\\\"],\\\"sizeBytes\\\":621542709},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3e089c4e4fa9a22803b2673b776215e021a1f12a856dbcaba2fadee29bee10a3\\\"],\\\"sizeBytes\\\":589275174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1582ea693f35073e3316e2380a18227b78096ca7f4e1328f1dd8a2c423da26e9\\\"],\\\"sizeBytes\\\":582052489},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:314be88d356b2c8a3c4416daeb4cfcd58d617a4526319c01ddaffae4b4179e74\\\"],\\\"sizeBytes\\\":558105176},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d77a77c401bcfaa65a6ab6de82415af0e7ace1b470626647e5feb4875c89a5ef\\\"],\\\"sizeBytes\\\":529218694},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bc0ca626e5e17f9f78ddbfde54ea13ddc7749904911817bba16e6b59f30499ec\\\"],\\\"sizeBytes\\\":528829499},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:11f566fe2ae782ad96d36028b0fd81911a64ef787dcebc83803f741f272fa396\\\"],\\\"sizeBytes\\\":518279996},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:40bb7cf7c637bf9efd8fb0157839d325a019d67cc7d7279665fcf90dbb7f3f33\\\"],\\\"sizeBytes\\\":517888569},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fd63e2c1185e529c6e9f6e1426222ff2ac195132b44a1775f407e4593b66d4c\\\"],\\\"sizeBytes\\\":514875199},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ce471c00b59fd855a59f7efa9afdb3f0f9cbf1c4bcce3a82fe1a4cb82e90f52e\\\"],\\\"sizeBytes\\\":513119434},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a9dcbc6b966928b7597d4a822948ae6f07b62feecb91679c1d825d0d19426e19\\\"],\\\"sizeBytes\\\":512172666},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5f4a546983224e416dfcc3a700afc15f9790182a5a2f8f7c94892d0e95abab3\\\"],\\\"sizeBytes\\\":511125422},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2c8de5c5b21ed8c7829ba988d580ffa470c9913877fe0ee5e11bf507400ffbc7\\\"],\\\"sizeBytes\\\":511059399},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:64ba461fd5594e3a30bfd755f1496707a88249bc68d07c65124c8617d664d2ac\\\"],\\\"sizeBytes\\\":508786786},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a82e441a9e9b93f0e010f1ce26e30c24b6ca93f7752084d4694ebdb3c5b53f83\\\"],\\\"sizeBytes\\\":508443359},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d7bd3361d506dcc1be3afa62d35080c5dd37afccc26cd36019e2b9db2c45f896\\\"],\\\"sizeBytes\\\":507867630},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:034588ffd95ce834e866279bf80a45af2cddda631c6c9a6344c1bb2e033fd83e\\\"],\\\"sizeBytes\\\":506374680},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8618d42fe4da4881abe39e98691d187e13713981b66d0dac0a11cb1287482b7\\\"],\\\"sizeBytes\\\":506291135},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ce68078d909b63bb5b872d94c04829aa1b5812c416abbaf9024840d348ee68b1\\\"],\\\"sizeBytes\\\":505244089},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:457c564075e8b14b1d24ff6eab750600ebc90ff8b7bb137306a579ee8445ae95\\\"],\\\"sizeBytes\\\":505137106},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:897708222502e4d710dd737923f74d153c084ba6048bffceb16dfd30f79a6ecc\\\"],\\\"sizeBytes\\\":504513960},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4e8c6ae1f9a450c90857c9fbccf1e5fb404dbc0d65d086afce005d6bd307853b\\\"],\\\"sizeBytes\\\":494959854},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:117a846734fc8159b7172a40ed2feb43a969b7dbc113ee1a572cbf6f9f922655\\\"],\\\"sizeBytes\\\":486990304},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4797a485fd4ab3414ba8d52bdf2afccefab6c657b1d259baad703fca5145124c\\\"],\\\"sizeBytes\\\":484349508},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a132d09565133b36ac7c797213d6a74ac810bb368ef59136320ab3d300f45bd\\\"],\\\"sizeBytes\\\":484074784},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:235b846666adaa2e4b4d6d0f7fd71d57bf3be253466e1d9fffafd103fa2696ac\\\"],\\\"sizeBytes\\\":470575802},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ce89154fa3fe1e87c660e644b58cf125fede575869fd5841600082c0d1f858a3\\\"],\\\"sizeBytes\\\":468159025},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2ba8aec9f09d75121b95d2e6f1097415302c0ae7121fa7076fd38d7adb9a5afa\\\"],\\\"sizeBytes\\\":467133839},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cb2014728aa54e620f65424402b14c5247016734a9a982c393dc011acb1a1f52\\\"],\\\"sizeBytes\\\":464984427},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:39d04e6e7ced98e7e189aff1bf392a4d4526e011fc6adead5c6b27dbd08776a9\\\"],\\\"sizeBytes\\\":463600445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f42321072d0ab781f41e8f595ed6f5efabe791e472c7d0784e61b3c214194656\\\"],\\\"sizeBytes\\\":458025547},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:24097d3bc90ed1fc543f5d96736c6091eb57b9e578d7186f430147ee28269cbf\\\"],\\\"sizeBytes\\\":456470711},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e53cc6c4d6263c99978c787e90575dd4818eac732589145ca7331186ad4f16de\\\"],\\\"sizeBytes\\\":448723134},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fc46bdc145c2a9e4a89a5fe574cd228b7355eb99754255bf9a0c8bf2cc1de1f2\\\"],\\\"sizeBytes\\\":447940744},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eef7d0364bb9259fdc66e57df6df3a59ce7bf957a77d0ca25d4fedb5f122015\\\"],\\\"sizeBytes\\\":443170136},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b1d840665bf310fa455ddaff9b262dd0649440ca9ecf34d49b340ce669885568\\\"],\\\"sizeBytes\\\":411485245},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16ea15164e7d71550d4c0e2c90d17f96edda4ab77123947b2e188ffb23951fa0\\\"],\\\"sizeBytes\\\":407241636},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5001a555eb05eef7f23d64667303c2b4db8343ee900c265f7613c40c1db229\\\"],\\\"sizeBytes\\\":396420881}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:20:00.018807 master-0 kubenswrapper[7728]: I0223 14:20:00.018690 7728 generic.go:334] "Generic (PLEG): container finished" podID="c9ad9373c007a4fcd25e70622bdc8deb" containerID="b545413980bb822863005db697b932a984f3d1797f9e0fd0d4ca5331ec57bc46" exitCode=1 Feb 23 14:20:00.018807 master-0 kubenswrapper[7728]: I0223 14:20:00.018738 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"c9ad9373c007a4fcd25e70622bdc8deb","Type":"ContainerDied","Data":"b545413980bb822863005db697b932a984f3d1797f9e0fd0d4ca5331ec57bc46"} Feb 23 14:20:00.018807 master-0 kubenswrapper[7728]: I0223 14:20:00.018781 7728 scope.go:117] "RemoveContainer" containerID="ad9ef13f95d7901e7f24b0914da444cc2df5f3bc77853f6da272e6cf3ddf8974" Feb 23 14:20:00.019358 master-0 kubenswrapper[7728]: I0223 14:20:00.019318 7728 scope.go:117] "RemoveContainer" containerID="b545413980bb822863005db697b932a984f3d1797f9e0fd0d4ca5331ec57bc46" Feb 23 14:20:00.035610 master-0 kubenswrapper[7728]: E0223 14:20:00.035562 7728 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Feb 23 14:20:00.035884 master-0 kubenswrapper[7728]: I0223 14:20:00.035861 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Feb 23 14:20:00.486440 master-0 kubenswrapper[7728]: I0223 14:20:00.486372 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:20:00.487154 master-0 kubenswrapper[7728]: I0223 14:20:00.486457 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:20:02.028786 master-0 kubenswrapper[7728]: I0223 14:20:02.028679 7728 generic.go:334] "Generic (PLEG): container finished" podID="5f67ab24-82bc-4e71-b974-e25b819986c8" containerID="c172e0b4868c308f20f7ae8b13ba955f59eebc66ffba5fd517b3648866cbe26f" exitCode=0 Feb 23 14:20:02.029337 master-0 kubenswrapper[7728]: I0223 14:20:02.029243 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"5f67ab24-82bc-4e71-b974-e25b819986c8","Type":"ContainerDied","Data":"c172e0b4868c308f20f7ae8b13ba955f59eebc66ffba5fd517b3648866cbe26f"} Feb 23 14:20:03.036343 master-0 kubenswrapper[7728]: I0223 14:20:03.036257 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_29f7b30e-bf6a-4e54-b009-1b0fcd830035/installer/0.log" Feb 23 14:20:03.036343 master-0 kubenswrapper[7728]: I0223 14:20:03.036335 7728 generic.go:334] "Generic (PLEG): container finished" podID="29f7b30e-bf6a-4e54-b009-1b0fcd830035" containerID="325ae25a8338b6a2543759476e50b822896d1071332fcb78a23d45a461fab54f" exitCode=1 Feb 23 14:20:03.037027 master-0 kubenswrapper[7728]: I0223 14:20:03.036436 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"29f7b30e-bf6a-4e54-b009-1b0fcd830035","Type":"ContainerDied","Data":"325ae25a8338b6a2543759476e50b822896d1071332fcb78a23d45a461fab54f"} Feb 23 14:20:03.487289 master-0 kubenswrapper[7728]: I0223 14:20:03.487020 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:20:03.487289 master-0 kubenswrapper[7728]: I0223 14:20:03.487085 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:20:04.044438 master-0 kubenswrapper[7728]: I0223 14:20:04.044249 7728 generic.go:334] "Generic (PLEG): container finished" podID="56c3cb71c9851003c8de7e7c5db4b87e" containerID="1161af5c0919fc04c557fffb0fa1799b448226d91a3bed741eb027099a2bf8f9" exitCode=1 Feb 23 14:20:04.044438 master-0 kubenswrapper[7728]: I0223 14:20:04.044382 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"56c3cb71c9851003c8de7e7c5db4b87e","Type":"ContainerDied","Data":"1161af5c0919fc04c557fffb0fa1799b448226d91a3bed741eb027099a2bf8f9"} Feb 23 14:20:04.045720 master-0 kubenswrapper[7728]: I0223 14:20:04.045307 7728 scope.go:117] "RemoveContainer" containerID="1161af5c0919fc04c557fffb0fa1799b448226d91a3bed741eb027099a2bf8f9" Feb 23 14:20:06.363926 master-0 kubenswrapper[7728]: I0223 14:20:06.363782 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:20:06.486681 master-0 kubenswrapper[7728]: I0223 14:20:06.486605 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:20:06.486908 master-0 kubenswrapper[7728]: I0223 14:20:06.486703 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:20:07.218419 master-0 kubenswrapper[7728]: I0223 14:20:07.218345 7728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:20:07.437924 master-0 kubenswrapper[7728]: I0223 14:20:07.437883 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:20:07.513538 master-0 kubenswrapper[7728]: I0223 14:20:07.513044 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Feb 23 14:20:07.518674 master-0 kubenswrapper[7728]: I0223 14:20:07.518629 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_29f7b30e-bf6a-4e54-b009-1b0fcd830035/installer/0.log" Feb 23 14:20:07.518776 master-0 kubenswrapper[7728]: I0223 14:20:07.518727 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Feb 23 14:20:07.636627 master-0 kubenswrapper[7728]: I0223 14:20:07.636535 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29f7b30e-bf6a-4e54-b009-1b0fcd830035-kube-api-access\") pod \"29f7b30e-bf6a-4e54-b009-1b0fcd830035\" (UID: \"29f7b30e-bf6a-4e54-b009-1b0fcd830035\") " Feb 23 14:20:07.637237 master-0 kubenswrapper[7728]: I0223 14:20:07.637156 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/29f7b30e-bf6a-4e54-b009-1b0fcd830035-var-lock\") pod \"29f7b30e-bf6a-4e54-b009-1b0fcd830035\" (UID: \"29f7b30e-bf6a-4e54-b009-1b0fcd830035\") " Feb 23 14:20:07.637344 master-0 kubenswrapper[7728]: I0223 14:20:07.637279 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5f67ab24-82bc-4e71-b974-e25b819986c8-kubelet-dir\") pod \"5f67ab24-82bc-4e71-b974-e25b819986c8\" (UID: \"5f67ab24-82bc-4e71-b974-e25b819986c8\") " Feb 23 14:20:07.637449 master-0 kubenswrapper[7728]: I0223 14:20:07.637331 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29f7b30e-bf6a-4e54-b009-1b0fcd830035-var-lock" (OuterVolumeSpecName: "var-lock") pod "29f7b30e-bf6a-4e54-b009-1b0fcd830035" (UID: "29f7b30e-bf6a-4e54-b009-1b0fcd830035"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:20:07.637449 master-0 kubenswrapper[7728]: I0223 14:20:07.637382 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5f67ab24-82bc-4e71-b974-e25b819986c8-kube-api-access\") pod \"5f67ab24-82bc-4e71-b974-e25b819986c8\" (UID: \"5f67ab24-82bc-4e71-b974-e25b819986c8\") " Feb 23 14:20:07.637564 master-0 kubenswrapper[7728]: I0223 14:20:07.637521 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29f7b30e-bf6a-4e54-b009-1b0fcd830035-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "29f7b30e-bf6a-4e54-b009-1b0fcd830035" (UID: "29f7b30e-bf6a-4e54-b009-1b0fcd830035"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:20:07.637564 master-0 kubenswrapper[7728]: I0223 14:20:07.637469 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f67ab24-82bc-4e71-b974-e25b819986c8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5f67ab24-82bc-4e71-b974-e25b819986c8" (UID: "5f67ab24-82bc-4e71-b974-e25b819986c8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:20:07.637696 master-0 kubenswrapper[7728]: I0223 14:20:07.637433 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29f7b30e-bf6a-4e54-b009-1b0fcd830035-kubelet-dir\") pod \"29f7b30e-bf6a-4e54-b009-1b0fcd830035\" (UID: \"29f7b30e-bf6a-4e54-b009-1b0fcd830035\") " Feb 23 14:20:07.637778 master-0 kubenswrapper[7728]: I0223 14:20:07.637753 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5f67ab24-82bc-4e71-b974-e25b819986c8-var-lock\") pod \"5f67ab24-82bc-4e71-b974-e25b819986c8\" (UID: \"5f67ab24-82bc-4e71-b974-e25b819986c8\") " Feb 23 14:20:07.638259 master-0 kubenswrapper[7728]: I0223 14:20:07.638224 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f67ab24-82bc-4e71-b974-e25b819986c8-var-lock" (OuterVolumeSpecName: "var-lock") pod "5f67ab24-82bc-4e71-b974-e25b819986c8" (UID: "5f67ab24-82bc-4e71-b974-e25b819986c8"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:20:07.639236 master-0 kubenswrapper[7728]: I0223 14:20:07.639196 7728 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/29f7b30e-bf6a-4e54-b009-1b0fcd830035-var-lock\") on node \"master-0\" DevicePath \"\"" Feb 23 14:20:07.639298 master-0 kubenswrapper[7728]: I0223 14:20:07.639235 7728 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5f67ab24-82bc-4e71-b974-e25b819986c8-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:20:07.639298 master-0 kubenswrapper[7728]: I0223 14:20:07.639266 7728 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29f7b30e-bf6a-4e54-b009-1b0fcd830035-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:20:07.639298 master-0 kubenswrapper[7728]: I0223 14:20:07.639286 7728 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5f67ab24-82bc-4e71-b974-e25b819986c8-var-lock\") on node \"master-0\" DevicePath \"\"" Feb 23 14:20:07.640837 master-0 kubenswrapper[7728]: I0223 14:20:07.640779 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29f7b30e-bf6a-4e54-b009-1b0fcd830035-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "29f7b30e-bf6a-4e54-b009-1b0fcd830035" (UID: "29f7b30e-bf6a-4e54-b009-1b0fcd830035"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:20:07.642949 master-0 kubenswrapper[7728]: I0223 14:20:07.642890 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f67ab24-82bc-4e71-b974-e25b819986c8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5f67ab24-82bc-4e71-b974-e25b819986c8" (UID: "5f67ab24-82bc-4e71-b974-e25b819986c8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:20:07.739963 master-0 kubenswrapper[7728]: I0223 14:20:07.739764 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29f7b30e-bf6a-4e54-b009-1b0fcd830035-kube-api-access\") on node \"master-0\" DevicePath \"\"" Feb 23 14:20:07.739963 master-0 kubenswrapper[7728]: I0223 14:20:07.739856 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5f67ab24-82bc-4e71-b974-e25b819986c8-kube-api-access\") on node \"master-0\" DevicePath \"\"" Feb 23 14:20:08.070028 master-0 kubenswrapper[7728]: I0223 14:20:08.069810 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_29f7b30e-bf6a-4e54-b009-1b0fcd830035/installer/0.log" Feb 23 14:20:08.070251 master-0 kubenswrapper[7728]: I0223 14:20:08.070075 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Feb 23 14:20:08.070532 master-0 kubenswrapper[7728]: I0223 14:20:08.070420 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"29f7b30e-bf6a-4e54-b009-1b0fcd830035","Type":"ContainerDied","Data":"d4622d20df32d4655ff4c5d8c0ab82bdd9c1a367900ef291744093a4801a66c4"} Feb 23 14:20:08.070532 master-0 kubenswrapper[7728]: I0223 14:20:08.070504 7728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4622d20df32d4655ff4c5d8c0ab82bdd9c1a367900ef291744093a4801a66c4" Feb 23 14:20:08.072544 master-0 kubenswrapper[7728]: I0223 14:20:08.072454 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"5f67ab24-82bc-4e71-b974-e25b819986c8","Type":"ContainerDied","Data":"6dfbc560ad1e1a5e7a72cca845fe403480de4a21ae08827666c9a7f55f0e049e"} Feb 23 14:20:08.072675 master-0 kubenswrapper[7728]: I0223 14:20:08.072547 7728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dfbc560ad1e1a5e7a72cca845fe403480de4a21ae08827666c9a7f55f0e049e" Feb 23 14:20:08.072843 master-0 kubenswrapper[7728]: I0223 14:20:08.072808 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Feb 23 14:20:08.181084 master-0 kubenswrapper[7728]: W0223 14:20:08.181043 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18a83278819db2092fa26d8274eb3f00.slice/crio-94e94715e4a9a7ea0bdeab74580c1cabb71e05248b0269144d5616aa9022f9eb WatchSource:0}: Error finding container 94e94715e4a9a7ea0bdeab74580c1cabb71e05248b0269144d5616aa9022f9eb: Status 404 returned error can't find the container with id 94e94715e4a9a7ea0bdeab74580c1cabb71e05248b0269144d5616aa9022f9eb Feb 23 14:20:08.695328 master-0 kubenswrapper[7728]: E0223 14:20:08.695274 7728 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:20:09.081325 master-0 kubenswrapper[7728]: I0223 14:20:09.081283 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"56c3cb71c9851003c8de7e7c5db4b87e","Type":"ContainerStarted","Data":"901941f5b39d593d08535a59f0a3320fa3d1d31c538434d8bc740dd1aca5de85"} Feb 23 14:20:09.083448 master-0 kubenswrapper[7728]: I0223 14:20:09.083409 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"c9ad9373c007a4fcd25e70622bdc8deb","Type":"ContainerStarted","Data":"12ea317144b1f97c12db3d866b1cc7b66073f64b41a37f71cd1c51e60dce3e4c"} Feb 23 14:20:09.085258 master-0 kubenswrapper[7728]: I0223 14:20:09.085224 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwtc6" event={"ID":"b993917a-bce8-4467-a09d-bfc923a90460","Type":"ContainerStarted","Data":"d394e98a69d42d1c4f58982ec3fcda0ef2490c48a0e2aec3d0cd3ed1011f69a0"} Feb 23 14:20:09.086679 master-0 kubenswrapper[7728]: I0223 14:20:09.086638 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"18a83278819db2092fa26d8274eb3f00","Type":"ContainerStarted","Data":"e1b9898b8e99e752199648be3eeb21746009166b99c8416be13f36fdd12cbcdd"} Feb 23 14:20:09.086795 master-0 kubenswrapper[7728]: I0223 14:20:09.086692 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"18a83278819db2092fa26d8274eb3f00","Type":"ContainerStarted","Data":"94e94715e4a9a7ea0bdeab74580c1cabb71e05248b0269144d5616aa9022f9eb"} Feb 23 14:20:09.088404 master-0 kubenswrapper[7728]: I0223 14:20:09.088371 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xxh6f" event={"ID":"2a0bf4c4-8272-4f24-8e48-525d7a278b26","Type":"ContainerStarted","Data":"962d19618a794630debc80be44aa12464f3df0a838c893cad0f8f7c0495b4423"} Feb 23 14:20:09.088642 master-0 kubenswrapper[7728]: I0223 14:20:09.088558 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xxh6f" podUID="2a0bf4c4-8272-4f24-8e48-525d7a278b26" containerName="extract-content" containerID="cri-o://962d19618a794630debc80be44aa12464f3df0a838c893cad0f8f7c0495b4423" gracePeriod=2 Feb 23 14:20:09.091666 master-0 kubenswrapper[7728]: I0223 14:20:09.091629 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mtvwp" event={"ID":"3a398c0f-1b6a-4836-a8b4-33b004350d84","Type":"ContainerStarted","Data":"99efcfca15e0062c8ae1454f293dca23babf93693b0446a429661734820c3937"} Feb 23 14:20:09.097563 master-0 kubenswrapper[7728]: I0223 14:20:09.097535 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-6f47d587d6-55qjr_92c63c95-e880-4f51-9858-7715343f7bd8/openshift-config-operator/0.log" Feb 23 14:20:09.098148 master-0 kubenswrapper[7728]: I0223 14:20:09.098101 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" event={"ID":"92c63c95-e880-4f51-9858-7715343f7bd8","Type":"ContainerStarted","Data":"502abf4ea3cb690eb21a0ba5e773be5fbc2712d7f83f4ac4448a35b53cf2ac71"} Feb 23 14:20:09.098448 master-0 kubenswrapper[7728]: I0223 14:20:09.098384 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" Feb 23 14:20:09.100065 master-0 kubenswrapper[7728]: I0223 14:20:09.100037 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl" event={"ID":"ceba7b56-f910-473d-aed5-add94868fb31","Type":"ContainerStarted","Data":"5f3a2dd4bf392d1f8fcfc7780b16f411d497e94edaa80b6729dddf180e05f5b1"} Feb 23 14:20:09.104513 master-0 kubenswrapper[7728]: I0223 14:20:09.102159 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n82gm" event={"ID":"43ce2f82-05aa-4778-a444-848a408cf570","Type":"ContainerStarted","Data":"aee8f69f97ca5ab0b4e2570f9519b21cf1cd68e3c1a1e58e99c69e8e582ff273"} Feb 23 14:20:09.403345 master-0 kubenswrapper[7728]: E0223 14:20:09.403229 7728 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:20:10.110069 master-0 kubenswrapper[7728]: I0223 14:20:10.110017 7728 generic.go:334] "Generic (PLEG): container finished" podID="43ce2f82-05aa-4778-a444-848a408cf570" containerID="aee8f69f97ca5ab0b4e2570f9519b21cf1cd68e3c1a1e58e99c69e8e582ff273" exitCode=0 Feb 23 14:20:10.110853 master-0 kubenswrapper[7728]: I0223 14:20:10.110093 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n82gm" event={"ID":"43ce2f82-05aa-4778-a444-848a408cf570","Type":"ContainerDied","Data":"aee8f69f97ca5ab0b4e2570f9519b21cf1cd68e3c1a1e58e99c69e8e582ff273"} Feb 23 14:20:10.111822 master-0 kubenswrapper[7728]: I0223 14:20:10.111775 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xxh6f_2a0bf4c4-8272-4f24-8e48-525d7a278b26/extract-content/0.log" Feb 23 14:20:10.112741 master-0 kubenswrapper[7728]: I0223 14:20:10.112690 7728 generic.go:334] "Generic (PLEG): container finished" podID="2a0bf4c4-8272-4f24-8e48-525d7a278b26" containerID="962d19618a794630debc80be44aa12464f3df0a838c893cad0f8f7c0495b4423" exitCode=2 Feb 23 14:20:10.112860 master-0 kubenswrapper[7728]: I0223 14:20:10.112813 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xxh6f" event={"ID":"2a0bf4c4-8272-4f24-8e48-525d7a278b26","Type":"ContainerDied","Data":"962d19618a794630debc80be44aa12464f3df0a838c893cad0f8f7c0495b4423"} Feb 23 14:20:10.115238 master-0 kubenswrapper[7728]: I0223 14:20:10.115174 7728 generic.go:334] "Generic (PLEG): container finished" podID="3a398c0f-1b6a-4836-a8b4-33b004350d84" containerID="99efcfca15e0062c8ae1454f293dca23babf93693b0446a429661734820c3937" exitCode=0 Feb 23 14:20:10.115238 master-0 kubenswrapper[7728]: I0223 14:20:10.115213 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mtvwp" event={"ID":"3a398c0f-1b6a-4836-a8b4-33b004350d84","Type":"ContainerDied","Data":"99efcfca15e0062c8ae1454f293dca23babf93693b0446a429661734820c3937"} Feb 23 14:20:10.122904 master-0 kubenswrapper[7728]: I0223 14:20:10.118908 7728 generic.go:334] "Generic (PLEG): container finished" podID="b993917a-bce8-4467-a09d-bfc923a90460" containerID="d394e98a69d42d1c4f58982ec3fcda0ef2490c48a0e2aec3d0cd3ed1011f69a0" exitCode=0 Feb 23 14:20:10.122904 master-0 kubenswrapper[7728]: I0223 14:20:10.119094 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwtc6" event={"ID":"b993917a-bce8-4467-a09d-bfc923a90460","Type":"ContainerDied","Data":"d394e98a69d42d1c4f58982ec3fcda0ef2490c48a0e2aec3d0cd3ed1011f69a0"} Feb 23 14:20:10.122904 master-0 kubenswrapper[7728]: I0223 14:20:10.121661 7728 generic.go:334] "Generic (PLEG): container finished" podID="18a83278819db2092fa26d8274eb3f00" containerID="e1b9898b8e99e752199648be3eeb21746009166b99c8416be13f36fdd12cbcdd" exitCode=0 Feb 23 14:20:10.122904 master-0 kubenswrapper[7728]: I0223 14:20:10.122915 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"18a83278819db2092fa26d8274eb3f00","Type":"ContainerDied","Data":"e1b9898b8e99e752199648be3eeb21746009166b99c8416be13f36fdd12cbcdd"} Feb 23 14:20:10.539595 master-0 kubenswrapper[7728]: I0223 14:20:10.539549 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n82gm" Feb 23 14:20:10.611028 master-0 kubenswrapper[7728]: I0223 14:20:10.610953 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xxh6f_2a0bf4c4-8272-4f24-8e48-525d7a278b26/extract-content/0.log" Feb 23 14:20:10.611793 master-0 kubenswrapper[7728]: I0223 14:20:10.611753 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xxh6f" Feb 23 14:20:10.680710 master-0 kubenswrapper[7728]: I0223 14:20:10.680642 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43ce2f82-05aa-4778-a444-848a408cf570-utilities\") pod \"43ce2f82-05aa-4778-a444-848a408cf570\" (UID: \"43ce2f82-05aa-4778-a444-848a408cf570\") " Feb 23 14:20:10.682677 master-0 kubenswrapper[7728]: I0223 14:20:10.680705 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43ce2f82-05aa-4778-a444-848a408cf570-catalog-content\") pod \"43ce2f82-05aa-4778-a444-848a408cf570\" (UID: \"43ce2f82-05aa-4778-a444-848a408cf570\") " Feb 23 14:20:10.682800 master-0 kubenswrapper[7728]: I0223 14:20:10.681452 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43ce2f82-05aa-4778-a444-848a408cf570-utilities" (OuterVolumeSpecName: "utilities") pod "43ce2f82-05aa-4778-a444-848a408cf570" (UID: "43ce2f82-05aa-4778-a444-848a408cf570"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:20:10.682800 master-0 kubenswrapper[7728]: I0223 14:20:10.682778 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a0bf4c4-8272-4f24-8e48-525d7a278b26-utilities\") pod \"2a0bf4c4-8272-4f24-8e48-525d7a278b26\" (UID: \"2a0bf4c4-8272-4f24-8e48-525d7a278b26\") " Feb 23 14:20:10.683877 master-0 kubenswrapper[7728]: I0223 14:20:10.683809 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a0bf4c4-8272-4f24-8e48-525d7a278b26-utilities" (OuterVolumeSpecName: "utilities") pod "2a0bf4c4-8272-4f24-8e48-525d7a278b26" (UID: "2a0bf4c4-8272-4f24-8e48-525d7a278b26"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:20:10.684029 master-0 kubenswrapper[7728]: I0223 14:20:10.683901 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggcqk\" (UniqueName: \"kubernetes.io/projected/43ce2f82-05aa-4778-a444-848a408cf570-kube-api-access-ggcqk\") pod \"43ce2f82-05aa-4778-a444-848a408cf570\" (UID: \"43ce2f82-05aa-4778-a444-848a408cf570\") " Feb 23 14:20:10.684428 master-0 kubenswrapper[7728]: I0223 14:20:10.684374 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59xbh\" (UniqueName: \"kubernetes.io/projected/2a0bf4c4-8272-4f24-8e48-525d7a278b26-kube-api-access-59xbh\") pod \"2a0bf4c4-8272-4f24-8e48-525d7a278b26\" (UID: \"2a0bf4c4-8272-4f24-8e48-525d7a278b26\") " Feb 23 14:20:10.684428 master-0 kubenswrapper[7728]: I0223 14:20:10.684427 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a0bf4c4-8272-4f24-8e48-525d7a278b26-catalog-content\") pod \"2a0bf4c4-8272-4f24-8e48-525d7a278b26\" (UID: \"2a0bf4c4-8272-4f24-8e48-525d7a278b26\") " Feb 23 14:20:10.684766 master-0 kubenswrapper[7728]: I0223 14:20:10.684712 7728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43ce2f82-05aa-4778-a444-848a408cf570-utilities\") on node \"master-0\" DevicePath \"\"" Feb 23 14:20:10.684766 master-0 kubenswrapper[7728]: I0223 14:20:10.684744 7728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a0bf4c4-8272-4f24-8e48-525d7a278b26-utilities\") on node \"master-0\" DevicePath \"\"" Feb 23 14:20:10.688291 master-0 kubenswrapper[7728]: I0223 14:20:10.688206 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43ce2f82-05aa-4778-a444-848a408cf570-kube-api-access-ggcqk" (OuterVolumeSpecName: "kube-api-access-ggcqk") pod "43ce2f82-05aa-4778-a444-848a408cf570" (UID: "43ce2f82-05aa-4778-a444-848a408cf570"). InnerVolumeSpecName "kube-api-access-ggcqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:20:10.688452 master-0 kubenswrapper[7728]: I0223 14:20:10.688419 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a0bf4c4-8272-4f24-8e48-525d7a278b26-kube-api-access-59xbh" (OuterVolumeSpecName: "kube-api-access-59xbh") pod "2a0bf4c4-8272-4f24-8e48-525d7a278b26" (UID: "2a0bf4c4-8272-4f24-8e48-525d7a278b26"). InnerVolumeSpecName "kube-api-access-59xbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:20:10.708009 master-0 kubenswrapper[7728]: I0223 14:20:10.707945 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43ce2f82-05aa-4778-a444-848a408cf570-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "43ce2f82-05aa-4778-a444-848a408cf570" (UID: "43ce2f82-05aa-4778-a444-848a408cf570"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:20:10.723593 master-0 kubenswrapper[7728]: I0223 14:20:10.723535 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a0bf4c4-8272-4f24-8e48-525d7a278b26-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a0bf4c4-8272-4f24-8e48-525d7a278b26" (UID: "2a0bf4c4-8272-4f24-8e48-525d7a278b26"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:20:10.785180 master-0 kubenswrapper[7728]: I0223 14:20:10.785083 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59xbh\" (UniqueName: \"kubernetes.io/projected/2a0bf4c4-8272-4f24-8e48-525d7a278b26-kube-api-access-59xbh\") on node \"master-0\" DevicePath \"\"" Feb 23 14:20:10.785180 master-0 kubenswrapper[7728]: I0223 14:20:10.785125 7728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a0bf4c4-8272-4f24-8e48-525d7a278b26-catalog-content\") on node \"master-0\" DevicePath \"\"" Feb 23 14:20:10.785180 master-0 kubenswrapper[7728]: I0223 14:20:10.785134 7728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43ce2f82-05aa-4778-a444-848a408cf570-catalog-content\") on node \"master-0\" DevicePath \"\"" Feb 23 14:20:10.785180 master-0 kubenswrapper[7728]: I0223 14:20:10.785144 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggcqk\" (UniqueName: \"kubernetes.io/projected/43ce2f82-05aa-4778-a444-848a408cf570-kube-api-access-ggcqk\") on node \"master-0\" DevicePath \"\"" Feb 23 14:20:11.131878 master-0 kubenswrapper[7728]: I0223 14:20:11.131739 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwtc6" event={"ID":"b993917a-bce8-4467-a09d-bfc923a90460","Type":"ContainerStarted","Data":"e78de7953f2a8533477d2a1bfe68954a05a59fc83000f53a6813ad0fc2dd2095"} Feb 23 14:20:11.133993 master-0 kubenswrapper[7728]: I0223 14:20:11.133955 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n82gm" event={"ID":"43ce2f82-05aa-4778-a444-848a408cf570","Type":"ContainerDied","Data":"d034ee90da289ad6b98a9bab827fdbf238a1e22975fb6217c4d11b94c5b9f815"} Feb 23 14:20:11.133993 master-0 kubenswrapper[7728]: I0223 14:20:11.133989 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n82gm" Feb 23 14:20:11.134161 master-0 kubenswrapper[7728]: I0223 14:20:11.134015 7728 scope.go:117] "RemoveContainer" containerID="aee8f69f97ca5ab0b4e2570f9519b21cf1cd68e3c1a1e58e99c69e8e582ff273" Feb 23 14:20:11.136380 master-0 kubenswrapper[7728]: I0223 14:20:11.136352 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xxh6f_2a0bf4c4-8272-4f24-8e48-525d7a278b26/extract-content/0.log" Feb 23 14:20:11.136837 master-0 kubenswrapper[7728]: I0223 14:20:11.136810 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xxh6f" Feb 23 14:20:11.137067 master-0 kubenswrapper[7728]: I0223 14:20:11.136964 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xxh6f" event={"ID":"2a0bf4c4-8272-4f24-8e48-525d7a278b26","Type":"ContainerDied","Data":"023a865fc1618efadf669d9c23556b255086ea013fc0ed3bfdf62e3156c24070"} Feb 23 14:20:11.138387 master-0 kubenswrapper[7728]: I0223 14:20:11.138354 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mtvwp" event={"ID":"3a398c0f-1b6a-4836-a8b4-33b004350d84","Type":"ContainerStarted","Data":"05b715f2ab02ad5806af23b9ce5ee7e81a28b5cc350517ef49d7396788fb5efc"} Feb 23 14:20:11.153107 master-0 kubenswrapper[7728]: I0223 14:20:11.153069 7728 scope.go:117] "RemoveContainer" containerID="f5924ea863c0ed9e2a9af26deac1c7c0b867ae7a40e98d453b8a9fd857b9c3cf" Feb 23 14:20:11.173133 master-0 kubenswrapper[7728]: I0223 14:20:11.173093 7728 scope.go:117] "RemoveContainer" containerID="962d19618a794630debc80be44aa12464f3df0a838c893cad0f8f7c0495b4423" Feb 23 14:20:11.188585 master-0 kubenswrapper[7728]: I0223 14:20:11.188545 7728 scope.go:117] "RemoveContainer" containerID="65283b394e3f27d5698322b04176e2aa8708a2e0dcc6d84e940b16ca132e47aa" Feb 23 14:20:12.486261 master-0 kubenswrapper[7728]: I0223 14:20:12.486148 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:20:12.487058 master-0 kubenswrapper[7728]: I0223 14:20:12.486259 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:20:12.487058 master-0 kubenswrapper[7728]: I0223 14:20:12.486177 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:20:12.487058 master-0 kubenswrapper[7728]: I0223 14:20:12.486371 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:20:13.020979 master-0 kubenswrapper[7728]: I0223 14:20:13.020924 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vwtc6" Feb 23 14:20:13.021216 master-0 kubenswrapper[7728]: I0223 14:20:13.021002 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vwtc6" Feb 23 14:20:13.080792 master-0 kubenswrapper[7728]: I0223 14:20:13.080707 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vwtc6" Feb 23 14:20:14.166180 master-0 kubenswrapper[7728]: I0223 14:20:14.166109 7728 generic.go:334] "Generic (PLEG): container finished" podID="12dab5d350ebc129b0bfa4714d330b15" containerID="86dd361ededa7f9d61d9c2bea900261b661a76c0468603804e9af20765f8d8cd" exitCode=0 Feb 23 14:20:15.486784 master-0 kubenswrapper[7728]: I0223 14:20:15.486662 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:20:15.487812 master-0 kubenswrapper[7728]: I0223 14:20:15.486777 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:20:15.487812 master-0 kubenswrapper[7728]: I0223 14:20:15.486947 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:20:15.487812 master-0 kubenswrapper[7728]: I0223 14:20:15.486994 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:20:15.897944 master-0 kubenswrapper[7728]: I0223 14:20:15.897776 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mtvwp" Feb 23 14:20:15.897944 master-0 kubenswrapper[7728]: I0223 14:20:15.897846 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mtvwp" Feb 23 14:20:16.363643 master-0 kubenswrapper[7728]: I0223 14:20:16.363536 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:20:16.964813 master-0 kubenswrapper[7728]: I0223 14:20:16.964748 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mtvwp" podUID="3a398c0f-1b6a-4836-a8b4-33b004350d84" containerName="registry-server" probeResult="failure" output=< Feb 23 14:20:16.964813 master-0 kubenswrapper[7728]: timeout: failed to connect service ":50051" within 1s Feb 23 14:20:16.964813 master-0 kubenswrapper[7728]: > Feb 23 14:20:17.188084 master-0 kubenswrapper[7728]: I0223 14:20:17.188012 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_12dab5d350ebc129b0bfa4714d330b15/etcdctl/0.log" Feb 23 14:20:17.188290 master-0 kubenswrapper[7728]: I0223 14:20:17.188087 7728 generic.go:334] "Generic (PLEG): container finished" podID="12dab5d350ebc129b0bfa4714d330b15" containerID="5809ecf60a8e4db68dfab073298af03c567dcc4e91a5b6d7f6d78ca758010d15" exitCode=137 Feb 23 14:20:17.188290 master-0 kubenswrapper[7728]: I0223 14:20:17.188139 7728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd18675422a4846ac8ff692dbc3019546e4c2cecfa8b4d0fe07976539e44abe0" Feb 23 14:20:17.192681 master-0 kubenswrapper[7728]: I0223 14:20:17.192632 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_12dab5d350ebc129b0bfa4714d330b15/etcdctl/0.log" Feb 23 14:20:17.192789 master-0 kubenswrapper[7728]: I0223 14:20:17.192751 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Feb 23 14:20:17.231636 master-0 kubenswrapper[7728]: I0223 14:20:17.231520 7728 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Feb 23 14:20:17.385515 master-0 kubenswrapper[7728]: I0223 14:20:17.385402 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/12dab5d350ebc129b0bfa4714d330b15-data-dir\") pod \"12dab5d350ebc129b0bfa4714d330b15\" (UID: \"12dab5d350ebc129b0bfa4714d330b15\") " Feb 23 14:20:17.385828 master-0 kubenswrapper[7728]: I0223 14:20:17.385571 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12dab5d350ebc129b0bfa4714d330b15-data-dir" (OuterVolumeSpecName: "data-dir") pod "12dab5d350ebc129b0bfa4714d330b15" (UID: "12dab5d350ebc129b0bfa4714d330b15"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:20:17.385828 master-0 kubenswrapper[7728]: I0223 14:20:17.385646 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/12dab5d350ebc129b0bfa4714d330b15-certs\") pod \"12dab5d350ebc129b0bfa4714d330b15\" (UID: \"12dab5d350ebc129b0bfa4714d330b15\") " Feb 23 14:20:17.385828 master-0 kubenswrapper[7728]: I0223 14:20:17.385746 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12dab5d350ebc129b0bfa4714d330b15-certs" (OuterVolumeSpecName: "certs") pod "12dab5d350ebc129b0bfa4714d330b15" (UID: "12dab5d350ebc129b0bfa4714d330b15"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:20:17.386024 master-0 kubenswrapper[7728]: I0223 14:20:17.385988 7728 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/12dab5d350ebc129b0bfa4714d330b15-certs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:20:17.386024 master-0 kubenswrapper[7728]: I0223 14:20:17.386011 7728 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/12dab5d350ebc129b0bfa4714d330b15-data-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:20:18.194523 master-0 kubenswrapper[7728]: I0223 14:20:18.194387 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Feb 23 14:20:18.486367 master-0 kubenswrapper[7728]: I0223 14:20:18.486146 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:20:18.486367 master-0 kubenswrapper[7728]: I0223 14:20:18.486251 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:20:18.486367 master-0 kubenswrapper[7728]: I0223 14:20:18.486321 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:20:18.486812 master-0 kubenswrapper[7728]: I0223 14:20:18.486394 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:20:18.697454 master-0 kubenswrapper[7728]: E0223 14:20:18.697322 7728 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:20:19.364317 master-0 kubenswrapper[7728]: I0223 14:20:19.364233 7728 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:20:19.403932 master-0 kubenswrapper[7728]: E0223 14:20:19.403863 7728 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:20:21.013214 master-0 kubenswrapper[7728]: E0223 14:20:21.012964 7728 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0-master-0.1896e5ff0bc02bc0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:12dab5d350ebc129b0bfa4714d330b15,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Killing,Message:Stopping container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:19:46.993433536 +0000 UTC m=+79.956094832,LastTimestamp:2026-02-23 14:19:46.993433536 +0000 UTC m=+79.956094832,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:20:21.486601 master-0 kubenswrapper[7728]: I0223 14:20:21.486436 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:20:21.487035 master-0 kubenswrapper[7728]: I0223 14:20:21.486602 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:20:22.096807 master-0 kubenswrapper[7728]: E0223 14:20:22.096749 7728 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Feb 23 14:20:24.485916 master-0 kubenswrapper[7728]: I0223 14:20:24.485841 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:20:24.486834 master-0 kubenswrapper[7728]: I0223 14:20:24.485921 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:20:25.257101 master-0 kubenswrapper[7728]: I0223 14:20:25.257026 7728 generic.go:334] "Generic (PLEG): container finished" podID="cf04aca0-8174-4134-835d-37adf6a3b5ca" containerID="93ee993f97732b66b7b7fa627308e4fbe3771a952955dfa9d4f021a884360bf3" exitCode=0 Feb 23 14:20:25.259805 master-0 kubenswrapper[7728]: I0223 14:20:25.259762 7728 generic.go:334] "Generic (PLEG): container finished" podID="865ceedb-b19a-4f2f-b295-311e1b7a645e" containerID="515b3836a32aed4579312ac49c6468a1e7035624b7a30950b8364d5d10c9310d" exitCode=0 Feb 23 14:20:27.486407 master-0 kubenswrapper[7728]: I0223 14:20:27.486309 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:20:27.486407 master-0 kubenswrapper[7728]: I0223 14:20:27.486398 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:20:28.698007 master-0 kubenswrapper[7728]: E0223 14:20:28.697872 7728 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:20:29.298734 master-0 kubenswrapper[7728]: I0223 14:20:29.298453 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_0fdb9885-7479-43b5-8613-b2857a798ade/installer/0.log" Feb 23 14:20:29.298734 master-0 kubenswrapper[7728]: I0223 14:20:29.298589 7728 generic.go:334] "Generic (PLEG): container finished" podID="0fdb9885-7479-43b5-8613-b2857a798ade" containerID="0a7994e86e7ddf474fa9a6e9d028e17c8d71e5299119418e1b05d25a7b604984" exitCode=1 Feb 23 14:20:29.364605 master-0 kubenswrapper[7728]: I0223 14:20:29.364441 7728 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 14:20:29.405113 master-0 kubenswrapper[7728]: E0223 14:20:29.405003 7728 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:20:30.308135 master-0 kubenswrapper[7728]: I0223 14:20:30.308000 7728 generic.go:334] "Generic (PLEG): container finished" podID="24829faf-50e8-45bb-abb0-7cc5ccf81080" containerID="0e43678d3197cf112cf0a044926bfa730d56557262cc8421afdcc26a5ee07b83" exitCode=0 Feb 23 14:20:30.310504 master-0 kubenswrapper[7728]: I0223 14:20:30.310442 7728 generic.go:334] "Generic (PLEG): container finished" podID="961e4ecd-545b-4270-ae34-e733dec793b6" containerID="5ed2538f1dd4c505937625e4613ce7839a7ad1306cb779a0660bf410856f74ea" exitCode=0 Feb 23 14:20:30.485873 master-0 kubenswrapper[7728]: I0223 14:20:30.485785 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:20:30.486085 master-0 kubenswrapper[7728]: I0223 14:20:30.485867 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:20:31.317365 master-0 kubenswrapper[7728]: I0223 14:20:31.317287 7728 generic.go:334] "Generic (PLEG): container finished" podID="674041a2-e2b0-4286-88cc-f1b00571e3f3" containerID="444b5986734e966174e693b843714d39c39b89099075b49c0d4944256ff9f4ae" exitCode=0 Feb 23 14:20:33.486386 master-0 kubenswrapper[7728]: I0223 14:20:33.486262 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:20:33.487095 master-0 kubenswrapper[7728]: I0223 14:20:33.486395 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:20:35.235261 master-0 kubenswrapper[7728]: E0223 14:20:35.233712 7728 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Feb 23 14:20:35.344322 master-0 kubenswrapper[7728]: I0223 14:20:35.344232 7728 generic.go:334] "Generic (PLEG): container finished" podID="b714a9df-026e-423d-a980-2569f0d92e47" containerID="5867cf57b319e8b378703de8112e0a4c5fd05aee108af7754fc3219eac54a673" exitCode=0 Feb 23 14:20:36.368340 master-0 kubenswrapper[7728]: I0223 14:20:36.368284 7728 generic.go:334] "Generic (PLEG): container finished" podID="18a83278819db2092fa26d8274eb3f00" containerID="626890ddbc06982ad60de27c4c4ad3f994d6a386f27886fbc0cdba298ce4fc87" exitCode=0 Feb 23 14:20:36.485695 master-0 kubenswrapper[7728]: I0223 14:20:36.485638 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:20:36.485920 master-0 kubenswrapper[7728]: I0223 14:20:36.485710 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:20:38.698787 master-0 kubenswrapper[7728]: E0223 14:20:38.698640 7728 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:20:38.699807 master-0 kubenswrapper[7728]: I0223 14:20:38.698770 7728 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 23 14:20:39.364230 master-0 kubenswrapper[7728]: I0223 14:20:39.364087 7728 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:20:39.406384 master-0 kubenswrapper[7728]: E0223 14:20:39.406284 7728 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:20:39.406384 master-0 kubenswrapper[7728]: E0223 14:20:39.406340 7728 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 14:20:39.486443 master-0 kubenswrapper[7728]: I0223 14:20:39.486359 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:20:39.486790 master-0 kubenswrapper[7728]: I0223 14:20:39.486442 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:20:40.395538 master-0 kubenswrapper[7728]: I0223 14:20:40.395380 7728 generic.go:334] "Generic (PLEG): container finished" podID="b9cf1c39-24f0-420b-8020-089616d1cdf0" containerID="4f3667b06f9040c2373de3a09349d52a663561d04056133aea74705119d3b818" exitCode=0 Feb 23 14:20:42.486120 master-0 kubenswrapper[7728]: I0223 14:20:42.486024 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:20:42.486951 master-0 kubenswrapper[7728]: I0223 14:20:42.486117 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:20:44.425148 master-0 kubenswrapper[7728]: I0223 14:20:44.424948 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-td489_bbe678de-546d-49d0-8280-3f6d94fa5e4f/approver/0.log" Feb 23 14:20:44.425982 master-0 kubenswrapper[7728]: I0223 14:20:44.425678 7728 generic.go:334] "Generic (PLEG): container finished" podID="bbe678de-546d-49d0-8280-3f6d94fa5e4f" containerID="86a800fe59aed9a0c248de7a352a6c1ffaea2cbdde27bb246147baa866e1c79a" exitCode=1 Feb 23 14:20:45.486023 master-0 kubenswrapper[7728]: I0223 14:20:45.485909 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:20:45.486023 master-0 kubenswrapper[7728]: I0223 14:20:45.485997 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:20:48.123511 master-0 kubenswrapper[7728]: I0223 14:20:48.123384 7728 status_manager.go:851] "Failed to get status for pod" podUID="d1bffce5-019a-4c97-85f2-929dc19a0bde" pod="openshift-kube-controller-manager/installer-1-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods installer-1-master-0)" Feb 23 14:20:48.486529 master-0 kubenswrapper[7728]: I0223 14:20:48.486413 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:20:48.486529 master-0 kubenswrapper[7728]: I0223 14:20:48.486524 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:20:48.701625 master-0 kubenswrapper[7728]: E0223 14:20:48.701505 7728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="200ms" Feb 23 14:20:51.234508 master-0 kubenswrapper[7728]: E0223 14:20:51.234418 7728 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Feb 23 14:20:51.235599 master-0 kubenswrapper[7728]: E0223 14:20:51.235037 7728 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.015s" Feb 23 14:20:51.235599 master-0 kubenswrapper[7728]: I0223 14:20:51.235433 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" event={"ID":"cf04aca0-8174-4134-835d-37adf6a3b5ca","Type":"ContainerDied","Data":"93ee993f97732b66b7b7fa627308e4fbe3771a952955dfa9d4f021a884360bf3"} Feb 23 14:20:51.237304 master-0 kubenswrapper[7728]: I0223 14:20:51.237247 7728 scope.go:117] "RemoveContainer" containerID="93ee993f97732b66b7b7fa627308e4fbe3771a952955dfa9d4f021a884360bf3" Feb 23 14:20:51.238526 master-0 kubenswrapper[7728]: I0223 14:20:51.238101 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:20:51.238526 master-0 kubenswrapper[7728]: I0223 14:20:51.238437 7728 scope.go:117] "RemoveContainer" containerID="86a800fe59aed9a0c248de7a352a6c1ffaea2cbdde27bb246147baa866e1c79a" Feb 23 14:20:51.239293 master-0 kubenswrapper[7728]: I0223 14:20:51.238992 7728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"12ea317144b1f97c12db3d866b1cc7b66073f64b41a37f71cd1c51e60dce3e4c"} pod="kube-system/bootstrap-kube-controller-manager-master-0" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Feb 23 14:20:51.239293 master-0 kubenswrapper[7728]: I0223 14:20:51.239123 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="kube-controller-manager" containerID="cri-o://12ea317144b1f97c12db3d866b1cc7b66073f64b41a37f71cd1c51e60dce3e4c" gracePeriod=30 Feb 23 14:20:51.245387 master-0 kubenswrapper[7728]: I0223 14:20:51.245333 7728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12dab5d350ebc129b0bfa4714d330b15" path="/var/lib/kubelet/pods/12dab5d350ebc129b0bfa4714d330b15/volumes" Feb 23 14:20:51.245805 master-0 kubenswrapper[7728]: I0223 14:20:51.245768 7728 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Feb 23 14:20:51.472149 master-0 kubenswrapper[7728]: I0223 14:20:51.472098 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-td489_bbe678de-546d-49d0-8280-3f6d94fa5e4f/approver/0.log" Feb 23 14:20:51.476331 master-0 kubenswrapper[7728]: I0223 14:20:51.476286 7728 generic.go:334] "Generic (PLEG): container finished" podID="c9ad9373c007a4fcd25e70622bdc8deb" containerID="12ea317144b1f97c12db3d866b1cc7b66073f64b41a37f71cd1c51e60dce3e4c" exitCode=2 Feb 23 14:20:51.486144 master-0 kubenswrapper[7728]: I0223 14:20:51.486048 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:20:51.486144 master-0 kubenswrapper[7728]: I0223 14:20:51.486111 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:20:52.499103 master-0 kubenswrapper[7728]: I0223 14:20:52.499028 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-td489_bbe678de-546d-49d0-8280-3f6d94fa5e4f/approver/0.log" Feb 23 14:20:54.486064 master-0 kubenswrapper[7728]: I0223 14:20:54.485941 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:20:54.486064 master-0 kubenswrapper[7728]: I0223 14:20:54.486033 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:20:55.015750 master-0 kubenswrapper[7728]: E0223 14:20:55.015410 7728 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{machine-approver-7dd9c7d7b9-rn8fj.1896e5ff38c896d2 openshift-cluster-machine-approver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-cluster-machine-approver,Name:machine-approver-7dd9c7d7b9-rn8fj,UID:c67a2ed2-f520-46fc-84d3-6816dc19f4e0,APIVersion:v1,ResourceVersion:9572,FieldPath:spec.containers{machine-approver-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2ba8aec9f09d75121b95d2e6f1097415302c0ae7121fa7076fd38d7adb9a5afa\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:19:47.748959954 +0000 UTC m=+80.711621250,LastTimestamp:2026-02-23 14:19:47.748959954 +0000 UTC m=+80.711621250,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:20:58.123810 master-0 kubenswrapper[7728]: I0223 14:20:58.122099 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:20:58.123810 master-0 kubenswrapper[7728]: I0223 14:20:58.122193 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:20:58.902830 master-0 kubenswrapper[7728]: E0223 14:20:58.902746 7728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="400ms" Feb 23 14:20:59.595209 master-0 kubenswrapper[7728]: E0223 14:20:59.595059 7728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T14:20:49Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T14:20:49Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T14:20:49Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T14:20:49Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:584b5d125dad1fa4f8d03e6ace2e4901c173569ff1ed9536da6915c56fa52bc0\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8124eb3839b25af23303e9fdde35728bfd24d7c0c47530e77852cba1dd9d1ffb\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1702755272},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:94d88fe2fa42931a725508dbf17296b6ed99b8e20c1169f5d1fb8a36f4927ddd\\\"],\\\"sizeBytes\\\":1637274270},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d7a8ac0ba2e5115c9d451d553741173ae8744d4544da15e28bf38f61630182fd\\\"],\\\"sizeBytes\\\":1237794314},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:c2af15d278f72034eecf3db74223b7e61f3d07c1a5c7ba760e7586915ff1b17e\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:e166d1252d7455d8bd62e43f2967e738ee9bdd6a09b7771a4187d82477ae7535\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1237042376},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:b5385b46d054c9ee73478bf23e07056d0b9f81d34619d0949927d8d9e791fcb5\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:ebdc10b149ba97b999770285d06149ef92c780205d916c3cab994098e20be0ba\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1210455233},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:518982b9ad8a8bfb7bb3b4216b235cac99e126df3bb48e390b36064560c76b83\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b3293b04e31c8e67c885f77e0ad2ee994295afde7c42cb9761c7090ae0cdb3f8\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1202767548},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4775c6461221dafe3ddd67ff683ccb665bed6eb278fa047d9d744aab9af65dcf\\\"],\\\"sizeBytes\\\":992461126},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8177c465e14c63854e5c0fa95ca0635cffc9b5dd3d077ecf971feedbc42b1274\\\"],\\\"sizeBytes\\\":943734757},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6c7ec917f0eff7b41d7174f1b5fdc4ce53ad106e51599afba731a8431ff9caa7\\\"],\\\"sizeBytes\\\":918153745},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8ff40a2d97bf7a95e19303f7e972b7e8354a3864039111c6d33d5479117aaeed\\\"],\\\"sizeBytes\\\":880247193},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:72fafcd55ab739919dd8a114863fda27106af1c497f474e7ce0cb23b58dfa021\\\"],\\\"sizeBytes\\\":875998518},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7b9239f1f5e9590e3db71e61fde86db8f43e0085f61ae7769508d2ea058481c7\\\"],\\\"sizeBytes\\\":862501144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:572b0ca6e993beea2ee9346197665e56a2e4999fbb6958c747c48a35bf72ee34\\\"],\\\"sizeBytes\\\":862091954},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3fa84eaa1310d97fe55bb23a7c27ece85718d0643fa7fc0ff81014edb4b948b\\\"],\\\"sizeBytes\\\":772838975},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bd420e879c9f0271bca2d123a6d762591d9a4626b72f254d1f885842c32149e8\\\"],\\\"sizeBytes\\\":687849728},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3c467c1eeba7434b2aebf07169ab8afe0203d638e871dbdf29a16f830e9aef9e\\\"],\\\"sizeBytes\\\":682963466},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5121a0944000b7bfa57ae2e4eb3f412e1b4b89fcc75eec1ef20241182c0527f2\\\"],\\\"sizeBytes\\\":677827184},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a31b448302fbb994548ed801ac488a44e8a7c4ae9149c3b4cc20d6af832f83\\\"],\\\"sizeBytes\\\":621542709},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3e089c4e4fa9a22803b2673b776215e021a1f12a856dbcaba2fadee29bee10a3\\\"],\\\"sizeBytes\\\":589275174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1582ea693f35073e3316e2380a18227b78096ca7f4e1328f1dd8a2c423da26e9\\\"],\\\"sizeBytes\\\":582052489},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:314be88d356b2c8a3c4416daeb4cfcd58d617a4526319c01ddaffae4b4179e74\\\"],\\\"sizeBytes\\\":558105176},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:69f9df2f6b5cd83ab895e9e4a9bf8920d35fe450679ce06fb223944e95cfbe3e\\\"],\\\"sizeBytes\\\":557320737},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f86073cf0561e4b69668f8917ef5184cb0ef5aa16d0fefe38118f1167b268721\\\"],\\\"sizeBytes\\\":548646306},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d77a77c401bcfaa65a6ab6de82415af0e7ace1b470626647e5feb4875c89a5ef\\\"],\\\"sizeBytes\\\":529218694},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bc0ca626e5e17f9f78ddbfde54ea13ddc7749904911817bba16e6b59f30499ec\\\"],\\\"sizeBytes\\\":528829499},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:11f566fe2ae782ad96d36028b0fd81911a64ef787dcebc83803f741f272fa396\\\"],\\\"sizeBytes\\\":518279996},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:40bb7cf7c637bf9efd8fb0157839d325a019d67cc7d7279665fcf90dbb7f3f33\\\"],\\\"sizeBytes\\\":517888569},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fd63e2c1185e529c6e9f6e1426222ff2ac195132b44a1775f407e4593b66d4c\\\"],\\\"sizeBytes\\\":514875199},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a1b426a276216372c7d688fe60e9eaf251efd35071f94e1bcd4337f51a90fd75\\\"],\\\"sizeBytes\\\":513473308},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ce471c00b59fd855a59f7efa9afdb3f0f9cbf1c4bcce3a82fe1a4cb82e90f52e\\\"],\\\"sizeBytes\\\":513119434},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a9dcbc6b966928b7597d4a822948ae6f07b62feecb91679c1d825d0d19426e19\\\"],\\\"sizeBytes\\\":512172666},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5f4a546983224e416dfcc3a700afc15f9790182a5a2f8f7c94892d0e95abab3\\\"],\\\"sizeBytes\\\":511125422},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2c8de5c5b21ed8c7829ba988d580ffa470c9913877fe0ee5e11bf507400ffbc7\\\"],\\\"sizeBytes\\\":511059399},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:64ba461fd5594e3a30bfd755f1496707a88249bc68d07c65124c8617d664d2ac\\\"],\\\"sizeBytes\\\":508786786},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a82e441a9e9b93f0e010f1ce26e30c24b6ca93f7752084d4694ebdb3c5b53f83\\\"],\\\"sizeBytes\\\":508443359},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d7bd3361d506dcc1be3afa62d35080c5dd37afccc26cd36019e2b9db2c45f896\\\"],\\\"sizeBytes\\\":507867630},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:034588ffd95ce834e866279bf80a45af2cddda631c6c9a6344c1bb2e033fd83e\\\"],\\\"sizeBytes\\\":506374680},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8618d42fe4da4881abe39e98691d187e13713981b66d0dac0a11cb1287482b7\\\"],\\\"sizeBytes\\\":506291135},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ce68078d909b63bb5b872d94c04829aa1b5812c416abbaf9024840d348ee68b1\\\"],\\\"sizeBytes\\\":505244089},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:457c564075e8b14b1d24ff6eab750600ebc90ff8b7bb137306a579ee8445ae95\\\"],\\\"sizeBytes\\\":505137106},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ebf883de8fd905490f0c9b420a5d6446ecde18e12e15364f6dcd4e885104972c\\\"],\\\"sizeBytes\\\":504558291},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:897708222502e4d710dd737923f74d153c084ba6048bffceb16dfd30f79a6ecc\\\"],\\\"sizeBytes\\\":504513960},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:86d9e1fdf97794f44fc1c91da025714ec6900fafa6cdc4c0041ffa95e9d70c6c\\\"],\\\"sizeBytes\\\":495888162},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4e8c6ae1f9a450c90857c9fbccf1e5fb404dbc0d65d086afce005d6bd307853b\\\"],\\\"sizeBytes\\\":494959854},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:117a846734fc8159b7172a40ed2feb43a969b7dbc113ee1a572cbf6f9f922655\\\"],\\\"sizeBytes\\\":486990304},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4797a485fd4ab3414ba8d52bdf2afccefab6c657b1d259baad703fca5145124c\\\"],\\\"sizeBytes\\\":484349508},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a132d09565133b36ac7c797213d6a74ac810bb368ef59136320ab3d300f45bd\\\"],\\\"sizeBytes\\\":484074784},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6acc7c3c018d8bb3cb597580eedae0300c44a5424f07129270c878899ef592a6\\\"],\\\"sizeBytes\\\":470717179},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:235b846666adaa2e4b4d6d0f7fd71d57bf3be253466e1d9fffafd103fa2696ac\\\"],\\\"sizeBytes\\\":470575802},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ce89154fa3fe1e87c660e644b58cf125fede575869fd5841600082c0d1f858a3\\\"],\\\"sizeBytes\\\":468159025}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:21:00.486596 master-0 kubenswrapper[7728]: I0223 14:21:00.486520 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:21:00.486841 master-0 kubenswrapper[7728]: I0223 14:21:00.486617 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:21:03.163202 master-0 kubenswrapper[7728]: I0223 14:21:03.163133 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_493a9ed3-6d64-489a-a68c-235b69a58782/installer/0.log" Feb 23 14:21:03.163202 master-0 kubenswrapper[7728]: I0223 14:21:03.163198 7728 generic.go:334] "Generic (PLEG): container finished" podID="493a9ed3-6d64-489a-a68c-235b69a58782" containerID="1df16973da8e7c98a51b37b7335c255585ebd5dc4bbbed0d842fe3c32df42186" exitCode=1 Feb 23 14:21:03.486345 master-0 kubenswrapper[7728]: I0223 14:21:03.486270 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:21:03.486657 master-0 kubenswrapper[7728]: I0223 14:21:03.486370 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:21:06.487279 master-0 kubenswrapper[7728]: I0223 14:21:06.487148 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:21:06.488184 master-0 kubenswrapper[7728]: I0223 14:21:06.487270 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:21:09.304300 master-0 kubenswrapper[7728]: E0223 14:21:09.304193 7728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="800ms" Feb 23 14:21:09.596235 master-0 kubenswrapper[7728]: E0223 14:21:09.596062 7728 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": the server was unable to return a response in the time allotted, but may still be processing the request (get nodes master-0)" Feb 23 14:21:10.486172 master-0 kubenswrapper[7728]: I0223 14:21:10.486081 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:21:10.487043 master-0 kubenswrapper[7728]: I0223 14:21:10.486184 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:21:13.229361 master-0 kubenswrapper[7728]: I0223 14:21:13.229270 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-6569778c84-hsl6c_3488a7eb-5170-478c-9af7-490dbe0f514e/ingress-operator/0.log" Feb 23 14:21:13.230315 master-0 kubenswrapper[7728]: I0223 14:21:13.229367 7728 generic.go:334] "Generic (PLEG): container finished" podID="3488a7eb-5170-478c-9af7-490dbe0f514e" containerID="475b682a5602a8b70516629df8770a92cda1f614d3b2e4b8f4d6b708bbc8532d" exitCode=1 Feb 23 14:21:13.486192 master-0 kubenswrapper[7728]: I0223 14:21:13.485984 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:21:13.486192 master-0 kubenswrapper[7728]: I0223 14:21:13.486119 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:21:16.486706 master-0 kubenswrapper[7728]: I0223 14:21:16.486602 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:21:16.487371 master-0 kubenswrapper[7728]: I0223 14:21:16.486729 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:21:19.485281 master-0 kubenswrapper[7728]: I0223 14:21:19.485194 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:21:19.485281 master-0 kubenswrapper[7728]: I0223 14:21:19.485273 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:21:19.596601 master-0 kubenswrapper[7728]: E0223 14:21:19.596511 7728 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:21:20.105663 master-0 kubenswrapper[7728]: E0223 14:21:20.105560 7728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="1.6s" Feb 23 14:21:22.486413 master-0 kubenswrapper[7728]: I0223 14:21:22.486307 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:21:22.487613 master-0 kubenswrapper[7728]: I0223 14:21:22.486413 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:21:25.248677 master-0 kubenswrapper[7728]: E0223 14:21:25.248610 7728 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Feb 23 14:21:25.249192 master-0 kubenswrapper[7728]: E0223 14:21:25.248871 7728 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.011s" Feb 23 14:21:25.249192 master-0 kubenswrapper[7728]: I0223 14:21:25.248905 7728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" Feb 23 14:21:25.249192 master-0 kubenswrapper[7728]: I0223 14:21:25.248941 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" event={"ID":"865ceedb-b19a-4f2f-b295-311e1b7a645e","Type":"ContainerDied","Data":"515b3836a32aed4579312ac49c6468a1e7035624b7a30950b8364d5d10c9310d"} Feb 23 14:21:25.249192 master-0 kubenswrapper[7728]: I0223 14:21:25.249172 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mtvwp" Feb 23 14:21:25.249986 master-0 kubenswrapper[7728]: I0223 14:21:25.249930 7728 scope.go:117] "RemoveContainer" containerID="515b3836a32aed4579312ac49c6468a1e7035624b7a30950b8364d5d10c9310d" Feb 23 14:21:25.250717 master-0 kubenswrapper[7728]: I0223 14:21:25.250273 7728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="openshift-config-operator" containerStatusID={"Type":"cri-o","ID":"502abf4ea3cb690eb21a0ba5e773be5fbc2712d7f83f4ac4448a35b53cf2ac71"} pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" containerMessage="Container openshift-config-operator failed liveness probe, will be restarted" Feb 23 14:21:25.250717 master-0 kubenswrapper[7728]: I0223 14:21:25.250367 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" containerID="cri-o://502abf4ea3cb690eb21a0ba5e773be5fbc2712d7f83f4ac4448a35b53cf2ac71" gracePeriod=30 Feb 23 14:21:25.261502 master-0 kubenswrapper[7728]: I0223 14:21:25.261428 7728 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Feb 23 14:21:25.265661 master-0 kubenswrapper[7728]: I0223 14:21:25.265603 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": read tcp 10.128.0.2:49658->10.128.0.52:8443: read: connection reset by peer" start-of-body= Feb 23 14:21:25.265756 master-0 kubenswrapper[7728]: I0223 14:21:25.265677 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": read tcp 10.128.0.2:49658->10.128.0.52:8443: read: connection reset by peer" Feb 23 14:21:25.266688 master-0 kubenswrapper[7728]: I0223 14:21:25.266361 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:21:25.267042 master-0 kubenswrapper[7728]: I0223 14:21:25.266425 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:21:26.321109 master-0 kubenswrapper[7728]: I0223 14:21:26.321027 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-6f47d587d6-55qjr_92c63c95-e880-4f51-9858-7715343f7bd8/openshift-config-operator/1.log" Feb 23 14:21:26.322073 master-0 kubenswrapper[7728]: I0223 14:21:26.321726 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-6f47d587d6-55qjr_92c63c95-e880-4f51-9858-7715343f7bd8/openshift-config-operator/0.log" Feb 23 14:21:26.322602 master-0 kubenswrapper[7728]: I0223 14:21:26.322533 7728 generic.go:334] "Generic (PLEG): container finished" podID="92c63c95-e880-4f51-9858-7715343f7bd8" containerID="502abf4ea3cb690eb21a0ba5e773be5fbc2712d7f83f4ac4448a35b53cf2ac71" exitCode=255 Feb 23 14:21:27.486017 master-0 kubenswrapper[7728]: I0223 14:21:27.485961 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:21:27.487062 master-0 kubenswrapper[7728]: I0223 14:21:27.486801 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:21:29.017995 master-0 kubenswrapper[7728]: E0223 14:21:29.017716 7728 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{machine-approver-7dd9c7d7b9-rn8fj.1896e5ff4916a992 openshift-cluster-machine-approver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-cluster-machine-approver,Name:machine-approver-7dd9c7d7b9-rn8fj,UID:c67a2ed2-f520-46fc-84d3-6816dc19f4e0,APIVersion:v1,ResourceVersion:9572,FieldPath:spec.containers{machine-approver-controller},},Reason:Created,Message:Created container: machine-approver-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:19:48.022512018 +0000 UTC m=+80.985173314,LastTimestamp:2026-02-23 14:19:48.022512018 +0000 UTC m=+80.985173314,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:21:29.597748 master-0 kubenswrapper[7728]: E0223 14:21:29.597629 7728 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:21:30.486004 master-0 kubenswrapper[7728]: I0223 14:21:30.485899 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:21:30.486872 master-0 kubenswrapper[7728]: I0223 14:21:30.486007 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:21:31.708099 master-0 kubenswrapper[7728]: E0223 14:21:31.707766 7728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Feb 23 14:21:33.486170 master-0 kubenswrapper[7728]: I0223 14:21:33.486040 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:21:33.486170 master-0 kubenswrapper[7728]: I0223 14:21:33.486132 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:21:36.485684 master-0 kubenswrapper[7728]: I0223 14:21:36.485580 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:21:36.485684 master-0 kubenswrapper[7728]: I0223 14:21:36.485669 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:21:39.486274 master-0 kubenswrapper[7728]: I0223 14:21:39.486131 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:21:39.486274 master-0 kubenswrapper[7728]: I0223 14:21:39.486244 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:21:39.598101 master-0 kubenswrapper[7728]: E0223 14:21:39.597997 7728 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:21:39.598101 master-0 kubenswrapper[7728]: E0223 14:21:39.598064 7728 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 14:21:42.486341 master-0 kubenswrapper[7728]: I0223 14:21:42.486235 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:21:42.486341 master-0 kubenswrapper[7728]: I0223 14:21:42.486321 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:21:44.909690 master-0 kubenswrapper[7728]: E0223 14:21:44.909513 7728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="6.4s" Feb 23 14:21:45.485794 master-0 kubenswrapper[7728]: I0223 14:21:45.485728 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:21:45.486036 master-0 kubenswrapper[7728]: I0223 14:21:45.485796 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:21:46.457360 master-0 kubenswrapper[7728]: I0223 14:21:46.457174 7728 generic.go:334] "Generic (PLEG): container finished" podID="585f74db-4593-426b-b0c7-ec8f64810549" containerID="3d191963e287b24eb8e359eae476b7710f1b01ed3998cce17300434d7f6e8d0b" exitCode=0 Feb 23 14:21:46.459891 master-0 kubenswrapper[7728]: I0223 14:21:46.459844 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-9cc7d7bb-6zmk9_1c60ff3f-2bb1-422e-be27-5eca96d85fd2/manager/0.log" Feb 23 14:21:46.459891 master-0 kubenswrapper[7728]: I0223 14:21:46.459887 7728 generic.go:334] "Generic (PLEG): container finished" podID="1c60ff3f-2bb1-422e-be27-5eca96d85fd2" containerID="0bda8d15a11221e7b98f49af56e0807945868c4a5e5d028da4a5c53d7f410c01" exitCode=1 Feb 23 14:21:47.469516 master-0 kubenswrapper[7728]: I0223 14:21:47.469409 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-84b8d9d697-2hr5s_66c72c71-f74a-43ab-bf0d-1f4c93623774/manager/0.log" Feb 23 14:21:47.470339 master-0 kubenswrapper[7728]: I0223 14:21:47.470260 7728 generic.go:334] "Generic (PLEG): container finished" podID="66c72c71-f74a-43ab-bf0d-1f4c93623774" containerID="e192093c7698f9c13f14fd55a50b3b960cd4142b3b8cb914299c2709465ffc51" exitCode=1 Feb 23 14:21:48.124968 master-0 kubenswrapper[7728]: I0223 14:21:48.124869 7728 status_manager.go:851] "Failed to get status for pod" podUID="c67a2ed2-f520-46fc-84d3-6816dc19f4e0" pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods machine-approver-7dd9c7d7b9-rn8fj)" Feb 23 14:21:48.486207 master-0 kubenswrapper[7728]: I0223 14:21:48.486118 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:21:48.487104 master-0 kubenswrapper[7728]: I0223 14:21:48.486213 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:21:50.265792 master-0 kubenswrapper[7728]: I0223 14:21:50.265685 7728 patch_prober.go:28] interesting pod/marketplace-operator-6f5488b997-7b5sp container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.128.0.15:8080/healthz\": dial tcp 10.128.0.15:8080: connect: connection refused" start-of-body= Feb 23 14:21:50.267990 master-0 kubenswrapper[7728]: I0223 14:21:50.265787 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" podUID="585f74db-4593-426b-b0c7-ec8f64810549" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.15:8080/healthz\": dial tcp 10.128.0.15:8080: connect: connection refused" Feb 23 14:21:50.267990 master-0 kubenswrapper[7728]: I0223 14:21:50.265832 7728 patch_prober.go:28] interesting pod/marketplace-operator-6f5488b997-7b5sp container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.15:8080/healthz\": dial tcp 10.128.0.15:8080: connect: connection refused" start-of-body= Feb 23 14:21:50.267990 master-0 kubenswrapper[7728]: I0223 14:21:50.265930 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" podUID="585f74db-4593-426b-b0c7-ec8f64810549" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.15:8080/healthz\": dial tcp 10.128.0.15:8080: connect: connection refused" Feb 23 14:21:51.486149 master-0 kubenswrapper[7728]: I0223 14:21:51.486058 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:21:51.486149 master-0 kubenswrapper[7728]: I0223 14:21:51.486133 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:21:52.508079 master-0 kubenswrapper[7728]: I0223 14:21:52.507962 7728 generic.go:334] "Generic (PLEG): container finished" podID="c9ad9373c007a4fcd25e70622bdc8deb" containerID="a0c28cc50bec94c9a70b8ff73f58f632e7f157d8192b386a307045a41a893000" exitCode=1 Feb 23 14:21:54.486424 master-0 kubenswrapper[7728]: I0223 14:21:54.486326 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:21:54.487195 master-0 kubenswrapper[7728]: I0223 14:21:54.486414 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:21:55.369537 master-0 kubenswrapper[7728]: I0223 14:21:55.369430 7728 patch_prober.go:28] interesting pod/operator-controller-controller-manager-9cc7d7bb-6zmk9 container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.36:8081/readyz\": dial tcp 10.128.0.36:8081: connect: connection refused" start-of-body= Feb 23 14:21:55.369841 master-0 kubenswrapper[7728]: I0223 14:21:55.369590 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" podUID="1c60ff3f-2bb1-422e-be27-5eca96d85fd2" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.36:8081/readyz\": dial tcp 10.128.0.36:8081: connect: connection refused" Feb 23 14:21:55.383243 master-0 kubenswrapper[7728]: I0223 14:21:55.383151 7728 patch_prober.go:28] interesting pod/catalogd-controller-manager-84b8d9d697-2hr5s container/manager namespace/openshift-catalogd: Readiness probe status=failure output="Get \"http://10.128.0.35:8081/readyz\": dial tcp 10.128.0.35:8081: connect: connection refused" start-of-body= Feb 23 14:21:55.383400 master-0 kubenswrapper[7728]: I0223 14:21:55.383281 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" podUID="66c72c71-f74a-43ab-bf0d-1f4c93623774" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.35:8081/readyz\": dial tcp 10.128.0.35:8081: connect: connection refused" Feb 23 14:21:56.543784 master-0 kubenswrapper[7728]: I0223 14:21:56.543699 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6847bb4785-5fw2x_2e89a047-9ebc-459b-b7b3-e902c1fb0e17/snapshot-controller/0.log" Feb 23 14:21:56.544602 master-0 kubenswrapper[7728]: I0223 14:21:56.543798 7728 generic.go:334] "Generic (PLEG): container finished" podID="2e89a047-9ebc-459b-b7b3-e902c1fb0e17" containerID="b8ab745e2116720c089d0aba55fcbbcd93f3d05db7dc85aaff6bdfb686118c69" exitCode=1 Feb 23 14:21:57.486173 master-0 kubenswrapper[7728]: I0223 14:21:57.486064 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:21:57.486579 master-0 kubenswrapper[7728]: I0223 14:21:57.486186 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:21:57.553793 master-0 kubenswrapper[7728]: I0223 14:21:57.553702 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb_c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04/config-sync-controllers/0.log" Feb 23 14:21:57.554701 master-0 kubenswrapper[7728]: I0223 14:21:57.554586 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb_c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04/cluster-cloud-controller-manager/0.log" Feb 23 14:21:57.554701 master-0 kubenswrapper[7728]: I0223 14:21:57.554656 7728 generic.go:334] "Generic (PLEG): container finished" podID="c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04" containerID="06ed5eab4f45a414dec39fdf73e09eda9befba12eaf73ac8d264e79dbcbe1fcb" exitCode=1 Feb 23 14:21:57.554701 master-0 kubenswrapper[7728]: I0223 14:21:57.554686 7728 generic.go:334] "Generic (PLEG): container finished" podID="c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04" containerID="89c65e32357fb90a65db3743a53bf98698ca1c5da74b91fe797e842ada8b4fd8" exitCode=1 Feb 23 14:21:59.264663 master-0 kubenswrapper[7728]: E0223 14:21:59.264562 7728 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Feb 23 14:21:59.265566 master-0 kubenswrapper[7728]: E0223 14:21:59.264748 7728 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.016s" Feb 23 14:21:59.271846 master-0 kubenswrapper[7728]: I0223 14:21:59.271798 7728 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Feb 23 14:21:59.716261 master-0 kubenswrapper[7728]: E0223 14:21:59.716034 7728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T14:21:49Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T14:21:49Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T14:21:49Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T14:21:49Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:584b5d125dad1fa4f8d03e6ace2e4901c173569ff1ed9536da6915c56fa52bc0\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8124eb3839b25af23303e9fdde35728bfd24d7c0c47530e77852cba1dd9d1ffb\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1702755272},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:94d88fe2fa42931a725508dbf17296b6ed99b8e20c1169f5d1fb8a36f4927ddd\\\"],\\\"sizeBytes\\\":1637274270},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d7a8ac0ba2e5115c9d451d553741173ae8744d4544da15e28bf38f61630182fd\\\"],\\\"sizeBytes\\\":1237794314},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:c2af15d278f72034eecf3db74223b7e61f3d07c1a5c7ba760e7586915ff1b17e\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:e166d1252d7455d8bd62e43f2967e738ee9bdd6a09b7771a4187d82477ae7535\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1237042376},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:b5385b46d054c9ee73478bf23e07056d0b9f81d34619d0949927d8d9e791fcb5\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:ebdc10b149ba97b999770285d06149ef92c780205d916c3cab994098e20be0ba\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1210455233},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:518982b9ad8a8bfb7bb3b4216b235cac99e126df3bb48e390b36064560c76b83\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b3293b04e31c8e67c885f77e0ad2ee994295afde7c42cb9761c7090ae0cdb3f8\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1202767548},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4775c6461221dafe3ddd67ff683ccb665bed6eb278fa047d9d744aab9af65dcf\\\"],\\\"sizeBytes\\\":992461126},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8177c465e14c63854e5c0fa95ca0635cffc9b5dd3d077ecf971feedbc42b1274\\\"],\\\"sizeBytes\\\":943734757},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6c7ec917f0eff7b41d7174f1b5fdc4ce53ad106e51599afba731a8431ff9caa7\\\"],\\\"sizeBytes\\\":918153745},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8ff40a2d97bf7a95e19303f7e972b7e8354a3864039111c6d33d5479117aaeed\\\"],\\\"sizeBytes\\\":880247193},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:72fafcd55ab739919dd8a114863fda27106af1c497f474e7ce0cb23b58dfa021\\\"],\\\"sizeBytes\\\":875998518},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7b9239f1f5e9590e3db71e61fde86db8f43e0085f61ae7769508d2ea058481c7\\\"],\\\"sizeBytes\\\":862501144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:572b0ca6e993beea2ee9346197665e56a2e4999fbb6958c747c48a35bf72ee34\\\"],\\\"sizeBytes\\\":862091954},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3fa84eaa1310d97fe55bb23a7c27ece85718d0643fa7fc0ff81014edb4b948b\\\"],\\\"sizeBytes\\\":772838975},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bd420e879c9f0271bca2d123a6d762591d9a4626b72f254d1f885842c32149e8\\\"],\\\"sizeBytes\\\":687849728},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3c467c1eeba7434b2aebf07169ab8afe0203d638e871dbdf29a16f830e9aef9e\\\"],\\\"sizeBytes\\\":682963466},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5121a0944000b7bfa57ae2e4eb3f412e1b4b89fcc75eec1ef20241182c0527f2\\\"],\\\"sizeBytes\\\":677827184},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a31b448302fbb994548ed801ac488a44e8a7c4ae9149c3b4cc20d6af832f83\\\"],\\\"sizeBytes\\\":621542709},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3e089c4e4fa9a22803b2673b776215e021a1f12a856dbcaba2fadee29bee10a3\\\"],\\\"sizeBytes\\\":589275174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1582ea693f35073e3316e2380a18227b78096ca7f4e1328f1dd8a2c423da26e9\\\"],\\\"sizeBytes\\\":582052489},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:314be88d356b2c8a3c4416daeb4cfcd58d617a4526319c01ddaffae4b4179e74\\\"],\\\"sizeBytes\\\":558105176},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:69f9df2f6b5cd83ab895e9e4a9bf8920d35fe450679ce06fb223944e95cfbe3e\\\"],\\\"sizeBytes\\\":557320737},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f86073cf0561e4b69668f8917ef5184cb0ef5aa16d0fefe38118f1167b268721\\\"],\\\"sizeBytes\\\":548646306},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d77a77c401bcfaa65a6ab6de82415af0e7ace1b470626647e5feb4875c89a5ef\\\"],\\\"sizeBytes\\\":529218694},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bc0ca626e5e17f9f78ddbfde54ea13ddc7749904911817bba16e6b59f30499ec\\\"],\\\"sizeBytes\\\":528829499},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:11f566fe2ae782ad96d36028b0fd81911a64ef787dcebc83803f741f272fa396\\\"],\\\"sizeBytes\\\":518279996},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:40bb7cf7c637bf9efd8fb0157839d325a019d67cc7d7279665fcf90dbb7f3f33\\\"],\\\"sizeBytes\\\":517888569},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fd63e2c1185e529c6e9f6e1426222ff2ac195132b44a1775f407e4593b66d4c\\\"],\\\"sizeBytes\\\":514875199},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a1b426a276216372c7d688fe60e9eaf251efd35071f94e1bcd4337f51a90fd75\\\"],\\\"sizeBytes\\\":513473308},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ce471c00b59fd855a59f7efa9afdb3f0f9cbf1c4bcce3a82fe1a4cb82e90f52e\\\"],\\\"sizeBytes\\\":513119434},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a9dcbc6b966928b7597d4a822948ae6f07b62feecb91679c1d825d0d19426e19\\\"],\\\"sizeBytes\\\":512172666},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5f4a546983224e416dfcc3a700afc15f9790182a5a2f8f7c94892d0e95abab3\\\"],\\\"sizeBytes\\\":511125422},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2c8de5c5b21ed8c7829ba988d580ffa470c9913877fe0ee5e11bf507400ffbc7\\\"],\\\"sizeBytes\\\":511059399},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:64ba461fd5594e3a30bfd755f1496707a88249bc68d07c65124c8617d664d2ac\\\"],\\\"sizeBytes\\\":508786786},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a82e441a9e9b93f0e010f1ce26e30c24b6ca93f7752084d4694ebdb3c5b53f83\\\"],\\\"sizeBytes\\\":508443359},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d7bd3361d506dcc1be3afa62d35080c5dd37afccc26cd36019e2b9db2c45f896\\\"],\\\"sizeBytes\\\":507867630},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:034588ffd95ce834e866279bf80a45af2cddda631c6c9a6344c1bb2e033fd83e\\\"],\\\"sizeBytes\\\":506374680},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8618d42fe4da4881abe39e98691d187e13713981b66d0dac0a11cb1287482b7\\\"],\\\"sizeBytes\\\":506291135},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ce68078d909b63bb5b872d94c04829aa1b5812c416abbaf9024840d348ee68b1\\\"],\\\"sizeBytes\\\":505244089},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:457c564075e8b14b1d24ff6eab750600ebc90ff8b7bb137306a579ee8445ae95\\\"],\\\"sizeBytes\\\":505137106},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ebf883de8fd905490f0c9b420a5d6446ecde18e12e15364f6dcd4e885104972c\\\"],\\\"sizeBytes\\\":504558291},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:897708222502e4d710dd737923f74d153c084ba6048bffceb16dfd30f79a6ecc\\\"],\\\"sizeBytes\\\":504513960},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:86d9e1fdf97794f44fc1c91da025714ec6900fafa6cdc4c0041ffa95e9d70c6c\\\"],\\\"sizeBytes\\\":495888162},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4e8c6ae1f9a450c90857c9fbccf1e5fb404dbc0d65d086afce005d6bd307853b\\\"],\\\"sizeBytes\\\":494959854},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:117a846734fc8159b7172a40ed2feb43a969b7dbc113ee1a572cbf6f9f922655\\\"],\\\"sizeBytes\\\":486990304},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4797a485fd4ab3414ba8d52bdf2afccefab6c657b1d259baad703fca5145124c\\\"],\\\"sizeBytes\\\":484349508},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a132d09565133b36ac7c797213d6a74ac810bb368ef59136320ab3d300f45bd\\\"],\\\"sizeBytes\\\":484074784},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6acc7c3c018d8bb3cb597580eedae0300c44a5424f07129270c878899ef592a6\\\"],\\\"sizeBytes\\\":470717179},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:235b846666adaa2e4b4d6d0f7fd71d57bf3be253466e1d9fffafd103fa2696ac\\\"],\\\"sizeBytes\\\":470575802},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ce89154fa3fe1e87c660e644b58cf125fede575869fd5841600082c0d1f858a3\\\"],\\\"sizeBytes\\\":468159025}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:22:00.266923 master-0 kubenswrapper[7728]: I0223 14:22:00.265946 7728 patch_prober.go:28] interesting pod/marketplace-operator-6f5488b997-7b5sp container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.128.0.15:8080/healthz\": dial tcp 10.128.0.15:8080: connect: connection refused" start-of-body= Feb 23 14:22:00.266923 master-0 kubenswrapper[7728]: I0223 14:22:00.266043 7728 patch_prober.go:28] interesting pod/marketplace-operator-6f5488b997-7b5sp container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.15:8080/healthz\": dial tcp 10.128.0.15:8080: connect: connection refused" start-of-body= Feb 23 14:22:00.266923 master-0 kubenswrapper[7728]: I0223 14:22:00.266088 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" podUID="585f74db-4593-426b-b0c7-ec8f64810549" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.15:8080/healthz\": dial tcp 10.128.0.15:8080: connect: connection refused" Feb 23 14:22:00.266923 master-0 kubenswrapper[7728]: I0223 14:22:00.266138 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" podUID="585f74db-4593-426b-b0c7-ec8f64810549" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.15:8080/healthz\": dial tcp 10.128.0.15:8080: connect: connection refused" Feb 23 14:22:00.486609 master-0 kubenswrapper[7728]: I0223 14:22:00.486462 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:22:00.486891 master-0 kubenswrapper[7728]: I0223 14:22:00.486618 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:22:01.310854 master-0 kubenswrapper[7728]: E0223 14:22:01.310658 7728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Feb 23 14:22:03.022257 master-0 kubenswrapper[7728]: E0223 14:22:03.021905 7728 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{machine-approver-7dd9c7d7b9-rn8fj.1896e5ff546a37fe openshift-cluster-machine-approver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-cluster-machine-approver,Name:machine-approver-7dd9c7d7b9-rn8fj,UID:c67a2ed2-f520-46fc-84d3-6816dc19f4e0,APIVersion:v1,ResourceVersion:9572,FieldPath:spec.containers{machine-approver-controller},},Reason:Started,Message:Started container machine-approver-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:19:48.212537342 +0000 UTC m=+81.175198648,LastTimestamp:2026-02-23 14:19:48.212537342 +0000 UTC m=+81.175198648,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:22:03.486277 master-0 kubenswrapper[7728]: I0223 14:22:03.486177 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:22:03.486545 master-0 kubenswrapper[7728]: I0223 14:22:03.486272 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:22:05.369279 master-0 kubenswrapper[7728]: I0223 14:22:05.369198 7728 patch_prober.go:28] interesting pod/operator-controller-controller-manager-9cc7d7bb-6zmk9 container/manager namespace/openshift-operator-controller: Liveness probe status=failure output="Get \"http://10.128.0.36:8081/healthz\": dial tcp 10.128.0.36:8081: connect: connection refused" start-of-body= Feb 23 14:22:05.370116 master-0 kubenswrapper[7728]: I0223 14:22:05.369284 7728 patch_prober.go:28] interesting pod/operator-controller-controller-manager-9cc7d7bb-6zmk9 container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.36:8081/readyz\": dial tcp 10.128.0.36:8081: connect: connection refused" start-of-body= Feb 23 14:22:05.370116 master-0 kubenswrapper[7728]: I0223 14:22:05.369291 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" podUID="1c60ff3f-2bb1-422e-be27-5eca96d85fd2" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.36:8081/healthz\": dial tcp 10.128.0.36:8081: connect: connection refused" Feb 23 14:22:05.370116 master-0 kubenswrapper[7728]: I0223 14:22:05.369372 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" podUID="1c60ff3f-2bb1-422e-be27-5eca96d85fd2" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.36:8081/readyz\": dial tcp 10.128.0.36:8081: connect: connection refused" Feb 23 14:22:05.384073 master-0 kubenswrapper[7728]: I0223 14:22:05.383956 7728 patch_prober.go:28] interesting pod/catalogd-controller-manager-84b8d9d697-2hr5s container/manager namespace/openshift-catalogd: Readiness probe status=failure output="Get \"http://10.128.0.35:8081/readyz\": dial tcp 10.128.0.35:8081: connect: connection refused" start-of-body= Feb 23 14:22:05.384073 master-0 kubenswrapper[7728]: I0223 14:22:05.383989 7728 patch_prober.go:28] interesting pod/catalogd-controller-manager-84b8d9d697-2hr5s container/manager namespace/openshift-catalogd: Liveness probe status=failure output="Get \"http://10.128.0.35:8081/healthz\": dial tcp 10.128.0.35:8081: connect: connection refused" start-of-body= Feb 23 14:22:05.384073 master-0 kubenswrapper[7728]: I0223 14:22:05.384035 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" podUID="66c72c71-f74a-43ab-bf0d-1f4c93623774" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.35:8081/readyz\": dial tcp 10.128.0.35:8081: connect: connection refused" Feb 23 14:22:05.384073 master-0 kubenswrapper[7728]: I0223 14:22:05.384050 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" podUID="66c72c71-f74a-43ab-bf0d-1f4c93623774" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.35:8081/healthz\": dial tcp 10.128.0.35:8081: connect: connection refused" Feb 23 14:22:05.993213 master-0 kubenswrapper[7728]: I0223 14:22:05.993124 7728 patch_prober.go:28] interesting pod/machine-config-daemon-fhcgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 14:22:05.993720 master-0 kubenswrapper[7728]: I0223 14:22:05.993231 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" podUID="76c67569-3a72-4de9-87cd-432a4607b15b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 14:22:06.486944 master-0 kubenswrapper[7728]: I0223 14:22:06.486800 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:22:06.487798 master-0 kubenswrapper[7728]: I0223 14:22:06.486962 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:22:09.486984 master-0 kubenswrapper[7728]: I0223 14:22:09.486888 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:22:09.487866 master-0 kubenswrapper[7728]: I0223 14:22:09.487019 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:22:09.716610 master-0 kubenswrapper[7728]: E0223 14:22:09.716545 7728 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:22:10.265171 master-0 kubenswrapper[7728]: I0223 14:22:10.265109 7728 patch_prober.go:28] interesting pod/marketplace-operator-6f5488b997-7b5sp container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.15:8080/healthz\": dial tcp 10.128.0.15:8080: connect: connection refused" start-of-body= Feb 23 14:22:10.265171 master-0 kubenswrapper[7728]: I0223 14:22:10.265149 7728 patch_prober.go:28] interesting pod/marketplace-operator-6f5488b997-7b5sp container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.128.0.15:8080/healthz\": dial tcp 10.128.0.15:8080: connect: connection refused" start-of-body= Feb 23 14:22:10.265433 master-0 kubenswrapper[7728]: I0223 14:22:10.265186 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" podUID="585f74db-4593-426b-b0c7-ec8f64810549" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.15:8080/healthz\": dial tcp 10.128.0.15:8080: connect: connection refused" Feb 23 14:22:10.265433 master-0 kubenswrapper[7728]: I0223 14:22:10.265217 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" podUID="585f74db-4593-426b-b0c7-ec8f64810549" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.15:8080/healthz\": dial tcp 10.128.0.15:8080: connect: connection refused" Feb 23 14:22:10.648354 master-0 kubenswrapper[7728]: I0223 14:22:10.648293 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-686847ff5f-9q266_4373687a-61a0-434b-81f7-3fecaa1494ef/control-plane-machine-set-operator/0.log" Feb 23 14:22:10.648890 master-0 kubenswrapper[7728]: I0223 14:22:10.648353 7728 generic.go:334] "Generic (PLEG): container finished" podID="4373687a-61a0-434b-81f7-3fecaa1494ef" containerID="9b45bf126e1d92621372b72946a5700b9c49834f8698b4a6266b185922dfcbee" exitCode=1 Feb 23 14:22:12.486408 master-0 kubenswrapper[7728]: I0223 14:22:12.486297 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:22:12.486408 master-0 kubenswrapper[7728]: I0223 14:22:12.486393 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:22:15.369869 master-0 kubenswrapper[7728]: I0223 14:22:15.369766 7728 patch_prober.go:28] interesting pod/operator-controller-controller-manager-9cc7d7bb-6zmk9 container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.36:8081/readyz\": dial tcp 10.128.0.36:8081: connect: connection refused" start-of-body= Feb 23 14:22:15.369869 master-0 kubenswrapper[7728]: I0223 14:22:15.369858 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" podUID="1c60ff3f-2bb1-422e-be27-5eca96d85fd2" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.36:8081/readyz\": dial tcp 10.128.0.36:8081: connect: connection refused" Feb 23 14:22:15.383754 master-0 kubenswrapper[7728]: I0223 14:22:15.383665 7728 patch_prober.go:28] interesting pod/catalogd-controller-manager-84b8d9d697-2hr5s container/manager namespace/openshift-catalogd: Readiness probe status=failure output="Get \"http://10.128.0.35:8081/readyz\": dial tcp 10.128.0.35:8081: connect: connection refused" start-of-body= Feb 23 14:22:15.383754 master-0 kubenswrapper[7728]: I0223 14:22:15.383729 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" podUID="66c72c71-f74a-43ab-bf0d-1f4c93623774" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.35:8081/readyz\": dial tcp 10.128.0.35:8081: connect: connection refused" Feb 23 14:22:15.486753 master-0 kubenswrapper[7728]: I0223 14:22:15.486647 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:22:15.486753 master-0 kubenswrapper[7728]: I0223 14:22:15.486740 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:22:18.312925 master-0 kubenswrapper[7728]: E0223 14:22:18.312768 7728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Feb 23 14:22:18.486621 master-0 kubenswrapper[7728]: I0223 14:22:18.486449 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:22:18.486858 master-0 kubenswrapper[7728]: I0223 14:22:18.486637 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:22:19.717177 master-0 kubenswrapper[7728]: E0223 14:22:19.716800 7728 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:22:20.265952 master-0 kubenswrapper[7728]: I0223 14:22:20.265857 7728 patch_prober.go:28] interesting pod/marketplace-operator-6f5488b997-7b5sp container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.15:8080/healthz\": dial tcp 10.128.0.15:8080: connect: connection refused" start-of-body= Feb 23 14:22:20.266190 master-0 kubenswrapper[7728]: I0223 14:22:20.265947 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" podUID="585f74db-4593-426b-b0c7-ec8f64810549" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.15:8080/healthz\": dial tcp 10.128.0.15:8080: connect: connection refused" Feb 23 14:22:21.486618 master-0 kubenswrapper[7728]: I0223 14:22:21.486551 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:22:21.487654 master-0 kubenswrapper[7728]: I0223 14:22:21.487590 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:22:22.728441 master-0 kubenswrapper[7728]: I0223 14:22:22.728327 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-7bcfbc574b-zdntd_cf04aca0-8174-4134-835d-37adf6a3b5ca/kube-controller-manager-operator/1.log" Feb 23 14:22:22.729345 master-0 kubenswrapper[7728]: I0223 14:22:22.729279 7728 generic.go:334] "Generic (PLEG): container finished" podID="cf04aca0-8174-4134-835d-37adf6a3b5ca" containerID="1ea2b285f2639d5a18b8a335d5c0eee1af23080ecbfb38dc1a5168ba545660e2" exitCode=255 Feb 23 14:22:24.486619 master-0 kubenswrapper[7728]: I0223 14:22:24.486537 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:22:24.487209 master-0 kubenswrapper[7728]: I0223 14:22:24.486623 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:22:25.369978 master-0 kubenswrapper[7728]: I0223 14:22:25.369868 7728 patch_prober.go:28] interesting pod/operator-controller-controller-manager-9cc7d7bb-6zmk9 container/manager namespace/openshift-operator-controller: Liveness probe status=failure output="Get \"http://10.128.0.36:8081/healthz\": dial tcp 10.128.0.36:8081: connect: connection refused" start-of-body= Feb 23 14:22:25.369978 master-0 kubenswrapper[7728]: I0223 14:22:25.369922 7728 patch_prober.go:28] interesting pod/operator-controller-controller-manager-9cc7d7bb-6zmk9 container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.36:8081/readyz\": dial tcp 10.128.0.36:8081: connect: connection refused" start-of-body= Feb 23 14:22:25.370372 master-0 kubenswrapper[7728]: I0223 14:22:25.369958 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" podUID="1c60ff3f-2bb1-422e-be27-5eca96d85fd2" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.36:8081/healthz\": dial tcp 10.128.0.36:8081: connect: connection refused" Feb 23 14:22:25.370372 master-0 kubenswrapper[7728]: I0223 14:22:25.370023 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" podUID="1c60ff3f-2bb1-422e-be27-5eca96d85fd2" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.36:8081/readyz\": dial tcp 10.128.0.36:8081: connect: connection refused" Feb 23 14:22:25.383701 master-0 kubenswrapper[7728]: I0223 14:22:25.383622 7728 patch_prober.go:28] interesting pod/catalogd-controller-manager-84b8d9d697-2hr5s container/manager namespace/openshift-catalogd: Liveness probe status=failure output="Get \"http://10.128.0.35:8081/healthz\": dial tcp 10.128.0.35:8081: connect: connection refused" start-of-body= Feb 23 14:22:25.383701 master-0 kubenswrapper[7728]: I0223 14:22:25.383676 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" podUID="66c72c71-f74a-43ab-bf0d-1f4c93623774" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.35:8081/healthz\": dial tcp 10.128.0.35:8081: connect: connection refused" Feb 23 14:22:25.384242 master-0 kubenswrapper[7728]: I0223 14:22:25.384162 7728 patch_prober.go:28] interesting pod/catalogd-controller-manager-84b8d9d697-2hr5s container/manager namespace/openshift-catalogd: Readiness probe status=failure output="Get \"http://10.128.0.35:8081/readyz\": dial tcp 10.128.0.35:8081: connect: connection refused" start-of-body= Feb 23 14:22:25.384351 master-0 kubenswrapper[7728]: I0223 14:22:25.384290 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" podUID="66c72c71-f74a-43ab-bf0d-1f4c93623774" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.35:8081/readyz\": dial tcp 10.128.0.35:8081: connect: connection refused" Feb 23 14:22:28.485317 master-0 kubenswrapper[7728]: I0223 14:22:28.485263 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:22:28.486154 master-0 kubenswrapper[7728]: I0223 14:22:28.485328 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:22:29.717451 master-0 kubenswrapper[7728]: E0223 14:22:29.717362 7728 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:22:30.265413 master-0 kubenswrapper[7728]: I0223 14:22:30.265323 7728 patch_prober.go:28] interesting pod/marketplace-operator-6f5488b997-7b5sp container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.15:8080/healthz\": dial tcp 10.128.0.15:8080: connect: connection refused" start-of-body= Feb 23 14:22:30.265413 master-0 kubenswrapper[7728]: I0223 14:22:30.265394 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" podUID="585f74db-4593-426b-b0c7-ec8f64810549" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.15:8080/healthz\": dial tcp 10.128.0.15:8080: connect: connection refused" Feb 23 14:22:30.782759 master-0 kubenswrapper[7728]: I0223 14:22:30.782627 7728 generic.go:334] "Generic (PLEG): container finished" podID="b090ed5a-984f-41dd-8cea-34a1ece1514f" containerID="be11245e52df36836387b793176a5296c3112993cdce052d05331b901d833321" exitCode=0 Feb 23 14:22:31.485799 master-0 kubenswrapper[7728]: I0223 14:22:31.485676 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:22:31.486146 master-0 kubenswrapper[7728]: I0223 14:22:31.485803 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:22:31.792278 master-0 kubenswrapper[7728]: I0223 14:22:31.792073 7728 generic.go:334] "Generic (PLEG): container finished" podID="959c2393-e914-4c10-a18f-b30fcf012d19" containerID="943dceb3c19889e0c21143fb06ce16ff62e733710dc9afea16ddd3ae92da4904" exitCode=0 Feb 23 14:22:31.795328 master-0 kubenswrapper[7728]: I0223 14:22:31.795264 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-d6bb9bb76-4frj6_12b256b7-a57b-4124-8452-25e74cfa7926/cluster-baremetal-operator/0.log" Feb 23 14:22:31.795328 master-0 kubenswrapper[7728]: I0223 14:22:31.795323 7728 generic.go:334] "Generic (PLEG): container finished" podID="12b256b7-a57b-4124-8452-25e74cfa7926" containerID="59103074e5c9d28cc59a99d2933688907ecdae822b440f6d4da07709d19793c9" exitCode=1 Feb 23 14:22:32.967653 master-0 kubenswrapper[7728]: I0223 14:22:32.967569 7728 patch_prober.go:28] interesting pod/controller-manager-55d786cb4c-cqkbt container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.58:8443/healthz\": dial tcp 10.128.0.58:8443: connect: connection refused" start-of-body= Feb 23 14:22:32.967653 master-0 kubenswrapper[7728]: I0223 14:22:32.967641 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" podUID="959c2393-e914-4c10-a18f-b30fcf012d19" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.58:8443/healthz\": dial tcp 10.128.0.58:8443: connect: connection refused" Feb 23 14:22:32.968253 master-0 kubenswrapper[7728]: I0223 14:22:32.967842 7728 patch_prober.go:28] interesting pod/controller-manager-55d786cb4c-cqkbt container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.128.0.58:8443/healthz\": dial tcp 10.128.0.58:8443: connect: connection refused" start-of-body= Feb 23 14:22:32.968253 master-0 kubenswrapper[7728]: I0223 14:22:32.967916 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" podUID="959c2393-e914-4c10-a18f-b30fcf012d19" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.58:8443/healthz\": dial tcp 10.128.0.58:8443: connect: connection refused" Feb 23 14:22:33.275374 master-0 kubenswrapper[7728]: E0223 14:22:33.275205 7728 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Feb 23 14:22:33.275704 master-0 kubenswrapper[7728]: E0223 14:22:33.275439 7728 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.011s" Feb 23 14:22:33.275704 master-0 kubenswrapper[7728]: I0223 14:22:33.275676 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:22:33.277738 master-0 kubenswrapper[7728]: I0223 14:22:33.277682 7728 scope.go:117] "RemoveContainer" containerID="1ea2b285f2639d5a18b8a335d5c0eee1af23080ecbfb38dc1a5168ba545660e2" Feb 23 14:22:33.278077 master-0 kubenswrapper[7728]: I0223 14:22:33.278036 7728 scope.go:117] "RemoveContainer" containerID="3d191963e287b24eb8e359eae476b7710f1b01ed3998cce17300434d7f6e8d0b" Feb 23 14:22:33.278697 master-0 kubenswrapper[7728]: I0223 14:22:33.278662 7728 scope.go:117] "RemoveContainer" containerID="b8ab745e2116720c089d0aba55fcbbcd93f3d05db7dc85aaff6bdfb686118c69" Feb 23 14:22:33.278825 master-0 kubenswrapper[7728]: I0223 14:22:33.278771 7728 scope.go:117] "RemoveContainer" containerID="5ed2538f1dd4c505937625e4613ce7839a7ad1306cb779a0660bf410856f74ea" Feb 23 14:22:33.280562 master-0 kubenswrapper[7728]: I0223 14:22:33.280284 7728 scope.go:117] "RemoveContainer" containerID="0e43678d3197cf112cf0a044926bfa730d56557262cc8421afdcc26a5ee07b83" Feb 23 14:22:33.280562 master-0 kubenswrapper[7728]: I0223 14:22:33.280454 7728 scope.go:117] "RemoveContainer" containerID="444b5986734e966174e693b843714d39c39b89099075b49c0d4944256ff9f4ae" Feb 23 14:22:33.281502 master-0 kubenswrapper[7728]: I0223 14:22:33.281353 7728 scope.go:117] "RemoveContainer" containerID="59103074e5c9d28cc59a99d2933688907ecdae822b440f6d4da07709d19793c9" Feb 23 14:22:33.281960 master-0 kubenswrapper[7728]: I0223 14:22:33.281867 7728 scope.go:117] "RemoveContainer" containerID="9b45bf126e1d92621372b72946a5700b9c49834f8698b4a6266b185922dfcbee" Feb 23 14:22:33.283142 master-0 kubenswrapper[7728]: I0223 14:22:33.282895 7728 scope.go:117] "RemoveContainer" containerID="943dceb3c19889e0c21143fb06ce16ff62e733710dc9afea16ddd3ae92da4904" Feb 23 14:22:33.284653 master-0 kubenswrapper[7728]: I0223 14:22:33.284401 7728 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Feb 23 14:22:33.285223 master-0 kubenswrapper[7728]: I0223 14:22:33.285150 7728 scope.go:117] "RemoveContainer" containerID="475b682a5602a8b70516629df8770a92cda1f614d3b2e4b8f4d6b708bbc8532d" Feb 23 14:22:33.285784 master-0 kubenswrapper[7728]: I0223 14:22:33.285674 7728 scope.go:117] "RemoveContainer" containerID="5867cf57b319e8b378703de8112e0a4c5fd05aee108af7754fc3219eac54a673" Feb 23 14:22:33.286309 master-0 kubenswrapper[7728]: I0223 14:22:33.286091 7728 scope.go:117] "RemoveContainer" containerID="a0c28cc50bec94c9a70b8ff73f58f632e7f157d8192b386a307045a41a893000" Feb 23 14:22:33.288873 master-0 kubenswrapper[7728]: I0223 14:22:33.288823 7728 scope.go:117] "RemoveContainer" containerID="4f3667b06f9040c2373de3a09349d52a663561d04056133aea74705119d3b818" Feb 23 14:22:33.289891 master-0 kubenswrapper[7728]: I0223 14:22:33.289856 7728 scope.go:117] "RemoveContainer" containerID="0bda8d15a11221e7b98f49af56e0807945868c4a5e5d028da4a5c53d7f410c01" Feb 23 14:22:33.290364 master-0 kubenswrapper[7728]: I0223 14:22:33.290132 7728 scope.go:117] "RemoveContainer" containerID="e192093c7698f9c13f14fd55a50b3b960cd4142b3b8cb914299c2709465ffc51" Feb 23 14:22:33.290692 master-0 kubenswrapper[7728]: I0223 14:22:33.290652 7728 scope.go:117] "RemoveContainer" containerID="89c65e32357fb90a65db3743a53bf98698ca1c5da74b91fe797e842ada8b4fd8" Feb 23 14:22:33.290692 master-0 kubenswrapper[7728]: I0223 14:22:33.290687 7728 scope.go:117] "RemoveContainer" containerID="06ed5eab4f45a414dec39fdf73e09eda9befba12eaf73ac8d264e79dbcbe1fcb" Feb 23 14:22:33.291141 master-0 kubenswrapper[7728]: I0223 14:22:33.291094 7728 scope.go:117] "RemoveContainer" containerID="be11245e52df36836387b793176a5296c3112993cdce052d05331b901d833321" Feb 23 14:22:33.816029 master-0 kubenswrapper[7728]: I0223 14:22:33.815621 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6847bb4785-5fw2x_2e89a047-9ebc-459b-b7b3-e902c1fb0e17/snapshot-controller/0.log" Feb 23 14:22:33.822153 master-0 kubenswrapper[7728]: I0223 14:22:33.819705 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-7bcfbc574b-zdntd_cf04aca0-8174-4134-835d-37adf6a3b5ca/kube-controller-manager-operator/1.log" Feb 23 14:22:33.833502 master-0 kubenswrapper[7728]: I0223 14:22:33.829659 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-686847ff5f-9q266_4373687a-61a0-434b-81f7-3fecaa1494ef/control-plane-machine-set-operator/0.log" Feb 23 14:22:33.837953 master-0 kubenswrapper[7728]: I0223 14:22:33.833820 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-d6bb9bb76-4frj6_12b256b7-a57b-4124-8452-25e74cfa7926/cluster-baremetal-operator/0.log" Feb 23 14:22:33.837953 master-0 kubenswrapper[7728]: I0223 14:22:33.836406 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-6569778c84-hsl6c_3488a7eb-5170-478c-9af7-490dbe0f514e/ingress-operator/0.log" Feb 23 14:22:34.172518 master-0 kubenswrapper[7728]: I0223 14:22:34.172489 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_493a9ed3-6d64-489a-a68c-235b69a58782/installer/0.log" Feb 23 14:22:34.172975 master-0 kubenswrapper[7728]: I0223 14:22:34.172545 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Feb 23 14:22:34.179582 master-0 kubenswrapper[7728]: I0223 14:22:34.179548 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_0fdb9885-7479-43b5-8613-b2857a798ade/installer/0.log" Feb 23 14:22:34.179680 master-0 kubenswrapper[7728]: I0223 14:22:34.179600 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Feb 23 14:22:34.282652 master-0 kubenswrapper[7728]: I0223 14:22:34.282568 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0fdb9885-7479-43b5-8613-b2857a798ade-kube-api-access\") pod \"0fdb9885-7479-43b5-8613-b2857a798ade\" (UID: \"0fdb9885-7479-43b5-8613-b2857a798ade\") " Feb 23 14:22:34.282797 master-0 kubenswrapper[7728]: I0223 14:22:34.282660 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/493a9ed3-6d64-489a-a68c-235b69a58782-var-lock\") pod \"493a9ed3-6d64-489a-a68c-235b69a58782\" (UID: \"493a9ed3-6d64-489a-a68c-235b69a58782\") " Feb 23 14:22:34.282797 master-0 kubenswrapper[7728]: I0223 14:22:34.282687 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0fdb9885-7479-43b5-8613-b2857a798ade-kubelet-dir\") pod \"0fdb9885-7479-43b5-8613-b2857a798ade\" (UID: \"0fdb9885-7479-43b5-8613-b2857a798ade\") " Feb 23 14:22:34.282797 master-0 kubenswrapper[7728]: I0223 14:22:34.282719 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0fdb9885-7479-43b5-8613-b2857a798ade-var-lock\") pod \"0fdb9885-7479-43b5-8613-b2857a798ade\" (UID: \"0fdb9885-7479-43b5-8613-b2857a798ade\") " Feb 23 14:22:34.282797 master-0 kubenswrapper[7728]: I0223 14:22:34.282753 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/493a9ed3-6d64-489a-a68c-235b69a58782-kube-api-access\") pod \"493a9ed3-6d64-489a-a68c-235b69a58782\" (UID: \"493a9ed3-6d64-489a-a68c-235b69a58782\") " Feb 23 14:22:34.282797 master-0 kubenswrapper[7728]: I0223 14:22:34.282787 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/493a9ed3-6d64-489a-a68c-235b69a58782-kubelet-dir\") pod \"493a9ed3-6d64-489a-a68c-235b69a58782\" (UID: \"493a9ed3-6d64-489a-a68c-235b69a58782\") " Feb 23 14:22:34.283129 master-0 kubenswrapper[7728]: I0223 14:22:34.283101 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/493a9ed3-6d64-489a-a68c-235b69a58782-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "493a9ed3-6d64-489a-a68c-235b69a58782" (UID: "493a9ed3-6d64-489a-a68c-235b69a58782"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:22:34.283197 master-0 kubenswrapper[7728]: I0223 14:22:34.283114 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0fdb9885-7479-43b5-8613-b2857a798ade-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0fdb9885-7479-43b5-8613-b2857a798ade" (UID: "0fdb9885-7479-43b5-8613-b2857a798ade"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:22:34.283197 master-0 kubenswrapper[7728]: I0223 14:22:34.283141 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0fdb9885-7479-43b5-8613-b2857a798ade-var-lock" (OuterVolumeSpecName: "var-lock") pod "0fdb9885-7479-43b5-8613-b2857a798ade" (UID: "0fdb9885-7479-43b5-8613-b2857a798ade"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:22:34.283687 master-0 kubenswrapper[7728]: I0223 14:22:34.283638 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/493a9ed3-6d64-489a-a68c-235b69a58782-var-lock" (OuterVolumeSpecName: "var-lock") pod "493a9ed3-6d64-489a-a68c-235b69a58782" (UID: "493a9ed3-6d64-489a-a68c-235b69a58782"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:22:34.286116 master-0 kubenswrapper[7728]: I0223 14:22:34.286063 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fdb9885-7479-43b5-8613-b2857a798ade-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0fdb9885-7479-43b5-8613-b2857a798ade" (UID: "0fdb9885-7479-43b5-8613-b2857a798ade"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:22:34.291108 master-0 kubenswrapper[7728]: I0223 14:22:34.291041 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/493a9ed3-6d64-489a-a68c-235b69a58782-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "493a9ed3-6d64-489a-a68c-235b69a58782" (UID: "493a9ed3-6d64-489a-a68c-235b69a58782"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:22:34.384202 master-0 kubenswrapper[7728]: I0223 14:22:34.384083 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0fdb9885-7479-43b5-8613-b2857a798ade-kube-api-access\") on node \"master-0\" DevicePath \"\"" Feb 23 14:22:34.384202 master-0 kubenswrapper[7728]: I0223 14:22:34.384151 7728 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/493a9ed3-6d64-489a-a68c-235b69a58782-var-lock\") on node \"master-0\" DevicePath \"\"" Feb 23 14:22:34.384202 master-0 kubenswrapper[7728]: I0223 14:22:34.384173 7728 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0fdb9885-7479-43b5-8613-b2857a798ade-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:22:34.384202 master-0 kubenswrapper[7728]: I0223 14:22:34.384192 7728 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0fdb9885-7479-43b5-8613-b2857a798ade-var-lock\") on node \"master-0\" DevicePath \"\"" Feb 23 14:22:34.384410 master-0 kubenswrapper[7728]: I0223 14:22:34.384210 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/493a9ed3-6d64-489a-a68c-235b69a58782-kube-api-access\") on node \"master-0\" DevicePath \"\"" Feb 23 14:22:34.384410 master-0 kubenswrapper[7728]: I0223 14:22:34.384228 7728 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/493a9ed3-6d64-489a-a68c-235b69a58782-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:22:34.486160 master-0 kubenswrapper[7728]: I0223 14:22:34.486060 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:22:34.486269 master-0 kubenswrapper[7728]: I0223 14:22:34.486161 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:22:34.860377 master-0 kubenswrapper[7728]: I0223 14:22:34.860291 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb_c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04/config-sync-controllers/0.log" Feb 23 14:22:34.861008 master-0 kubenswrapper[7728]: I0223 14:22:34.860950 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb_c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04/cluster-cloud-controller-manager/0.log" Feb 23 14:22:34.863225 master-0 kubenswrapper[7728]: I0223 14:22:34.863168 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_0fdb9885-7479-43b5-8613-b2857a798ade/installer/0.log" Feb 23 14:22:34.863394 master-0 kubenswrapper[7728]: I0223 14:22:34.863350 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Feb 23 14:22:34.869285 master-0 kubenswrapper[7728]: I0223 14:22:34.869221 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-9cc7d7bb-6zmk9_1c60ff3f-2bb1-422e-be27-5eca96d85fd2/manager/0.log" Feb 23 14:22:34.880602 master-0 kubenswrapper[7728]: I0223 14:22:34.880549 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_493a9ed3-6d64-489a-a68c-235b69a58782/installer/0.log" Feb 23 14:22:34.880967 master-0 kubenswrapper[7728]: I0223 14:22:34.880901 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Feb 23 14:22:34.883745 master-0 kubenswrapper[7728]: I0223 14:22:34.883690 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-84b8d9d697-2hr5s_66c72c71-f74a-43ab-bf0d-1f4c93623774/manager/0.log" Feb 23 14:22:35.314795 master-0 kubenswrapper[7728]: E0223 14:22:35.314697 7728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Feb 23 14:22:35.993239 master-0 kubenswrapper[7728]: I0223 14:22:35.993134 7728 patch_prober.go:28] interesting pod/machine-config-daemon-fhcgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 14:22:35.993589 master-0 kubenswrapper[7728]: I0223 14:22:35.993239 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" podUID="76c67569-3a72-4de9-87cd-432a4607b15b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 14:22:37.024556 master-0 kubenswrapper[7728]: E0223 14:22:37.024368 7728 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{openshift-config-operator-6f47d587d6-55qjr.1896e5ff73e4d525 openshift-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-config-operator,Name:openshift-config-operator-6f47d587d6-55qjr,UID:92c63c95-e880-4f51-9858-7715343f7bd8,APIVersion:v1,ResourceVersion:8517,FieldPath:spec.containers{openshift-config-operator},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:86d9e1fdf97794f44fc1c91da025714ec6900fafa6cdc4c0041ffa95e9d70c6c\" in 5.586s (5.586s including waiting). Image size: 495888162 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:19:48.740666661 +0000 UTC m=+81.703327957,LastTimestamp:2026-02-23 14:19:48.740666661 +0000 UTC m=+81.703327957,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:22:37.487105 master-0 kubenswrapper[7728]: I0223 14:22:37.486944 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:22:37.487397 master-0 kubenswrapper[7728]: I0223 14:22:37.487108 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:22:39.717831 master-0 kubenswrapper[7728]: E0223 14:22:39.717717 7728 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:22:39.717831 master-0 kubenswrapper[7728]: E0223 14:22:39.717764 7728 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 14:22:40.487325 master-0 kubenswrapper[7728]: I0223 14:22:40.487122 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:22:40.487325 master-0 kubenswrapper[7728]: I0223 14:22:40.487306 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:22:43.487202 master-0 kubenswrapper[7728]: I0223 14:22:43.487081 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:22:43.487202 master-0 kubenswrapper[7728]: I0223 14:22:43.487195 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:22:46.295008 master-0 kubenswrapper[7728]: E0223 14:22:46.294894 7728 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Feb 23 14:22:46.486666 master-0 kubenswrapper[7728]: I0223 14:22:46.486599 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:22:46.486776 master-0 kubenswrapper[7728]: I0223 14:22:46.486681 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:22:46.979254 master-0 kubenswrapper[7728]: I0223 14:22:46.979208 7728 generic.go:334] "Generic (PLEG): container finished" podID="18a83278819db2092fa26d8274eb3f00" containerID="4850e29e1670d0434d8ca87c5950a0424937b61be4c5fb2ae511df8fe764c7a2" exitCode=0 Feb 23 14:22:48.127525 master-0 kubenswrapper[7728]: I0223 14:22:48.127408 7728 status_manager.go:851] "Failed to get status for pod" podUID="2a0bf4c4-8272-4f24-8e48-525d7a278b26" pod="openshift-marketplace/community-operators-xxh6f" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods community-operators-xxh6f)" Feb 23 14:22:49.487078 master-0 kubenswrapper[7728]: I0223 14:22:49.486911 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:22:49.487078 master-0 kubenswrapper[7728]: I0223 14:22:49.487070 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:22:52.316240 master-0 kubenswrapper[7728]: E0223 14:22:52.315814 7728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Feb 23 14:22:52.486989 master-0 kubenswrapper[7728]: I0223 14:22:52.486877 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:22:52.487294 master-0 kubenswrapper[7728]: I0223 14:22:52.487002 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:22:55.485596 master-0 kubenswrapper[7728]: I0223 14:22:55.485462 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:22:55.486432 master-0 kubenswrapper[7728]: I0223 14:22:55.485616 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:22:56.051021 master-0 kubenswrapper[7728]: I0223 14:22:56.050933 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-fc889cfd5-tw2r9_865ceedb-b19a-4f2f-b295-311e1b7a645e/kube-storage-version-migrator-operator/1.log" Feb 23 14:22:56.051794 master-0 kubenswrapper[7728]: I0223 14:22:56.051723 7728 generic.go:334] "Generic (PLEG): container finished" podID="865ceedb-b19a-4f2f-b295-311e1b7a645e" containerID="7afdcdc79bcf059c9a09c1210b1f2828b6a1174f97563a28f6b04cfb2b6ff9e4" exitCode=255 Feb 23 14:22:57.061177 master-0 kubenswrapper[7728]: I0223 14:22:57.061108 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-6f47d587d6-55qjr_92c63c95-e880-4f51-9858-7715343f7bd8/openshift-config-operator/2.log" Feb 23 14:22:57.062018 master-0 kubenswrapper[7728]: I0223 14:22:57.061975 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-6f47d587d6-55qjr_92c63c95-e880-4f51-9858-7715343f7bd8/openshift-config-operator/1.log" Feb 23 14:22:57.062790 master-0 kubenswrapper[7728]: I0223 14:22:57.062747 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-6f47d587d6-55qjr_92c63c95-e880-4f51-9858-7715343f7bd8/openshift-config-operator/0.log" Feb 23 14:22:57.063247 master-0 kubenswrapper[7728]: I0223 14:22:57.063200 7728 generic.go:334] "Generic (PLEG): container finished" podID="92c63c95-e880-4f51-9858-7715343f7bd8" containerID="0a3915eaedd169a17fc20783989eb20aa548b21f919f4de39f43389e2994de7c" exitCode=255 Feb 23 14:22:57.486658 master-0 kubenswrapper[7728]: I0223 14:22:57.486568 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:22:57.486921 master-0 kubenswrapper[7728]: I0223 14:22:57.486665 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:22:59.863827 master-0 kubenswrapper[7728]: E0223 14:22:59.863569 7728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T14:22:49Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T14:22:49Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T14:22:49Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T14:22:49Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:584b5d125dad1fa4f8d03e6ace2e4901c173569ff1ed9536da6915c56fa52bc0\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8124eb3839b25af23303e9fdde35728bfd24d7c0c47530e77852cba1dd9d1ffb\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1702755272},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:94d88fe2fa42931a725508dbf17296b6ed99b8e20c1169f5d1fb8a36f4927ddd\\\"],\\\"sizeBytes\\\":1637274270},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d7a8ac0ba2e5115c9d451d553741173ae8744d4544da15e28bf38f61630182fd\\\"],\\\"sizeBytes\\\":1237794314},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:c2af15d278f72034eecf3db74223b7e61f3d07c1a5c7ba760e7586915ff1b17e\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:e166d1252d7455d8bd62e43f2967e738ee9bdd6a09b7771a4187d82477ae7535\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1237042376},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:b5385b46d054c9ee73478bf23e07056d0b9f81d34619d0949927d8d9e791fcb5\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:ebdc10b149ba97b999770285d06149ef92c780205d916c3cab994098e20be0ba\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1210455233},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:518982b9ad8a8bfb7bb3b4216b235cac99e126df3bb48e390b36064560c76b83\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b3293b04e31c8e67c885f77e0ad2ee994295afde7c42cb9761c7090ae0cdb3f8\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1202767548},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4775c6461221dafe3ddd67ff683ccb665bed6eb278fa047d9d744aab9af65dcf\\\"],\\\"sizeBytes\\\":992461126},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8177c465e14c63854e5c0fa95ca0635cffc9b5dd3d077ecf971feedbc42b1274\\\"],\\\"sizeBytes\\\":943734757},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6c7ec917f0eff7b41d7174f1b5fdc4ce53ad106e51599afba731a8431ff9caa7\\\"],\\\"sizeBytes\\\":918153745},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8ff40a2d97bf7a95e19303f7e972b7e8354a3864039111c6d33d5479117aaeed\\\"],\\\"sizeBytes\\\":880247193},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:72fafcd55ab739919dd8a114863fda27106af1c497f474e7ce0cb23b58dfa021\\\"],\\\"sizeBytes\\\":875998518},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7b9239f1f5e9590e3db71e61fde86db8f43e0085f61ae7769508d2ea058481c7\\\"],\\\"sizeBytes\\\":862501144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:572b0ca6e993beea2ee9346197665e56a2e4999fbb6958c747c48a35bf72ee34\\\"],\\\"sizeBytes\\\":862091954},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3fa84eaa1310d97fe55bb23a7c27ece85718d0643fa7fc0ff81014edb4b948b\\\"],\\\"sizeBytes\\\":772838975},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bd420e879c9f0271bca2d123a6d762591d9a4626b72f254d1f885842c32149e8\\\"],\\\"sizeBytes\\\":687849728},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3c467c1eeba7434b2aebf07169ab8afe0203d638e871dbdf29a16f830e9aef9e\\\"],\\\"sizeBytes\\\":682963466},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5121a0944000b7bfa57ae2e4eb3f412e1b4b89fcc75eec1ef20241182c0527f2\\\"],\\\"sizeBytes\\\":677827184},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a31b448302fbb994548ed801ac488a44e8a7c4ae9149c3b4cc20d6af832f83\\\"],\\\"sizeBytes\\\":621542709},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3e089c4e4fa9a22803b2673b776215e021a1f12a856dbcaba2fadee29bee10a3\\\"],\\\"sizeBytes\\\":589275174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1582ea693f35073e3316e2380a18227b78096ca7f4e1328f1dd8a2c423da26e9\\\"],\\\"sizeBytes\\\":582052489},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:314be88d356b2c8a3c4416daeb4cfcd58d617a4526319c01ddaffae4b4179e74\\\"],\\\"sizeBytes\\\":558105176},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:69f9df2f6b5cd83ab895e9e4a9bf8920d35fe450679ce06fb223944e95cfbe3e\\\"],\\\"sizeBytes\\\":557320737},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f86073cf0561e4b69668f8917ef5184cb0ef5aa16d0fefe38118f1167b268721\\\"],\\\"sizeBytes\\\":548646306},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d77a77c401bcfaa65a6ab6de82415af0e7ace1b470626647e5feb4875c89a5ef\\\"],\\\"sizeBytes\\\":529218694},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bc0ca626e5e17f9f78ddbfde54ea13ddc7749904911817bba16e6b59f30499ec\\\"],\\\"sizeBytes\\\":528829499},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:11f566fe2ae782ad96d36028b0fd81911a64ef787dcebc83803f741f272fa396\\\"],\\\"sizeBytes\\\":518279996},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:40bb7cf7c637bf9efd8fb0157839d325a019d67cc7d7279665fcf90dbb7f3f33\\\"],\\\"sizeBytes\\\":517888569},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fd63e2c1185e529c6e9f6e1426222ff2ac195132b44a1775f407e4593b66d4c\\\"],\\\"sizeBytes\\\":514875199},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a1b426a276216372c7d688fe60e9eaf251efd35071f94e1bcd4337f51a90fd75\\\"],\\\"sizeBytes\\\":513473308},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ce471c00b59fd855a59f7efa9afdb3f0f9cbf1c4bcce3a82fe1a4cb82e90f52e\\\"],\\\"sizeBytes\\\":513119434},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a9dcbc6b966928b7597d4a822948ae6f07b62feecb91679c1d825d0d19426e19\\\"],\\\"sizeBytes\\\":512172666},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5f4a546983224e416dfcc3a700afc15f9790182a5a2f8f7c94892d0e95abab3\\\"],\\\"sizeBytes\\\":511125422},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2c8de5c5b21ed8c7829ba988d580ffa470c9913877fe0ee5e11bf507400ffbc7\\\"],\\\"sizeBytes\\\":511059399},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:64ba461fd5594e3a30bfd755f1496707a88249bc68d07c65124c8617d664d2ac\\\"],\\\"sizeBytes\\\":508786786},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a82e441a9e9b93f0e010f1ce26e30c24b6ca93f7752084d4694ebdb3c5b53f83\\\"],\\\"sizeBytes\\\":508443359},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d7bd3361d506dcc1be3afa62d35080c5dd37afccc26cd36019e2b9db2c45f896\\\"],\\\"sizeBytes\\\":507867630},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:034588ffd95ce834e866279bf80a45af2cddda631c6c9a6344c1bb2e033fd83e\\\"],\\\"sizeBytes\\\":506374680},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8618d42fe4da4881abe39e98691d187e13713981b66d0dac0a11cb1287482b7\\\"],\\\"sizeBytes\\\":506291135},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ce68078d909b63bb5b872d94c04829aa1b5812c416abbaf9024840d348ee68b1\\\"],\\\"sizeBytes\\\":505244089},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:457c564075e8b14b1d24ff6eab750600ebc90ff8b7bb137306a579ee8445ae95\\\"],\\\"sizeBytes\\\":505137106},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ebf883de8fd905490f0c9b420a5d6446ecde18e12e15364f6dcd4e885104972c\\\"],\\\"sizeBytes\\\":504558291},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:897708222502e4d710dd737923f74d153c084ba6048bffceb16dfd30f79a6ecc\\\"],\\\"sizeBytes\\\":504513960},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:86d9e1fdf97794f44fc1c91da025714ec6900fafa6cdc4c0041ffa95e9d70c6c\\\"],\\\"sizeBytes\\\":495888162},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4e8c6ae1f9a450c90857c9fbccf1e5fb404dbc0d65d086afce005d6bd307853b\\\"],\\\"sizeBytes\\\":494959854},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:117a846734fc8159b7172a40ed2feb43a969b7dbc113ee1a572cbf6f9f922655\\\"],\\\"sizeBytes\\\":486990304},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4797a485fd4ab3414ba8d52bdf2afccefab6c657b1d259baad703fca5145124c\\\"],\\\"sizeBytes\\\":484349508},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a132d09565133b36ac7c797213d6a74ac810bb368ef59136320ab3d300f45bd\\\"],\\\"sizeBytes\\\":484074784},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6acc7c3c018d8bb3cb597580eedae0300c44a5424f07129270c878899ef592a6\\\"],\\\"sizeBytes\\\":470717179},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:235b846666adaa2e4b4d6d0f7fd71d57bf3be253466e1d9fffafd103fa2696ac\\\"],\\\"sizeBytes\\\":470575802},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ce89154fa3fe1e87c660e644b58cf125fede575869fd5841600082c0d1f858a3\\\"],\\\"sizeBytes\\\":468159025}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:23:00.485738 master-0 kubenswrapper[7728]: I0223 14:23:00.485664 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:23:00.486004 master-0 kubenswrapper[7728]: I0223 14:23:00.485743 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:23:03.486187 master-0 kubenswrapper[7728]: I0223 14:23:03.486103 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:23:03.486685 master-0 kubenswrapper[7728]: I0223 14:23:03.486220 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:23:04.111977 master-0 kubenswrapper[7728]: I0223 14:23:04.111939 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6847bb4785-5fw2x_2e89a047-9ebc-459b-b7b3-e902c1fb0e17/snapshot-controller/1.log" Feb 23 14:23:04.112977 master-0 kubenswrapper[7728]: I0223 14:23:04.112943 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6847bb4785-5fw2x_2e89a047-9ebc-459b-b7b3-e902c1fb0e17/snapshot-controller/0.log" Feb 23 14:23:04.113076 master-0 kubenswrapper[7728]: I0223 14:23:04.113002 7728 generic.go:334] "Generic (PLEG): container finished" podID="2e89a047-9ebc-459b-b7b3-e902c1fb0e17" containerID="3dfd224fb797b317bbd9fa5874481064b86c55aa823a6f87465b8ca08947f5d3" exitCode=1 Feb 23 14:23:05.992994 master-0 kubenswrapper[7728]: I0223 14:23:05.992906 7728 patch_prober.go:28] interesting pod/machine-config-daemon-fhcgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 14:23:05.994063 master-0 kubenswrapper[7728]: I0223 14:23:05.993023 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" podUID="76c67569-3a72-4de9-87cd-432a4607b15b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 14:23:06.486339 master-0 kubenswrapper[7728]: I0223 14:23:06.486249 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:23:06.486684 master-0 kubenswrapper[7728]: I0223 14:23:06.486361 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:23:07.287001 master-0 kubenswrapper[7728]: E0223 14:23:07.286915 7728 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Feb 23 14:23:07.287921 master-0 kubenswrapper[7728]: E0223 14:23:07.287175 7728 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.011s" Feb 23 14:23:07.287921 master-0 kubenswrapper[7728]: I0223 14:23:07.287212 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:23:07.287921 master-0 kubenswrapper[7728]: I0223 14:23:07.287249 7728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:23:07.304256 master-0 kubenswrapper[7728]: I0223 14:23:07.304180 7728 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Feb 23 14:23:09.318151 master-0 kubenswrapper[7728]: E0223 14:23:09.317978 7728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Feb 23 14:23:09.486436 master-0 kubenswrapper[7728]: I0223 14:23:09.486336 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:23:09.486709 master-0 kubenswrapper[7728]: I0223 14:23:09.486457 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:23:09.864383 master-0 kubenswrapper[7728]: E0223 14:23:09.864295 7728 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:23:11.027121 master-0 kubenswrapper[7728]: E0223 14:23:11.026912 7728 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{openshift-config-operator-6f47d587d6-55qjr.1896e5ff7bd5e536 openshift-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-config-operator,Name:openshift-config-operator-6f47d587d6-55qjr,UID:92c63c95-e880-4f51-9858-7715343f7bd8,APIVersion:v1,ResourceVersion:8517,FieldPath:spec.containers{openshift-config-operator},},Reason:Created,Message:Created container: openshift-config-operator,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:19:48.873905462 +0000 UTC m=+81.836566758,LastTimestamp:2026-02-23 14:19:48.873905462 +0000 UTC m=+81.836566758,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:23:12.485661 master-0 kubenswrapper[7728]: I0223 14:23:12.485543 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:23:12.486675 master-0 kubenswrapper[7728]: I0223 14:23:12.485670 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:23:15.486706 master-0 kubenswrapper[7728]: I0223 14:23:15.486577 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:23:15.487809 master-0 kubenswrapper[7728]: I0223 14:23:15.486707 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:23:18.486228 master-0 kubenswrapper[7728]: I0223 14:23:18.486103 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:23:18.486228 master-0 kubenswrapper[7728]: I0223 14:23:18.486202 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:23:19.364100 master-0 kubenswrapper[7728]: I0223 14:23:19.363944 7728 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:23:19.865118 master-0 kubenswrapper[7728]: E0223 14:23:19.865032 7728 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": context deadline exceeded" Feb 23 14:23:21.485438 master-0 kubenswrapper[7728]: I0223 14:23:21.485360 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:23:21.485438 master-0 kubenswrapper[7728]: I0223 14:23:21.485416 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:23:24.486042 master-0 kubenswrapper[7728]: I0223 14:23:24.485944 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:23:24.486775 master-0 kubenswrapper[7728]: I0223 14:23:24.486045 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:23:26.319743 master-0 kubenswrapper[7728]: E0223 14:23:26.319655 7728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Feb 23 14:23:27.401908 master-0 kubenswrapper[7728]: I0223 14:23:27.401817 7728 scope.go:117] "RemoveContainer" containerID="5809ecf60a8e4db68dfab073298af03c567dcc4e91a5b6d7f6d78ca758010d15" Feb 23 14:23:27.426450 master-0 kubenswrapper[7728]: I0223 14:23:27.426372 7728 scope.go:117] "RemoveContainer" containerID="86dd361ededa7f9d61d9c2bea900261b661a76c0468603804e9af20765f8d8cd" Feb 23 14:23:27.450643 master-0 kubenswrapper[7728]: I0223 14:23:27.450580 7728 scope.go:117] "RemoveContainer" containerID="6d4e5cbd51d6e2350099300783b6b53e026119467c4ee08ce357bbba7d0f9eaa" Feb 23 14:23:27.486764 master-0 kubenswrapper[7728]: I0223 14:23:27.486687 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:23:27.486935 master-0 kubenswrapper[7728]: I0223 14:23:27.486770 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:23:29.363983 master-0 kubenswrapper[7728]: I0223 14:23:29.363865 7728 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:23:29.865451 master-0 kubenswrapper[7728]: E0223 14:23:29.865374 7728 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:23:30.485897 master-0 kubenswrapper[7728]: I0223 14:23:30.485846 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:23:30.486939 master-0 kubenswrapper[7728]: I0223 14:23:30.486671 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:23:33.486513 master-0 kubenswrapper[7728]: I0223 14:23:33.486388 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:23:33.487287 master-0 kubenswrapper[7728]: I0223 14:23:33.486473 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:23:34.315145 master-0 kubenswrapper[7728]: I0223 14:23:34.315097 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-d6bb9bb76-4frj6_12b256b7-a57b-4124-8452-25e74cfa7926/cluster-baremetal-operator/1.log" Feb 23 14:23:34.316653 master-0 kubenswrapper[7728]: I0223 14:23:34.316625 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-d6bb9bb76-4frj6_12b256b7-a57b-4124-8452-25e74cfa7926/cluster-baremetal-operator/0.log" Feb 23 14:23:34.316723 master-0 kubenswrapper[7728]: I0223 14:23:34.316673 7728 generic.go:334] "Generic (PLEG): container finished" podID="12b256b7-a57b-4124-8452-25e74cfa7926" containerID="3f92273d1230c6309ba0ff19f3495f90ece38f0e07bb4c16e151dee5c4fb41ec" exitCode=1 Feb 23 14:23:35.328304 master-0 kubenswrapper[7728]: I0223 14:23:35.328223 7728 generic.go:334] "Generic (PLEG): container finished" podID="c9ad9373c007a4fcd25e70622bdc8deb" containerID="bceb70263737a80d48b793aeeb1a38a769270ac03a734c22702eb093a9f1b430" exitCode=1 Feb 23 14:23:36.364615 master-0 kubenswrapper[7728]: I0223 14:23:36.364471 7728 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Feb 23 14:23:36.486151 master-0 kubenswrapper[7728]: I0223 14:23:36.486046 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:23:36.486364 master-0 kubenswrapper[7728]: I0223 14:23:36.486142 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:23:39.486197 master-0 kubenswrapper[7728]: I0223 14:23:39.486062 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:23:39.486197 master-0 kubenswrapper[7728]: I0223 14:23:39.486175 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:23:39.865871 master-0 kubenswrapper[7728]: E0223 14:23:39.865712 7728 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:23:39.865871 master-0 kubenswrapper[7728]: E0223 14:23:39.865787 7728 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 14:23:41.306611 master-0 kubenswrapper[7728]: E0223 14:23:41.306529 7728 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Feb 23 14:23:41.307210 master-0 kubenswrapper[7728]: E0223 14:23:41.306738 7728 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.019s" Feb 23 14:23:41.314365 master-0 kubenswrapper[7728]: I0223 14:23:41.314313 7728 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Feb 23 14:23:42.485712 master-0 kubenswrapper[7728]: I0223 14:23:42.485628 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:23:42.485712 master-0 kubenswrapper[7728]: I0223 14:23:42.485704 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:23:43.321239 master-0 kubenswrapper[7728]: E0223 14:23:43.321131 7728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Feb 23 14:23:45.030321 master-0 kubenswrapper[7728]: E0223 14:23:45.030110 7728 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{openshift-config-operator-6f47d587d6-55qjr.1896e5ff7c992a10 openshift-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-config-operator,Name:openshift-config-operator-6f47d587d6-55qjr,UID:92c63c95-e880-4f51-9858-7715343f7bd8,APIVersion:v1,ResourceVersion:8517,FieldPath:spec.containers{openshift-config-operator},},Reason:Started,Message:Started container openshift-config-operator,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:19:48.886702608 +0000 UTC m=+81.849363904,LastTimestamp:2026-02-23 14:19:48.886702608 +0000 UTC m=+81.849363904,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:23:45.486676 master-0 kubenswrapper[7728]: I0223 14:23:45.486541 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:23:45.486676 master-0 kubenswrapper[7728]: I0223 14:23:45.486635 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:23:48.129839 master-0 kubenswrapper[7728]: I0223 14:23:48.129775 7728 status_manager.go:851] "Failed to get status for pod" podUID="43ce2f82-05aa-4778-a444-848a408cf570" pod="openshift-marketplace/redhat-marketplace-n82gm" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods redhat-marketplace-n82gm)" Feb 23 14:23:48.486019 master-0 kubenswrapper[7728]: I0223 14:23:48.485922 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:23:48.486019 master-0 kubenswrapper[7728]: I0223 14:23:48.485988 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:23:51.485754 master-0 kubenswrapper[7728]: I0223 14:23:51.485638 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:23:51.486630 master-0 kubenswrapper[7728]: I0223 14:23:51.485746 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:23:54.486398 master-0 kubenswrapper[7728]: I0223 14:23:54.486308 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:23:54.487157 master-0 kubenswrapper[7728]: I0223 14:23:54.486395 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:23:57.485762 master-0 kubenswrapper[7728]: I0223 14:23:57.485671 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:23:57.486584 master-0 kubenswrapper[7728]: I0223 14:23:57.485764 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:24:00.264655 master-0 kubenswrapper[7728]: E0223 14:24:00.264279 7728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T14:23:50Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T14:23:50Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T14:23:50Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T14:23:50Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:584b5d125dad1fa4f8d03e6ace2e4901c173569ff1ed9536da6915c56fa52bc0\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8124eb3839b25af23303e9fdde35728bfd24d7c0c47530e77852cba1dd9d1ffb\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1702755272},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:94d88fe2fa42931a725508dbf17296b6ed99b8e20c1169f5d1fb8a36f4927ddd\\\"],\\\"sizeBytes\\\":1637274270},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d7a8ac0ba2e5115c9d451d553741173ae8744d4544da15e28bf38f61630182fd\\\"],\\\"sizeBytes\\\":1237794314},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:c2af15d278f72034eecf3db74223b7e61f3d07c1a5c7ba760e7586915ff1b17e\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:e166d1252d7455d8bd62e43f2967e738ee9bdd6a09b7771a4187d82477ae7535\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1237042376},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:b5385b46d054c9ee73478bf23e07056d0b9f81d34619d0949927d8d9e791fcb5\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:ebdc10b149ba97b999770285d06149ef92c780205d916c3cab994098e20be0ba\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1210455233},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:518982b9ad8a8bfb7bb3b4216b235cac99e126df3bb48e390b36064560c76b83\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b3293b04e31c8e67c885f77e0ad2ee994295afde7c42cb9761c7090ae0cdb3f8\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1202767548},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4775c6461221dafe3ddd67ff683ccb665bed6eb278fa047d9d744aab9af65dcf\\\"],\\\"sizeBytes\\\":992461126},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8177c465e14c63854e5c0fa95ca0635cffc9b5dd3d077ecf971feedbc42b1274\\\"],\\\"sizeBytes\\\":943734757},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6c7ec917f0eff7b41d7174f1b5fdc4ce53ad106e51599afba731a8431ff9caa7\\\"],\\\"sizeBytes\\\":918153745},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8ff40a2d97bf7a95e19303f7e972b7e8354a3864039111c6d33d5479117aaeed\\\"],\\\"sizeBytes\\\":880247193},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:72fafcd55ab739919dd8a114863fda27106af1c497f474e7ce0cb23b58dfa021\\\"],\\\"sizeBytes\\\":875998518},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7b9239f1f5e9590e3db71e61fde86db8f43e0085f61ae7769508d2ea058481c7\\\"],\\\"sizeBytes\\\":862501144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:572b0ca6e993beea2ee9346197665e56a2e4999fbb6958c747c48a35bf72ee34\\\"],\\\"sizeBytes\\\":862091954},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3fa84eaa1310d97fe55bb23a7c27ece85718d0643fa7fc0ff81014edb4b948b\\\"],\\\"sizeBytes\\\":772838975},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bd420e879c9f0271bca2d123a6d762591d9a4626b72f254d1f885842c32149e8\\\"],\\\"sizeBytes\\\":687849728},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3c467c1eeba7434b2aebf07169ab8afe0203d638e871dbdf29a16f830e9aef9e\\\"],\\\"sizeBytes\\\":682963466},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5121a0944000b7bfa57ae2e4eb3f412e1b4b89fcc75eec1ef20241182c0527f2\\\"],\\\"sizeBytes\\\":677827184},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a31b448302fbb994548ed801ac488a44e8a7c4ae9149c3b4cc20d6af832f83\\\"],\\\"sizeBytes\\\":621542709},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3e089c4e4fa9a22803b2673b776215e021a1f12a856dbcaba2fadee29bee10a3\\\"],\\\"sizeBytes\\\":589275174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1582ea693f35073e3316e2380a18227b78096ca7f4e1328f1dd8a2c423da26e9\\\"],\\\"sizeBytes\\\":582052489},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:314be88d356b2c8a3c4416daeb4cfcd58d617a4526319c01ddaffae4b4179e74\\\"],\\\"sizeBytes\\\":558105176},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:69f9df2f6b5cd83ab895e9e4a9bf8920d35fe450679ce06fb223944e95cfbe3e\\\"],\\\"sizeBytes\\\":557320737},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f86073cf0561e4b69668f8917ef5184cb0ef5aa16d0fefe38118f1167b268721\\\"],\\\"sizeBytes\\\":548646306},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d77a77c401bcfaa65a6ab6de82415af0e7ace1b470626647e5feb4875c89a5ef\\\"],\\\"sizeBytes\\\":529218694},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bc0ca626e5e17f9f78ddbfde54ea13ddc7749904911817bba16e6b59f30499ec\\\"],\\\"sizeBytes\\\":528829499},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:11f566fe2ae782ad96d36028b0fd81911a64ef787dcebc83803f741f272fa396\\\"],\\\"sizeBytes\\\":518279996},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:40bb7cf7c637bf9efd8fb0157839d325a019d67cc7d7279665fcf90dbb7f3f33\\\"],\\\"sizeBytes\\\":517888569},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fd63e2c1185e529c6e9f6e1426222ff2ac195132b44a1775f407e4593b66d4c\\\"],\\\"sizeBytes\\\":514875199},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a1b426a276216372c7d688fe60e9eaf251efd35071f94e1bcd4337f51a90fd75\\\"],\\\"sizeBytes\\\":513473308},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ce471c00b59fd855a59f7efa9afdb3f0f9cbf1c4bcce3a82fe1a4cb82e90f52e\\\"],\\\"sizeBytes\\\":513119434},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a9dcbc6b966928b7597d4a822948ae6f07b62feecb91679c1d825d0d19426e19\\\"],\\\"sizeBytes\\\":512172666},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5f4a546983224e416dfcc3a700afc15f9790182a5a2f8f7c94892d0e95abab3\\\"],\\\"sizeBytes\\\":511125422},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2c8de5c5b21ed8c7829ba988d580ffa470c9913877fe0ee5e11bf507400ffbc7\\\"],\\\"sizeBytes\\\":511059399},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:64ba461fd5594e3a30bfd755f1496707a88249bc68d07c65124c8617d664d2ac\\\"],\\\"sizeBytes\\\":508786786},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a82e441a9e9b93f0e010f1ce26e30c24b6ca93f7752084d4694ebdb3c5b53f83\\\"],\\\"sizeBytes\\\":508443359},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d7bd3361d506dcc1be3afa62d35080c5dd37afccc26cd36019e2b9db2c45f896\\\"],\\\"sizeBytes\\\":507867630},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:034588ffd95ce834e866279bf80a45af2cddda631c6c9a6344c1bb2e033fd83e\\\"],\\\"sizeBytes\\\":506374680},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8618d42fe4da4881abe39e98691d187e13713981b66d0dac0a11cb1287482b7\\\"],\\\"sizeBytes\\\":506291135},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ce68078d909b63bb5b872d94c04829aa1b5812c416abbaf9024840d348ee68b1\\\"],\\\"sizeBytes\\\":505244089},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:457c564075e8b14b1d24ff6eab750600ebc90ff8b7bb137306a579ee8445ae95\\\"],\\\"sizeBytes\\\":505137106},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ebf883de8fd905490f0c9b420a5d6446ecde18e12e15364f6dcd4e885104972c\\\"],\\\"sizeBytes\\\":504558291},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:897708222502e4d710dd737923f74d153c084ba6048bffceb16dfd30f79a6ecc\\\"],\\\"sizeBytes\\\":504513960},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:86d9e1fdf97794f44fc1c91da025714ec6900fafa6cdc4c0041ffa95e9d70c6c\\\"],\\\"sizeBytes\\\":495888162},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4e8c6ae1f9a450c90857c9fbccf1e5fb404dbc0d65d086afce005d6bd307853b\\\"],\\\"sizeBytes\\\":494959854},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:117a846734fc8159b7172a40ed2feb43a969b7dbc113ee1a572cbf6f9f922655\\\"],\\\"sizeBytes\\\":486990304},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4797a485fd4ab3414ba8d52bdf2afccefab6c657b1d259baad703fca5145124c\\\"],\\\"sizeBytes\\\":484349508},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a132d09565133b36ac7c797213d6a74ac810bb368ef59136320ab3d300f45bd\\\"],\\\"sizeBytes\\\":484074784},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6acc7c3c018d8bb3cb597580eedae0300c44a5424f07129270c878899ef592a6\\\"],\\\"sizeBytes\\\":470717179},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:235b846666adaa2e4b4d6d0f7fd71d57bf3be253466e1d9fffafd103fa2696ac\\\"],\\\"sizeBytes\\\":470575802},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ce89154fa3fe1e87c660e644b58cf125fede575869fd5841600082c0d1f858a3\\\"],\\\"sizeBytes\\\":468159025}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:24:00.322317 master-0 kubenswrapper[7728]: E0223 14:24:00.322214 7728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Feb 23 14:24:00.486706 master-0 kubenswrapper[7728]: I0223 14:24:00.486617 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:24:00.487031 master-0 kubenswrapper[7728]: I0223 14:24:00.486706 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:24:03.486779 master-0 kubenswrapper[7728]: I0223 14:24:03.486680 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:24:03.487672 master-0 kubenswrapper[7728]: I0223 14:24:03.486782 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:24:04.525315 master-0 kubenswrapper[7728]: I0223 14:24:04.525241 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5d87bf58c-nq2tz_961e4ecd-545b-4270-ae34-e733dec793b6/kube-apiserver-operator/1.log" Feb 23 14:24:04.526843 master-0 kubenswrapper[7728]: I0223 14:24:04.526774 7728 generic.go:334] "Generic (PLEG): container finished" podID="961e4ecd-545b-4270-ae34-e733dec793b6" containerID="4077843a1666052f92a3616061104688e7ad630e49b861a884467e2e98bfca5d" exitCode=255 Feb 23 14:24:04.530000 master-0 kubenswrapper[7728]: I0223 14:24:04.529942 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7d7db75979-x4qnw_674041a2-e2b0-4286-88cc-f1b00571e3f3/network-operator/1.log" Feb 23 14:24:04.530725 master-0 kubenswrapper[7728]: I0223 14:24:04.530684 7728 generic.go:334] "Generic (PLEG): container finished" podID="674041a2-e2b0-4286-88cc-f1b00571e3f3" containerID="a2ae49d4722a1e40cd55dc37aa9260d992134f5c1ca873bbced79ae9e75c00b6" exitCode=255 Feb 23 14:24:04.533257 master-0 kubenswrapper[7728]: I0223 14:24:04.533076 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-7bcfbc574b-zdntd_cf04aca0-8174-4134-835d-37adf6a3b5ca/kube-controller-manager-operator/2.log" Feb 23 14:24:04.533945 master-0 kubenswrapper[7728]: I0223 14:24:04.533889 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-7bcfbc574b-zdntd_cf04aca0-8174-4134-835d-37adf6a3b5ca/kube-controller-manager-operator/1.log" Feb 23 14:24:04.534718 master-0 kubenswrapper[7728]: I0223 14:24:04.534660 7728 generic.go:334] "Generic (PLEG): container finished" podID="cf04aca0-8174-4134-835d-37adf6a3b5ca" containerID="945eeda0e74497e88a62481b91d0c1abd43853d97eee9a925d78fc6fc7443101" exitCode=255 Feb 23 14:24:04.537008 master-0 kubenswrapper[7728]: I0223 14:24:04.536960 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-c48c8bf7c-vtnsw_b714a9df-026e-423d-a980-2569f0d92e47/service-ca-operator/1.log" Feb 23 14:24:04.537677 master-0 kubenswrapper[7728]: I0223 14:24:04.537623 7728 generic.go:334] "Generic (PLEG): container finished" podID="b714a9df-026e-423d-a980-2569f0d92e47" containerID="796f1fd46fb4b7de05e9d7265e3f4d090bdf2c82e271c42f786a885903a59f3d" exitCode=255 Feb 23 14:24:04.541132 master-0 kubenswrapper[7728]: I0223 14:24:04.541083 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-8586dccc9b-tvnmq_24829faf-50e8-45bb-abb0-7cc5ccf81080/openshift-apiserver-operator/1.log" Feb 23 14:24:04.541733 master-0 kubenswrapper[7728]: I0223 14:24:04.541685 7728 generic.go:334] "Generic (PLEG): container finished" podID="24829faf-50e8-45bb-abb0-7cc5ccf81080" containerID="81ff2e6b5bae83ef9904fdb97ede2ea9a1442b7adab5da48d4302eefab7a166a" exitCode=255 Feb 23 14:24:04.543991 master-0 kubenswrapper[7728]: I0223 14:24:04.543936 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-77cd4d9559-qvq8x_b9cf1c39-24f0-420b-8020-089616d1cdf0/kube-scheduler-operator-container/1.log" Feb 23 14:24:04.544434 master-0 kubenswrapper[7728]: I0223 14:24:04.544387 7728 generic.go:334] "Generic (PLEG): container finished" podID="b9cf1c39-24f0-420b-8020-089616d1cdf0" containerID="b4c3088f58a98599776247fbbf6c6a5a5751aeab22dce8d8142cec8b35a3fab9" exitCode=255 Feb 23 14:24:06.486569 master-0 kubenswrapper[7728]: I0223 14:24:06.486466 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:24:06.487675 master-0 kubenswrapper[7728]: I0223 14:24:06.487624 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:24:09.486111 master-0 kubenswrapper[7728]: I0223 14:24:09.486035 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:24:09.486942 master-0 kubenswrapper[7728]: I0223 14:24:09.486128 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:24:10.265600 master-0 kubenswrapper[7728]: E0223 14:24:10.265397 7728 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:24:12.486105 master-0 kubenswrapper[7728]: I0223 14:24:12.485994 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:24:12.487097 master-0 kubenswrapper[7728]: I0223 14:24:12.486120 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:24:15.317451 master-0 kubenswrapper[7728]: E0223 14:24:15.317359 7728 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Feb 23 14:24:15.318457 master-0 kubenswrapper[7728]: E0223 14:24:15.317711 7728 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.011s" Feb 23 14:24:15.329130 master-0 kubenswrapper[7728]: I0223 14:24:15.329037 7728 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Feb 23 14:24:15.486574 master-0 kubenswrapper[7728]: I0223 14:24:15.486365 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:24:15.486574 master-0 kubenswrapper[7728]: I0223 14:24:15.486468 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:24:17.324219 master-0 kubenswrapper[7728]: E0223 14:24:17.324120 7728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Feb 23 14:24:18.486042 master-0 kubenswrapper[7728]: I0223 14:24:18.485953 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:24:18.486042 master-0 kubenswrapper[7728]: I0223 14:24:18.486041 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:24:19.033114 master-0 kubenswrapper[7728]: E0223 14:24:19.032899 7728 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event=< Feb 23 14:24:19.033114 master-0 kubenswrapper[7728]: &Event{ObjectMeta:{openshift-config-operator-6f47d587d6-55qjr.1896e60016c4c949 openshift-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-config-operator,Name:openshift-config-operator-6f47d587d6-55qjr,UID:92c63c95-e880-4f51-9858-7715343f7bd8,APIVersion:v1,ResourceVersion:8517,FieldPath:spec.containers{openshift-config-operator},},Reason:ProbeError,Message:Readiness probe error: Get "https://10.128.0.52:8443/healthz": dial tcp 10.128.0.52:8443: connect: connection refused Feb 23 14:24:19.033114 master-0 kubenswrapper[7728]: body: Feb 23 14:24:19.033114 master-0 kubenswrapper[7728]: ,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:19:51.473252681 +0000 UTC m=+84.435913977,LastTimestamp:2026-02-23 14:19:51.473252681 +0000 UTC m=+84.435913977,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,} Feb 23 14:24:19.033114 master-0 kubenswrapper[7728]: > Feb 23 14:24:20.265926 master-0 kubenswrapper[7728]: E0223 14:24:20.265756 7728 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": context deadline exceeded" Feb 23 14:24:21.486203 master-0 kubenswrapper[7728]: I0223 14:24:21.486114 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:24:21.487067 master-0 kubenswrapper[7728]: I0223 14:24:21.486209 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:24:24.486527 master-0 kubenswrapper[7728]: I0223 14:24:24.486423 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:24:24.486527 master-0 kubenswrapper[7728]: I0223 14:24:24.486520 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:24:24.682653 master-0 kubenswrapper[7728]: I0223 14:24:24.682587 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-bcf775fc9-z5t5b_57b57915-64dd-42f5-b06f-bc4bcc06b667/cluster-node-tuning-operator/0.log" Feb 23 14:24:24.682653 master-0 kubenswrapper[7728]: I0223 14:24:24.682657 7728 generic.go:334] "Generic (PLEG): container finished" podID="57b57915-64dd-42f5-b06f-bc4bcc06b667" containerID="2f96ee533f5d52939bd2d7faf41993b118d9a6bfbb0b89e7580d1b1a849ba083" exitCode=1 Feb 23 14:24:24.685106 master-0 kubenswrapper[7728]: I0223 14:24:24.685038 7728 generic.go:334] "Generic (PLEG): container finished" podID="cb6e88cd-98de-446a-92e8-f56a2f133703" containerID="031c49419dbbce343a020e2a52b0b21aa31f7846ce6d6338d427aedeeb387c27" exitCode=0 Feb 23 14:24:27.486611 master-0 kubenswrapper[7728]: I0223 14:24:27.486359 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:24:27.486611 master-0 kubenswrapper[7728]: I0223 14:24:27.486466 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:24:28.716006 master-0 kubenswrapper[7728]: I0223 14:24:28.715917 7728 generic.go:334] "Generic (PLEG): container finished" podID="d2aa0d48-7c8e-4ddb-84a3-b3c34414c061" containerID="bb60c962ed53b03fdfea9c76fcac5c126728571b797b7f917d784b1b7debd024" exitCode=0 Feb 23 14:24:30.267315 master-0 kubenswrapper[7728]: E0223 14:24:30.266847 7728 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:24:30.486052 master-0 kubenswrapper[7728]: I0223 14:24:30.485961 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:24:30.486454 master-0 kubenswrapper[7728]: I0223 14:24:30.486056 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:24:33.485831 master-0 kubenswrapper[7728]: I0223 14:24:33.485745 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:24:33.486802 master-0 kubenswrapper[7728]: I0223 14:24:33.485831 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:24:34.325608 master-0 kubenswrapper[7728]: E0223 14:24:34.325425 7728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Feb 23 14:24:34.768212 master-0 kubenswrapper[7728]: I0223 14:24:34.768114 7728 generic.go:334] "Generic (PLEG): container finished" podID="8de1f285-47ac-42aa-8026-8addce656362" containerID="5f94c8fde6ae66d48d8282c5c57e237550057485795720cf3e4f35047fc2b408" exitCode=0 Feb 23 14:24:34.770640 master-0 kubenswrapper[7728]: I0223 14:24:34.770586 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-6569778c84-hsl6c_3488a7eb-5170-478c-9af7-490dbe0f514e/ingress-operator/1.log" Feb 23 14:24:34.772403 master-0 kubenswrapper[7728]: I0223 14:24:34.772277 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-6569778c84-hsl6c_3488a7eb-5170-478c-9af7-490dbe0f514e/ingress-operator/0.log" Feb 23 14:24:34.772621 master-0 kubenswrapper[7728]: I0223 14:24:34.772424 7728 generic.go:334] "Generic (PLEG): container finished" podID="3488a7eb-5170-478c-9af7-490dbe0f514e" containerID="3ea10ed9b3b081ac010f974ab393059b11852999309c95ddd5381bd40e623b2e" exitCode=1 Feb 23 14:24:36.485942 master-0 kubenswrapper[7728]: I0223 14:24:36.485837 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:24:36.485942 master-0 kubenswrapper[7728]: I0223 14:24:36.485922 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:24:39.486229 master-0 kubenswrapper[7728]: I0223 14:24:39.486111 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:24:39.486229 master-0 kubenswrapper[7728]: I0223 14:24:39.486201 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:24:40.268165 master-0 kubenswrapper[7728]: E0223 14:24:40.267290 7728 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:24:40.268165 master-0 kubenswrapper[7728]: E0223 14:24:40.267355 7728 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 14:24:42.486158 master-0 kubenswrapper[7728]: I0223 14:24:42.486098 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:24:42.487224 master-0 kubenswrapper[7728]: I0223 14:24:42.486758 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:24:45.486965 master-0 kubenswrapper[7728]: I0223 14:24:45.486830 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:24:45.486965 master-0 kubenswrapper[7728]: I0223 14:24:45.486935 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:24:48.132525 master-0 kubenswrapper[7728]: I0223 14:24:48.132366 7728 status_manager.go:851] "Failed to get status for pod" podUID="3a398c0f-1b6a-4836-a8b4-33b004350d84" pod="openshift-marketplace/redhat-operators-mtvwp" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods redhat-operators-mtvwp)" Feb 23 14:24:48.486061 master-0 kubenswrapper[7728]: I0223 14:24:48.485954 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:24:48.486411 master-0 kubenswrapper[7728]: I0223 14:24:48.486080 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:24:49.332619 master-0 kubenswrapper[7728]: E0223 14:24:49.332531 7728 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Feb 23 14:24:49.333571 master-0 kubenswrapper[7728]: E0223 14:24:49.332802 7728 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.015s" Feb 23 14:24:49.333571 master-0 kubenswrapper[7728]: I0223 14:24:49.332840 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vwtc6" Feb 23 14:24:49.333571 master-0 kubenswrapper[7728]: I0223 14:24:49.332943 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mtvwp" Feb 23 14:24:49.334517 master-0 kubenswrapper[7728]: I0223 14:24:49.334374 7728 scope.go:117] "RemoveContainer" containerID="4077843a1666052f92a3616061104688e7ad630e49b861a884467e2e98bfca5d" Feb 23 14:24:49.335410 master-0 kubenswrapper[7728]: I0223 14:24:49.335335 7728 scope.go:117] "RemoveContainer" containerID="a2ae49d4722a1e40cd55dc37aa9260d992134f5c1ca873bbced79ae9e75c00b6" Feb 23 14:24:49.336667 master-0 kubenswrapper[7728]: I0223 14:24:49.335835 7728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a927374fcf62fad56c9d8325450d07e92c07f04787ed291d9c0071fab4d22549"} pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 14:24:49.336667 master-0 kubenswrapper[7728]: I0223 14:24:49.335911 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" podUID="76c67569-3a72-4de9-87cd-432a4607b15b" containerName="machine-config-daemon" containerID="cri-o://a927374fcf62fad56c9d8325450d07e92c07f04787ed291d9c0071fab4d22549" gracePeriod=600 Feb 23 14:24:49.336667 master-0 kubenswrapper[7728]: I0223 14:24:49.336552 7728 scope.go:117] "RemoveContainer" containerID="5f94c8fde6ae66d48d8282c5c57e237550057485795720cf3e4f35047fc2b408" Feb 23 14:24:49.337253 master-0 kubenswrapper[7728]: I0223 14:24:49.337193 7728 scope.go:117] "RemoveContainer" containerID="b4c3088f58a98599776247fbbf6c6a5a5751aeab22dce8d8142cec8b35a3fab9" Feb 23 14:24:49.337928 master-0 kubenswrapper[7728]: I0223 14:24:49.337883 7728 scope.go:117] "RemoveContainer" containerID="7afdcdc79bcf059c9a09c1210b1f2828b6a1174f97563a28f6b04cfb2b6ff9e4" Feb 23 14:24:49.338608 master-0 kubenswrapper[7728]: I0223 14:24:49.338366 7728 scope.go:117] "RemoveContainer" containerID="0a3915eaedd169a17fc20783989eb20aa548b21f919f4de39f43389e2994de7c" Feb 23 14:24:49.338929 master-0 kubenswrapper[7728]: I0223 14:24:49.338796 7728 scope.go:117] "RemoveContainer" containerID="2f96ee533f5d52939bd2d7faf41993b118d9a6bfbb0b89e7580d1b1a849ba083" Feb 23 14:24:49.339472 master-0 kubenswrapper[7728]: I0223 14:24:49.339429 7728 scope.go:117] "RemoveContainer" containerID="3f92273d1230c6309ba0ff19f3495f90ece38f0e07bb4c16e151dee5c4fb41ec" Feb 23 14:24:49.339717 master-0 kubenswrapper[7728]: I0223 14:24:49.339638 7728 scope.go:117] "RemoveContainer" containerID="031c49419dbbce343a020e2a52b0b21aa31f7846ce6d6338d427aedeeb387c27" Feb 23 14:24:49.343462 master-0 kubenswrapper[7728]: I0223 14:24:49.343295 7728 scope.go:117] "RemoveContainer" containerID="bceb70263737a80d48b793aeeb1a38a769270ac03a734c22702eb093a9f1b430" Feb 23 14:24:49.352424 master-0 kubenswrapper[7728]: I0223 14:24:49.352329 7728 scope.go:117] "RemoveContainer" containerID="bb60c962ed53b03fdfea9c76fcac5c126728571b797b7f917d784b1b7debd024" Feb 23 14:24:49.355820 master-0 kubenswrapper[7728]: I0223 14:24:49.355768 7728 scope.go:117] "RemoveContainer" containerID="796f1fd46fb4b7de05e9d7265e3f4d090bdf2c82e271c42f786a885903a59f3d" Feb 23 14:24:49.356278 master-0 kubenswrapper[7728]: I0223 14:24:49.356225 7728 scope.go:117] "RemoveContainer" containerID="81ff2e6b5bae83ef9904fdb97ede2ea9a1442b7adab5da48d4302eefab7a166a" Feb 23 14:24:49.357452 master-0 kubenswrapper[7728]: I0223 14:24:49.357414 7728 scope.go:117] "RemoveContainer" containerID="3ea10ed9b3b081ac010f974ab393059b11852999309c95ddd5381bd40e623b2e" Feb 23 14:24:49.357740 master-0 kubenswrapper[7728]: I0223 14:24:49.357710 7728 scope.go:117] "RemoveContainer" containerID="945eeda0e74497e88a62481b91d0c1abd43853d97eee9a925d78fc6fc7443101" Feb 23 14:24:49.357806 master-0 kubenswrapper[7728]: I0223 14:24:49.357774 7728 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Feb 23 14:24:49.361427 master-0 kubenswrapper[7728]: I0223 14:24:49.361371 7728 scope.go:117] "RemoveContainer" containerID="3dfd224fb797b317bbd9fa5874481064b86c55aa823a6f87465b8ca08947f5d3" Feb 23 14:24:49.881063 master-0 kubenswrapper[7728]: I0223 14:24:49.881017 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-d6bb9bb76-4frj6_12b256b7-a57b-4124-8452-25e74cfa7926/cluster-baremetal-operator/1.log" Feb 23 14:24:49.882034 master-0 kubenswrapper[7728]: I0223 14:24:49.881987 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-d6bb9bb76-4frj6_12b256b7-a57b-4124-8452-25e74cfa7926/cluster-baremetal-operator/0.log" Feb 23 14:24:49.884237 master-0 kubenswrapper[7728]: I0223 14:24:49.884050 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-8586dccc9b-tvnmq_24829faf-50e8-45bb-abb0-7cc5ccf81080/openshift-apiserver-operator/1.log" Feb 23 14:24:49.887626 master-0 kubenswrapper[7728]: I0223 14:24:49.887472 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-fc889cfd5-tw2r9_865ceedb-b19a-4f2f-b295-311e1b7a645e/kube-storage-version-migrator-operator/1.log" Feb 23 14:24:49.894449 master-0 kubenswrapper[7728]: I0223 14:24:49.894414 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-6f47d587d6-55qjr_92c63c95-e880-4f51-9858-7715343f7bd8/openshift-config-operator/2.log" Feb 23 14:24:49.895073 master-0 kubenswrapper[7728]: I0223 14:24:49.895034 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-6f47d587d6-55qjr_92c63c95-e880-4f51-9858-7715343f7bd8/openshift-config-operator/1.log" Feb 23 14:24:49.895706 master-0 kubenswrapper[7728]: I0223 14:24:49.895668 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-6f47d587d6-55qjr_92c63c95-e880-4f51-9858-7715343f7bd8/openshift-config-operator/0.log" Feb 23 14:24:49.900783 master-0 kubenswrapper[7728]: I0223 14:24:49.900730 7728 generic.go:334] "Generic (PLEG): container finished" podID="76c67569-3a72-4de9-87cd-432a4607b15b" containerID="a927374fcf62fad56c9d8325450d07e92c07f04787ed291d9c0071fab4d22549" exitCode=0 Feb 23 14:24:49.903945 master-0 kubenswrapper[7728]: I0223 14:24:49.903913 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-bcf775fc9-z5t5b_57b57915-64dd-42f5-b06f-bc4bcc06b667/cluster-node-tuning-operator/0.log" Feb 23 14:24:50.265207 master-0 kubenswrapper[7728]: I0223 14:24:50.264823 7728 patch_prober.go:28] interesting pod/package-server-manager-5c75f78c8b-cj2l7 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.128.0.17:8080/healthz\": dial tcp 10.128.0.17:8080: connect: connection refused" start-of-body= Feb 23 14:24:50.265207 master-0 kubenswrapper[7728]: I0223 14:24:50.264887 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" podUID="5b54fc16-d2f7-4b10-a611-5b411b389c5a" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.128.0.17:8080/healthz\": dial tcp 10.128.0.17:8080: connect: connection refused" Feb 23 14:24:50.265207 master-0 kubenswrapper[7728]: I0223 14:24:50.265040 7728 patch_prober.go:28] interesting pod/package-server-manager-5c75f78c8b-cj2l7 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.128.0.17:8080/healthz\": dial tcp 10.128.0.17:8080: connect: connection refused" start-of-body= Feb 23 14:24:50.265207 master-0 kubenswrapper[7728]: I0223 14:24:50.265122 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" podUID="5b54fc16-d2f7-4b10-a611-5b411b389c5a" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.128.0.17:8080/healthz\": dial tcp 10.128.0.17:8080: connect: connection refused" Feb 23 14:24:50.920746 master-0 kubenswrapper[7728]: I0223 14:24:50.920594 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-d6bb9bb76-4frj6_12b256b7-a57b-4124-8452-25e74cfa7926/cluster-baremetal-operator/1.log" Feb 23 14:24:50.921508 master-0 kubenswrapper[7728]: I0223 14:24:50.921196 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-d6bb9bb76-4frj6_12b256b7-a57b-4124-8452-25e74cfa7926/cluster-baremetal-operator/0.log" Feb 23 14:24:50.925558 master-0 kubenswrapper[7728]: I0223 14:24:50.925496 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-5c75f78c8b-cj2l7_5b54fc16-d2f7-4b10-a611-5b411b389c5a/package-server-manager/0.log" Feb 23 14:24:50.925925 master-0 kubenswrapper[7728]: I0223 14:24:50.925846 7728 generic.go:334] "Generic (PLEG): container finished" podID="5b54fc16-d2f7-4b10-a611-5b411b389c5a" containerID="3e43920c8c9e66c01584e52a234477388c129ea94fe151ecc6c23098a8981522" exitCode=1 Feb 23 14:24:50.929823 master-0 kubenswrapper[7728]: I0223 14:24:50.929768 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-77cd4d9559-qvq8x_b9cf1c39-24f0-420b-8020-089616d1cdf0/kube-scheduler-operator-container/1.log" Feb 23 14:24:50.931849 master-0 kubenswrapper[7728]: I0223 14:24:50.931804 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-c48c8bf7c-vtnsw_b714a9df-026e-423d-a980-2569f0d92e47/service-ca-operator/1.log" Feb 23 14:24:50.935530 master-0 kubenswrapper[7728]: I0223 14:24:50.935440 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-6569778c84-hsl6c_3488a7eb-5170-478c-9af7-490dbe0f514e/ingress-operator/1.log" Feb 23 14:24:50.936468 master-0 kubenswrapper[7728]: I0223 14:24:50.936412 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-6569778c84-hsl6c_3488a7eb-5170-478c-9af7-490dbe0f514e/ingress-operator/0.log" Feb 23 14:24:50.938390 master-0 kubenswrapper[7728]: I0223 14:24:50.938328 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-8586dccc9b-tvnmq_24829faf-50e8-45bb-abb0-7cc5ccf81080/openshift-apiserver-operator/1.log" Feb 23 14:24:50.940409 master-0 kubenswrapper[7728]: I0223 14:24:50.940371 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7d7db75979-x4qnw_674041a2-e2b0-4286-88cc-f1b00571e3f3/network-operator/1.log" Feb 23 14:24:50.942374 master-0 kubenswrapper[7728]: I0223 14:24:50.942321 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-7bcfbc574b-zdntd_cf04aca0-8174-4134-835d-37adf6a3b5ca/kube-controller-manager-operator/2.log" Feb 23 14:24:50.942819 master-0 kubenswrapper[7728]: I0223 14:24:50.942768 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-7bcfbc574b-zdntd_cf04aca0-8174-4134-835d-37adf6a3b5ca/kube-controller-manager-operator/1.log" Feb 23 14:24:50.947988 master-0 kubenswrapper[7728]: I0223 14:24:50.947932 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5d87bf58c-nq2tz_961e4ecd-545b-4270-ae34-e733dec793b6/kube-apiserver-operator/1.log" Feb 23 14:24:50.949749 master-0 kubenswrapper[7728]: I0223 14:24:50.949702 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6847bb4785-5fw2x_2e89a047-9ebc-459b-b7b3-e902c1fb0e17/snapshot-controller/1.log" Feb 23 14:24:50.950685 master-0 kubenswrapper[7728]: I0223 14:24:50.950516 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6847bb4785-5fw2x_2e89a047-9ebc-459b-b7b3-e902c1fb0e17/snapshot-controller/0.log" Feb 23 14:24:51.326972 master-0 kubenswrapper[7728]: E0223 14:24:51.326823 7728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Feb 23 14:24:53.036988 master-0 kubenswrapper[7728]: E0223 14:24:53.036786 7728 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{openshift-config-operator-6f47d587d6-55qjr.1896e60016c58bd2 openshift-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-config-operator,Name:openshift-config-operator-6f47d587d6-55qjr,UID:92c63c95-e880-4f51-9858-7715343f7bd8,APIVersion:v1,ResourceVersion:8517,FieldPath:spec.containers{openshift-config-operator},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:19:51.473302482 +0000 UTC m=+84.435963778,LastTimestamp:2026-02-23 14:19:51.473302482 +0000 UTC m=+84.435963778,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:25:00.266008 master-0 kubenswrapper[7728]: I0223 14:25:00.265863 7728 patch_prober.go:28] interesting pod/package-server-manager-5c75f78c8b-cj2l7 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.128.0.17:8080/healthz\": dial tcp 10.128.0.17:8080: connect: connection refused" start-of-body= Feb 23 14:25:00.266008 master-0 kubenswrapper[7728]: I0223 14:25:00.265961 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" podUID="5b54fc16-d2f7-4b10-a611-5b411b389c5a" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.128.0.17:8080/healthz\": dial tcp 10.128.0.17:8080: connect: connection refused" Feb 23 14:25:00.266897 master-0 kubenswrapper[7728]: I0223 14:25:00.265965 7728 patch_prober.go:28] interesting pod/package-server-manager-5c75f78c8b-cj2l7 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.128.0.17:8080/healthz\": dial tcp 10.128.0.17:8080: connect: connection refused" start-of-body= Feb 23 14:25:00.266897 master-0 kubenswrapper[7728]: I0223 14:25:00.266372 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" podUID="5b54fc16-d2f7-4b10-a611-5b411b389c5a" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.128.0.17:8080/healthz\": dial tcp 10.128.0.17:8080: connect: connection refused" Feb 23 14:25:00.359728 master-0 kubenswrapper[7728]: E0223 14:25:00.359436 7728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T14:24:50Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T14:24:50Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T14:24:50Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T14:24:50Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:584b5d125dad1fa4f8d03e6ace2e4901c173569ff1ed9536da6915c56fa52bc0\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8124eb3839b25af23303e9fdde35728bfd24d7c0c47530e77852cba1dd9d1ffb\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1702755272},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:94d88fe2fa42931a725508dbf17296b6ed99b8e20c1169f5d1fb8a36f4927ddd\\\"],\\\"sizeBytes\\\":1637274270},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d7a8ac0ba2e5115c9d451d553741173ae8744d4544da15e28bf38f61630182fd\\\"],\\\"sizeBytes\\\":1237794314},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:c2af15d278f72034eecf3db74223b7e61f3d07c1a5c7ba760e7586915ff1b17e\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:e166d1252d7455d8bd62e43f2967e738ee9bdd6a09b7771a4187d82477ae7535\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1237042376},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:b5385b46d054c9ee73478bf23e07056d0b9f81d34619d0949927d8d9e791fcb5\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:ebdc10b149ba97b999770285d06149ef92c780205d916c3cab994098e20be0ba\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1210455233},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:518982b9ad8a8bfb7bb3b4216b235cac99e126df3bb48e390b36064560c76b83\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b3293b04e31c8e67c885f77e0ad2ee994295afde7c42cb9761c7090ae0cdb3f8\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1202767548},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4775c6461221dafe3ddd67ff683ccb665bed6eb278fa047d9d744aab9af65dcf\\\"],\\\"sizeBytes\\\":992461126},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8177c465e14c63854e5c0fa95ca0635cffc9b5dd3d077ecf971feedbc42b1274\\\"],\\\"sizeBytes\\\":943734757},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6c7ec917f0eff7b41d7174f1b5fdc4ce53ad106e51599afba731a8431ff9caa7\\\"],\\\"sizeBytes\\\":918153745},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8ff40a2d97bf7a95e19303f7e972b7e8354a3864039111c6d33d5479117aaeed\\\"],\\\"sizeBytes\\\":880247193},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:72fafcd55ab739919dd8a114863fda27106af1c497f474e7ce0cb23b58dfa021\\\"],\\\"sizeBytes\\\":875998518},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7b9239f1f5e9590e3db71e61fde86db8f43e0085f61ae7769508d2ea058481c7\\\"],\\\"sizeBytes\\\":862501144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:572b0ca6e993beea2ee9346197665e56a2e4999fbb6958c747c48a35bf72ee34\\\"],\\\"sizeBytes\\\":862091954},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3fa84eaa1310d97fe55bb23a7c27ece85718d0643fa7fc0ff81014edb4b948b\\\"],\\\"sizeBytes\\\":772838975},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bd420e879c9f0271bca2d123a6d762591d9a4626b72f254d1f885842c32149e8\\\"],\\\"sizeBytes\\\":687849728},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3c467c1eeba7434b2aebf07169ab8afe0203d638e871dbdf29a16f830e9aef9e\\\"],\\\"sizeBytes\\\":682963466},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5121a0944000b7bfa57ae2e4eb3f412e1b4b89fcc75eec1ef20241182c0527f2\\\"],\\\"sizeBytes\\\":677827184},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a31b448302fbb994548ed801ac488a44e8a7c4ae9149c3b4cc20d6af832f83\\\"],\\\"sizeBytes\\\":621542709},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3e089c4e4fa9a22803b2673b776215e021a1f12a856dbcaba2fadee29bee10a3\\\"],\\\"sizeBytes\\\":589275174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1582ea693f35073e3316e2380a18227b78096ca7f4e1328f1dd8a2c423da26e9\\\"],\\\"sizeBytes\\\":582052489},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:314be88d356b2c8a3c4416daeb4cfcd58d617a4526319c01ddaffae4b4179e74\\\"],\\\"sizeBytes\\\":558105176},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:69f9df2f6b5cd83ab895e9e4a9bf8920d35fe450679ce06fb223944e95cfbe3e\\\"],\\\"sizeBytes\\\":557320737},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f86073cf0561e4b69668f8917ef5184cb0ef5aa16d0fefe38118f1167b268721\\\"],\\\"sizeBytes\\\":548646306},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d77a77c401bcfaa65a6ab6de82415af0e7ace1b470626647e5feb4875c89a5ef\\\"],\\\"sizeBytes\\\":529218694},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bc0ca626e5e17f9f78ddbfde54ea13ddc7749904911817bba16e6b59f30499ec\\\"],\\\"sizeBytes\\\":528829499},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:11f566fe2ae782ad96d36028b0fd81911a64ef787dcebc83803f741f272fa396\\\"],\\\"sizeBytes\\\":518279996},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:40bb7cf7c637bf9efd8fb0157839d325a019d67cc7d7279665fcf90dbb7f3f33\\\"],\\\"sizeBytes\\\":517888569},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fd63e2c1185e529c6e9f6e1426222ff2ac195132b44a1775f407e4593b66d4c\\\"],\\\"sizeBytes\\\":514875199},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a1b426a276216372c7d688fe60e9eaf251efd35071f94e1bcd4337f51a90fd75\\\"],\\\"sizeBytes\\\":513473308},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ce471c00b59fd855a59f7efa9afdb3f0f9cbf1c4bcce3a82fe1a4cb82e90f52e\\\"],\\\"sizeBytes\\\":513119434},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a9dcbc6b966928b7597d4a822948ae6f07b62feecb91679c1d825d0d19426e19\\\"],\\\"sizeBytes\\\":512172666},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5f4a546983224e416dfcc3a700afc15f9790182a5a2f8f7c94892d0e95abab3\\\"],\\\"sizeBytes\\\":511125422},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2c8de5c5b21ed8c7829ba988d580ffa470c9913877fe0ee5e11bf507400ffbc7\\\"],\\\"sizeBytes\\\":511059399},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:64ba461fd5594e3a30bfd755f1496707a88249bc68d07c65124c8617d664d2ac\\\"],\\\"sizeBytes\\\":508786786},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a82e441a9e9b93f0e010f1ce26e30c24b6ca93f7752084d4694ebdb3c5b53f83\\\"],\\\"sizeBytes\\\":508443359},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d7bd3361d506dcc1be3afa62d35080c5dd37afccc26cd36019e2b9db2c45f896\\\"],\\\"sizeBytes\\\":507867630},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:034588ffd95ce834e866279bf80a45af2cddda631c6c9a6344c1bb2e033fd83e\\\"],\\\"sizeBytes\\\":506374680},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8618d42fe4da4881abe39e98691d187e13713981b66d0dac0a11cb1287482b7\\\"],\\\"sizeBytes\\\":506291135},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ce68078d909b63bb5b872d94c04829aa1b5812c416abbaf9024840d348ee68b1\\\"],\\\"sizeBytes\\\":505244089},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:457c564075e8b14b1d24ff6eab750600ebc90ff8b7bb137306a579ee8445ae95\\\"],\\\"sizeBytes\\\":505137106},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ebf883de8fd905490f0c9b420a5d6446ecde18e12e15364f6dcd4e885104972c\\\"],\\\"sizeBytes\\\":504558291},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:897708222502e4d710dd737923f74d153c084ba6048bffceb16dfd30f79a6ecc\\\"],\\\"sizeBytes\\\":504513960},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:86d9e1fdf97794f44fc1c91da025714ec6900fafa6cdc4c0041ffa95e9d70c6c\\\"],\\\"sizeBytes\\\":495888162},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4e8c6ae1f9a450c90857c9fbccf1e5fb404dbc0d65d086afce005d6bd307853b\\\"],\\\"sizeBytes\\\":494959854},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:117a846734fc8159b7172a40ed2feb43a969b7dbc113ee1a572cbf6f9f922655\\\"],\\\"sizeBytes\\\":486990304},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4797a485fd4ab3414ba8d52bdf2afccefab6c657b1d259baad703fca5145124c\\\"],\\\"sizeBytes\\\":484349508},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a132d09565133b36ac7c797213d6a74ac810bb368ef59136320ab3d300f45bd\\\"],\\\"sizeBytes\\\":484074784},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6acc7c3c018d8bb3cb597580eedae0300c44a5424f07129270c878899ef592a6\\\"],\\\"sizeBytes\\\":470717179},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:235b846666adaa2e4b4d6d0f7fd71d57bf3be253466e1d9fffafd103fa2696ac\\\"],\\\"sizeBytes\\\":470575802},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ce89154fa3fe1e87c660e644b58cf125fede575869fd5841600082c0d1f858a3\\\"],\\\"sizeBytes\\\":468159025}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:25:02.366400 master-0 kubenswrapper[7728]: E0223 14:25:02.366347 7728 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Feb 23 14:25:06.082306 master-0 kubenswrapper[7728]: I0223 14:25:06.082251 7728 generic.go:334] "Generic (PLEG): container finished" podID="af950a67-1557-4352-8100-27281bb8ecbe" containerID="3d6da8a2ab007c14781f7a758e38f4dc17838974a913b095ba4be079439082e2" exitCode=0 Feb 23 14:25:08.329186 master-0 kubenswrapper[7728]: E0223 14:25:08.329126 7728 controller.go:145] "Failed to ensure lease exists, will retry" err="the server was unable to return a response in the time allotted, but may still be processing the request (get leases.coordination.k8s.io master-0)" interval="7s" Feb 23 14:25:10.265179 master-0 kubenswrapper[7728]: I0223 14:25:10.265064 7728 patch_prober.go:28] interesting pod/package-server-manager-5c75f78c8b-cj2l7 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.128.0.17:8080/healthz\": dial tcp 10.128.0.17:8080: connect: connection refused" start-of-body= Feb 23 14:25:10.265179 master-0 kubenswrapper[7728]: I0223 14:25:10.265169 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" podUID="5b54fc16-d2f7-4b10-a611-5b411b389c5a" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.128.0.17:8080/healthz\": dial tcp 10.128.0.17:8080: connect: connection refused" Feb 23 14:25:10.265844 master-0 kubenswrapper[7728]: I0223 14:25:10.265230 7728 patch_prober.go:28] interesting pod/package-server-manager-5c75f78c8b-cj2l7 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.128.0.17:8080/healthz\": dial tcp 10.128.0.17:8080: connect: connection refused" start-of-body= Feb 23 14:25:10.265844 master-0 kubenswrapper[7728]: I0223 14:25:10.265329 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" podUID="5b54fc16-d2f7-4b10-a611-5b411b389c5a" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.128.0.17:8080/healthz\": dial tcp 10.128.0.17:8080: connect: connection refused" Feb 23 14:25:10.360341 master-0 kubenswrapper[7728]: E0223 14:25:10.360275 7728 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:25:16.167367 master-0 kubenswrapper[7728]: I0223 14:25:16.167267 7728 generic.go:334] "Generic (PLEG): container finished" podID="709ac071-4392-4a3f-a3d1-4bc8ba2f6236" containerID="c28d30a2b760e3ebbe98681a086eea9adf4942f9ca5f692597b7830f1309f2a8" exitCode=0 Feb 23 14:25:18.184817 master-0 kubenswrapper[7728]: I0223 14:25:18.184718 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-86b8dc6d6-2kvfp_3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3/cluster-autoscaler-operator/0.log" Feb 23 14:25:18.185848 master-0 kubenswrapper[7728]: I0223 14:25:18.185371 7728 generic.go:334] "Generic (PLEG): container finished" podID="3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3" containerID="08d3df84ad8de18eec9e6a636baf4cb95ff798ffedb9a2d917a6b77d6c934fb7" exitCode=255 Feb 23 14:25:18.187942 master-0 kubenswrapper[7728]: I0223 14:25:18.187882 7728 generic.go:334] "Generic (PLEG): container finished" podID="ad0f0d72-0337-4347-bb50-e299a175f3ca" containerID="c3f209a9ce16ae00e125bd88a555117337a8948041a4b5c781124f66c958f969" exitCode=0 Feb 23 14:25:18.190577 master-0 kubenswrapper[7728]: I0223 14:25:18.190519 7728 generic.go:334] "Generic (PLEG): container finished" podID="e2d00ece-7586-4346-adbb-eaae1aeda69e" containerID="54011ffa8f000620849835983f9e2c00740786e321e8cd4e4de797c7d208b465" exitCode=0 Feb 23 14:25:19.198765 master-0 kubenswrapper[7728]: I0223 14:25:19.198681 7728 generic.go:334] "Generic (PLEG): container finished" podID="fbb66172-1ea9-4683-b88f-227c4fd94924" containerID="4bd96aadee1934ae65fac50d897e75007505a399d7d143ad871ced8edd81b895" exitCode=0 Feb 23 14:25:19.200702 master-0 kubenswrapper[7728]: I0223 14:25:19.200639 7728 generic.go:334] "Generic (PLEG): container finished" podID="482284fd-6911-4ba6-8d57-7966cc51117a" containerID="30dd1f19a8b444dbc9b769a06f0917819d1c1e9174b5fb3b5552595a9eed345f" exitCode=0 Feb 23 14:25:19.203293 master-0 kubenswrapper[7728]: I0223 14:25:19.203242 7728 generic.go:334] "Generic (PLEG): container finished" podID="a4ae9292-71dc-4484-b277-43cb26c1e04d" containerID="fafa7b0f21c17417165ff9592e80bbb6992685b66472f608cb30827b7d663491" exitCode=0 Feb 23 14:25:19.206845 master-0 kubenswrapper[7728]: I0223 14:25:19.206797 7728 generic.go:334] "Generic (PLEG): container finished" podID="c9ad9373c007a4fcd25e70622bdc8deb" containerID="f38113657e6647d113d4b8b771a4b871cb4df714ffeae8172aebba272b7e4da9" exitCode=0 Feb 23 14:25:20.216664 master-0 kubenswrapper[7728]: I0223 14:25:20.216613 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6847bb4785-5fw2x_2e89a047-9ebc-459b-b7b3-e902c1fb0e17/snapshot-controller/2.log" Feb 23 14:25:20.217334 master-0 kubenswrapper[7728]: I0223 14:25:20.217284 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6847bb4785-5fw2x_2e89a047-9ebc-459b-b7b3-e902c1fb0e17/snapshot-controller/1.log" Feb 23 14:25:20.217946 master-0 kubenswrapper[7728]: I0223 14:25:20.217914 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6847bb4785-5fw2x_2e89a047-9ebc-459b-b7b3-e902c1fb0e17/snapshot-controller/0.log" Feb 23 14:25:20.217994 master-0 kubenswrapper[7728]: I0223 14:25:20.217963 7728 generic.go:334] "Generic (PLEG): container finished" podID="2e89a047-9ebc-459b-b7b3-e902c1fb0e17" containerID="4d73e7e4ca95353fc1daf5c78e6fb2d258da6e5bbcc4add88a2d98722b1263c5" exitCode=1 Feb 23 14:25:20.266260 master-0 kubenswrapper[7728]: I0223 14:25:20.266177 7728 patch_prober.go:28] interesting pod/package-server-manager-5c75f78c8b-cj2l7 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.128.0.17:8080/healthz\": dial tcp 10.128.0.17:8080: connect: connection refused" start-of-body= Feb 23 14:25:20.266628 master-0 kubenswrapper[7728]: I0223 14:25:20.266275 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" podUID="5b54fc16-d2f7-4b10-a611-5b411b389c5a" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.128.0.17:8080/healthz\": dial tcp 10.128.0.17:8080: connect: connection refused" Feb 23 14:25:20.291228 master-0 kubenswrapper[7728]: I0223 14:25:20.291137 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Feb 23 14:25:20.361417 master-0 kubenswrapper[7728]: E0223 14:25:20.361234 7728 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:25:23.361151 master-0 kubenswrapper[7728]: E0223 14:25:23.361053 7728 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Feb 23 14:25:23.362254 master-0 kubenswrapper[7728]: E0223 14:25:23.361345 7728 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.028s" Feb 23 14:25:23.372850 master-0 kubenswrapper[7728]: I0223 14:25:23.372777 7728 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Feb 23 14:25:23.713913 master-0 kubenswrapper[7728]: I0223 14:25:23.713798 7728 patch_prober.go:28] interesting pod/route-controller-manager-8bb99f4f-msq8f container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.64:8443/healthz\": dial tcp 10.128.0.64:8443: connect: connection refused" start-of-body= Feb 23 14:25:23.713913 master-0 kubenswrapper[7728]: I0223 14:25:23.713883 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" podUID="482284fd-6911-4ba6-8d57-7966cc51117a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.64:8443/healthz\": dial tcp 10.128.0.64:8443: connect: connection refused" Feb 23 14:25:23.714287 master-0 kubenswrapper[7728]: I0223 14:25:23.713996 7728 patch_prober.go:28] interesting pod/route-controller-manager-8bb99f4f-msq8f container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.128.0.64:8443/healthz\": dial tcp 10.128.0.64:8443: connect: connection refused" start-of-body= Feb 23 14:25:23.714287 master-0 kubenswrapper[7728]: I0223 14:25:23.714025 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" podUID="482284fd-6911-4ba6-8d57-7966cc51117a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.64:8443/healthz\": dial tcp 10.128.0.64:8443: connect: connection refused" Feb 23 14:25:25.198004 master-0 kubenswrapper[7728]: I0223 14:25:25.197907 7728 patch_prober.go:28] interesting pod/authentication-operator-5bd7c86784-mlbx2 container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.10:8443/healthz\": dial tcp 10.128.0.10:8443: connect: connection refused" start-of-body= Feb 23 14:25:25.198582 master-0 kubenswrapper[7728]: I0223 14:25:25.198000 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" podUID="e2d00ece-7586-4346-adbb-eaae1aeda69e" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.10:8443/healthz\": dial tcp 10.128.0.10:8443: connect: connection refused" Feb 23 14:25:25.331418 master-0 kubenswrapper[7728]: E0223 14:25:25.331290 7728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Feb 23 14:25:25.911884 master-0 kubenswrapper[7728]: I0223 14:25:25.911686 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Feb 23 14:25:27.040876 master-0 kubenswrapper[7728]: E0223 14:25:27.040726 7728 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event=< Feb 23 14:25:27.040876 master-0 kubenswrapper[7728]: &Event{ObjectMeta:{openshift-config-operator-6f47d587d6-55qjr.1896e60016c4c949 openshift-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-config-operator,Name:openshift-config-operator-6f47d587d6-55qjr,UID:92c63c95-e880-4f51-9858-7715343f7bd8,APIVersion:v1,ResourceVersion:8517,FieldPath:spec.containers{openshift-config-operator},},Reason:ProbeError,Message:Readiness probe error: Get "https://10.128.0.52:8443/healthz": dial tcp 10.128.0.52:8443: connect: connection refused Feb 23 14:25:27.040876 master-0 kubenswrapper[7728]: body: Feb 23 14:25:27.040876 master-0 kubenswrapper[7728]: ,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:19:51.473252681 +0000 UTC m=+84.435913977,LastTimestamp:2026-02-23 14:19:51.485857691 +0000 UTC m=+84.448518987,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,} Feb 23 14:25:27.040876 master-0 kubenswrapper[7728]: > Feb 23 14:25:27.533470 master-0 kubenswrapper[7728]: I0223 14:25:27.533415 7728 scope.go:117] "RemoveContainer" containerID="9576dd15e5e70c1d1ba1e6d5d639886620c60fa49c2ad4add67f8fd17b2dd5ba" Feb 23 14:25:30.266725 master-0 kubenswrapper[7728]: I0223 14:25:30.265803 7728 patch_prober.go:28] interesting pod/package-server-manager-5c75f78c8b-cj2l7 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.128.0.17:8080/healthz\": dial tcp 10.128.0.17:8080: connect: connection refused" start-of-body= Feb 23 14:25:30.266725 master-0 kubenswrapper[7728]: I0223 14:25:30.265930 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" podUID="5b54fc16-d2f7-4b10-a611-5b411b389c5a" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.128.0.17:8080/healthz\": dial tcp 10.128.0.17:8080: connect: connection refused" Feb 23 14:25:30.292334 master-0 kubenswrapper[7728]: I0223 14:25:30.291573 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Feb 23 14:25:30.362639 master-0 kubenswrapper[7728]: E0223 14:25:30.362452 7728 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:25:33.713075 master-0 kubenswrapper[7728]: I0223 14:25:33.712952 7728 patch_prober.go:28] interesting pod/route-controller-manager-8bb99f4f-msq8f container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.64:8443/healthz\": dial tcp 10.128.0.64:8443: connect: connection refused" start-of-body= Feb 23 14:25:33.713075 master-0 kubenswrapper[7728]: I0223 14:25:33.713028 7728 patch_prober.go:28] interesting pod/route-controller-manager-8bb99f4f-msq8f container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.128.0.64:8443/healthz\": dial tcp 10.128.0.64:8443: connect: connection refused" start-of-body= Feb 23 14:25:33.714017 master-0 kubenswrapper[7728]: I0223 14:25:33.713086 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" podUID="482284fd-6911-4ba6-8d57-7966cc51117a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.64:8443/healthz\": dial tcp 10.128.0.64:8443: connect: connection refused" Feb 23 14:25:33.714017 master-0 kubenswrapper[7728]: I0223 14:25:33.713129 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" podUID="482284fd-6911-4ba6-8d57-7966cc51117a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.64:8443/healthz\": dial tcp 10.128.0.64:8443: connect: connection refused" Feb 23 14:25:35.199007 master-0 kubenswrapper[7728]: I0223 14:25:35.198924 7728 patch_prober.go:28] interesting pod/authentication-operator-5bd7c86784-mlbx2 container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.10:8443/healthz\": dial tcp 10.128.0.10:8443: connect: connection refused" start-of-body= Feb 23 14:25:35.199842 master-0 kubenswrapper[7728]: I0223 14:25:35.199010 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" podUID="e2d00ece-7586-4346-adbb-eaae1aeda69e" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.10:8443/healthz\": dial tcp 10.128.0.10:8443: connect: connection refused" Feb 23 14:25:35.911635 master-0 kubenswrapper[7728]: I0223 14:25:35.911577 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Feb 23 14:25:40.266247 master-0 kubenswrapper[7728]: I0223 14:25:40.266124 7728 patch_prober.go:28] interesting pod/package-server-manager-5c75f78c8b-cj2l7 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.128.0.17:8080/healthz\": dial tcp 10.128.0.17:8080: connect: connection refused" start-of-body= Feb 23 14:25:40.266247 master-0 kubenswrapper[7728]: I0223 14:25:40.266222 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" podUID="5b54fc16-d2f7-4b10-a611-5b411b389c5a" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.128.0.17:8080/healthz\": dial tcp 10.128.0.17:8080: connect: connection refused" Feb 23 14:25:40.292144 master-0 kubenswrapper[7728]: I0223 14:25:40.292079 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Feb 23 14:25:40.363935 master-0 kubenswrapper[7728]: E0223 14:25:40.363862 7728 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:25:40.363935 master-0 kubenswrapper[7728]: E0223 14:25:40.363925 7728 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 14:25:41.365186 master-0 kubenswrapper[7728]: I0223 14:25:41.365135 7728 generic.go:334] "Generic (PLEG): container finished" podID="b9774f8c-0f29-46d8-be77-81bcf74d5994" containerID="94bfdbcfdcf4914977da334b3fd2fe80966ec6c36be33d3628e4eada6361765f" exitCode=0 Feb 23 14:25:42.332409 master-0 kubenswrapper[7728]: E0223 14:25:42.332251 7728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Feb 23 14:25:43.713463 master-0 kubenswrapper[7728]: I0223 14:25:43.713361 7728 patch_prober.go:28] interesting pod/route-controller-manager-8bb99f4f-msq8f container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.128.0.64:8443/healthz\": dial tcp 10.128.0.64:8443: connect: connection refused" start-of-body= Feb 23 14:25:43.714283 master-0 kubenswrapper[7728]: I0223 14:25:43.713378 7728 patch_prober.go:28] interesting pod/route-controller-manager-8bb99f4f-msq8f container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.64:8443/healthz\": dial tcp 10.128.0.64:8443: connect: connection refused" start-of-body= Feb 23 14:25:43.714283 master-0 kubenswrapper[7728]: I0223 14:25:43.713593 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" podUID="482284fd-6911-4ba6-8d57-7966cc51117a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.64:8443/healthz\": dial tcp 10.128.0.64:8443: connect: connection refused" Feb 23 14:25:43.714283 master-0 kubenswrapper[7728]: I0223 14:25:43.713513 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" podUID="482284fd-6911-4ba6-8d57-7966cc51117a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.64:8443/healthz\": dial tcp 10.128.0.64:8443: connect: connection refused" Feb 23 14:25:45.198918 master-0 kubenswrapper[7728]: I0223 14:25:45.198777 7728 patch_prober.go:28] interesting pod/authentication-operator-5bd7c86784-mlbx2 container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.10:8443/healthz\": dial tcp 10.128.0.10:8443: connect: connection refused" start-of-body= Feb 23 14:25:45.198918 master-0 kubenswrapper[7728]: I0223 14:25:45.198921 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" podUID="e2d00ece-7586-4346-adbb-eaae1aeda69e" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.10:8443/healthz\": dial tcp 10.128.0.10:8443: connect: connection refused" Feb 23 14:25:45.911948 master-0 kubenswrapper[7728]: I0223 14:25:45.911837 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Feb 23 14:25:48.134274 master-0 kubenswrapper[7728]: I0223 14:25:48.134166 7728 status_manager.go:851] "Failed to get status for pod" podUID="12dab5d350ebc129b0bfa4714d330b15" pod="openshift-etcd/etcd-master-0-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods etcd-master-0-master-0)" Feb 23 14:25:50.265031 master-0 kubenswrapper[7728]: I0223 14:25:50.264891 7728 patch_prober.go:28] interesting pod/package-server-manager-5c75f78c8b-cj2l7 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.128.0.17:8080/healthz\": dial tcp 10.128.0.17:8080: connect: connection refused" start-of-body= Feb 23 14:25:50.265031 master-0 kubenswrapper[7728]: I0223 14:25:50.264957 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" podUID="5b54fc16-d2f7-4b10-a611-5b411b389c5a" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.128.0.17:8080/healthz\": dial tcp 10.128.0.17:8080: connect: connection refused" Feb 23 14:25:50.426823 master-0 kubenswrapper[7728]: I0223 14:25:50.426757 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-d6bb9bb76-4frj6_12b256b7-a57b-4124-8452-25e74cfa7926/cluster-baremetal-operator/2.log" Feb 23 14:25:50.427219 master-0 kubenswrapper[7728]: I0223 14:25:50.427182 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-d6bb9bb76-4frj6_12b256b7-a57b-4124-8452-25e74cfa7926/cluster-baremetal-operator/1.log" Feb 23 14:25:50.428234 master-0 kubenswrapper[7728]: I0223 14:25:50.428206 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-d6bb9bb76-4frj6_12b256b7-a57b-4124-8452-25e74cfa7926/cluster-baremetal-operator/0.log" Feb 23 14:25:50.428353 master-0 kubenswrapper[7728]: I0223 14:25:50.428250 7728 generic.go:334] "Generic (PLEG): container finished" podID="12b256b7-a57b-4124-8452-25e74cfa7926" containerID="578a9e2a674702d2219386592f2e2254d406630d2cc2c55e8edf24f8f9368991" exitCode=1 Feb 23 14:25:51.442515 master-0 kubenswrapper[7728]: I0223 14:25:51.442412 7728 generic.go:334] "Generic (PLEG): container finished" podID="c9ad9373c007a4fcd25e70622bdc8deb" containerID="e7e20b5ba72ce778a4607a64cc8928522b6f4e4e91aae5a0ddbe4de3f2e8d4a6" exitCode=1 Feb 23 14:25:53.713283 master-0 kubenswrapper[7728]: I0223 14:25:53.713193 7728 patch_prober.go:28] interesting pod/route-controller-manager-8bb99f4f-msq8f container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.64:8443/healthz\": dial tcp 10.128.0.64:8443: connect: connection refused" start-of-body= Feb 23 14:25:53.714360 master-0 kubenswrapper[7728]: I0223 14:25:53.713296 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" podUID="482284fd-6911-4ba6-8d57-7966cc51117a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.64:8443/healthz\": dial tcp 10.128.0.64:8443: connect: connection refused" Feb 23 14:25:55.911692 master-0 kubenswrapper[7728]: I0223 14:25:55.911621 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Feb 23 14:25:57.375958 master-0 kubenswrapper[7728]: E0223 14:25:57.375859 7728 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Feb 23 14:25:57.376552 master-0 kubenswrapper[7728]: E0223 14:25:57.376125 7728 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.015s" Feb 23 14:25:57.376552 master-0 kubenswrapper[7728]: I0223 14:25:57.376169 7728 status_manager.go:317] "Container readiness changed for unknown container" pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" containerID="cri-o://3d191963e287b24eb8e359eae476b7710f1b01ed3998cce17300434d7f6e8d0b" Feb 23 14:25:57.376552 master-0 kubenswrapper[7728]: I0223 14:25:57.376186 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:25:57.376552 master-0 kubenswrapper[7728]: I0223 14:25:57.376227 7728 status_manager.go:317] "Container readiness changed for unknown container" pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" containerID="cri-o://0bda8d15a11221e7b98f49af56e0807945868c4a5e5d028da4a5c53d7f410c01" Feb 23 14:25:57.376552 master-0 kubenswrapper[7728]: I0223 14:25:57.376242 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:25:57.390027 master-0 kubenswrapper[7728]: I0223 14:25:57.389958 7728 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Feb 23 14:25:58.627736 master-0 kubenswrapper[7728]: E0223 14:25:58.627630 7728 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.251s" Feb 23 14:25:58.627736 master-0 kubenswrapper[7728]: I0223 14:25:58.627707 7728 status_manager.go:317] "Container readiness changed for unknown container" pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" containerID="cri-o://e192093c7698f9c13f14fd55a50b3b960cd4142b3b8cb914299c2709465ffc51" Feb 23 14:25:58.627736 master-0 kubenswrapper[7728]: I0223 14:25:58.627727 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:25:58.629095 master-0 kubenswrapper[7728]: I0223 14:25:58.627762 7728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:25:58.629095 master-0 kubenswrapper[7728]: I0223 14:25:58.627780 7728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:25:58.629095 master-0 kubenswrapper[7728]: I0223 14:25:58.627804 7728 status_manager.go:379] "Container startup changed for unknown container" pod="kube-system/bootstrap-kube-controller-manager-master-0" containerID="cri-o://a0c28cc50bec94c9a70b8ff73f58f632e7f157d8192b386a307045a41a893000" Feb 23 14:25:58.629095 master-0 kubenswrapper[7728]: I0223 14:25:58.627820 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:25:58.629095 master-0 kubenswrapper[7728]: I0223 14:25:58.628204 7728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:25:58.629631 master-0 kubenswrapper[7728]: I0223 14:25:58.629586 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:25:58.629631 master-0 kubenswrapper[7728]: I0223 14:25:58.629625 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Feb 23 14:25:58.629833 master-0 kubenswrapper[7728]: I0223 14:25:58.629639 7728 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="3b737ae4-e677-4455-a567-97f8dfbedc5c" Feb 23 14:25:58.629833 master-0 kubenswrapper[7728]: I0223 14:25:58.629656 7728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:25:58.629833 master-0 kubenswrapper[7728]: I0223 14:25:58.629681 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:25:58.629833 master-0 kubenswrapper[7728]: I0223 14:25:58.629700 7728 status_manager.go:317] "Container readiness changed for unknown container" pod="kube-system/bootstrap-kube-controller-manager-master-0" containerID="cri-o://a0c28cc50bec94c9a70b8ff73f58f632e7f157d8192b386a307045a41a893000" Feb 23 14:25:58.629833 master-0 kubenswrapper[7728]: I0223 14:25:58.629710 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:25:58.629833 master-0 kubenswrapper[7728]: I0223 14:25:58.629723 7728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" Feb 23 14:25:58.629833 master-0 kubenswrapper[7728]: I0223 14:25:58.629740 7728 status_manager.go:317] "Container readiness changed for unknown container" pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" containerID="cri-o://943dceb3c19889e0c21143fb06ce16ff62e733710dc9afea16ddd3ae92da4904" Feb 23 14:25:58.629833 master-0 kubenswrapper[7728]: I0223 14:25:58.629749 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:25:58.629833 master-0 kubenswrapper[7728]: I0223 14:25:58.629785 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:25:58.629833 master-0 kubenswrapper[7728]: I0223 14:25:58.629803 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"0fdb9885-7479-43b5-8613-b2857a798ade","Type":"ContainerDied","Data":"0a7994e86e7ddf474fa9a6e9d028e17c8d71e5299119418e1b05d25a7b604984"} Feb 23 14:25:58.629833 master-0 kubenswrapper[7728]: I0223 14:25:58.629823 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" event={"ID":"24829faf-50e8-45bb-abb0-7cc5ccf81080","Type":"ContainerDied","Data":"0e43678d3197cf112cf0a044926bfa730d56557262cc8421afdcc26a5ee07b83"} Feb 23 14:25:58.629833 master-0 kubenswrapper[7728]: I0223 14:25:58.629840 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" event={"ID":"961e4ecd-545b-4270-ae34-e733dec793b6","Type":"ContainerDied","Data":"5ed2538f1dd4c505937625e4613ce7839a7ad1306cb779a0660bf410856f74ea"} Feb 23 14:25:58.631025 master-0 kubenswrapper[7728]: I0223 14:25:58.629869 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:25:58.631025 master-0 kubenswrapper[7728]: I0223 14:25:58.629884 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" event={"ID":"674041a2-e2b0-4286-88cc-f1b00571e3f3","Type":"ContainerDied","Data":"444b5986734e966174e693b843714d39c39b89099075b49c0d4944256ff9f4ae"} Feb 23 14:25:58.631025 master-0 kubenswrapper[7728]: I0223 14:25:58.629901 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" event={"ID":"b714a9df-026e-423d-a980-2569f0d92e47","Type":"ContainerDied","Data":"5867cf57b319e8b378703de8112e0a4c5fd05aee108af7754fc3219eac54a673"} Feb 23 14:25:58.631025 master-0 kubenswrapper[7728]: I0223 14:25:58.629920 7728 scope.go:117] "RemoveContainer" containerID="0e43678d3197cf112cf0a044926bfa730d56557262cc8421afdcc26a5ee07b83" Feb 23 14:25:58.631025 master-0 kubenswrapper[7728]: I0223 14:25:58.630796 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:25:58.631025 master-0 kubenswrapper[7728]: I0223 14:25:58.630840 7728 scope.go:117] "RemoveContainer" containerID="fafa7b0f21c17417165ff9592e80bbb6992685b66472f608cb30827b7d663491" Feb 23 14:25:58.631025 master-0 kubenswrapper[7728]: I0223 14:25:58.630871 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"18a83278819db2092fa26d8274eb3f00","Type":"ContainerDied","Data":"626890ddbc06982ad60de27c4c4ad3f994d6a386f27886fbc0cdba298ce4fc87"} Feb 23 14:25:58.631025 master-0 kubenswrapper[7728]: I0223 14:25:58.630955 7728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" Feb 23 14:25:58.631846 master-0 kubenswrapper[7728]: I0223 14:25:58.631166 7728 scope.go:117] "RemoveContainer" containerID="54011ffa8f000620849835983f9e2c00740786e321e8cd4e4de797c7d208b465" Feb 23 14:25:58.631846 master-0 kubenswrapper[7728]: I0223 14:25:58.631372 7728 scope.go:117] "RemoveContainer" containerID="3d6da8a2ab007c14781f7a758e38f4dc17838974a913b095ba4be079439082e2" Feb 23 14:25:58.631846 master-0 kubenswrapper[7728]: I0223 14:25:58.631409 7728 scope.go:117] "RemoveContainer" containerID="e7e20b5ba72ce778a4607a64cc8928522b6f4e4e91aae5a0ddbe4de3f2e8d4a6" Feb 23 14:25:58.631846 master-0 kubenswrapper[7728]: I0223 14:25:58.631439 7728 scope.go:117] "RemoveContainer" containerID="f38113657e6647d113d4b8b771a4b871cb4df714ffeae8172aebba272b7e4da9" Feb 23 14:25:58.631846 master-0 kubenswrapper[7728]: I0223 14:25:58.631685 7728 scope.go:117] "RemoveContainer" containerID="4d73e7e4ca95353fc1daf5c78e6fb2d258da6e5bbcc4add88a2d98722b1263c5" Feb 23 14:25:58.632298 master-0 kubenswrapper[7728]: I0223 14:25:58.632001 7728 status_manager.go:317] "Container readiness changed for unknown container" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" containerID="cri-o://0a3915eaedd169a17fc20783989eb20aa548b21f919f4de39f43389e2994de7c" Feb 23 14:25:58.632298 master-0 kubenswrapper[7728]: I0223 14:25:58.632023 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" Feb 23 14:25:58.632298 master-0 kubenswrapper[7728]: I0223 14:25:58.632047 7728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:25:58.632298 master-0 kubenswrapper[7728]: I0223 14:25:58.632069 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" event={"ID":"b9cf1c39-24f0-420b-8020-089616d1cdf0","Type":"ContainerDied","Data":"4f3667b06f9040c2373de3a09349d52a663561d04056133aea74705119d3b818"} Feb 23 14:25:58.632298 master-0 kubenswrapper[7728]: I0223 14:25:58.632134 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:25:58.632298 master-0 kubenswrapper[7728]: I0223 14:25:58.632154 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-td489" event={"ID":"bbe678de-546d-49d0-8280-3f6d94fa5e4f","Type":"ContainerDied","Data":"86a800fe59aed9a0c248de7a352a6c1ffaea2cbdde27bb246147baa866e1c79a"} Feb 23 14:25:58.632298 master-0 kubenswrapper[7728]: I0223 14:25:58.632177 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"c9ad9373c007a4fcd25e70622bdc8deb","Type":"ContainerDied","Data":"12ea317144b1f97c12db3d866b1cc7b66073f64b41a37f71cd1c51e60dce3e4c"} Feb 23 14:25:58.632298 master-0 kubenswrapper[7728]: I0223 14:25:58.632218 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:25:58.632298 master-0 kubenswrapper[7728]: I0223 14:25:58.632235 7728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:25:58.632298 master-0 kubenswrapper[7728]: I0223 14:25:58.632253 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" event={"ID":"cf04aca0-8174-4134-835d-37adf6a3b5ca","Type":"ContainerStarted","Data":"1ea2b285f2639d5a18b8a335d5c0eee1af23080ecbfb38dc1a5168ba545660e2"} Feb 23 14:25:58.633269 master-0 kubenswrapper[7728]: I0223 14:25:58.632337 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:25:58.633269 master-0 kubenswrapper[7728]: I0223 14:25:58.632363 7728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:25:58.633269 master-0 kubenswrapper[7728]: I0223 14:25:58.632692 7728 scope.go:117] "RemoveContainer" containerID="3e43920c8c9e66c01584e52a234477388c129ea94fe151ecc6c23098a8981522" Feb 23 14:25:58.633269 master-0 kubenswrapper[7728]: I0223 14:25:58.632814 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:25:58.633269 master-0 kubenswrapper[7728]: I0223 14:25:58.632847 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"c9ad9373c007a4fcd25e70622bdc8deb","Type":"ContainerStarted","Data":"a0c28cc50bec94c9a70b8ff73f58f632e7f157d8192b386a307045a41a893000"} Feb 23 14:25:58.633269 master-0 kubenswrapper[7728]: I0223 14:25:58.632900 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:25:58.633269 master-0 kubenswrapper[7728]: I0223 14:25:58.632921 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-td489" event={"ID":"bbe678de-546d-49d0-8280-3f6d94fa5e4f","Type":"ContainerStarted","Data":"0e0e360765d8d16da79c870190987bdf00a3af7c783a06b62100aaa85b3602c9"} Feb 23 14:25:58.633269 master-0 kubenswrapper[7728]: I0223 14:25:58.632941 7728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:25:58.633269 master-0 kubenswrapper[7728]: I0223 14:25:58.632958 7728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:25:58.633269 master-0 kubenswrapper[7728]: I0223 14:25:58.632992 7728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:25:58.633269 master-0 kubenswrapper[7728]: I0223 14:25:58.633028 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:25:58.633269 master-0 kubenswrapper[7728]: I0223 14:25:58.633045 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"493a9ed3-6d64-489a-a68c-235b69a58782","Type":"ContainerDied","Data":"1df16973da8e7c98a51b37b7335c255585ebd5dc4bbbed0d842fe3c32df42186"} Feb 23 14:25:58.633269 master-0 kubenswrapper[7728]: I0223 14:25:58.633068 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" event={"ID":"3488a7eb-5170-478c-9af7-490dbe0f514e","Type":"ContainerDied","Data":"475b682a5602a8b70516629df8770a92cda1f614d3b2e4b8f4d6b708bbc8532d"} Feb 23 14:25:58.633269 master-0 kubenswrapper[7728]: I0223 14:25:58.633094 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:25:58.633269 master-0 kubenswrapper[7728]: I0223 14:25:58.633112 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" event={"ID":"92c63c95-e880-4f51-9858-7715343f7bd8","Type":"ContainerDied","Data":"502abf4ea3cb690eb21a0ba5e773be5fbc2712d7f83f4ac4448a35b53cf2ac71"} Feb 23 14:25:58.633269 master-0 kubenswrapper[7728]: I0223 14:25:58.633205 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:25:58.633269 master-0 kubenswrapper[7728]: I0223 14:25:58.633247 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" event={"ID":"92c63c95-e880-4f51-9858-7715343f7bd8","Type":"ContainerStarted","Data":"0a3915eaedd169a17fc20783989eb20aa548b21f919f4de39f43389e2994de7c"} Feb 23 14:25:58.633269 master-0 kubenswrapper[7728]: I0223 14:25:58.633270 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" event={"ID":"865ceedb-b19a-4f2f-b295-311e1b7a645e","Type":"ContainerStarted","Data":"7afdcdc79bcf059c9a09c1210b1f2828b6a1174f97563a28f6b04cfb2b6ff9e4"} Feb 23 14:25:58.633269 master-0 kubenswrapper[7728]: I0223 14:25:58.633293 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" event={"ID":"585f74db-4593-426b-b0c7-ec8f64810549","Type":"ContainerDied","Data":"3d191963e287b24eb8e359eae476b7710f1b01ed3998cce17300434d7f6e8d0b"} Feb 23 14:25:58.633269 master-0 kubenswrapper[7728]: I0223 14:25:58.633314 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" event={"ID":"1c60ff3f-2bb1-422e-be27-5eca96d85fd2","Type":"ContainerDied","Data":"0bda8d15a11221e7b98f49af56e0807945868c4a5e5d028da4a5c53d7f410c01"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.633337 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" event={"ID":"66c72c71-f74a-43ab-bf0d-1f4c93623774","Type":"ContainerDied","Data":"e192093c7698f9c13f14fd55a50b3b960cd4142b3b8cb914299c2709465ffc51"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.633361 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"c9ad9373c007a4fcd25e70622bdc8deb","Type":"ContainerDied","Data":"a0c28cc50bec94c9a70b8ff73f58f632e7f157d8192b386a307045a41a893000"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.633383 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6847bb4785-5fw2x" event={"ID":"2e89a047-9ebc-459b-b7b3-e902c1fb0e17","Type":"ContainerDied","Data":"b8ab745e2116720c089d0aba55fcbbcd93f3d05db7dc85aaff6bdfb686118c69"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.633406 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" event={"ID":"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04","Type":"ContainerDied","Data":"06ed5eab4f45a414dec39fdf73e09eda9befba12eaf73ac8d264e79dbcbe1fcb"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.633429 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" event={"ID":"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04","Type":"ContainerDied","Data":"89c65e32357fb90a65db3743a53bf98698ca1c5da74b91fe797e842ada8b4fd8"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.633448 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-686847ff5f-9q266" event={"ID":"4373687a-61a0-434b-81f7-3fecaa1494ef","Type":"ContainerDied","Data":"9b45bf126e1d92621372b72946a5700b9c49834f8698b4a6266b185922dfcbee"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.633471 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" event={"ID":"cf04aca0-8174-4134-835d-37adf6a3b5ca","Type":"ContainerDied","Data":"1ea2b285f2639d5a18b8a335d5c0eee1af23080ecbfb38dc1a5168ba545660e2"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.633524 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" event={"ID":"b090ed5a-984f-41dd-8cea-34a1ece1514f","Type":"ContainerDied","Data":"be11245e52df36836387b793176a5296c3112993cdce052d05331b901d833321"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.633544 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" event={"ID":"959c2393-e914-4c10-a18f-b30fcf012d19","Type":"ContainerDied","Data":"943dceb3c19889e0c21143fb06ce16ff62e733710dc9afea16ddd3ae92da4904"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.633565 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" event={"ID":"12b256b7-a57b-4124-8452-25e74cfa7926","Type":"ContainerDied","Data":"59103074e5c9d28cc59a99d2933688907ecdae822b440f6d4da07709d19793c9"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.633587 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6847bb4785-5fw2x" event={"ID":"2e89a047-9ebc-459b-b7b3-e902c1fb0e17","Type":"ContainerStarted","Data":"3dfd224fb797b317bbd9fa5874481064b86c55aa823a6f87465b8ca08947f5d3"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.633605 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" event={"ID":"cf04aca0-8174-4134-835d-37adf6a3b5ca","Type":"ContainerStarted","Data":"945eeda0e74497e88a62481b91d0c1abd43853d97eee9a925d78fc6fc7443101"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.633623 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" event={"ID":"959c2393-e914-4c10-a18f-b30fcf012d19","Type":"ContainerStarted","Data":"18b02500a922018fef0fe170792a110deb1ca490ebe442765b459f2885b97744"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.633644 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" event={"ID":"24829faf-50e8-45bb-abb0-7cc5ccf81080","Type":"ContainerStarted","Data":"81ff2e6b5bae83ef9904fdb97ede2ea9a1442b7adab5da48d4302eefab7a166a"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.633664 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-686847ff5f-9q266" event={"ID":"4373687a-61a0-434b-81f7-3fecaa1494ef","Type":"ContainerStarted","Data":"61f0e42918568e19cfd95e1214cb522d9ea19f33de09143d470f6bb7988c8d8a"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.633682 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" event={"ID":"585f74db-4593-426b-b0c7-ec8f64810549","Type":"ContainerStarted","Data":"e8151e4f2721f179d56208bf1c11204648825d6daf5824777e4cd2cde1fcc527"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.633700 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" event={"ID":"12b256b7-a57b-4124-8452-25e74cfa7926","Type":"ContainerStarted","Data":"3f92273d1230c6309ba0ff19f3495f90ece38f0e07bb4c16e151dee5c4fb41ec"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.633717 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" event={"ID":"3488a7eb-5170-478c-9af7-490dbe0f514e","Type":"ContainerStarted","Data":"3ea10ed9b3b081ac010f974ab393059b11852999309c95ddd5381bd40e623b2e"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.633737 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" event={"ID":"b714a9df-026e-423d-a980-2569f0d92e47","Type":"ContainerStarted","Data":"796f1fd46fb4b7de05e9d7265e3f4d090bdf2c82e271c42f786a885903a59f3d"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.633755 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" event={"ID":"961e4ecd-545b-4270-ae34-e733dec793b6","Type":"ContainerStarted","Data":"4077843a1666052f92a3616061104688e7ad630e49b861a884467e2e98bfca5d"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.633774 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" event={"ID":"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04","Type":"ContainerStarted","Data":"6025024f5636acf7c27c5b7a39d4bb79ae61ce702639b0f3582d218d6929f100"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.633791 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" event={"ID":"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04","Type":"ContainerStarted","Data":"56b4c213d9a7300864baea1d80b35071587d7cb159b21a8c971fe22bd13e200d"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.633810 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"0fdb9885-7479-43b5-8613-b2857a798ade","Type":"ContainerDied","Data":"3cbdfb9045c2d2cb397063c37573cc9d345a2e61b6805238ad5391bd43edfbaa"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.633832 7728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cbdfb9045c2d2cb397063c37573cc9d345a2e61b6805238ad5391bd43edfbaa" Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.633850 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" event={"ID":"b090ed5a-984f-41dd-8cea-34a1ece1514f","Type":"ContainerStarted","Data":"f1624e490e996adbc1581323e6fc52ba5b194ad2cd07588f621ae1c9497226e8"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.633869 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" event={"ID":"1c60ff3f-2bb1-422e-be27-5eca96d85fd2","Type":"ContainerStarted","Data":"5c9a811dd7ca05a47e75def9ccd5b3cad7fd1e69fba1f4ac35541ff048398ec8"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.633869 7728 scope.go:117] "RemoveContainer" containerID="30dd1f19a8b444dbc9b769a06f0917819d1c1e9174b5fb3b5552595a9eed345f" Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.633887 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" event={"ID":"674041a2-e2b0-4286-88cc-f1b00571e3f3","Type":"ContainerStarted","Data":"a2ae49d4722a1e40cd55dc37aa9260d992134f5c1ca873bbced79ae9e75c00b6"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.634405 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"c9ad9373c007a4fcd25e70622bdc8deb","Type":"ContainerStarted","Data":"bceb70263737a80d48b793aeeb1a38a769270ac03a734c22702eb093a9f1b430"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.634429 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" event={"ID":"b9cf1c39-24f0-420b-8020-089616d1cdf0","Type":"ContainerStarted","Data":"b4c3088f58a98599776247fbbf6c6a5a5751aeab22dce8d8142cec8b35a3fab9"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.634450 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"493a9ed3-6d64-489a-a68c-235b69a58782","Type":"ContainerDied","Data":"cb9c2c793a1a03d8088100a56c493a223ec7cd474c24708ed4bb05825975b542"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.634472 7728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb9c2c793a1a03d8088100a56c493a223ec7cd474c24708ed4bb05825975b542" Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.634520 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" event={"ID":"66c72c71-f74a-43ab-bf0d-1f4c93623774","Type":"ContainerStarted","Data":"80c3493a0d8d53c83776cd0edf83e55e36824f6eb21d9ce4f03e101c3a13e139"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.634545 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"18a83278819db2092fa26d8274eb3f00","Type":"ContainerDied","Data":"4850e29e1670d0434d8ca87c5950a0424937b61be4c5fb2ae511df8fe764c7a2"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.634571 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" event={"ID":"865ceedb-b19a-4f2f-b295-311e1b7a645e","Type":"ContainerDied","Data":"7afdcdc79bcf059c9a09c1210b1f2828b6a1174f97563a28f6b04cfb2b6ff9e4"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.634593 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" event={"ID":"92c63c95-e880-4f51-9858-7715343f7bd8","Type":"ContainerDied","Data":"0a3915eaedd169a17fc20783989eb20aa548b21f919f4de39f43389e2994de7c"} Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.634599 7728 scope.go:117] "RemoveContainer" containerID="578a9e2a674702d2219386592f2e2254d406630d2cc2c55e8edf24f8f9368991" Feb 23 14:25:58.635144 master-0 kubenswrapper[7728]: I0223 14:25:58.634617 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6847bb4785-5fw2x" event={"ID":"2e89a047-9ebc-459b-b7b3-e902c1fb0e17","Type":"ContainerDied","Data":"3dfd224fb797b317bbd9fa5874481064b86c55aa823a6f87465b8ca08947f5d3"} Feb 23 14:25:58.643387 master-0 kubenswrapper[7728]: I0223 14:25:58.635466 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" event={"ID":"12b256b7-a57b-4124-8452-25e74cfa7926","Type":"ContainerDied","Data":"3f92273d1230c6309ba0ff19f3495f90ece38f0e07bb4c16e151dee5c4fb41ec"} Feb 23 14:25:58.643387 master-0 kubenswrapper[7728]: I0223 14:25:58.635574 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"c9ad9373c007a4fcd25e70622bdc8deb","Type":"ContainerDied","Data":"bceb70263737a80d48b793aeeb1a38a769270ac03a734c22702eb093a9f1b430"} Feb 23 14:25:58.643387 master-0 kubenswrapper[7728]: I0223 14:25:58.635610 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" event={"ID":"961e4ecd-545b-4270-ae34-e733dec793b6","Type":"ContainerDied","Data":"4077843a1666052f92a3616061104688e7ad630e49b861a884467e2e98bfca5d"} Feb 23 14:25:58.643387 master-0 kubenswrapper[7728]: I0223 14:25:58.635644 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" event={"ID":"674041a2-e2b0-4286-88cc-f1b00571e3f3","Type":"ContainerDied","Data":"a2ae49d4722a1e40cd55dc37aa9260d992134f5c1ca873bbced79ae9e75c00b6"} Feb 23 14:25:58.643387 master-0 kubenswrapper[7728]: I0223 14:25:58.635781 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" event={"ID":"cf04aca0-8174-4134-835d-37adf6a3b5ca","Type":"ContainerDied","Data":"945eeda0e74497e88a62481b91d0c1abd43853d97eee9a925d78fc6fc7443101"} Feb 23 14:25:58.643387 master-0 kubenswrapper[7728]: I0223 14:25:58.635826 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" event={"ID":"b714a9df-026e-423d-a980-2569f0d92e47","Type":"ContainerDied","Data":"796f1fd46fb4b7de05e9d7265e3f4d090bdf2c82e271c42f786a885903a59f3d"} Feb 23 14:25:58.643387 master-0 kubenswrapper[7728]: I0223 14:25:58.635860 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" event={"ID":"24829faf-50e8-45bb-abb0-7cc5ccf81080","Type":"ContainerDied","Data":"81ff2e6b5bae83ef9904fdb97ede2ea9a1442b7adab5da48d4302eefab7a166a"} Feb 23 14:25:58.643387 master-0 kubenswrapper[7728]: I0223 14:25:58.635898 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" event={"ID":"b9cf1c39-24f0-420b-8020-089616d1cdf0","Type":"ContainerDied","Data":"b4c3088f58a98599776247fbbf6c6a5a5751aeab22dce8d8142cec8b35a3fab9"} Feb 23 14:25:58.643387 master-0 kubenswrapper[7728]: I0223 14:25:58.635930 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" event={"ID":"57b57915-64dd-42f5-b06f-bc4bcc06b667","Type":"ContainerDied","Data":"2f96ee533f5d52939bd2d7faf41993b118d9a6bfbb0b89e7580d1b1a849ba083"} Feb 23 14:25:58.643387 master-0 kubenswrapper[7728]: I0223 14:25:58.635961 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" event={"ID":"cb6e88cd-98de-446a-92e8-f56a2f133703","Type":"ContainerDied","Data":"031c49419dbbce343a020e2a52b0b21aa31f7846ce6d6338d427aedeeb387c27"} Feb 23 14:25:58.643387 master-0 kubenswrapper[7728]: I0223 14:25:58.636013 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" event={"ID":"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061","Type":"ContainerDied","Data":"bb60c962ed53b03fdfea9c76fcac5c126728571b797b7f917d784b1b7debd024"} Feb 23 14:25:58.643387 master-0 kubenswrapper[7728]: I0223 14:25:58.636047 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" event={"ID":"8de1f285-47ac-42aa-8026-8addce656362","Type":"ContainerDied","Data":"5f94c8fde6ae66d48d8282c5c57e237550057485795720cf3e4f35047fc2b408"} Feb 23 14:25:58.643387 master-0 kubenswrapper[7728]: I0223 14:25:58.636381 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" event={"ID":"3488a7eb-5170-478c-9af7-490dbe0f514e","Type":"ContainerDied","Data":"3ea10ed9b3b081ac010f974ab393059b11852999309c95ddd5381bd40e623b2e"} Feb 23 14:25:58.643387 master-0 kubenswrapper[7728]: I0223 14:25:58.636421 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" event={"ID":"865ceedb-b19a-4f2f-b295-311e1b7a645e","Type":"ContainerStarted","Data":"deffb87f96ddeeef2ceba573c92018620cd6c1adba32e1a82ff2a0041c126856"} Feb 23 14:25:58.643387 master-0 kubenswrapper[7728]: I0223 14:25:58.636673 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" event={"ID":"92c63c95-e880-4f51-9858-7715343f7bd8","Type":"ContainerStarted","Data":"0bba5b88522722fd7c81731469c726d0abf8c593b96b63602cc16566d85db157"} Feb 23 14:25:58.643387 master-0 kubenswrapper[7728]: I0223 14:25:58.636716 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" event={"ID":"76c67569-3a72-4de9-87cd-432a4607b15b","Type":"ContainerDied","Data":"a927374fcf62fad56c9d8325450d07e92c07f04787ed291d9c0071fab4d22549"} Feb 23 14:25:58.643387 master-0 kubenswrapper[7728]: I0223 14:25:58.636773 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" event={"ID":"57b57915-64dd-42f5-b06f-bc4bcc06b667","Type":"ContainerStarted","Data":"26b79c94d8eda0b324339825ea5aaa008bf89734d46acf3467ff356aa52ce675"} Feb 23 14:25:58.643387 master-0 kubenswrapper[7728]: I0223 14:25:58.636804 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" event={"ID":"12b256b7-a57b-4124-8452-25e74cfa7926","Type":"ContainerStarted","Data":"578a9e2a674702d2219386592f2e2254d406630d2cc2c55e8edf24f8f9368991"} Feb 23 14:25:58.643387 master-0 kubenswrapper[7728]: I0223 14:25:58.636834 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" event={"ID":"8de1f285-47ac-42aa-8026-8addce656362","Type":"ContainerStarted","Data":"a3d7f9dd773bb2be7eef32103651b05954025b8d3ad91ea82c3e56fc88bd34fd"} Feb 23 14:25:58.643387 master-0 kubenswrapper[7728]: I0223 14:25:58.637154 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" event={"ID":"5b54fc16-d2f7-4b10-a611-5b411b389c5a","Type":"ContainerDied","Data":"3e43920c8c9e66c01584e52a234477388c129ea94fe151ecc6c23098a8981522"} Feb 23 14:25:58.643387 master-0 kubenswrapper[7728]: I0223 14:25:58.637192 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" event={"ID":"76c67569-3a72-4de9-87cd-432a4607b15b","Type":"ContainerStarted","Data":"4f6e215689332bab70cfb5b43a7cfcbaa2bd241cb7a2a1c1757464250604d426"} Feb 23 14:25:58.643387 master-0 kubenswrapper[7728]: I0223 14:25:58.637432 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" event={"ID":"b9cf1c39-24f0-420b-8020-089616d1cdf0","Type":"ContainerStarted","Data":"cb07ee7a08ec58d0214f496b0ca32c3611b77165c521b9fecab35b067ef91753"} Feb 23 14:25:58.643387 master-0 kubenswrapper[7728]: I0223 14:25:58.637518 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" event={"ID":"b714a9df-026e-423d-a980-2569f0d92e47","Type":"ContainerStarted","Data":"22cda996f9dec95459a017791c6284a80f33c42296156317930bcb92d3fc7877"} Feb 23 14:25:58.643387 master-0 kubenswrapper[7728]: I0223 14:25:58.637564 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" event={"ID":"cb6e88cd-98de-446a-92e8-f56a2f133703","Type":"ContainerStarted","Data":"e062ef1f26297d24d2516be8292a3297ef7a87cfa574a75bfb2f2e2e904d65e1"} Feb 23 14:25:58.643387 master-0 kubenswrapper[7728]: I0223 14:25:58.637594 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" event={"ID":"3488a7eb-5170-478c-9af7-490dbe0f514e","Type":"ContainerStarted","Data":"9abb82bc4e660ae80bfe0a01a4c7f25bfcf62f98ac7b617e82941def46f78a19"} Feb 23 14:25:58.650289 master-0 kubenswrapper[7728]: E0223 14:25:58.643453 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-d6bb9bb76-4frj6_openshift-machine-api(12b256b7-a57b-4124-8452-25e74cfa7926)\"" pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" podUID="12b256b7-a57b-4124-8452-25e74cfa7926" Feb 23 14:25:58.650289 master-0 kubenswrapper[7728]: I0223 14:25:58.637622 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" event={"ID":"24829faf-50e8-45bb-abb0-7cc5ccf81080","Type":"ContainerStarted","Data":"50817d53493752eda9d4463a0b5a65e93107befbd5b1e52f265dd7c7f17a73bc"} Feb 23 14:25:58.650289 master-0 kubenswrapper[7728]: I0223 14:25:58.646155 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" event={"ID":"674041a2-e2b0-4286-88cc-f1b00571e3f3","Type":"ContainerStarted","Data":"4059934c66f6a9887a7e6b1218e04bcfb0fcfe5376abb8c188a9213f581fe6f3"} Feb 23 14:25:58.650289 master-0 kubenswrapper[7728]: I0223 14:25:58.646186 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" event={"ID":"cf04aca0-8174-4134-835d-37adf6a3b5ca","Type":"ContainerStarted","Data":"7c094f15ea265ac3d44bbebfb78fef4402e37dfe5737cb2bab354a08b8292a17"} Feb 23 14:25:58.650289 master-0 kubenswrapper[7728]: I0223 14:25:58.646207 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"c9ad9373c007a4fcd25e70622bdc8deb","Type":"ContainerStarted","Data":"e7e20b5ba72ce778a4607a64cc8928522b6f4e4e91aae5a0ddbe4de3f2e8d4a6"} Feb 23 14:25:58.650289 master-0 kubenswrapper[7728]: I0223 14:25:58.646230 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" event={"ID":"961e4ecd-545b-4270-ae34-e733dec793b6","Type":"ContainerStarted","Data":"41bf9ac4f6ba09181a226cfe2ad608e31e59bbb137b1b1ead593f9c6c980fde1"} Feb 23 14:25:58.650289 master-0 kubenswrapper[7728]: I0223 14:25:58.646253 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6847bb4785-5fw2x" event={"ID":"2e89a047-9ebc-459b-b7b3-e902c1fb0e17","Type":"ContainerStarted","Data":"4d73e7e4ca95353fc1daf5c78e6fb2d258da6e5bbcc4add88a2d98722b1263c5"} Feb 23 14:25:58.650289 master-0 kubenswrapper[7728]: I0223 14:25:58.646276 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" event={"ID":"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061","Type":"ContainerStarted","Data":"525b335554d223a0f792c02a10050ad9f40b958440d7f69f8c4c394f4e398780"} Feb 23 14:25:58.650289 master-0 kubenswrapper[7728]: I0223 14:25:58.646307 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"18a83278819db2092fa26d8274eb3f00","Type":"ContainerStarted","Data":"6a2e68abb8955d199221c9563462d7c40348ae0f0d637124ee947f300645aca2"} Feb 23 14:25:58.650289 master-0 kubenswrapper[7728]: I0223 14:25:58.646343 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"18a83278819db2092fa26d8274eb3f00","Type":"ContainerStarted","Data":"cce51c3475ff5b0dfbfd6ba41c684f036ddeb5d98e0d3c9ece1c70d0cc4cd606"} Feb 23 14:25:58.650289 master-0 kubenswrapper[7728]: I0223 14:25:58.646388 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"18a83278819db2092fa26d8274eb3f00","Type":"ContainerStarted","Data":"8ebd95875caac8439baca54415c00dcf7fafd7bb372421de30584dee13828051"} Feb 23 14:25:58.650289 master-0 kubenswrapper[7728]: I0223 14:25:58.646412 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"18a83278819db2092fa26d8274eb3f00","Type":"ContainerStarted","Data":"d783393f51b6dc83f57672bdad13e558c8969087e1d7a88a2a8c67c244b55dbe"} Feb 23 14:25:58.650289 master-0 kubenswrapper[7728]: I0223 14:25:58.646433 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"18a83278819db2092fa26d8274eb3f00","Type":"ContainerStarted","Data":"ecda0f59f77c55aef5b6997149fe196458b5009b38a90f2c2ecb3d3be2666b23"} Feb 23 14:25:58.650289 master-0 kubenswrapper[7728]: I0223 14:25:58.646452 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" event={"ID":"af950a67-1557-4352-8100-27281bb8ecbe","Type":"ContainerDied","Data":"3d6da8a2ab007c14781f7a758e38f4dc17838974a913b095ba4be079439082e2"} Feb 23 14:25:58.650289 master-0 kubenswrapper[7728]: I0223 14:25:58.646529 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-576b4d78bd-lq6ct" event={"ID":"709ac071-4392-4a3f-a3d1-4bc8ba2f6236","Type":"ContainerDied","Data":"c28d30a2b760e3ebbe98681a086eea9adf4942f9ca5f692597b7830f1309f2a8"} Feb 23 14:25:58.650289 master-0 kubenswrapper[7728]: I0223 14:25:58.646560 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp" event={"ID":"3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3","Type":"ContainerDied","Data":"08d3df84ad8de18eec9e6a636baf4cb95ff798ffedb9a2d917a6b77d6c934fb7"} Feb 23 14:25:58.650289 master-0 kubenswrapper[7728]: I0223 14:25:58.646583 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7" event={"ID":"ad0f0d72-0337-4347-bb50-e299a175f3ca","Type":"ContainerDied","Data":"c3f209a9ce16ae00e125bd88a555117337a8948041a4b5c781124f66c958f969"} Feb 23 14:25:58.650289 master-0 kubenswrapper[7728]: I0223 14:25:58.646610 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" event={"ID":"e2d00ece-7586-4346-adbb-eaae1aeda69e","Type":"ContainerDied","Data":"54011ffa8f000620849835983f9e2c00740786e321e8cd4e4de797c7d208b465"} Feb 23 14:25:58.650289 master-0 kubenswrapper[7728]: I0223 14:25:58.646634 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-f94476f49-s6c8v" event={"ID":"fbb66172-1ea9-4683-b88f-227c4fd94924","Type":"ContainerDied","Data":"4bd96aadee1934ae65fac50d897e75007505a399d7d143ad871ced8edd81b895"} Feb 23 14:25:58.650289 master-0 kubenswrapper[7728]: I0223 14:25:58.646658 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" event={"ID":"482284fd-6911-4ba6-8d57-7966cc51117a","Type":"ContainerDied","Data":"30dd1f19a8b444dbc9b769a06f0917819d1c1e9174b5fb3b5552595a9eed345f"} Feb 23 14:25:58.650289 master-0 kubenswrapper[7728]: I0223 14:25:58.646694 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6fb4df594f-hkcgz" event={"ID":"a4ae9292-71dc-4484-b277-43cb26c1e04d","Type":"ContainerDied","Data":"fafa7b0f21c17417165ff9592e80bbb6992685b66472f608cb30827b7d663491"} Feb 23 14:25:58.650289 master-0 kubenswrapper[7728]: I0223 14:25:58.646718 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"c9ad9373c007a4fcd25e70622bdc8deb","Type":"ContainerDied","Data":"f38113657e6647d113d4b8b771a4b871cb4df714ffeae8172aebba272b7e4da9"} Feb 23 14:25:58.650289 master-0 kubenswrapper[7728]: I0223 14:25:58.646740 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6847bb4785-5fw2x" event={"ID":"2e89a047-9ebc-459b-b7b3-e902c1fb0e17","Type":"ContainerDied","Data":"4d73e7e4ca95353fc1daf5c78e6fb2d258da6e5bbcc4add88a2d98722b1263c5"} Feb 23 14:25:58.650289 master-0 kubenswrapper[7728]: I0223 14:25:58.646779 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" event={"ID":"b9774f8c-0f29-46d8-be77-81bcf74d5994","Type":"ContainerDied","Data":"94bfdbcfdcf4914977da334b3fd2fe80966ec6c36be33d3628e4eada6361765f"} Feb 23 14:25:58.650289 master-0 kubenswrapper[7728]: I0223 14:25:58.646806 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" event={"ID":"12b256b7-a57b-4124-8452-25e74cfa7926","Type":"ContainerDied","Data":"578a9e2a674702d2219386592f2e2254d406630d2cc2c55e8edf24f8f9368991"} Feb 23 14:25:58.650289 master-0 kubenswrapper[7728]: I0223 14:25:58.647284 7728 scope.go:117] "RemoveContainer" containerID="c28d30a2b760e3ebbe98681a086eea9adf4942f9ca5f692597b7830f1309f2a8" Feb 23 14:25:58.650289 master-0 kubenswrapper[7728]: I0223 14:25:58.647795 7728 scope.go:117] "RemoveContainer" containerID="c3f209a9ce16ae00e125bd88a555117337a8948041a4b5c781124f66c958f969" Feb 23 14:25:58.650289 master-0 kubenswrapper[7728]: I0223 14:25:58.647787 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"c9ad9373c007a4fcd25e70622bdc8deb","Type":"ContainerDied","Data":"e7e20b5ba72ce778a4607a64cc8928522b6f4e4e91aae5a0ddbe4de3f2e8d4a6"} Feb 23 14:25:58.650289 master-0 kubenswrapper[7728]: I0223 14:25:58.648992 7728 scope.go:117] "RemoveContainer" containerID="94bfdbcfdcf4914977da334b3fd2fe80966ec6c36be33d3628e4eada6361765f" Feb 23 14:25:58.659708 master-0 kubenswrapper[7728]: I0223 14:25:58.652002 7728 scope.go:117] "RemoveContainer" containerID="08d3df84ad8de18eec9e6a636baf4cb95ff798ffedb9a2d917a6b77d6c934fb7" Feb 23 14:25:58.659708 master-0 kubenswrapper[7728]: I0223 14:25:58.652170 7728 scope.go:117] "RemoveContainer" containerID="4bd96aadee1934ae65fac50d897e75007505a399d7d143ad871ced8edd81b895" Feb 23 14:25:58.661469 master-0 kubenswrapper[7728]: I0223 14:25:58.660746 7728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Feb 23 14:25:58.661469 master-0 kubenswrapper[7728]: I0223 14:25:58.660784 7728 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="3b737ae4-e677-4455-a567-97f8dfbedc5c" Feb 23 14:25:58.669243 master-0 kubenswrapper[7728]: I0223 14:25:58.668600 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Feb 23 14:25:58.673236 master-0 kubenswrapper[7728]: I0223 14:25:58.672320 7728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Feb 23 14:25:58.727769 master-0 kubenswrapper[7728]: I0223 14:25:58.724008 7728 scope.go:117] "RemoveContainer" containerID="5ed2538f1dd4c505937625e4613ce7839a7ad1306cb779a0660bf410856f74ea" Feb 23 14:25:58.760631 master-0 kubenswrapper[7728]: I0223 14:25:58.758629 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podStartSLOduration=371.161646566 podStartE2EDuration="6m28.758607099s" podCreationTimestamp="2026-02-23 14:19:30 +0000 UTC" firstStartedPulling="2026-02-23 14:19:31.143694338 +0000 UTC m=+64.106355634" lastFinishedPulling="2026-02-23 14:19:48.740654881 +0000 UTC m=+81.703316167" observedRunningTime="2026-02-23 14:25:58.755253073 +0000 UTC m=+451.717914389" watchObservedRunningTime="2026-02-23 14:25:58.758607099 +0000 UTC m=+451.721268415" Feb 23 14:25:58.804515 master-0 kubenswrapper[7728]: I0223 14:25:58.804391 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vwtc6" podStartSLOduration=360.362890095 podStartE2EDuration="6m26.804368476s" podCreationTimestamp="2026-02-23 14:19:32 +0000 UTC" firstStartedPulling="2026-02-23 14:19:44.366438867 +0000 UTC m=+77.329100163" lastFinishedPulling="2026-02-23 14:20:10.807917258 +0000 UTC m=+103.770578544" observedRunningTime="2026-02-23 14:25:58.800152377 +0000 UTC m=+451.762813673" watchObservedRunningTime="2026-02-23 14:25:58.804368476 +0000 UTC m=+451.767029782" Feb 23 14:25:58.821739 master-0 kubenswrapper[7728]: I0223 14:25:58.821679 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl" podStartSLOduration=362.732862914 podStartE2EDuration="6m27.82165895s" podCreationTimestamp="2026-02-23 14:19:31 +0000 UTC" firstStartedPulling="2026-02-23 14:19:43.051085858 +0000 UTC m=+76.013747154" lastFinishedPulling="2026-02-23 14:20:08.139881884 +0000 UTC m=+101.102543190" observedRunningTime="2026-02-23 14:25:58.819578007 +0000 UTC m=+451.782239323" watchObservedRunningTime="2026-02-23 14:25:58.82165895 +0000 UTC m=+451.784320256" Feb 23 14:25:58.875974 master-0 kubenswrapper[7728]: I0223 14:25:58.875930 7728 scope.go:117] "RemoveContainer" containerID="444b5986734e966174e693b843714d39c39b89099075b49c0d4944256ff9f4ae" Feb 23 14:25:58.916551 master-0 kubenswrapper[7728]: I0223 14:25:58.916504 7728 scope.go:117] "RemoveContainer" containerID="5867cf57b319e8b378703de8112e0a4c5fd05aee108af7754fc3219eac54a673" Feb 23 14:25:58.947658 master-0 kubenswrapper[7728]: I0223 14:25:58.947611 7728 scope.go:117] "RemoveContainer" containerID="4f3667b06f9040c2373de3a09349d52a663561d04056133aea74705119d3b818" Feb 23 14:25:59.027653 master-0 kubenswrapper[7728]: I0223 14:25:59.027315 7728 scope.go:117] "RemoveContainer" containerID="bceb70263737a80d48b793aeeb1a38a769270ac03a734c22702eb093a9f1b430" Feb 23 14:25:59.109332 master-0 kubenswrapper[7728]: I0223 14:25:59.109300 7728 scope.go:117] "RemoveContainer" containerID="a0c28cc50bec94c9a70b8ff73f58f632e7f157d8192b386a307045a41a893000" Feb 23 14:25:59.144764 master-0 kubenswrapper[7728]: I0223 14:25:59.144723 7728 scope.go:117] "RemoveContainer" containerID="12ea317144b1f97c12db3d866b1cc7b66073f64b41a37f71cd1c51e60dce3e4c" Feb 23 14:25:59.172290 master-0 kubenswrapper[7728]: I0223 14:25:59.172240 7728 scope.go:117] "RemoveContainer" containerID="b545413980bb822863005db697b932a984f3d1797f9e0fd0d4ca5331ec57bc46" Feb 23 14:25:59.229962 master-0 kubenswrapper[7728]: I0223 14:25:59.229908 7728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1bffce5-019a-4c97-85f2-929dc19a0bde" path="/var/lib/kubelet/pods/d1bffce5-019a-4c97-85f2-929dc19a0bde/volumes" Feb 23 14:25:59.281675 master-0 kubenswrapper[7728]: E0223 14:25:59.281624 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(c9ad9373c007a4fcd25e70622bdc8deb)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" Feb 23 14:25:59.286213 master-0 kubenswrapper[7728]: I0223 14:25:59.286154 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" podStartSLOduration=374.286141344 podStartE2EDuration="6m14.286141344s" podCreationTimestamp="2026-02-23 14:19:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:25:59.284150143 +0000 UTC m=+452.246811429" watchObservedRunningTime="2026-02-23 14:25:59.286141344 +0000 UTC m=+452.248802640" Feb 23 14:25:59.294887 master-0 kubenswrapper[7728]: I0223 14:25:59.294716 7728 scope.go:117] "RemoveContainer" containerID="475b682a5602a8b70516629df8770a92cda1f614d3b2e4b8f4d6b708bbc8532d" Feb 23 14:25:59.334007 master-0 kubenswrapper[7728]: E0223 14:25:59.333913 7728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Feb 23 14:25:59.366995 master-0 kubenswrapper[7728]: I0223 14:25:59.366961 7728 scope.go:117] "RemoveContainer" containerID="502abf4ea3cb690eb21a0ba5e773be5fbc2712d7f83f4ac4448a35b53cf2ac71" Feb 23 14:25:59.452895 master-0 kubenswrapper[7728]: I0223 14:25:59.452793 7728 scope.go:117] "RemoveContainer" containerID="5d82b70b9c0cfec9d3d38ffda7072c232f8227384d8dba5c3b39ed19470ad748" Feb 23 14:25:59.488084 master-0 kubenswrapper[7728]: I0223 14:25:59.488023 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xxh6f"] Feb 23 14:25:59.505451 master-0 kubenswrapper[7728]: I0223 14:25:59.505407 7728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xxh6f"] Feb 23 14:25:59.512829 master-0 kubenswrapper[7728]: I0223 14:25:59.512580 7728 scope.go:117] "RemoveContainer" containerID="bceb70263737a80d48b793aeeb1a38a769270ac03a734c22702eb093a9f1b430" Feb 23 14:25:59.513195 master-0 kubenswrapper[7728]: E0223 14:25:59.513114 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bceb70263737a80d48b793aeeb1a38a769270ac03a734c22702eb093a9f1b430\": container with ID starting with bceb70263737a80d48b793aeeb1a38a769270ac03a734c22702eb093a9f1b430 not found: ID does not exist" containerID="bceb70263737a80d48b793aeeb1a38a769270ac03a734c22702eb093a9f1b430" Feb 23 14:25:59.513288 master-0 kubenswrapper[7728]: I0223 14:25:59.513209 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bceb70263737a80d48b793aeeb1a38a769270ac03a734c22702eb093a9f1b430"} err="failed to get container status \"bceb70263737a80d48b793aeeb1a38a769270ac03a734c22702eb093a9f1b430\": rpc error: code = NotFound desc = could not find container \"bceb70263737a80d48b793aeeb1a38a769270ac03a734c22702eb093a9f1b430\": container with ID starting with bceb70263737a80d48b793aeeb1a38a769270ac03a734c22702eb093a9f1b430 not found: ID does not exist" Feb 23 14:25:59.513345 master-0 kubenswrapper[7728]: I0223 14:25:59.513297 7728 scope.go:117] "RemoveContainer" containerID="a0c28cc50bec94c9a70b8ff73f58f632e7f157d8192b386a307045a41a893000" Feb 23 14:25:59.513847 master-0 kubenswrapper[7728]: E0223 14:25:59.513782 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0c28cc50bec94c9a70b8ff73f58f632e7f157d8192b386a307045a41a893000\": container with ID starting with a0c28cc50bec94c9a70b8ff73f58f632e7f157d8192b386a307045a41a893000 not found: ID does not exist" containerID="a0c28cc50bec94c9a70b8ff73f58f632e7f157d8192b386a307045a41a893000" Feb 23 14:25:59.513906 master-0 kubenswrapper[7728]: I0223 14:25:59.513852 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0c28cc50bec94c9a70b8ff73f58f632e7f157d8192b386a307045a41a893000"} err="failed to get container status \"a0c28cc50bec94c9a70b8ff73f58f632e7f157d8192b386a307045a41a893000\": rpc error: code = NotFound desc = could not find container \"a0c28cc50bec94c9a70b8ff73f58f632e7f157d8192b386a307045a41a893000\": container with ID starting with a0c28cc50bec94c9a70b8ff73f58f632e7f157d8192b386a307045a41a893000 not found: ID does not exist" Feb 23 14:25:59.513906 master-0 kubenswrapper[7728]: I0223 14:25:59.513871 7728 scope.go:117] "RemoveContainer" containerID="12ea317144b1f97c12db3d866b1cc7b66073f64b41a37f71cd1c51e60dce3e4c" Feb 23 14:25:59.514324 master-0 kubenswrapper[7728]: E0223 14:25:59.514298 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12ea317144b1f97c12db3d866b1cc7b66073f64b41a37f71cd1c51e60dce3e4c\": container with ID starting with 12ea317144b1f97c12db3d866b1cc7b66073f64b41a37f71cd1c51e60dce3e4c not found: ID does not exist" containerID="12ea317144b1f97c12db3d866b1cc7b66073f64b41a37f71cd1c51e60dce3e4c" Feb 23 14:25:59.514375 master-0 kubenswrapper[7728]: I0223 14:25:59.514327 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12ea317144b1f97c12db3d866b1cc7b66073f64b41a37f71cd1c51e60dce3e4c"} err="failed to get container status \"12ea317144b1f97c12db3d866b1cc7b66073f64b41a37f71cd1c51e60dce3e4c\": rpc error: code = NotFound desc = could not find container \"12ea317144b1f97c12db3d866b1cc7b66073f64b41a37f71cd1c51e60dce3e4c\": container with ID starting with 12ea317144b1f97c12db3d866b1cc7b66073f64b41a37f71cd1c51e60dce3e4c not found: ID does not exist" Feb 23 14:25:59.514409 master-0 kubenswrapper[7728]: I0223 14:25:59.514374 7728 scope.go:117] "RemoveContainer" containerID="b545413980bb822863005db697b932a984f3d1797f9e0fd0d4ca5331ec57bc46" Feb 23 14:25:59.514695 master-0 kubenswrapper[7728]: E0223 14:25:59.514668 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b545413980bb822863005db697b932a984f3d1797f9e0fd0d4ca5331ec57bc46\": container with ID starting with b545413980bb822863005db697b932a984f3d1797f9e0fd0d4ca5331ec57bc46 not found: ID does not exist" containerID="b545413980bb822863005db697b932a984f3d1797f9e0fd0d4ca5331ec57bc46" Feb 23 14:25:59.514751 master-0 kubenswrapper[7728]: I0223 14:25:59.514698 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b545413980bb822863005db697b932a984f3d1797f9e0fd0d4ca5331ec57bc46"} err="failed to get container status \"b545413980bb822863005db697b932a984f3d1797f9e0fd0d4ca5331ec57bc46\": rpc error: code = NotFound desc = could not find container \"b545413980bb822863005db697b932a984f3d1797f9e0fd0d4ca5331ec57bc46\": container with ID starting with b545413980bb822863005db697b932a984f3d1797f9e0fd0d4ca5331ec57bc46 not found: ID does not exist" Feb 23 14:25:59.514751 master-0 kubenswrapper[7728]: I0223 14:25:59.514715 7728 scope.go:117] "RemoveContainer" containerID="3dfd224fb797b317bbd9fa5874481064b86c55aa823a6f87465b8ca08947f5d3" Feb 23 14:25:59.550401 master-0 kubenswrapper[7728]: I0223 14:25:59.550359 7728 scope.go:117] "RemoveContainer" containerID="b8ab745e2116720c089d0aba55fcbbcd93f3d05db7dc85aaff6bdfb686118c69" Feb 23 14:25:59.603632 master-0 kubenswrapper[7728]: I0223 14:25:59.603573 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n82gm"] Feb 23 14:25:59.615451 master-0 kubenswrapper[7728]: I0223 14:25:59.615365 7728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n82gm"] Feb 23 14:25:59.628214 master-0 kubenswrapper[7728]: I0223 14:25:59.625090 7728 scope.go:117] "RemoveContainer" containerID="1ea2b285f2639d5a18b8a335d5c0eee1af23080ecbfb38dc1a5168ba545660e2" Feb 23 14:25:59.641351 master-0 kubenswrapper[7728]: I0223 14:25:59.641287 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"c9ad9373c007a4fcd25e70622bdc8deb","Type":"ContainerStarted","Data":"db7391fbfbce6a5b2d3e6ea64af30eda73902a756b9cef12c2e6e67aee0522bb"} Feb 23 14:25:59.642055 master-0 kubenswrapper[7728]: I0223 14:25:59.641997 7728 scope.go:117] "RemoveContainer" containerID="e7e20b5ba72ce778a4607a64cc8928522b6f4e4e91aae5a0ddbe4de3f2e8d4a6" Feb 23 14:25:59.642570 master-0 kubenswrapper[7728]: E0223 14:25:59.642516 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(c9ad9373c007a4fcd25e70622bdc8deb)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" Feb 23 14:25:59.644009 master-0 kubenswrapper[7728]: I0223 14:25:59.643963 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-86b8dc6d6-2kvfp_3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3/cluster-autoscaler-operator/0.log" Feb 23 14:25:59.645238 master-0 kubenswrapper[7728]: I0223 14:25:59.644373 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp" event={"ID":"3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3","Type":"ContainerStarted","Data":"dfdd31784cff3d18e2ca10b9d658f04a933e734fce723e106164c1f2dd94e36b"} Feb 23 14:25:59.650877 master-0 kubenswrapper[7728]: I0223 14:25:59.650829 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-6f47d587d6-55qjr_92c63c95-e880-4f51-9858-7715343f7bd8/openshift-config-operator/2.log" Feb 23 14:25:59.651596 master-0 kubenswrapper[7728]: I0223 14:25:59.651542 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" Feb 23 14:25:59.653968 master-0 kubenswrapper[7728]: I0223 14:25:59.653919 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-77cd4d9559-qvq8x_b9cf1c39-24f0-420b-8020-089616d1cdf0/kube-scheduler-operator-container/1.log" Feb 23 14:25:59.656275 master-0 kubenswrapper[7728]: I0223 14:25:59.656241 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" event={"ID":"b9774f8c-0f29-46d8-be77-81bcf74d5994","Type":"ContainerStarted","Data":"076f7d65fee110708a8e6296446403bfd54323b5e373a5fdc91a115dfcb1e945"} Feb 23 14:25:59.658945 master-0 kubenswrapper[7728]: I0223 14:25:59.658808 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-f94476f49-s6c8v" event={"ID":"fbb66172-1ea9-4683-b88f-227c4fd94924","Type":"ContainerStarted","Data":"688f3915cda6841fdb5b96d4c61c8187855740e4ef08e7e26901e30e5fbf3918"} Feb 23 14:25:59.661084 master-0 kubenswrapper[7728]: I0223 14:25:59.661039 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" event={"ID":"482284fd-6911-4ba6-8d57-7966cc51117a","Type":"ContainerStarted","Data":"cc5b0e807a282b75c570fbfb71a174caf59e3ff1678808f33d1b9369bbe859b7"} Feb 23 14:25:59.661889 master-0 kubenswrapper[7728]: I0223 14:25:59.661747 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:25:59.663871 master-0 kubenswrapper[7728]: I0223 14:25:59.663660 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7d7db75979-x4qnw_674041a2-e2b0-4286-88cc-f1b00571e3f3/network-operator/1.log" Feb 23 14:25:59.666299 master-0 kubenswrapper[7728]: I0223 14:25:59.665987 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-576b4d78bd-lq6ct" event={"ID":"709ac071-4392-4a3f-a3d1-4bc8ba2f6236","Type":"ContainerStarted","Data":"856c1fc3c90a7b6e24237be79fd4d77aac6d053d0596ff94d90d617594f6c02a"} Feb 23 14:25:59.668983 master-0 kubenswrapper[7728]: I0223 14:25:59.667615 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6847bb4785-5fw2x_2e89a047-9ebc-459b-b7b3-e902c1fb0e17/snapshot-controller/2.log" Feb 23 14:25:59.668983 master-0 kubenswrapper[7728]: I0223 14:25:59.667706 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6847bb4785-5fw2x" event={"ID":"2e89a047-9ebc-459b-b7b3-e902c1fb0e17","Type":"ContainerStarted","Data":"3d02c5174ccc3722ad642137b2ae38a4ad6beee863578d93948d8f75b3ffc635"} Feb 23 14:25:59.669769 master-0 kubenswrapper[7728]: I0223 14:25:59.669735 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6fb4df594f-hkcgz" event={"ID":"a4ae9292-71dc-4484-b277-43cb26c1e04d","Type":"ContainerStarted","Data":"61dc17f9e6cf7debb617c1a5d6bf61a564d357e02c830ef1638e669b87de1835"} Feb 23 14:25:59.671702 master-0 kubenswrapper[7728]: I0223 14:25:59.671672 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-c48c8bf7c-vtnsw_b714a9df-026e-423d-a980-2569f0d92e47/service-ca-operator/1.log" Feb 23 14:25:59.673453 master-0 kubenswrapper[7728]: I0223 14:25:59.673420 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-5c75f78c8b-cj2l7_5b54fc16-d2f7-4b10-a611-5b411b389c5a/package-server-manager/0.log" Feb 23 14:25:59.673961 master-0 kubenswrapper[7728]: I0223 14:25:59.673922 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" event={"ID":"5b54fc16-d2f7-4b10-a611-5b411b389c5a","Type":"ContainerStarted","Data":"d129e9b13c717050b25d9d0f6e17182a80ea8a33b9d790e963d24636d1efd35e"} Feb 23 14:25:59.674149 master-0 kubenswrapper[7728]: I0223 14:25:59.674093 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:25:59.676237 master-0 kubenswrapper[7728]: I0223 14:25:59.676200 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" event={"ID":"af950a67-1557-4352-8100-27281bb8ecbe","Type":"ContainerStarted","Data":"0ee1bcffd0080993a8a7d5ae7b401f3f8009a9023b5b1079dc04460210753ad8"} Feb 23 14:25:59.678124 master-0 kubenswrapper[7728]: I0223 14:25:59.678102 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7" event={"ID":"ad0f0d72-0337-4347-bb50-e299a175f3ca","Type":"ContainerStarted","Data":"7ba62a28fa741876aeafcc419bd3f3721acce59a5947cd8fcdc354def4e8ba87"} Feb 23 14:25:59.679257 master-0 kubenswrapper[7728]: I0223 14:25:59.679014 7728 scope.go:117] "RemoveContainer" containerID="93ee993f97732b66b7b7fa627308e4fbe3771a952955dfa9d4f021a884360bf3" Feb 23 14:25:59.681364 master-0 kubenswrapper[7728]: I0223 14:25:59.681308 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" event={"ID":"e2d00ece-7586-4346-adbb-eaae1aeda69e","Type":"ContainerStarted","Data":"0d84cce0e88dcc70d83b8dd67a4e91c62d1f30fed4495c32ca427288ab62004f"} Feb 23 14:25:59.683633 master-0 kubenswrapper[7728]: I0223 14:25:59.683616 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-6569778c84-hsl6c_3488a7eb-5170-478c-9af7-490dbe0f514e/ingress-operator/1.log" Feb 23 14:25:59.686145 master-0 kubenswrapper[7728]: I0223 14:25:59.686109 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-8586dccc9b-tvnmq_24829faf-50e8-45bb-abb0-7cc5ccf81080/openshift-apiserver-operator/1.log" Feb 23 14:25:59.688554 master-0 kubenswrapper[7728]: I0223 14:25:59.688530 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5d87bf58c-nq2tz_961e4ecd-545b-4270-ae34-e733dec793b6/kube-apiserver-operator/1.log" Feb 23 14:25:59.691469 master-0 kubenswrapper[7728]: I0223 14:25:59.691424 7728 scope.go:117] "RemoveContainer" containerID="578a9e2a674702d2219386592f2e2254d406630d2cc2c55e8edf24f8f9368991" Feb 23 14:25:59.691879 master-0 kubenswrapper[7728]: E0223 14:25:59.691820 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-d6bb9bb76-4frj6_openshift-machine-api(12b256b7-a57b-4124-8452-25e74cfa7926)\"" pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" podUID="12b256b7-a57b-4124-8452-25e74cfa7926" Feb 23 14:25:59.727516 master-0 kubenswrapper[7728]: I0223 14:25:59.727463 7728 scope.go:117] "RemoveContainer" containerID="3f92273d1230c6309ba0ff19f3495f90ece38f0e07bb4c16e151dee5c4fb41ec" Feb 23 14:25:59.738153 master-0 kubenswrapper[7728]: I0223 14:25:59.738084 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:25:59.755113 master-0 kubenswrapper[7728]: I0223 14:25:59.755075 7728 scope.go:117] "RemoveContainer" containerID="59103074e5c9d28cc59a99d2933688907ecdae822b440f6d4da07709d19793c9" Feb 23 14:25:59.779785 master-0 kubenswrapper[7728]: I0223 14:25:59.779754 7728 scope.go:117] "RemoveContainer" containerID="515b3836a32aed4579312ac49c6468a1e7035624b7a30950b8364d5d10c9310d" Feb 23 14:25:59.792301 master-0 kubenswrapper[7728]: I0223 14:25:59.792189 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mtvwp" podStartSLOduration=358.213985899 podStartE2EDuration="6m24.792159747s" podCreationTimestamp="2026-02-23 14:19:35 +0000 UTC" firstStartedPulling="2026-02-23 14:19:44.280959645 +0000 UTC m=+77.243620941" lastFinishedPulling="2026-02-23 14:20:10.859133493 +0000 UTC m=+103.821794789" observedRunningTime="2026-02-23 14:25:59.790050763 +0000 UTC m=+452.752712069" watchObservedRunningTime="2026-02-23 14:25:59.792159747 +0000 UTC m=+452.754821083" Feb 23 14:25:59.822165 master-0 kubenswrapper[7728]: I0223 14:25:59.822107 7728 scope.go:117] "RemoveContainer" containerID="502abf4ea3cb690eb21a0ba5e773be5fbc2712d7f83f4ac4448a35b53cf2ac71" Feb 23 14:25:59.823027 master-0 kubenswrapper[7728]: E0223 14:25:59.822974 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"502abf4ea3cb690eb21a0ba5e773be5fbc2712d7f83f4ac4448a35b53cf2ac71\": container with ID starting with 502abf4ea3cb690eb21a0ba5e773be5fbc2712d7f83f4ac4448a35b53cf2ac71 not found: ID does not exist" containerID="502abf4ea3cb690eb21a0ba5e773be5fbc2712d7f83f4ac4448a35b53cf2ac71" Feb 23 14:25:59.823130 master-0 kubenswrapper[7728]: I0223 14:25:59.823029 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"502abf4ea3cb690eb21a0ba5e773be5fbc2712d7f83f4ac4448a35b53cf2ac71"} err="failed to get container status \"502abf4ea3cb690eb21a0ba5e773be5fbc2712d7f83f4ac4448a35b53cf2ac71\": rpc error: code = NotFound desc = could not find container \"502abf4ea3cb690eb21a0ba5e773be5fbc2712d7f83f4ac4448a35b53cf2ac71\": container with ID starting with 502abf4ea3cb690eb21a0ba5e773be5fbc2712d7f83f4ac4448a35b53cf2ac71 not found: ID does not exist" Feb 23 14:25:59.823130 master-0 kubenswrapper[7728]: I0223 14:25:59.823058 7728 scope.go:117] "RemoveContainer" containerID="5d82b70b9c0cfec9d3d38ffda7072c232f8227384d8dba5c3b39ed19470ad748" Feb 23 14:25:59.823503 master-0 kubenswrapper[7728]: E0223 14:25:59.823440 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d82b70b9c0cfec9d3d38ffda7072c232f8227384d8dba5c3b39ed19470ad748\": container with ID starting with 5d82b70b9c0cfec9d3d38ffda7072c232f8227384d8dba5c3b39ed19470ad748 not found: ID does not exist" containerID="5d82b70b9c0cfec9d3d38ffda7072c232f8227384d8dba5c3b39ed19470ad748" Feb 23 14:25:59.823590 master-0 kubenswrapper[7728]: I0223 14:25:59.823518 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d82b70b9c0cfec9d3d38ffda7072c232f8227384d8dba5c3b39ed19470ad748"} err="failed to get container status \"5d82b70b9c0cfec9d3d38ffda7072c232f8227384d8dba5c3b39ed19470ad748\": rpc error: code = NotFound desc = could not find container \"5d82b70b9c0cfec9d3d38ffda7072c232f8227384d8dba5c3b39ed19470ad748\": container with ID starting with 5d82b70b9c0cfec9d3d38ffda7072c232f8227384d8dba5c3b39ed19470ad748 not found: ID does not exist" Feb 23 14:25:59.823590 master-0 kubenswrapper[7728]: I0223 14:25:59.823554 7728 scope.go:117] "RemoveContainer" containerID="3dfd224fb797b317bbd9fa5874481064b86c55aa823a6f87465b8ca08947f5d3" Feb 23 14:25:59.824084 master-0 kubenswrapper[7728]: E0223 14:25:59.824047 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dfd224fb797b317bbd9fa5874481064b86c55aa823a6f87465b8ca08947f5d3\": container with ID starting with 3dfd224fb797b317bbd9fa5874481064b86c55aa823a6f87465b8ca08947f5d3 not found: ID does not exist" containerID="3dfd224fb797b317bbd9fa5874481064b86c55aa823a6f87465b8ca08947f5d3" Feb 23 14:25:59.824218 master-0 kubenswrapper[7728]: I0223 14:25:59.824187 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dfd224fb797b317bbd9fa5874481064b86c55aa823a6f87465b8ca08947f5d3"} err="failed to get container status \"3dfd224fb797b317bbd9fa5874481064b86c55aa823a6f87465b8ca08947f5d3\": rpc error: code = NotFound desc = could not find container \"3dfd224fb797b317bbd9fa5874481064b86c55aa823a6f87465b8ca08947f5d3\": container with ID starting with 3dfd224fb797b317bbd9fa5874481064b86c55aa823a6f87465b8ca08947f5d3 not found: ID does not exist" Feb 23 14:25:59.824310 master-0 kubenswrapper[7728]: I0223 14:25:59.824294 7728 scope.go:117] "RemoveContainer" containerID="b8ab745e2116720c089d0aba55fcbbcd93f3d05db7dc85aaff6bdfb686118c69" Feb 23 14:25:59.825004 master-0 kubenswrapper[7728]: E0223 14:25:59.824948 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8ab745e2116720c089d0aba55fcbbcd93f3d05db7dc85aaff6bdfb686118c69\": container with ID starting with b8ab745e2116720c089d0aba55fcbbcd93f3d05db7dc85aaff6bdfb686118c69 not found: ID does not exist" containerID="b8ab745e2116720c089d0aba55fcbbcd93f3d05db7dc85aaff6bdfb686118c69" Feb 23 14:25:59.825102 master-0 kubenswrapper[7728]: I0223 14:25:59.824995 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8ab745e2116720c089d0aba55fcbbcd93f3d05db7dc85aaff6bdfb686118c69"} err="failed to get container status \"b8ab745e2116720c089d0aba55fcbbcd93f3d05db7dc85aaff6bdfb686118c69\": rpc error: code = NotFound desc = could not find container \"b8ab745e2116720c089d0aba55fcbbcd93f3d05db7dc85aaff6bdfb686118c69\": container with ID starting with b8ab745e2116720c089d0aba55fcbbcd93f3d05db7dc85aaff6bdfb686118c69 not found: ID does not exist" Feb 23 14:25:59.825102 master-0 kubenswrapper[7728]: I0223 14:25:59.825024 7728 scope.go:117] "RemoveContainer" containerID="3f92273d1230c6309ba0ff19f3495f90ece38f0e07bb4c16e151dee5c4fb41ec" Feb 23 14:25:59.825492 master-0 kubenswrapper[7728]: E0223 14:25:59.825424 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f92273d1230c6309ba0ff19f3495f90ece38f0e07bb4c16e151dee5c4fb41ec\": container with ID starting with 3f92273d1230c6309ba0ff19f3495f90ece38f0e07bb4c16e151dee5c4fb41ec not found: ID does not exist" containerID="3f92273d1230c6309ba0ff19f3495f90ece38f0e07bb4c16e151dee5c4fb41ec" Feb 23 14:25:59.825564 master-0 kubenswrapper[7728]: I0223 14:25:59.825473 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f92273d1230c6309ba0ff19f3495f90ece38f0e07bb4c16e151dee5c4fb41ec"} err="failed to get container status \"3f92273d1230c6309ba0ff19f3495f90ece38f0e07bb4c16e151dee5c4fb41ec\": rpc error: code = NotFound desc = could not find container \"3f92273d1230c6309ba0ff19f3495f90ece38f0e07bb4c16e151dee5c4fb41ec\": container with ID starting with 3f92273d1230c6309ba0ff19f3495f90ece38f0e07bb4c16e151dee5c4fb41ec not found: ID does not exist" Feb 23 14:25:59.825564 master-0 kubenswrapper[7728]: I0223 14:25:59.825529 7728 scope.go:117] "RemoveContainer" containerID="59103074e5c9d28cc59a99d2933688907ecdae822b440f6d4da07709d19793c9" Feb 23 14:25:59.826070 master-0 kubenswrapper[7728]: E0223 14:25:59.826035 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59103074e5c9d28cc59a99d2933688907ecdae822b440f6d4da07709d19793c9\": container with ID starting with 59103074e5c9d28cc59a99d2933688907ecdae822b440f6d4da07709d19793c9 not found: ID does not exist" containerID="59103074e5c9d28cc59a99d2933688907ecdae822b440f6d4da07709d19793c9" Feb 23 14:25:59.826137 master-0 kubenswrapper[7728]: I0223 14:25:59.826106 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59103074e5c9d28cc59a99d2933688907ecdae822b440f6d4da07709d19793c9"} err="failed to get container status \"59103074e5c9d28cc59a99d2933688907ecdae822b440f6d4da07709d19793c9\": rpc error: code = NotFound desc = could not find container \"59103074e5c9d28cc59a99d2933688907ecdae822b440f6d4da07709d19793c9\": container with ID starting with 59103074e5c9d28cc59a99d2933688907ecdae822b440f6d4da07709d19793c9 not found: ID does not exist" Feb 23 14:25:59.826137 master-0 kubenswrapper[7728]: I0223 14:25:59.826132 7728 scope.go:117] "RemoveContainer" containerID="bceb70263737a80d48b793aeeb1a38a769270ac03a734c22702eb093a9f1b430" Feb 23 14:25:59.826546 master-0 kubenswrapper[7728]: I0223 14:25:59.826499 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bceb70263737a80d48b793aeeb1a38a769270ac03a734c22702eb093a9f1b430"} err="failed to get container status \"bceb70263737a80d48b793aeeb1a38a769270ac03a734c22702eb093a9f1b430\": rpc error: code = NotFound desc = could not find container \"bceb70263737a80d48b793aeeb1a38a769270ac03a734c22702eb093a9f1b430\": container with ID starting with bceb70263737a80d48b793aeeb1a38a769270ac03a734c22702eb093a9f1b430 not found: ID does not exist" Feb 23 14:25:59.826678 master-0 kubenswrapper[7728]: I0223 14:25:59.826656 7728 scope.go:117] "RemoveContainer" containerID="a0c28cc50bec94c9a70b8ff73f58f632e7f157d8192b386a307045a41a893000" Feb 23 14:25:59.827227 master-0 kubenswrapper[7728]: I0223 14:25:59.827189 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0c28cc50bec94c9a70b8ff73f58f632e7f157d8192b386a307045a41a893000"} err="failed to get container status \"a0c28cc50bec94c9a70b8ff73f58f632e7f157d8192b386a307045a41a893000\": rpc error: code = NotFound desc = could not find container \"a0c28cc50bec94c9a70b8ff73f58f632e7f157d8192b386a307045a41a893000\": container with ID starting with a0c28cc50bec94c9a70b8ff73f58f632e7f157d8192b386a307045a41a893000 not found: ID does not exist" Feb 23 14:25:59.827315 master-0 kubenswrapper[7728]: I0223 14:25:59.827227 7728 scope.go:117] "RemoveContainer" containerID="12ea317144b1f97c12db3d866b1cc7b66073f64b41a37f71cd1c51e60dce3e4c" Feb 23 14:25:59.827632 master-0 kubenswrapper[7728]: I0223 14:25:59.827603 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12ea317144b1f97c12db3d866b1cc7b66073f64b41a37f71cd1c51e60dce3e4c"} err="failed to get container status \"12ea317144b1f97c12db3d866b1cc7b66073f64b41a37f71cd1c51e60dce3e4c\": rpc error: code = NotFound desc = could not find container \"12ea317144b1f97c12db3d866b1cc7b66073f64b41a37f71cd1c51e60dce3e4c\": container with ID starting with 12ea317144b1f97c12db3d866b1cc7b66073f64b41a37f71cd1c51e60dce3e4c not found: ID does not exist" Feb 23 14:25:59.827737 master-0 kubenswrapper[7728]: I0223 14:25:59.827720 7728 scope.go:117] "RemoveContainer" containerID="b545413980bb822863005db697b932a984f3d1797f9e0fd0d4ca5331ec57bc46" Feb 23 14:25:59.828204 master-0 kubenswrapper[7728]: I0223 14:25:59.828157 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b545413980bb822863005db697b932a984f3d1797f9e0fd0d4ca5331ec57bc46"} err="failed to get container status \"b545413980bb822863005db697b932a984f3d1797f9e0fd0d4ca5331ec57bc46\": rpc error: code = NotFound desc = could not find container \"b545413980bb822863005db697b932a984f3d1797f9e0fd0d4ca5331ec57bc46\": container with ID starting with b545413980bb822863005db697b932a984f3d1797f9e0fd0d4ca5331ec57bc46 not found: ID does not exist" Feb 23 14:25:59.828290 master-0 kubenswrapper[7728]: I0223 14:25:59.828232 7728 scope.go:117] "RemoveContainer" containerID="5ed2538f1dd4c505937625e4613ce7839a7ad1306cb779a0660bf410856f74ea" Feb 23 14:25:59.828661 master-0 kubenswrapper[7728]: E0223 14:25:59.828642 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ed2538f1dd4c505937625e4613ce7839a7ad1306cb779a0660bf410856f74ea\": container with ID starting with 5ed2538f1dd4c505937625e4613ce7839a7ad1306cb779a0660bf410856f74ea not found: ID does not exist" containerID="5ed2538f1dd4c505937625e4613ce7839a7ad1306cb779a0660bf410856f74ea" Feb 23 14:25:59.828752 master-0 kubenswrapper[7728]: I0223 14:25:59.828733 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ed2538f1dd4c505937625e4613ce7839a7ad1306cb779a0660bf410856f74ea"} err="failed to get container status \"5ed2538f1dd4c505937625e4613ce7839a7ad1306cb779a0660bf410856f74ea\": rpc error: code = NotFound desc = could not find container \"5ed2538f1dd4c505937625e4613ce7839a7ad1306cb779a0660bf410856f74ea\": container with ID starting with 5ed2538f1dd4c505937625e4613ce7839a7ad1306cb779a0660bf410856f74ea not found: ID does not exist" Feb 23 14:25:59.828809 master-0 kubenswrapper[7728]: I0223 14:25:59.828799 7728 scope.go:117] "RemoveContainer" containerID="444b5986734e966174e693b843714d39c39b89099075b49c0d4944256ff9f4ae" Feb 23 14:25:59.829188 master-0 kubenswrapper[7728]: E0223 14:25:59.829144 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"444b5986734e966174e693b843714d39c39b89099075b49c0d4944256ff9f4ae\": container with ID starting with 444b5986734e966174e693b843714d39c39b89099075b49c0d4944256ff9f4ae not found: ID does not exist" containerID="444b5986734e966174e693b843714d39c39b89099075b49c0d4944256ff9f4ae" Feb 23 14:25:59.829241 master-0 kubenswrapper[7728]: I0223 14:25:59.829195 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"444b5986734e966174e693b843714d39c39b89099075b49c0d4944256ff9f4ae"} err="failed to get container status \"444b5986734e966174e693b843714d39c39b89099075b49c0d4944256ff9f4ae\": rpc error: code = NotFound desc = could not find container \"444b5986734e966174e693b843714d39c39b89099075b49c0d4944256ff9f4ae\": container with ID starting with 444b5986734e966174e693b843714d39c39b89099075b49c0d4944256ff9f4ae not found: ID does not exist" Feb 23 14:25:59.829241 master-0 kubenswrapper[7728]: I0223 14:25:59.829227 7728 scope.go:117] "RemoveContainer" containerID="1ea2b285f2639d5a18b8a335d5c0eee1af23080ecbfb38dc1a5168ba545660e2" Feb 23 14:25:59.829683 master-0 kubenswrapper[7728]: E0223 14:25:59.829615 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ea2b285f2639d5a18b8a335d5c0eee1af23080ecbfb38dc1a5168ba545660e2\": container with ID starting with 1ea2b285f2639d5a18b8a335d5c0eee1af23080ecbfb38dc1a5168ba545660e2 not found: ID does not exist" containerID="1ea2b285f2639d5a18b8a335d5c0eee1af23080ecbfb38dc1a5168ba545660e2" Feb 23 14:25:59.829745 master-0 kubenswrapper[7728]: I0223 14:25:59.829694 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ea2b285f2639d5a18b8a335d5c0eee1af23080ecbfb38dc1a5168ba545660e2"} err="failed to get container status \"1ea2b285f2639d5a18b8a335d5c0eee1af23080ecbfb38dc1a5168ba545660e2\": rpc error: code = NotFound desc = could not find container \"1ea2b285f2639d5a18b8a335d5c0eee1af23080ecbfb38dc1a5168ba545660e2\": container with ID starting with 1ea2b285f2639d5a18b8a335d5c0eee1af23080ecbfb38dc1a5168ba545660e2 not found: ID does not exist" Feb 23 14:25:59.829778 master-0 kubenswrapper[7728]: I0223 14:25:59.829724 7728 scope.go:117] "RemoveContainer" containerID="93ee993f97732b66b7b7fa627308e4fbe3771a952955dfa9d4f021a884360bf3" Feb 23 14:25:59.830088 master-0 kubenswrapper[7728]: E0223 14:25:59.830064 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93ee993f97732b66b7b7fa627308e4fbe3771a952955dfa9d4f021a884360bf3\": container with ID starting with 93ee993f97732b66b7b7fa627308e4fbe3771a952955dfa9d4f021a884360bf3 not found: ID does not exist" containerID="93ee993f97732b66b7b7fa627308e4fbe3771a952955dfa9d4f021a884360bf3" Feb 23 14:25:59.830180 master-0 kubenswrapper[7728]: I0223 14:25:59.830164 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93ee993f97732b66b7b7fa627308e4fbe3771a952955dfa9d4f021a884360bf3"} err="failed to get container status \"93ee993f97732b66b7b7fa627308e4fbe3771a952955dfa9d4f021a884360bf3\": rpc error: code = NotFound desc = could not find container \"93ee993f97732b66b7b7fa627308e4fbe3771a952955dfa9d4f021a884360bf3\": container with ID starting with 93ee993f97732b66b7b7fa627308e4fbe3771a952955dfa9d4f021a884360bf3 not found: ID does not exist" Feb 23 14:25:59.830239 master-0 kubenswrapper[7728]: I0223 14:25:59.830229 7728 scope.go:117] "RemoveContainer" containerID="5867cf57b319e8b378703de8112e0a4c5fd05aee108af7754fc3219eac54a673" Feb 23 14:25:59.830758 master-0 kubenswrapper[7728]: E0223 14:25:59.830669 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5867cf57b319e8b378703de8112e0a4c5fd05aee108af7754fc3219eac54a673\": container with ID starting with 5867cf57b319e8b378703de8112e0a4c5fd05aee108af7754fc3219eac54a673 not found: ID does not exist" containerID="5867cf57b319e8b378703de8112e0a4c5fd05aee108af7754fc3219eac54a673" Feb 23 14:25:59.830819 master-0 kubenswrapper[7728]: I0223 14:25:59.830767 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5867cf57b319e8b378703de8112e0a4c5fd05aee108af7754fc3219eac54a673"} err="failed to get container status \"5867cf57b319e8b378703de8112e0a4c5fd05aee108af7754fc3219eac54a673\": rpc error: code = NotFound desc = could not find container \"5867cf57b319e8b378703de8112e0a4c5fd05aee108af7754fc3219eac54a673\": container with ID starting with 5867cf57b319e8b378703de8112e0a4c5fd05aee108af7754fc3219eac54a673 not found: ID does not exist" Feb 23 14:25:59.830854 master-0 kubenswrapper[7728]: I0223 14:25:59.830824 7728 scope.go:117] "RemoveContainer" containerID="0e43678d3197cf112cf0a044926bfa730d56557262cc8421afdcc26a5ee07b83" Feb 23 14:25:59.831129 master-0 kubenswrapper[7728]: E0223 14:25:59.831111 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e43678d3197cf112cf0a044926bfa730d56557262cc8421afdcc26a5ee07b83\": container with ID starting with 0e43678d3197cf112cf0a044926bfa730d56557262cc8421afdcc26a5ee07b83 not found: ID does not exist" containerID="0e43678d3197cf112cf0a044926bfa730d56557262cc8421afdcc26a5ee07b83" Feb 23 14:25:59.831217 master-0 kubenswrapper[7728]: I0223 14:25:59.831200 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e43678d3197cf112cf0a044926bfa730d56557262cc8421afdcc26a5ee07b83"} err="failed to get container status \"0e43678d3197cf112cf0a044926bfa730d56557262cc8421afdcc26a5ee07b83\": rpc error: code = NotFound desc = could not find container \"0e43678d3197cf112cf0a044926bfa730d56557262cc8421afdcc26a5ee07b83\": container with ID starting with 0e43678d3197cf112cf0a044926bfa730d56557262cc8421afdcc26a5ee07b83 not found: ID does not exist" Feb 23 14:25:59.831281 master-0 kubenswrapper[7728]: I0223 14:25:59.831269 7728 scope.go:117] "RemoveContainer" containerID="4f3667b06f9040c2373de3a09349d52a663561d04056133aea74705119d3b818" Feb 23 14:25:59.832547 master-0 kubenswrapper[7728]: E0223 14:25:59.832525 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f3667b06f9040c2373de3a09349d52a663561d04056133aea74705119d3b818\": container with ID starting with 4f3667b06f9040c2373de3a09349d52a663561d04056133aea74705119d3b818 not found: ID does not exist" containerID="4f3667b06f9040c2373de3a09349d52a663561d04056133aea74705119d3b818" Feb 23 14:25:59.832643 master-0 kubenswrapper[7728]: I0223 14:25:59.832626 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f3667b06f9040c2373de3a09349d52a663561d04056133aea74705119d3b818"} err="failed to get container status \"4f3667b06f9040c2373de3a09349d52a663561d04056133aea74705119d3b818\": rpc error: code = NotFound desc = could not find container \"4f3667b06f9040c2373de3a09349d52a663561d04056133aea74705119d3b818\": container with ID starting with 4f3667b06f9040c2373de3a09349d52a663561d04056133aea74705119d3b818 not found: ID does not exist" Feb 23 14:25:59.832701 master-0 kubenswrapper[7728]: I0223 14:25:59.832691 7728 scope.go:117] "RemoveContainer" containerID="475b682a5602a8b70516629df8770a92cda1f614d3b2e4b8f4d6b708bbc8532d" Feb 23 14:25:59.833050 master-0 kubenswrapper[7728]: E0223 14:25:59.833027 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"475b682a5602a8b70516629df8770a92cda1f614d3b2e4b8f4d6b708bbc8532d\": container with ID starting with 475b682a5602a8b70516629df8770a92cda1f614d3b2e4b8f4d6b708bbc8532d not found: ID does not exist" containerID="475b682a5602a8b70516629df8770a92cda1f614d3b2e4b8f4d6b708bbc8532d" Feb 23 14:25:59.833139 master-0 kubenswrapper[7728]: I0223 14:25:59.833123 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"475b682a5602a8b70516629df8770a92cda1f614d3b2e4b8f4d6b708bbc8532d"} err="failed to get container status \"475b682a5602a8b70516629df8770a92cda1f614d3b2e4b8f4d6b708bbc8532d\": rpc error: code = NotFound desc = could not find container \"475b682a5602a8b70516629df8770a92cda1f614d3b2e4b8f4d6b708bbc8532d\": container with ID starting with 475b682a5602a8b70516629df8770a92cda1f614d3b2e4b8f4d6b708bbc8532d not found: ID does not exist" Feb 23 14:25:59.833205 master-0 kubenswrapper[7728]: I0223 14:25:59.833194 7728 scope.go:117] "RemoveContainer" containerID="6fdaded4c1d5d4706ada0063d02a22ac0f3bed1016ec71609468c9f080c894da" Feb 23 14:25:59.874641 master-0 kubenswrapper[7728]: I0223 14:25:59.874504 7728 scope.go:117] "RemoveContainer" containerID="3dfd224fb797b317bbd9fa5874481064b86c55aa823a6f87465b8ca08947f5d3" Feb 23 14:25:59.875517 master-0 kubenswrapper[7728]: I0223 14:25:59.875406 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dfd224fb797b317bbd9fa5874481064b86c55aa823a6f87465b8ca08947f5d3"} err="failed to get container status \"3dfd224fb797b317bbd9fa5874481064b86c55aa823a6f87465b8ca08947f5d3\": rpc error: code = NotFound desc = could not find container \"3dfd224fb797b317bbd9fa5874481064b86c55aa823a6f87465b8ca08947f5d3\": container with ID starting with 3dfd224fb797b317bbd9fa5874481064b86c55aa823a6f87465b8ca08947f5d3 not found: ID does not exist" Feb 23 14:25:59.875517 master-0 kubenswrapper[7728]: I0223 14:25:59.875510 7728 scope.go:117] "RemoveContainer" containerID="b8ab745e2116720c089d0aba55fcbbcd93f3d05db7dc85aaff6bdfb686118c69" Feb 23 14:25:59.876079 master-0 kubenswrapper[7728]: I0223 14:25:59.875978 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8ab745e2116720c089d0aba55fcbbcd93f3d05db7dc85aaff6bdfb686118c69"} err="failed to get container status \"b8ab745e2116720c089d0aba55fcbbcd93f3d05db7dc85aaff6bdfb686118c69\": rpc error: code = NotFound desc = could not find container \"b8ab745e2116720c089d0aba55fcbbcd93f3d05db7dc85aaff6bdfb686118c69\": container with ID starting with b8ab745e2116720c089d0aba55fcbbcd93f3d05db7dc85aaff6bdfb686118c69 not found: ID does not exist" Feb 23 14:25:59.876079 master-0 kubenswrapper[7728]: I0223 14:25:59.876048 7728 scope.go:117] "RemoveContainer" containerID="3f92273d1230c6309ba0ff19f3495f90ece38f0e07bb4c16e151dee5c4fb41ec" Feb 23 14:25:59.876614 master-0 kubenswrapper[7728]: I0223 14:25:59.876533 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f92273d1230c6309ba0ff19f3495f90ece38f0e07bb4c16e151dee5c4fb41ec"} err="failed to get container status \"3f92273d1230c6309ba0ff19f3495f90ece38f0e07bb4c16e151dee5c4fb41ec\": rpc error: code = NotFound desc = could not find container \"3f92273d1230c6309ba0ff19f3495f90ece38f0e07bb4c16e151dee5c4fb41ec\": container with ID starting with 3f92273d1230c6309ba0ff19f3495f90ece38f0e07bb4c16e151dee5c4fb41ec not found: ID does not exist" Feb 23 14:25:59.876614 master-0 kubenswrapper[7728]: I0223 14:25:59.876599 7728 scope.go:117] "RemoveContainer" containerID="59103074e5c9d28cc59a99d2933688907ecdae822b440f6d4da07709d19793c9" Feb 23 14:25:59.877170 master-0 kubenswrapper[7728]: I0223 14:25:59.877106 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59103074e5c9d28cc59a99d2933688907ecdae822b440f6d4da07709d19793c9"} err="failed to get container status \"59103074e5c9d28cc59a99d2933688907ecdae822b440f6d4da07709d19793c9\": rpc error: code = NotFound desc = could not find container \"59103074e5c9d28cc59a99d2933688907ecdae822b440f6d4da07709d19793c9\": container with ID starting with 59103074e5c9d28cc59a99d2933688907ecdae822b440f6d4da07709d19793c9 not found: ID does not exist" Feb 23 14:25:59.877275 master-0 kubenswrapper[7728]: I0223 14:25:59.877184 7728 scope.go:117] "RemoveContainer" containerID="bceb70263737a80d48b793aeeb1a38a769270ac03a734c22702eb093a9f1b430" Feb 23 14:25:59.877717 master-0 kubenswrapper[7728]: I0223 14:25:59.877668 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bceb70263737a80d48b793aeeb1a38a769270ac03a734c22702eb093a9f1b430"} err="failed to get container status \"bceb70263737a80d48b793aeeb1a38a769270ac03a734c22702eb093a9f1b430\": rpc error: code = NotFound desc = could not find container \"bceb70263737a80d48b793aeeb1a38a769270ac03a734c22702eb093a9f1b430\": container with ID starting with bceb70263737a80d48b793aeeb1a38a769270ac03a734c22702eb093a9f1b430 not found: ID does not exist" Feb 23 14:25:59.877717 master-0 kubenswrapper[7728]: I0223 14:25:59.877710 7728 scope.go:117] "RemoveContainer" containerID="a0c28cc50bec94c9a70b8ff73f58f632e7f157d8192b386a307045a41a893000" Feb 23 14:25:59.878167 master-0 kubenswrapper[7728]: I0223 14:25:59.878115 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0c28cc50bec94c9a70b8ff73f58f632e7f157d8192b386a307045a41a893000"} err="failed to get container status \"a0c28cc50bec94c9a70b8ff73f58f632e7f157d8192b386a307045a41a893000\": rpc error: code = NotFound desc = could not find container \"a0c28cc50bec94c9a70b8ff73f58f632e7f157d8192b386a307045a41a893000\": container with ID starting with a0c28cc50bec94c9a70b8ff73f58f632e7f157d8192b386a307045a41a893000 not found: ID does not exist" Feb 23 14:25:59.878167 master-0 kubenswrapper[7728]: I0223 14:25:59.878160 7728 scope.go:117] "RemoveContainer" containerID="12ea317144b1f97c12db3d866b1cc7b66073f64b41a37f71cd1c51e60dce3e4c" Feb 23 14:25:59.878670 master-0 kubenswrapper[7728]: I0223 14:25:59.878628 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12ea317144b1f97c12db3d866b1cc7b66073f64b41a37f71cd1c51e60dce3e4c"} err="failed to get container status \"12ea317144b1f97c12db3d866b1cc7b66073f64b41a37f71cd1c51e60dce3e4c\": rpc error: code = NotFound desc = could not find container \"12ea317144b1f97c12db3d866b1cc7b66073f64b41a37f71cd1c51e60dce3e4c\": container with ID starting with 12ea317144b1f97c12db3d866b1cc7b66073f64b41a37f71cd1c51e60dce3e4c not found: ID does not exist" Feb 23 14:25:59.878715 master-0 kubenswrapper[7728]: I0223 14:25:59.878669 7728 scope.go:117] "RemoveContainer" containerID="b545413980bb822863005db697b932a984f3d1797f9e0fd0d4ca5331ec57bc46" Feb 23 14:25:59.879169 master-0 kubenswrapper[7728]: I0223 14:25:59.879121 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b545413980bb822863005db697b932a984f3d1797f9e0fd0d4ca5331ec57bc46"} err="failed to get container status \"b545413980bb822863005db697b932a984f3d1797f9e0fd0d4ca5331ec57bc46\": rpc error: code = NotFound desc = could not find container \"b545413980bb822863005db697b932a984f3d1797f9e0fd0d4ca5331ec57bc46\": container with ID starting with b545413980bb822863005db697b932a984f3d1797f9e0fd0d4ca5331ec57bc46 not found: ID does not exist" Feb 23 14:26:00.037203 master-0 kubenswrapper[7728]: I0223 14:26:00.037008 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Feb 23 14:26:00.037203 master-0 kubenswrapper[7728]: I0223 14:26:00.037160 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Feb 23 14:26:00.109345 master-0 kubenswrapper[7728]: I0223 14:26:00.109289 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Feb 23 14:26:00.652028 master-0 kubenswrapper[7728]: I0223 14:26:00.651940 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:26:00.652028 master-0 kubenswrapper[7728]: I0223 14:26:00.652018 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:26:00.661890 master-0 kubenswrapper[7728]: I0223 14:26:00.661825 7728 patch_prober.go:28] interesting pod/route-controller-manager-8bb99f4f-msq8f container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.64:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:26:00.661890 master-0 kubenswrapper[7728]: I0223 14:26:00.661881 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" podUID="482284fd-6911-4ba6-8d57-7966cc51117a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.64:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:26:00.699285 master-0 kubenswrapper[7728]: I0223 14:26:00.699152 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-d6bb9bb76-4frj6_12b256b7-a57b-4124-8452-25e74cfa7926/cluster-baremetal-operator/2.log" Feb 23 14:26:00.700702 master-0 kubenswrapper[7728]: I0223 14:26:00.700658 7728 scope.go:117] "RemoveContainer" containerID="578a9e2a674702d2219386592f2e2254d406630d2cc2c55e8edf24f8f9368991" Feb 23 14:26:00.701075 master-0 kubenswrapper[7728]: E0223 14:26:00.701016 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-d6bb9bb76-4frj6_openshift-machine-api(12b256b7-a57b-4124-8452-25e74cfa7926)\"" pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" podUID="12b256b7-a57b-4124-8452-25e74cfa7926" Feb 23 14:26:00.703108 master-0 kubenswrapper[7728]: I0223 14:26:00.703045 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-7bcfbc574b-zdntd_cf04aca0-8174-4134-835d-37adf6a3b5ca/kube-controller-manager-operator/2.log" Feb 23 14:26:00.709186 master-0 kubenswrapper[7728]: I0223 14:26:00.709096 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-fc889cfd5-tw2r9_865ceedb-b19a-4f2f-b295-311e1b7a645e/kube-storage-version-migrator-operator/1.log" Feb 23 14:26:00.710718 master-0 kubenswrapper[7728]: I0223 14:26:00.710679 7728 scope.go:117] "RemoveContainer" containerID="e7e20b5ba72ce778a4607a64cc8928522b6f4e4e91aae5a0ddbe4de3f2e8d4a6" Feb 23 14:26:00.711431 master-0 kubenswrapper[7728]: E0223 14:26:00.711204 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(c9ad9373c007a4fcd25e70622bdc8deb)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" Feb 23 14:26:00.740029 master-0 kubenswrapper[7728]: E0223 14:26:00.739753 7728 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T14:25:50Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T14:25:50Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T14:25:50Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T14:25:50Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:584b5d125dad1fa4f8d03e6ace2e4901c173569ff1ed9536da6915c56fa52bc0\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8124eb3839b25af23303e9fdde35728bfd24d7c0c47530e77852cba1dd9d1ffb\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1702755272},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:94d88fe2fa42931a725508dbf17296b6ed99b8e20c1169f5d1fb8a36f4927ddd\\\"],\\\"sizeBytes\\\":1637274270},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d7a8ac0ba2e5115c9d451d553741173ae8744d4544da15e28bf38f61630182fd\\\"],\\\"sizeBytes\\\":1237794314},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:c2af15d278f72034eecf3db74223b7e61f3d07c1a5c7ba760e7586915ff1b17e\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:e166d1252d7455d8bd62e43f2967e738ee9bdd6a09b7771a4187d82477ae7535\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1237042376},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:b5385b46d054c9ee73478bf23e07056d0b9f81d34619d0949927d8d9e791fcb5\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:ebdc10b149ba97b999770285d06149ef92c780205d916c3cab994098e20be0ba\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1210455233},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:518982b9ad8a8bfb7bb3b4216b235cac99e126df3bb48e390b36064560c76b83\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b3293b04e31c8e67c885f77e0ad2ee994295afde7c42cb9761c7090ae0cdb3f8\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1202767548},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4775c6461221dafe3ddd67ff683ccb665bed6eb278fa047d9d744aab9af65dcf\\\"],\\\"sizeBytes\\\":992461126},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8177c465e14c63854e5c0fa95ca0635cffc9b5dd3d077ecf971feedbc42b1274\\\"],\\\"sizeBytes\\\":943734757},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6c7ec917f0eff7b41d7174f1b5fdc4ce53ad106e51599afba731a8431ff9caa7\\\"],\\\"sizeBytes\\\":918153745},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8ff40a2d97bf7a95e19303f7e972b7e8354a3864039111c6d33d5479117aaeed\\\"],\\\"sizeBytes\\\":880247193},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:72fafcd55ab739919dd8a114863fda27106af1c497f474e7ce0cb23b58dfa021\\\"],\\\"sizeBytes\\\":875998518},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7b9239f1f5e9590e3db71e61fde86db8f43e0085f61ae7769508d2ea058481c7\\\"],\\\"sizeBytes\\\":862501144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:572b0ca6e993beea2ee9346197665e56a2e4999fbb6958c747c48a35bf72ee34\\\"],\\\"sizeBytes\\\":862091954},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3fa84eaa1310d97fe55bb23a7c27ece85718d0643fa7fc0ff81014edb4b948b\\\"],\\\"sizeBytes\\\":772838975},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bd420e879c9f0271bca2d123a6d762591d9a4626b72f254d1f885842c32149e8\\\"],\\\"sizeBytes\\\":687849728},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3c467c1eeba7434b2aebf07169ab8afe0203d638e871dbdf29a16f830e9aef9e\\\"],\\\"sizeBytes\\\":682963466},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5121a0944000b7bfa57ae2e4eb3f412e1b4b89fcc75eec1ef20241182c0527f2\\\"],\\\"sizeBytes\\\":677827184},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5a31b448302fbb994548ed801ac488a44e8a7c4ae9149c3b4cc20d6af832f83\\\"],\\\"sizeBytes\\\":621542709},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3e089c4e4fa9a22803b2673b776215e021a1f12a856dbcaba2fadee29bee10a3\\\"],\\\"sizeBytes\\\":589275174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1582ea693f35073e3316e2380a18227b78096ca7f4e1328f1dd8a2c423da26e9\\\"],\\\"sizeBytes\\\":582052489},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:314be88d356b2c8a3c4416daeb4cfcd58d617a4526319c01ddaffae4b4179e74\\\"],\\\"sizeBytes\\\":558105176},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:69f9df2f6b5cd83ab895e9e4a9bf8920d35fe450679ce06fb223944e95cfbe3e\\\"],\\\"sizeBytes\\\":557320737},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f86073cf0561e4b69668f8917ef5184cb0ef5aa16d0fefe38118f1167b268721\\\"],\\\"sizeBytes\\\":548646306},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d77a77c401bcfaa65a6ab6de82415af0e7ace1b470626647e5feb4875c89a5ef\\\"],\\\"sizeBytes\\\":529218694},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bc0ca626e5e17f9f78ddbfde54ea13ddc7749904911817bba16e6b59f30499ec\\\"],\\\"sizeBytes\\\":528829499},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:11f566fe2ae782ad96d36028b0fd81911a64ef787dcebc83803f741f272fa396\\\"],\\\"sizeBytes\\\":518279996},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:40bb7cf7c637bf9efd8fb0157839d325a019d67cc7d7279665fcf90dbb7f3f33\\\"],\\\"sizeBytes\\\":517888569},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fd63e2c1185e529c6e9f6e1426222ff2ac195132b44a1775f407e4593b66d4c\\\"],\\\"sizeBytes\\\":514875199},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a1b426a276216372c7d688fe60e9eaf251efd35071f94e1bcd4337f51a90fd75\\\"],\\\"sizeBytes\\\":513473308},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ce471c00b59fd855a59f7efa9afdb3f0f9cbf1c4bcce3a82fe1a4cb82e90f52e\\\"],\\\"sizeBytes\\\":513119434},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a9dcbc6b966928b7597d4a822948ae6f07b62feecb91679c1d825d0d19426e19\\\"],\\\"sizeBytes\\\":512172666},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5f4a546983224e416dfcc3a700afc15f9790182a5a2f8f7c94892d0e95abab3\\\"],\\\"sizeBytes\\\":511125422},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2c8de5c5b21ed8c7829ba988d580ffa470c9913877fe0ee5e11bf507400ffbc7\\\"],\\\"sizeBytes\\\":511059399},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:64ba461fd5594e3a30bfd755f1496707a88249bc68d07c65124c8617d664d2ac\\\"],\\\"sizeBytes\\\":508786786},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a82e441a9e9b93f0e010f1ce26e30c24b6ca93f7752084d4694ebdb3c5b53f83\\\"],\\\"sizeBytes\\\":508443359},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d7bd3361d506dcc1be3afa62d35080c5dd37afccc26cd36019e2b9db2c45f896\\\"],\\\"sizeBytes\\\":507867630},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:034588ffd95ce834e866279bf80a45af2cddda631c6c9a6344c1bb2e033fd83e\\\"],\\\"sizeBytes\\\":506374680},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8618d42fe4da4881abe39e98691d187e13713981b66d0dac0a11cb1287482b7\\\"],\\\"sizeBytes\\\":506291135},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ce68078d909b63bb5b872d94c04829aa1b5812c416abbaf9024840d348ee68b1\\\"],\\\"sizeBytes\\\":505244089},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:457c564075e8b14b1d24ff6eab750600ebc90ff8b7bb137306a579ee8445ae95\\\"],\\\"sizeBytes\\\":505137106},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ebf883de8fd905490f0c9b420a5d6446ecde18e12e15364f6dcd4e885104972c\\\"],\\\"sizeBytes\\\":504558291},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:897708222502e4d710dd737923f74d153c084ba6048bffceb16dfd30f79a6ecc\\\"],\\\"sizeBytes\\\":504513960},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:86d9e1fdf97794f44fc1c91da025714ec6900fafa6cdc4c0041ffa95e9d70c6c\\\"],\\\"sizeBytes\\\":495888162},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4e8c6ae1f9a450c90857c9fbccf1e5fb404dbc0d65d086afce005d6bd307853b\\\"],\\\"sizeBytes\\\":494959854},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:117a846734fc8159b7172a40ed2feb43a969b7dbc113ee1a572cbf6f9f922655\\\"],\\\"sizeBytes\\\":486990304},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4797a485fd4ab3414ba8d52bdf2afccefab6c657b1d259baad703fca5145124c\\\"],\\\"sizeBytes\\\":484349508},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a132d09565133b36ac7c797213d6a74ac810bb368ef59136320ab3d300f45bd\\\"],\\\"sizeBytes\\\":484074784},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6acc7c3c018d8bb3cb597580eedae0300c44a5424f07129270c878899ef592a6\\\"],\\\"sizeBytes\\\":470717179},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:235b846666adaa2e4b4d6d0f7fd71d57bf3be253466e1d9fffafd103fa2696ac\\\"],\\\"sizeBytes\\\":470575802},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ce89154fa3fe1e87c660e644b58cf125fede575869fd5841600082c0d1f858a3\\\"],\\\"sizeBytes\\\":468159025}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:26:01.044968 master-0 kubenswrapper[7728]: E0223 14:26:01.044733 7728 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event=< Feb 23 14:26:01.044968 master-0 kubenswrapper[7728]: &Event{ObjectMeta:{openshift-config-operator-6f47d587d6-55qjr.1896e60017859e72 openshift-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-config-operator,Name:openshift-config-operator-6f47d587d6-55qjr,UID:92c63c95-e880-4f51-9858-7715343f7bd8,APIVersion:v1,ResourceVersion:8517,FieldPath:spec.containers{openshift-config-operator},},Reason:ProbeError,Message:Liveness probe error: Get "https://10.128.0.52:8443/healthz": dial tcp 10.128.0.52:8443: connect: connection refused Feb 23 14:26:01.044968 master-0 kubenswrapper[7728]: body: Feb 23 14:26:01.044968 master-0 kubenswrapper[7728]: ,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:19:51.485890162 +0000 UTC m=+84.448551458,LastTimestamp:2026-02-23 14:19:51.485890162 +0000 UTC m=+84.448551458,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,} Feb 23 14:26:01.044968 master-0 kubenswrapper[7728]: > Feb 23 14:26:01.235532 master-0 kubenswrapper[7728]: I0223 14:26:01.235389 7728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a0bf4c4-8272-4f24-8e48-525d7a278b26" path="/var/lib/kubelet/pods/2a0bf4c4-8272-4f24-8e48-525d7a278b26/volumes" Feb 23 14:26:01.237399 master-0 kubenswrapper[7728]: I0223 14:26:01.237346 7728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43ce2f82-05aa-4778-a444-848a408cf570" path="/var/lib/kubelet/pods/43ce2f82-05aa-4778-a444-848a408cf570/volumes" Feb 23 14:26:01.486574 master-0 kubenswrapper[7728]: I0223 14:26:01.486448 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:26:01.486836 master-0 kubenswrapper[7728]: I0223 14:26:01.486569 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:26:01.652943 master-0 kubenswrapper[7728]: I0223 14:26:01.652822 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:26:01.652943 master-0 kubenswrapper[7728]: I0223 14:26:01.652902 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:26:01.709874 master-0 kubenswrapper[7728]: I0223 14:26:01.709775 7728 patch_prober.go:28] interesting pod/route-controller-manager-8bb99f4f-msq8f container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.64:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:26:01.709874 master-0 kubenswrapper[7728]: I0223 14:26:01.709866 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" podUID="482284fd-6911-4ba6-8d57-7966cc51117a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.64:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:26:01.715850 master-0 kubenswrapper[7728]: I0223 14:26:01.715786 7728 scope.go:117] "RemoveContainer" containerID="e7e20b5ba72ce778a4607a64cc8928522b6f4e4e91aae5a0ddbe4de3f2e8d4a6" Feb 23 14:26:01.716164 master-0 kubenswrapper[7728]: E0223 14:26:01.716110 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(c9ad9373c007a4fcd25e70622bdc8deb)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" Feb 23 14:26:02.653221 master-0 kubenswrapper[7728]: I0223 14:26:02.653096 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:26:02.653221 master-0 kubenswrapper[7728]: I0223 14:26:02.653200 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:26:02.715900 master-0 kubenswrapper[7728]: I0223 14:26:02.715758 7728 patch_prober.go:28] interesting pod/route-controller-manager-8bb99f4f-msq8f container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.64:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:26:02.716172 master-0 kubenswrapper[7728]: I0223 14:26:02.715911 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" podUID="482284fd-6911-4ba6-8d57-7966cc51117a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.64:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:26:02.738247 master-0 kubenswrapper[7728]: I0223 14:26:02.738160 7728 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:26:04.486638 master-0 kubenswrapper[7728]: I0223 14:26:04.486532 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:26:04.486638 master-0 kubenswrapper[7728]: I0223 14:26:04.486621 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:26:04.487910 master-0 kubenswrapper[7728]: I0223 14:26:04.486708 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:26:04.487910 master-0 kubenswrapper[7728]: I0223 14:26:04.486736 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:26:04.713702 master-0 kubenswrapper[7728]: I0223 14:26:04.713571 7728 patch_prober.go:28] interesting pod/route-controller-manager-8bb99f4f-msq8f container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.64:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:26:04.713702 master-0 kubenswrapper[7728]: I0223 14:26:04.713698 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" podUID="482284fd-6911-4ba6-8d57-7966cc51117a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.64:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:26:05.058818 master-0 kubenswrapper[7728]: I0223 14:26:05.058761 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Feb 23 14:26:05.910554 master-0 kubenswrapper[7728]: I0223 14:26:05.910512 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:26:05.912050 master-0 kubenswrapper[7728]: I0223 14:26:05.912023 7728 scope.go:117] "RemoveContainer" containerID="e7e20b5ba72ce778a4607a64cc8928522b6f4e4e91aae5a0ddbe4de3f2e8d4a6" Feb 23 14:26:05.912595 master-0 kubenswrapper[7728]: E0223 14:26:05.912560 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(c9ad9373c007a4fcd25e70622bdc8deb)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" Feb 23 14:26:06.363774 master-0 kubenswrapper[7728]: I0223 14:26:06.363685 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:26:06.363774 master-0 kubenswrapper[7728]: I0223 14:26:06.363767 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:26:06.748641 master-0 kubenswrapper[7728]: I0223 14:26:06.748537 7728 scope.go:117] "RemoveContainer" containerID="e7e20b5ba72ce778a4607a64cc8928522b6f4e4e91aae5a0ddbe4de3f2e8d4a6" Feb 23 14:26:06.749010 master-0 kubenswrapper[7728]: E0223 14:26:06.748947 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(c9ad9373c007a4fcd25e70622bdc8deb)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" Feb 23 14:26:07.218962 master-0 kubenswrapper[7728]: I0223 14:26:07.218846 7728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:26:07.438100 master-0 kubenswrapper[7728]: I0223 14:26:07.438030 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:26:07.486201 master-0 kubenswrapper[7728]: I0223 14:26:07.486099 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:26:07.486366 master-0 kubenswrapper[7728]: I0223 14:26:07.486198 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:26:07.486366 master-0 kubenswrapper[7728]: I0223 14:26:07.486257 7728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" Feb 23 14:26:07.486622 master-0 kubenswrapper[7728]: I0223 14:26:07.486422 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:26:07.486622 master-0 kubenswrapper[7728]: I0223 14:26:07.486557 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:26:07.487360 master-0 kubenswrapper[7728]: I0223 14:26:07.487299 7728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="openshift-config-operator" containerStatusID={"Type":"cri-o","ID":"0bba5b88522722fd7c81731469c726d0abf8c593b96b63602cc16566d85db157"} pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" containerMessage="Container openshift-config-operator failed liveness probe, will be restarted" Feb 23 14:26:07.487360 master-0 kubenswrapper[7728]: I0223 14:26:07.487350 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" containerID="cri-o://0bba5b88522722fd7c81731469c726d0abf8c593b96b63602cc16566d85db157" gracePeriod=30 Feb 23 14:26:07.501167 master-0 kubenswrapper[7728]: I0223 14:26:07.501042 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": read tcp 10.128.0.2:45738->10.128.0.52:8443: read: connection reset by peer" start-of-body= Feb 23 14:26:07.501516 master-0 kubenswrapper[7728]: I0223 14:26:07.501174 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": read tcp 10.128.0.2:45738->10.128.0.52:8443: read: connection reset by peer" Feb 23 14:26:07.755150 master-0 kubenswrapper[7728]: I0223 14:26:07.754816 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-6f47d587d6-55qjr_92c63c95-e880-4f51-9858-7715343f7bd8/openshift-config-operator/3.log" Feb 23 14:26:07.756467 master-0 kubenswrapper[7728]: I0223 14:26:07.755919 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-6f47d587d6-55qjr_92c63c95-e880-4f51-9858-7715343f7bd8/openshift-config-operator/2.log" Feb 23 14:26:07.756467 master-0 kubenswrapper[7728]: I0223 14:26:07.756389 7728 generic.go:334] "Generic (PLEG): container finished" podID="92c63c95-e880-4f51-9858-7715343f7bd8" containerID="0bba5b88522722fd7c81731469c726d0abf8c593b96b63602cc16566d85db157" exitCode=255 Feb 23 14:26:07.756467 master-0 kubenswrapper[7728]: I0223 14:26:07.756436 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" event={"ID":"92c63c95-e880-4f51-9858-7715343f7bd8","Type":"ContainerDied","Data":"0bba5b88522722fd7c81731469c726d0abf8c593b96b63602cc16566d85db157"} Feb 23 14:26:07.756467 master-0 kubenswrapper[7728]: I0223 14:26:07.756505 7728 scope.go:117] "RemoveContainer" containerID="0a3915eaedd169a17fc20783989eb20aa548b21f919f4de39f43389e2994de7c" Feb 23 14:26:07.757025 master-0 kubenswrapper[7728]: I0223 14:26:07.756981 7728 scope.go:117] "RemoveContainer" containerID="e7e20b5ba72ce778a4607a64cc8928522b6f4e4e91aae5a0ddbe4de3f2e8d4a6" Feb 23 14:26:07.757452 master-0 kubenswrapper[7728]: E0223 14:26:07.757236 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(c9ad9373c007a4fcd25e70622bdc8deb)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" Feb 23 14:26:08.768730 master-0 kubenswrapper[7728]: I0223 14:26:08.768641 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-6f47d587d6-55qjr_92c63c95-e880-4f51-9858-7715343f7bd8/openshift-config-operator/3.log" Feb 23 14:26:08.770591 master-0 kubenswrapper[7728]: I0223 14:26:08.770469 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" event={"ID":"92c63c95-e880-4f51-9858-7715343f7bd8","Type":"ContainerStarted","Data":"5ace2c4cd314c92825fd50854b7a53f375dd7e9cb995361c6b2c717e5d66eb1b"} Feb 23 14:26:08.770892 master-0 kubenswrapper[7728]: I0223 14:26:08.770837 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" Feb 23 14:26:08.771257 master-0 kubenswrapper[7728]: I0223 14:26:08.771200 7728 scope.go:117] "RemoveContainer" containerID="e7e20b5ba72ce778a4607a64cc8928522b6f4e4e91aae5a0ddbe4de3f2e8d4a6" Feb 23 14:26:08.771698 master-0 kubenswrapper[7728]: E0223 14:26:08.771638 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(c9ad9373c007a4fcd25e70622bdc8deb)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" Feb 23 14:26:10.740381 master-0 kubenswrapper[7728]: E0223 14:26:10.740270 7728 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:26:11.642064 master-0 kubenswrapper[7728]: E0223 14:26:11.641970 7728 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Feb 23 14:26:12.007999 master-0 kubenswrapper[7728]: I0223 14:26:12.007918 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Feb 23 14:26:12.485943 master-0 kubenswrapper[7728]: I0223 14:26:12.485850 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:26:12.485943 master-0 kubenswrapper[7728]: I0223 14:26:12.485925 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:26:12.486272 master-0 kubenswrapper[7728]: I0223 14:26:12.486094 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:26:12.486272 master-0 kubenswrapper[7728]: I0223 14:26:12.486125 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:26:12.738058 master-0 kubenswrapper[7728]: I0223 14:26:12.737916 7728 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:26:13.220561 master-0 kubenswrapper[7728]: I0223 14:26:13.220447 7728 scope.go:117] "RemoveContainer" containerID="578a9e2a674702d2219386592f2e2254d406630d2cc2c55e8edf24f8f9368991" Feb 23 14:26:13.809768 master-0 kubenswrapper[7728]: I0223 14:26:13.809618 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-d6bb9bb76-4frj6_12b256b7-a57b-4124-8452-25e74cfa7926/cluster-baremetal-operator/2.log" Feb 23 14:26:13.810533 master-0 kubenswrapper[7728]: I0223 14:26:13.810369 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" event={"ID":"12b256b7-a57b-4124-8452-25e74cfa7926","Type":"ContainerStarted","Data":"123c9da1a73f2000f089739afcd687dfaec03bdf3090d2bb31eea1f983917dfc"} Feb 23 14:26:13.850207 master-0 kubenswrapper[7728]: I0223 14:26:13.850107 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=2.85008816 podStartE2EDuration="2.85008816s" podCreationTimestamp="2026-02-23 14:26:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:26:13.84618989 +0000 UTC m=+466.808851206" watchObservedRunningTime="2026-02-23 14:26:13.85008816 +0000 UTC m=+466.812749456" Feb 23 14:26:14.713873 master-0 kubenswrapper[7728]: I0223 14:26:14.713755 7728 patch_prober.go:28] interesting pod/route-controller-manager-8bb99f4f-msq8f container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.64:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:26:14.713873 master-0 kubenswrapper[7728]: I0223 14:26:14.713857 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" podUID="482284fd-6911-4ba6-8d57-7966cc51117a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.64:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:26:15.486447 master-0 kubenswrapper[7728]: I0223 14:26:15.486314 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:26:15.486447 master-0 kubenswrapper[7728]: I0223 14:26:15.486316 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:26:15.486447 master-0 kubenswrapper[7728]: I0223 14:26:15.486418 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:26:15.487209 master-0 kubenswrapper[7728]: I0223 14:26:15.486518 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:26:16.335587 master-0 kubenswrapper[7728]: E0223 14:26:16.335417 7728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Feb 23 14:26:17.844154 master-0 kubenswrapper[7728]: I0223 14:26:17.843922 7728 generic.go:334] "Generic (PLEG): container finished" podID="85365dec-af50-406c-b258-890e4f454c4a" containerID="348647c8be47f1f0398a726d98ab4e65fbf23ef3ceae1691e078bd87dddb99c7" exitCode=0 Feb 23 14:26:17.844154 master-0 kubenswrapper[7728]: I0223 14:26:17.844008 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-6968c58f46-p7jh7" event={"ID":"85365dec-af50-406c-b258-890e4f454c4a","Type":"ContainerDied","Data":"348647c8be47f1f0398a726d98ab4e65fbf23ef3ceae1691e078bd87dddb99c7"} Feb 23 14:26:17.845160 master-0 kubenswrapper[7728]: I0223 14:26:17.845119 7728 scope.go:117] "RemoveContainer" containerID="348647c8be47f1f0398a726d98ab4e65fbf23ef3ceae1691e078bd87dddb99c7" Feb 23 14:26:18.486355 master-0 kubenswrapper[7728]: I0223 14:26:18.486263 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:26:18.486649 master-0 kubenswrapper[7728]: I0223 14:26:18.486356 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:26:18.486649 master-0 kubenswrapper[7728]: I0223 14:26:18.486425 7728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" Feb 23 14:26:18.486786 master-0 kubenswrapper[7728]: I0223 14:26:18.486716 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:26:18.486886 master-0 kubenswrapper[7728]: I0223 14:26:18.486832 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:26:18.487442 master-0 kubenswrapper[7728]: I0223 14:26:18.487385 7728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="openshift-config-operator" containerStatusID={"Type":"cri-o","ID":"5ace2c4cd314c92825fd50854b7a53f375dd7e9cb995361c6b2c717e5d66eb1b"} pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" containerMessage="Container openshift-config-operator failed liveness probe, will be restarted" Feb 23 14:26:18.487566 master-0 kubenswrapper[7728]: I0223 14:26:18.487448 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" containerID="cri-o://5ace2c4cd314c92825fd50854b7a53f375dd7e9cb995361c6b2c717e5d66eb1b" gracePeriod=30 Feb 23 14:26:18.487642 master-0 kubenswrapper[7728]: I0223 14:26:18.487568 7728 patch_prober.go:28] interesting pod/openshift-config-operator-6f47d587d6-55qjr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Feb 23 14:26:18.487780 master-0 kubenswrapper[7728]: I0223 14:26:18.487634 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Feb 23 14:26:18.853349 master-0 kubenswrapper[7728]: I0223 14:26:18.853210 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-6968c58f46-p7jh7" event={"ID":"85365dec-af50-406c-b258-890e4f454c4a","Type":"ContainerStarted","Data":"87ac5de4a86de4ff5bcd686d0a9509d8482fb517bcd1929c37584398f44deed7"} Feb 23 14:26:19.424911 master-0 kubenswrapper[7728]: E0223 14:26:19.424815 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=openshift-config-operator pod=openshift-config-operator-6f47d587d6-55qjr_openshift-config-operator(92c63c95-e880-4f51-9858-7715343f7bd8)\"" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" Feb 23 14:26:19.862784 master-0 kubenswrapper[7728]: I0223 14:26:19.862667 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-6f47d587d6-55qjr_92c63c95-e880-4f51-9858-7715343f7bd8/openshift-config-operator/4.log" Feb 23 14:26:19.863583 master-0 kubenswrapper[7728]: I0223 14:26:19.863220 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-6f47d587d6-55qjr_92c63c95-e880-4f51-9858-7715343f7bd8/openshift-config-operator/3.log" Feb 23 14:26:19.864151 master-0 kubenswrapper[7728]: I0223 14:26:19.864074 7728 generic.go:334] "Generic (PLEG): container finished" podID="92c63c95-e880-4f51-9858-7715343f7bd8" containerID="5ace2c4cd314c92825fd50854b7a53f375dd7e9cb995361c6b2c717e5d66eb1b" exitCode=255 Feb 23 14:26:19.864314 master-0 kubenswrapper[7728]: I0223 14:26:19.864143 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" event={"ID":"92c63c95-e880-4f51-9858-7715343f7bd8","Type":"ContainerDied","Data":"5ace2c4cd314c92825fd50854b7a53f375dd7e9cb995361c6b2c717e5d66eb1b"} Feb 23 14:26:19.864314 master-0 kubenswrapper[7728]: I0223 14:26:19.864206 7728 scope.go:117] "RemoveContainer" containerID="0bba5b88522722fd7c81731469c726d0abf8c593b96b63602cc16566d85db157" Feb 23 14:26:19.865143 master-0 kubenswrapper[7728]: I0223 14:26:19.865088 7728 scope.go:117] "RemoveContainer" containerID="5ace2c4cd314c92825fd50854b7a53f375dd7e9cb995361c6b2c717e5d66eb1b" Feb 23 14:26:19.865565 master-0 kubenswrapper[7728]: E0223 14:26:19.865519 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=openshift-config-operator pod=openshift-config-operator-6f47d587d6-55qjr_openshift-config-operator(92c63c95-e880-4f51-9858-7715343f7bd8)\"" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" Feb 23 14:26:20.221465 master-0 kubenswrapper[7728]: I0223 14:26:20.221346 7728 scope.go:117] "RemoveContainer" containerID="e7e20b5ba72ce778a4607a64cc8928522b6f4e4e91aae5a0ddbe4de3f2e8d4a6" Feb 23 14:26:20.221960 master-0 kubenswrapper[7728]: E0223 14:26:20.221888 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(c9ad9373c007a4fcd25e70622bdc8deb)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" Feb 23 14:26:20.871795 master-0 kubenswrapper[7728]: I0223 14:26:20.871735 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-fc889cfd5-tw2r9_865ceedb-b19a-4f2f-b295-311e1b7a645e/kube-storage-version-migrator-operator/2.log" Feb 23 14:26:20.872656 master-0 kubenswrapper[7728]: I0223 14:26:20.872111 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-fc889cfd5-tw2r9_865ceedb-b19a-4f2f-b295-311e1b7a645e/kube-storage-version-migrator-operator/1.log" Feb 23 14:26:20.872656 master-0 kubenswrapper[7728]: I0223 14:26:20.872157 7728 generic.go:334] "Generic (PLEG): container finished" podID="865ceedb-b19a-4f2f-b295-311e1b7a645e" containerID="deffb87f96ddeeef2ceba573c92018620cd6c1adba32e1a82ff2a0041c126856" exitCode=255 Feb 23 14:26:20.872656 master-0 kubenswrapper[7728]: I0223 14:26:20.872221 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" event={"ID":"865ceedb-b19a-4f2f-b295-311e1b7a645e","Type":"ContainerDied","Data":"deffb87f96ddeeef2ceba573c92018620cd6c1adba32e1a82ff2a0041c126856"} Feb 23 14:26:20.872656 master-0 kubenswrapper[7728]: I0223 14:26:20.872259 7728 scope.go:117] "RemoveContainer" containerID="7afdcdc79bcf059c9a09c1210b1f2828b6a1174f97563a28f6b04cfb2b6ff9e4" Feb 23 14:26:20.872994 master-0 kubenswrapper[7728]: I0223 14:26:20.872792 7728 scope.go:117] "RemoveContainer" containerID="deffb87f96ddeeef2ceba573c92018620cd6c1adba32e1a82ff2a0041c126856" Feb 23 14:26:20.873062 master-0 kubenswrapper[7728]: E0223 14:26:20.873004 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-storage-version-migrator-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-storage-version-migrator-operator pod=kube-storage-version-migrator-operator-fc889cfd5-tw2r9_openshift-kube-storage-version-migrator-operator(865ceedb-b19a-4f2f-b295-311e1b7a645e)\"" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" podUID="865ceedb-b19a-4f2f-b295-311e1b7a645e" Feb 23 14:26:20.878874 master-0 kubenswrapper[7728]: I0223 14:26:20.878793 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-6f47d587d6-55qjr_92c63c95-e880-4f51-9858-7715343f7bd8/openshift-config-operator/4.log" Feb 23 14:26:20.885027 master-0 kubenswrapper[7728]: I0223 14:26:20.881956 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-7bcfbc574b-zdntd_cf04aca0-8174-4134-835d-37adf6a3b5ca/kube-controller-manager-operator/3.log" Feb 23 14:26:20.885027 master-0 kubenswrapper[7728]: I0223 14:26:20.882303 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-7bcfbc574b-zdntd_cf04aca0-8174-4134-835d-37adf6a3b5ca/kube-controller-manager-operator/2.log" Feb 23 14:26:20.885027 master-0 kubenswrapper[7728]: I0223 14:26:20.882334 7728 generic.go:334] "Generic (PLEG): container finished" podID="cf04aca0-8174-4134-835d-37adf6a3b5ca" containerID="7c094f15ea265ac3d44bbebfb78fef4402e37dfe5737cb2bab354a08b8292a17" exitCode=255 Feb 23 14:26:20.885027 master-0 kubenswrapper[7728]: I0223 14:26:20.882396 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" event={"ID":"cf04aca0-8174-4134-835d-37adf6a3b5ca","Type":"ContainerDied","Data":"7c094f15ea265ac3d44bbebfb78fef4402e37dfe5737cb2bab354a08b8292a17"} Feb 23 14:26:20.885027 master-0 kubenswrapper[7728]: I0223 14:26:20.883516 7728 scope.go:117] "RemoveContainer" containerID="7c094f15ea265ac3d44bbebfb78fef4402e37dfe5737cb2bab354a08b8292a17" Feb 23 14:26:20.885027 master-0 kubenswrapper[7728]: E0223 14:26:20.884007 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-controller-manager-operator pod=kube-controller-manager-operator-7bcfbc574b-zdntd_openshift-kube-controller-manager-operator(cf04aca0-8174-4134-835d-37adf6a3b5ca)\"" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" podUID="cf04aca0-8174-4134-835d-37adf6a3b5ca" Feb 23 14:26:20.885027 master-0 kubenswrapper[7728]: I0223 14:26:20.884114 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-545bf96f4d-fpwtm_8de1f285-47ac-42aa-8026-8addce656362/etcd-operator/2.log" Feb 23 14:26:20.885027 master-0 kubenswrapper[7728]: I0223 14:26:20.884734 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" event={"ID":"8de1f285-47ac-42aa-8026-8addce656362","Type":"ContainerDied","Data":"a3d7f9dd773bb2be7eef32103651b05954025b8d3ad91ea82c3e56fc88bd34fd"} Feb 23 14:26:20.885027 master-0 kubenswrapper[7728]: I0223 14:26:20.884651 7728 generic.go:334] "Generic (PLEG): container finished" podID="8de1f285-47ac-42aa-8026-8addce656362" containerID="a3d7f9dd773bb2be7eef32103651b05954025b8d3ad91ea82c3e56fc88bd34fd" exitCode=255 Feb 23 14:26:20.885511 master-0 kubenswrapper[7728]: I0223 14:26:20.885329 7728 scope.go:117] "RemoveContainer" containerID="a3d7f9dd773bb2be7eef32103651b05954025b8d3ad91ea82c3e56fc88bd34fd" Feb 23 14:26:20.885561 master-0 kubenswrapper[7728]: E0223 14:26:20.885527 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=etcd-operator pod=etcd-operator-545bf96f4d-fpwtm_openshift-etcd-operator(8de1f285-47ac-42aa-8026-8addce656362)\"" pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" podUID="8de1f285-47ac-42aa-8026-8addce656362" Feb 23 14:26:20.887081 master-0 kubenswrapper[7728]: I0223 14:26:20.887050 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-c48c8bf7c-vtnsw_b714a9df-026e-423d-a980-2569f0d92e47/service-ca-operator/2.log" Feb 23 14:26:20.887459 master-0 kubenswrapper[7728]: I0223 14:26:20.887426 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-c48c8bf7c-vtnsw_b714a9df-026e-423d-a980-2569f0d92e47/service-ca-operator/1.log" Feb 23 14:26:20.887528 master-0 kubenswrapper[7728]: I0223 14:26:20.887464 7728 generic.go:334] "Generic (PLEG): container finished" podID="b714a9df-026e-423d-a980-2569f0d92e47" containerID="22cda996f9dec95459a017791c6284a80f33c42296156317930bcb92d3fc7877" exitCode=255 Feb 23 14:26:20.887562 master-0 kubenswrapper[7728]: I0223 14:26:20.887529 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" event={"ID":"b714a9df-026e-423d-a980-2569f0d92e47","Type":"ContainerDied","Data":"22cda996f9dec95459a017791c6284a80f33c42296156317930bcb92d3fc7877"} Feb 23 14:26:20.887837 master-0 kubenswrapper[7728]: I0223 14:26:20.887808 7728 scope.go:117] "RemoveContainer" containerID="22cda996f9dec95459a017791c6284a80f33c42296156317930bcb92d3fc7877" Feb 23 14:26:20.888018 master-0 kubenswrapper[7728]: E0223 14:26:20.887983 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"service-ca-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=service-ca-operator pod=service-ca-operator-c48c8bf7c-vtnsw_openshift-service-ca-operator(b714a9df-026e-423d-a980-2569f0d92e47)\"" pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" podUID="b714a9df-026e-423d-a980-2569f0d92e47" Feb 23 14:26:20.889753 master-0 kubenswrapper[7728]: I0223 14:26:20.889726 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5d87bf58c-nq2tz_961e4ecd-545b-4270-ae34-e733dec793b6/kube-apiserver-operator/2.log" Feb 23 14:26:20.890448 master-0 kubenswrapper[7728]: I0223 14:26:20.890332 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5d87bf58c-nq2tz_961e4ecd-545b-4270-ae34-e733dec793b6/kube-apiserver-operator/1.log" Feb 23 14:26:20.890448 master-0 kubenswrapper[7728]: I0223 14:26:20.890364 7728 generic.go:334] "Generic (PLEG): container finished" podID="961e4ecd-545b-4270-ae34-e733dec793b6" containerID="41bf9ac4f6ba09181a226cfe2ad608e31e59bbb137b1b1ead593f9c6c980fde1" exitCode=255 Feb 23 14:26:20.890448 master-0 kubenswrapper[7728]: I0223 14:26:20.890389 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" event={"ID":"961e4ecd-545b-4270-ae34-e733dec793b6","Type":"ContainerDied","Data":"41bf9ac4f6ba09181a226cfe2ad608e31e59bbb137b1b1ead593f9c6c980fde1"} Feb 23 14:26:20.890831 master-0 kubenswrapper[7728]: I0223 14:26:20.890727 7728 scope.go:117] "RemoveContainer" containerID="41bf9ac4f6ba09181a226cfe2ad608e31e59bbb137b1b1ead593f9c6c980fde1" Feb 23 14:26:20.890917 master-0 kubenswrapper[7728]: E0223 14:26:20.890896 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-operator pod=kube-apiserver-operator-5d87bf58c-nq2tz_openshift-kube-apiserver-operator(961e4ecd-545b-4270-ae34-e733dec793b6)\"" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" podUID="961e4ecd-545b-4270-ae34-e733dec793b6" Feb 23 14:26:20.891578 master-0 kubenswrapper[7728]: I0223 14:26:20.891543 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7d7db75979-x4qnw_674041a2-e2b0-4286-88cc-f1b00571e3f3/network-operator/2.log" Feb 23 14:26:20.891937 master-0 kubenswrapper[7728]: I0223 14:26:20.891899 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7d7db75979-x4qnw_674041a2-e2b0-4286-88cc-f1b00571e3f3/network-operator/1.log" Feb 23 14:26:20.892016 master-0 kubenswrapper[7728]: I0223 14:26:20.891943 7728 generic.go:334] "Generic (PLEG): container finished" podID="674041a2-e2b0-4286-88cc-f1b00571e3f3" containerID="4059934c66f6a9887a7e6b1218e04bcfb0fcfe5376abb8c188a9213f581fe6f3" exitCode=255 Feb 23 14:26:20.892016 master-0 kubenswrapper[7728]: I0223 14:26:20.891996 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" event={"ID":"674041a2-e2b0-4286-88cc-f1b00571e3f3","Type":"ContainerDied","Data":"4059934c66f6a9887a7e6b1218e04bcfb0fcfe5376abb8c188a9213f581fe6f3"} Feb 23 14:26:20.892431 master-0 kubenswrapper[7728]: I0223 14:26:20.892383 7728 scope.go:117] "RemoveContainer" containerID="4059934c66f6a9887a7e6b1218e04bcfb0fcfe5376abb8c188a9213f581fe6f3" Feb 23 14:26:20.892594 master-0 kubenswrapper[7728]: E0223 14:26:20.892572 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=network-operator pod=network-operator-7d7db75979-x4qnw_openshift-network-operator(674041a2-e2b0-4286-88cc-f1b00571e3f3)\"" pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" podUID="674041a2-e2b0-4286-88cc-f1b00571e3f3" Feb 23 14:26:20.894389 master-0 kubenswrapper[7728]: I0223 14:26:20.894337 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-5bd7768f54-bgg88_d2aa0d48-7c8e-4ddb-84a3-b3c34414c061/cluster-olm-operator/1.log" Feb 23 14:26:20.895612 master-0 kubenswrapper[7728]: I0223 14:26:20.895586 7728 generic.go:334] "Generic (PLEG): container finished" podID="d2aa0d48-7c8e-4ddb-84a3-b3c34414c061" containerID="525b335554d223a0f792c02a10050ad9f40b958440d7f69f8c4c394f4e398780" exitCode=255 Feb 23 14:26:20.895730 master-0 kubenswrapper[7728]: I0223 14:26:20.895630 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" event={"ID":"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061","Type":"ContainerDied","Data":"525b335554d223a0f792c02a10050ad9f40b958440d7f69f8c4c394f4e398780"} Feb 23 14:26:20.895911 master-0 kubenswrapper[7728]: I0223 14:26:20.895857 7728 scope.go:117] "RemoveContainer" containerID="525b335554d223a0f792c02a10050ad9f40b958440d7f69f8c4c394f4e398780" Feb 23 14:26:20.896015 master-0 kubenswrapper[7728]: E0223 14:26:20.895983 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-olm-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cluster-olm-operator pod=cluster-olm-operator-5bd7768f54-bgg88_openshift-cluster-olm-operator(d2aa0d48-7c8e-4ddb-84a3-b3c34414c061)\"" pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" podUID="d2aa0d48-7c8e-4ddb-84a3-b3c34414c061" Feb 23 14:26:20.897899 master-0 kubenswrapper[7728]: I0223 14:26:20.897846 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-8586dccc9b-tvnmq_24829faf-50e8-45bb-abb0-7cc5ccf81080/openshift-apiserver-operator/2.log" Feb 23 14:26:20.898315 master-0 kubenswrapper[7728]: I0223 14:26:20.898289 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-8586dccc9b-tvnmq_24829faf-50e8-45bb-abb0-7cc5ccf81080/openshift-apiserver-operator/1.log" Feb 23 14:26:20.898794 master-0 kubenswrapper[7728]: I0223 14:26:20.898327 7728 generic.go:334] "Generic (PLEG): container finished" podID="24829faf-50e8-45bb-abb0-7cc5ccf81080" containerID="50817d53493752eda9d4463a0b5a65e93107befbd5b1e52f265dd7c7f17a73bc" exitCode=255 Feb 23 14:26:20.898794 master-0 kubenswrapper[7728]: I0223 14:26:20.898380 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" event={"ID":"24829faf-50e8-45bb-abb0-7cc5ccf81080","Type":"ContainerDied","Data":"50817d53493752eda9d4463a0b5a65e93107befbd5b1e52f265dd7c7f17a73bc"} Feb 23 14:26:20.898794 master-0 kubenswrapper[7728]: I0223 14:26:20.898681 7728 scope.go:117] "RemoveContainer" containerID="50817d53493752eda9d4463a0b5a65e93107befbd5b1e52f265dd7c7f17a73bc" Feb 23 14:26:20.899363 master-0 kubenswrapper[7728]: E0223 14:26:20.898834 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-apiserver-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=openshift-apiserver-operator pod=openshift-apiserver-operator-8586dccc9b-tvnmq_openshift-apiserver-operator(24829faf-50e8-45bb-abb0-7cc5ccf81080)\"" pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" podUID="24829faf-50e8-45bb-abb0-7cc5ccf81080" Feb 23 14:26:20.900177 master-0 kubenswrapper[7728]: I0223 14:26:20.900150 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-77cd4d9559-qvq8x_b9cf1c39-24f0-420b-8020-089616d1cdf0/kube-scheduler-operator-container/2.log" Feb 23 14:26:20.900761 master-0 kubenswrapper[7728]: I0223 14:26:20.900733 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-77cd4d9559-qvq8x_b9cf1c39-24f0-420b-8020-089616d1cdf0/kube-scheduler-operator-container/1.log" Feb 23 14:26:20.900870 master-0 kubenswrapper[7728]: I0223 14:26:20.900775 7728 generic.go:334] "Generic (PLEG): container finished" podID="b9cf1c39-24f0-420b-8020-089616d1cdf0" containerID="cb07ee7a08ec58d0214f496b0ca32c3611b77165c521b9fecab35b067ef91753" exitCode=255 Feb 23 14:26:20.900870 master-0 kubenswrapper[7728]: I0223 14:26:20.900830 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" event={"ID":"b9cf1c39-24f0-420b-8020-089616d1cdf0","Type":"ContainerDied","Data":"cb07ee7a08ec58d0214f496b0ca32c3611b77165c521b9fecab35b067ef91753"} Feb 23 14:26:20.901322 master-0 kubenswrapper[7728]: I0223 14:26:20.901298 7728 scope.go:117] "RemoveContainer" containerID="cb07ee7a08ec58d0214f496b0ca32c3611b77165c521b9fecab35b067ef91753" Feb 23 14:26:20.901575 master-0 kubenswrapper[7728]: E0223 14:26:20.901546 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-scheduler-operator-container\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-scheduler-operator-container pod=openshift-kube-scheduler-operator-77cd4d9559-qvq8x_openshift-kube-scheduler-operator(b9cf1c39-24f0-420b-8020-089616d1cdf0)\"" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" podUID="b9cf1c39-24f0-420b-8020-089616d1cdf0" Feb 23 14:26:20.903416 master-0 kubenswrapper[7728]: I0223 14:26:20.903376 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-584cc7bcb5-67ds6_cb6e88cd-98de-446a-92e8-f56a2f133703/openshift-controller-manager-operator/1.log" Feb 23 14:26:20.903844 master-0 kubenswrapper[7728]: I0223 14:26:20.903813 7728 generic.go:334] "Generic (PLEG): container finished" podID="cb6e88cd-98de-446a-92e8-f56a2f133703" containerID="e062ef1f26297d24d2516be8292a3297ef7a87cfa574a75bfb2f2e2e904d65e1" exitCode=255 Feb 23 14:26:20.903931 master-0 kubenswrapper[7728]: I0223 14:26:20.903844 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" event={"ID":"cb6e88cd-98de-446a-92e8-f56a2f133703","Type":"ContainerDied","Data":"e062ef1f26297d24d2516be8292a3297ef7a87cfa574a75bfb2f2e2e904d65e1"} Feb 23 14:26:20.904191 master-0 kubenswrapper[7728]: I0223 14:26:20.904144 7728 scope.go:117] "RemoveContainer" containerID="e062ef1f26297d24d2516be8292a3297ef7a87cfa574a75bfb2f2e2e904d65e1" Feb 23 14:26:20.904384 master-0 kubenswrapper[7728]: E0223 14:26:20.904335 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-controller-manager-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=openshift-controller-manager-operator pod=openshift-controller-manager-operator-584cc7bcb5-67ds6_openshift-controller-manager-operator(cb6e88cd-98de-446a-92e8-f56a2f133703)\"" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" podUID="cb6e88cd-98de-446a-92e8-f56a2f133703" Feb 23 14:26:20.921609 master-0 kubenswrapper[7728]: I0223 14:26:20.921499 7728 scope.go:117] "RemoveContainer" containerID="945eeda0e74497e88a62481b91d0c1abd43853d97eee9a925d78fc6fc7443101" Feb 23 14:26:20.975711 master-0 kubenswrapper[7728]: I0223 14:26:20.975674 7728 scope.go:117] "RemoveContainer" containerID="5f94c8fde6ae66d48d8282c5c57e237550057485795720cf3e4f35047fc2b408" Feb 23 14:26:21.000745 master-0 kubenswrapper[7728]: I0223 14:26:21.000683 7728 scope.go:117] "RemoveContainer" containerID="796f1fd46fb4b7de05e9d7265e3f4d090bdf2c82e271c42f786a885903a59f3d" Feb 23 14:26:21.025834 master-0 kubenswrapper[7728]: I0223 14:26:21.025777 7728 scope.go:117] "RemoveContainer" containerID="4077843a1666052f92a3616061104688e7ad630e49b861a884467e2e98bfca5d" Feb 23 14:26:21.151193 master-0 kubenswrapper[7728]: I0223 14:26:21.151108 7728 scope.go:117] "RemoveContainer" containerID="a2ae49d4722a1e40cd55dc37aa9260d992134f5c1ca873bbced79ae9e75c00b6" Feb 23 14:26:21.222742 master-0 kubenswrapper[7728]: I0223 14:26:21.222676 7728 scope.go:117] "RemoveContainer" containerID="bb60c962ed53b03fdfea9c76fcac5c126728571b797b7f917d784b1b7debd024" Feb 23 14:26:21.324716 master-0 kubenswrapper[7728]: I0223 14:26:21.324643 7728 scope.go:117] "RemoveContainer" containerID="81ff2e6b5bae83ef9904fdb97ede2ea9a1442b7adab5da48d4302eefab7a166a" Feb 23 14:26:21.380227 master-0 kubenswrapper[7728]: I0223 14:26:21.380166 7728 scope.go:117] "RemoveContainer" containerID="b4c3088f58a98599776247fbbf6c6a5a5751aeab22dce8d8142cec8b35a3fab9" Feb 23 14:26:21.463409 master-0 kubenswrapper[7728]: I0223 14:26:21.463348 7728 scope.go:117] "RemoveContainer" containerID="031c49419dbbce343a020e2a52b0b21aa31f7846ce6d6338d427aedeeb387c27" Feb 23 14:26:21.913723 master-0 kubenswrapper[7728]: I0223 14:26:21.913632 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5d87bf58c-nq2tz_961e4ecd-545b-4270-ae34-e733dec793b6/kube-apiserver-operator/2.log" Feb 23 14:26:21.916811 master-0 kubenswrapper[7728]: I0223 14:26:21.916745 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-5bd7768f54-bgg88_d2aa0d48-7c8e-4ddb-84a3-b3c34414c061/cluster-olm-operator/1.log" Feb 23 14:26:21.920416 master-0 kubenswrapper[7728]: I0223 14:26:21.920338 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-8586dccc9b-tvnmq_24829faf-50e8-45bb-abb0-7cc5ccf81080/openshift-apiserver-operator/2.log" Feb 23 14:26:21.923064 master-0 kubenswrapper[7728]: I0223 14:26:21.922999 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-584cc7bcb5-67ds6_cb6e88cd-98de-446a-92e8-f56a2f133703/openshift-controller-manager-operator/1.log" Feb 23 14:26:21.925596 master-0 kubenswrapper[7728]: I0223 14:26:21.925464 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7d7db75979-x4qnw_674041a2-e2b0-4286-88cc-f1b00571e3f3/network-operator/2.log" Feb 23 14:26:21.927782 master-0 kubenswrapper[7728]: I0223 14:26:21.927731 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-7bcfbc574b-zdntd_cf04aca0-8174-4134-835d-37adf6a3b5ca/kube-controller-manager-operator/3.log" Feb 23 14:26:21.930208 master-0 kubenswrapper[7728]: I0223 14:26:21.930135 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-545bf96f4d-fpwtm_8de1f285-47ac-42aa-8026-8addce656362/etcd-operator/2.log" Feb 23 14:26:21.932544 master-0 kubenswrapper[7728]: I0223 14:26:21.932463 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-c48c8bf7c-vtnsw_b714a9df-026e-423d-a980-2569f0d92e47/service-ca-operator/2.log" Feb 23 14:26:21.935001 master-0 kubenswrapper[7728]: I0223 14:26:21.934950 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-fc889cfd5-tw2r9_865ceedb-b19a-4f2f-b295-311e1b7a645e/kube-storage-version-migrator-operator/2.log" Feb 23 14:26:21.937388 master-0 kubenswrapper[7728]: I0223 14:26:21.937330 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-77cd4d9559-qvq8x_b9cf1c39-24f0-420b-8020-089616d1cdf0/kube-scheduler-operator-container/2.log" Feb 23 14:26:22.248695 master-0 kubenswrapper[7728]: I0223 14:26:22.248577 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vwtc6"] Feb 23 14:26:22.249024 master-0 kubenswrapper[7728]: I0223 14:26:22.248975 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vwtc6" podUID="b993917a-bce8-4467-a09d-bfc923a90460" containerName="registry-server" containerID="cri-o://e78de7953f2a8533477d2a1bfe68954a05a59fc83000f53a6813ad0fc2dd2095" gracePeriod=2 Feb 23 14:26:22.459127 master-0 kubenswrapper[7728]: I0223 14:26:22.459042 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mtvwp"] Feb 23 14:26:22.459382 master-0 kubenswrapper[7728]: I0223 14:26:22.459336 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mtvwp" podUID="3a398c0f-1b6a-4836-a8b4-33b004350d84" containerName="registry-server" containerID="cri-o://05b715f2ab02ad5806af23b9ce5ee7e81a28b5cc350517ef49d7396788fb5efc" gracePeriod=2 Feb 23 14:26:22.739192 master-0 kubenswrapper[7728]: I0223 14:26:22.739075 7728 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:26:22.739429 master-0 kubenswrapper[7728]: I0223 14:26:22.739216 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:26:22.740072 master-0 kubenswrapper[7728]: I0223 14:26:22.740026 7728 scope.go:117] "RemoveContainer" containerID="e7e20b5ba72ce778a4607a64cc8928522b6f4e4e91aae5a0ddbe4de3f2e8d4a6" Feb 23 14:26:22.740178 master-0 kubenswrapper[7728]: I0223 14:26:22.740133 7728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"db7391fbfbce6a5b2d3e6ea64af30eda73902a756b9cef12c2e6e67aee0522bb"} pod="kube-system/bootstrap-kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Feb 23 14:26:22.740323 master-0 kubenswrapper[7728]: I0223 14:26:22.740202 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="cluster-policy-controller" containerID="cri-o://db7391fbfbce6a5b2d3e6ea64af30eda73902a756b9cef12c2e6e67aee0522bb" gracePeriod=30 Feb 23 14:26:22.756847 master-0 kubenswrapper[7728]: I0223 14:26:22.756783 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vwtc6" Feb 23 14:26:22.780213 master-0 kubenswrapper[7728]: I0223 14:26:22.780019 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b993917a-bce8-4467-a09d-bfc923a90460-catalog-content\") pod \"b993917a-bce8-4467-a09d-bfc923a90460\" (UID: \"b993917a-bce8-4467-a09d-bfc923a90460\") " Feb 23 14:26:22.780213 master-0 kubenswrapper[7728]: I0223 14:26:22.780063 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b993917a-bce8-4467-a09d-bfc923a90460-utilities\") pod \"b993917a-bce8-4467-a09d-bfc923a90460\" (UID: \"b993917a-bce8-4467-a09d-bfc923a90460\") " Feb 23 14:26:22.780213 master-0 kubenswrapper[7728]: I0223 14:26:22.780130 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96vz5\" (UniqueName: \"kubernetes.io/projected/b993917a-bce8-4467-a09d-bfc923a90460-kube-api-access-96vz5\") pod \"b993917a-bce8-4467-a09d-bfc923a90460\" (UID: \"b993917a-bce8-4467-a09d-bfc923a90460\") " Feb 23 14:26:22.781825 master-0 kubenswrapper[7728]: I0223 14:26:22.781765 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b993917a-bce8-4467-a09d-bfc923a90460-utilities" (OuterVolumeSpecName: "utilities") pod "b993917a-bce8-4467-a09d-bfc923a90460" (UID: "b993917a-bce8-4467-a09d-bfc923a90460"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:26:22.784801 master-0 kubenswrapper[7728]: I0223 14:26:22.784717 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b993917a-bce8-4467-a09d-bfc923a90460-kube-api-access-96vz5" (OuterVolumeSpecName: "kube-api-access-96vz5") pod "b993917a-bce8-4467-a09d-bfc923a90460" (UID: "b993917a-bce8-4467-a09d-bfc923a90460"). InnerVolumeSpecName "kube-api-access-96vz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:26:22.881752 master-0 kubenswrapper[7728]: I0223 14:26:22.881688 7728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b993917a-bce8-4467-a09d-bfc923a90460-utilities\") on node \"master-0\" DevicePath \"\"" Feb 23 14:26:22.881752 master-0 kubenswrapper[7728]: I0223 14:26:22.881722 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96vz5\" (UniqueName: \"kubernetes.io/projected/b993917a-bce8-4467-a09d-bfc923a90460-kube-api-access-96vz5\") on node \"master-0\" DevicePath \"\"" Feb 23 14:26:22.883401 master-0 kubenswrapper[7728]: I0223 14:26:22.883358 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b993917a-bce8-4467-a09d-bfc923a90460-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b993917a-bce8-4467-a09d-bfc923a90460" (UID: "b993917a-bce8-4467-a09d-bfc923a90460"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:26:22.948662 master-0 kubenswrapper[7728]: I0223 14:26:22.948593 7728 generic.go:334] "Generic (PLEG): container finished" podID="3a398c0f-1b6a-4836-a8b4-33b004350d84" containerID="05b715f2ab02ad5806af23b9ce5ee7e81a28b5cc350517ef49d7396788fb5efc" exitCode=0 Feb 23 14:26:22.949118 master-0 kubenswrapper[7728]: I0223 14:26:22.948718 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mtvwp" event={"ID":"3a398c0f-1b6a-4836-a8b4-33b004350d84","Type":"ContainerDied","Data":"05b715f2ab02ad5806af23b9ce5ee7e81a28b5cc350517ef49d7396788fb5efc"} Feb 23 14:26:22.952928 master-0 kubenswrapper[7728]: I0223 14:26:22.952858 7728 generic.go:334] "Generic (PLEG): container finished" podID="c9ad9373c007a4fcd25e70622bdc8deb" containerID="db7391fbfbce6a5b2d3e6ea64af30eda73902a756b9cef12c2e6e67aee0522bb" exitCode=255 Feb 23 14:26:22.953002 master-0 kubenswrapper[7728]: I0223 14:26:22.952944 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"c9ad9373c007a4fcd25e70622bdc8deb","Type":"ContainerDied","Data":"db7391fbfbce6a5b2d3e6ea64af30eda73902a756b9cef12c2e6e67aee0522bb"} Feb 23 14:26:22.953054 master-0 kubenswrapper[7728]: I0223 14:26:22.953003 7728 scope.go:117] "RemoveContainer" containerID="f38113657e6647d113d4b8b771a4b871cb4df714ffeae8172aebba272b7e4da9" Feb 23 14:26:22.958517 master-0 kubenswrapper[7728]: I0223 14:26:22.958422 7728 generic.go:334] "Generic (PLEG): container finished" podID="b993917a-bce8-4467-a09d-bfc923a90460" containerID="e78de7953f2a8533477d2a1bfe68954a05a59fc83000f53a6813ad0fc2dd2095" exitCode=0 Feb 23 14:26:22.958517 master-0 kubenswrapper[7728]: I0223 14:26:22.958463 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwtc6" event={"ID":"b993917a-bce8-4467-a09d-bfc923a90460","Type":"ContainerDied","Data":"e78de7953f2a8533477d2a1bfe68954a05a59fc83000f53a6813ad0fc2dd2095"} Feb 23 14:26:22.958662 master-0 kubenswrapper[7728]: I0223 14:26:22.958547 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vwtc6" Feb 23 14:26:22.958749 master-0 kubenswrapper[7728]: I0223 14:26:22.958559 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwtc6" event={"ID":"b993917a-bce8-4467-a09d-bfc923a90460","Type":"ContainerDied","Data":"72afd593f65efb98e2b6fe3298c11d36a194a008b8e1244f1c39b4b489e885a8"} Feb 23 14:26:22.983122 master-0 kubenswrapper[7728]: I0223 14:26:22.983067 7728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b993917a-bce8-4467-a09d-bfc923a90460-catalog-content\") on node \"master-0\" DevicePath \"\"" Feb 23 14:26:23.021917 master-0 kubenswrapper[7728]: I0223 14:26:23.021835 7728 scope.go:117] "RemoveContainer" containerID="e78de7953f2a8533477d2a1bfe68954a05a59fc83000f53a6813ad0fc2dd2095" Feb 23 14:26:23.050944 master-0 kubenswrapper[7728]: I0223 14:26:23.050798 7728 scope.go:117] "RemoveContainer" containerID="d394e98a69d42d1c4f58982ec3fcda0ef2490c48a0e2aec3d0cd3ed1011f69a0" Feb 23 14:26:23.067557 master-0 kubenswrapper[7728]: I0223 14:26:23.067246 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vwtc6"] Feb 23 14:26:23.077046 master-0 kubenswrapper[7728]: I0223 14:26:23.076987 7728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vwtc6"] Feb 23 14:26:23.083343 master-0 kubenswrapper[7728]: I0223 14:26:23.083303 7728 scope.go:117] "RemoveContainer" containerID="265114de1092744105855e531fe1a322b26c21db647549095788228771d6e014" Feb 23 14:26:23.113384 master-0 kubenswrapper[7728]: I0223 14:26:23.113172 7728 scope.go:117] "RemoveContainer" containerID="e78de7953f2a8533477d2a1bfe68954a05a59fc83000f53a6813ad0fc2dd2095" Feb 23 14:26:23.114117 master-0 kubenswrapper[7728]: E0223 14:26:23.114080 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e78de7953f2a8533477d2a1bfe68954a05a59fc83000f53a6813ad0fc2dd2095\": container with ID starting with e78de7953f2a8533477d2a1bfe68954a05a59fc83000f53a6813ad0fc2dd2095 not found: ID does not exist" containerID="e78de7953f2a8533477d2a1bfe68954a05a59fc83000f53a6813ad0fc2dd2095" Feb 23 14:26:23.114184 master-0 kubenswrapper[7728]: I0223 14:26:23.114122 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e78de7953f2a8533477d2a1bfe68954a05a59fc83000f53a6813ad0fc2dd2095"} err="failed to get container status \"e78de7953f2a8533477d2a1bfe68954a05a59fc83000f53a6813ad0fc2dd2095\": rpc error: code = NotFound desc = could not find container \"e78de7953f2a8533477d2a1bfe68954a05a59fc83000f53a6813ad0fc2dd2095\": container with ID starting with e78de7953f2a8533477d2a1bfe68954a05a59fc83000f53a6813ad0fc2dd2095 not found: ID does not exist" Feb 23 14:26:23.114184 master-0 kubenswrapper[7728]: I0223 14:26:23.114159 7728 scope.go:117] "RemoveContainer" containerID="d394e98a69d42d1c4f58982ec3fcda0ef2490c48a0e2aec3d0cd3ed1011f69a0" Feb 23 14:26:23.115297 master-0 kubenswrapper[7728]: E0223 14:26:23.114523 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d394e98a69d42d1c4f58982ec3fcda0ef2490c48a0e2aec3d0cd3ed1011f69a0\": container with ID starting with d394e98a69d42d1c4f58982ec3fcda0ef2490c48a0e2aec3d0cd3ed1011f69a0 not found: ID does not exist" containerID="d394e98a69d42d1c4f58982ec3fcda0ef2490c48a0e2aec3d0cd3ed1011f69a0" Feb 23 14:26:23.115297 master-0 kubenswrapper[7728]: I0223 14:26:23.114573 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d394e98a69d42d1c4f58982ec3fcda0ef2490c48a0e2aec3d0cd3ed1011f69a0"} err="failed to get container status \"d394e98a69d42d1c4f58982ec3fcda0ef2490c48a0e2aec3d0cd3ed1011f69a0\": rpc error: code = NotFound desc = could not find container \"d394e98a69d42d1c4f58982ec3fcda0ef2490c48a0e2aec3d0cd3ed1011f69a0\": container with ID starting with d394e98a69d42d1c4f58982ec3fcda0ef2490c48a0e2aec3d0cd3ed1011f69a0 not found: ID does not exist" Feb 23 14:26:23.115297 master-0 kubenswrapper[7728]: I0223 14:26:23.114601 7728 scope.go:117] "RemoveContainer" containerID="265114de1092744105855e531fe1a322b26c21db647549095788228771d6e014" Feb 23 14:26:23.115297 master-0 kubenswrapper[7728]: I0223 14:26:23.114820 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mtvwp" Feb 23 14:26:23.115297 master-0 kubenswrapper[7728]: E0223 14:26:23.114899 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"265114de1092744105855e531fe1a322b26c21db647549095788228771d6e014\": container with ID starting with 265114de1092744105855e531fe1a322b26c21db647549095788228771d6e014 not found: ID does not exist" containerID="265114de1092744105855e531fe1a322b26c21db647549095788228771d6e014" Feb 23 14:26:23.115297 master-0 kubenswrapper[7728]: I0223 14:26:23.114930 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"265114de1092744105855e531fe1a322b26c21db647549095788228771d6e014"} err="failed to get container status \"265114de1092744105855e531fe1a322b26c21db647549095788228771d6e014\": rpc error: code = NotFound desc = could not find container \"265114de1092744105855e531fe1a322b26c21db647549095788228771d6e014\": container with ID starting with 265114de1092744105855e531fe1a322b26c21db647549095788228771d6e014 not found: ID does not exist" Feb 23 14:26:23.191115 master-0 kubenswrapper[7728]: I0223 14:26:23.190961 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqvf6\" (UniqueName: \"kubernetes.io/projected/3a398c0f-1b6a-4836-a8b4-33b004350d84-kube-api-access-rqvf6\") pod \"3a398c0f-1b6a-4836-a8b4-33b004350d84\" (UID: \"3a398c0f-1b6a-4836-a8b4-33b004350d84\") " Feb 23 14:26:23.191115 master-0 kubenswrapper[7728]: I0223 14:26:23.191020 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a398c0f-1b6a-4836-a8b4-33b004350d84-utilities\") pod \"3a398c0f-1b6a-4836-a8b4-33b004350d84\" (UID: \"3a398c0f-1b6a-4836-a8b4-33b004350d84\") " Feb 23 14:26:23.191115 master-0 kubenswrapper[7728]: I0223 14:26:23.191080 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a398c0f-1b6a-4836-a8b4-33b004350d84-catalog-content\") pod \"3a398c0f-1b6a-4836-a8b4-33b004350d84\" (UID: \"3a398c0f-1b6a-4836-a8b4-33b004350d84\") " Feb 23 14:26:23.192311 master-0 kubenswrapper[7728]: I0223 14:26:23.192265 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a398c0f-1b6a-4836-a8b4-33b004350d84-utilities" (OuterVolumeSpecName: "utilities") pod "3a398c0f-1b6a-4836-a8b4-33b004350d84" (UID: "3a398c0f-1b6a-4836-a8b4-33b004350d84"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:26:23.194166 master-0 kubenswrapper[7728]: I0223 14:26:23.194130 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a398c0f-1b6a-4836-a8b4-33b004350d84-kube-api-access-rqvf6" (OuterVolumeSpecName: "kube-api-access-rqvf6") pod "3a398c0f-1b6a-4836-a8b4-33b004350d84" (UID: "3a398c0f-1b6a-4836-a8b4-33b004350d84"). InnerVolumeSpecName "kube-api-access-rqvf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:26:23.230275 master-0 kubenswrapper[7728]: I0223 14:26:23.230245 7728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b993917a-bce8-4467-a09d-bfc923a90460" path="/var/lib/kubelet/pods/b993917a-bce8-4467-a09d-bfc923a90460/volumes" Feb 23 14:26:23.292401 master-0 kubenswrapper[7728]: I0223 14:26:23.292294 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqvf6\" (UniqueName: \"kubernetes.io/projected/3a398c0f-1b6a-4836-a8b4-33b004350d84-kube-api-access-rqvf6\") on node \"master-0\" DevicePath \"\"" Feb 23 14:26:23.292401 master-0 kubenswrapper[7728]: I0223 14:26:23.292370 7728 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a398c0f-1b6a-4836-a8b4-33b004350d84-utilities\") on node \"master-0\" DevicePath \"\"" Feb 23 14:26:23.327224 master-0 kubenswrapper[7728]: I0223 14:26:23.327106 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a398c0f-1b6a-4836-a8b4-33b004350d84-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a398c0f-1b6a-4836-a8b4-33b004350d84" (UID: "3a398c0f-1b6a-4836-a8b4-33b004350d84"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:26:23.393751 master-0 kubenswrapper[7728]: I0223 14:26:23.393684 7728 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a398c0f-1b6a-4836-a8b4-33b004350d84-catalog-content\") on node \"master-0\" DevicePath \"\"" Feb 23 14:26:23.526541 master-0 kubenswrapper[7728]: E0223 14:26:23.526370 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(c9ad9373c007a4fcd25e70622bdc8deb)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" Feb 23 14:26:23.968190 master-0 kubenswrapper[7728]: I0223 14:26:23.968137 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mtvwp" event={"ID":"3a398c0f-1b6a-4836-a8b4-33b004350d84","Type":"ContainerDied","Data":"ccfc6bc0e26aee4698f58d8e9759b6618ec00fc42f551f9f7741ee4b369d51bb"} Feb 23 14:26:23.969052 master-0 kubenswrapper[7728]: I0223 14:26:23.968206 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mtvwp" Feb 23 14:26:23.969052 master-0 kubenswrapper[7728]: I0223 14:26:23.968973 7728 scope.go:117] "RemoveContainer" containerID="05b715f2ab02ad5806af23b9ce5ee7e81a28b5cc350517ef49d7396788fb5efc" Feb 23 14:26:23.973617 master-0 kubenswrapper[7728]: I0223 14:26:23.973542 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"c9ad9373c007a4fcd25e70622bdc8deb","Type":"ContainerStarted","Data":"16c3c1cec998b569c900e35bdf09e7492aae0331e38f36582875c9a5db092d13"} Feb 23 14:26:23.974288 master-0 kubenswrapper[7728]: I0223 14:26:23.974218 7728 scope.go:117] "RemoveContainer" containerID="e7e20b5ba72ce778a4607a64cc8928522b6f4e4e91aae5a0ddbe4de3f2e8d4a6" Feb 23 14:26:23.974694 master-0 kubenswrapper[7728]: E0223 14:26:23.974634 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(c9ad9373c007a4fcd25e70622bdc8deb)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" Feb 23 14:26:23.991505 master-0 kubenswrapper[7728]: I0223 14:26:23.991451 7728 scope.go:117] "RemoveContainer" containerID="99efcfca15e0062c8ae1454f293dca23babf93693b0446a429661734820c3937" Feb 23 14:26:24.031255 master-0 kubenswrapper[7728]: I0223 14:26:24.030880 7728 scope.go:117] "RemoveContainer" containerID="b36b4de11e46511d30e9c21f6168fa9e64f6cbbd9e9b705ec3f96b48214779bc" Feb 23 14:26:24.105825 master-0 kubenswrapper[7728]: I0223 14:26:24.105750 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mtvwp"] Feb 23 14:26:24.114222 master-0 kubenswrapper[7728]: I0223 14:26:24.114149 7728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mtvwp"] Feb 23 14:26:24.412934 master-0 kubenswrapper[7728]: I0223 14:26:24.412588 7728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:26:24.413630 master-0 kubenswrapper[7728]: I0223 14:26:24.413574 7728 scope.go:117] "RemoveContainer" containerID="a3d7f9dd773bb2be7eef32103651b05954025b8d3ad91ea82c3e56fc88bd34fd" Feb 23 14:26:24.413968 master-0 kubenswrapper[7728]: E0223 14:26:24.413905 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=etcd-operator pod=etcd-operator-545bf96f4d-fpwtm_openshift-etcd-operator(8de1f285-47ac-42aa-8026-8addce656362)\"" pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" podUID="8de1f285-47ac-42aa-8026-8addce656362" Feb 23 14:26:24.712738 master-0 kubenswrapper[7728]: I0223 14:26:24.712675 7728 patch_prober.go:28] interesting pod/route-controller-manager-8bb99f4f-msq8f container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.64:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:26:24.712979 master-0 kubenswrapper[7728]: I0223 14:26:24.712764 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" podUID="482284fd-6911-4ba6-8d57-7966cc51117a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.64:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:26:24.984719 master-0 kubenswrapper[7728]: I0223 14:26:24.984533 7728 scope.go:117] "RemoveContainer" containerID="e7e20b5ba72ce778a4607a64cc8928522b6f4e4e91aae5a0ddbe4de3f2e8d4a6" Feb 23 14:26:24.985444 master-0 kubenswrapper[7728]: E0223 14:26:24.984794 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(c9ad9373c007a4fcd25e70622bdc8deb)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" Feb 23 14:26:25.232599 master-0 kubenswrapper[7728]: I0223 14:26:25.232459 7728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a398c0f-1b6a-4836-a8b4-33b004350d84" path="/var/lib/kubelet/pods/3a398c0f-1b6a-4836-a8b4-33b004350d84/volumes" Feb 23 14:26:25.909768 master-0 kubenswrapper[7728]: I0223 14:26:25.909659 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:26:25.990310 master-0 kubenswrapper[7728]: I0223 14:26:25.990193 7728 scope.go:117] "RemoveContainer" containerID="e7e20b5ba72ce778a4607a64cc8928522b6f4e4e91aae5a0ddbe4de3f2e8d4a6" Feb 23 14:26:25.991228 master-0 kubenswrapper[7728]: E0223 14:26:25.990698 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(c9ad9373c007a4fcd25e70622bdc8deb)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" Feb 23 14:26:29.011168 master-0 kubenswrapper[7728]: I0223 14:26:29.011064 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6847bb4785-5fw2x_2e89a047-9ebc-459b-b7b3-e902c1fb0e17/snapshot-controller/3.log" Feb 23 14:26:29.012145 master-0 kubenswrapper[7728]: I0223 14:26:29.011757 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6847bb4785-5fw2x_2e89a047-9ebc-459b-b7b3-e902c1fb0e17/snapshot-controller/2.log" Feb 23 14:26:29.012145 master-0 kubenswrapper[7728]: I0223 14:26:29.011826 7728 generic.go:334] "Generic (PLEG): container finished" podID="2e89a047-9ebc-459b-b7b3-e902c1fb0e17" containerID="3d02c5174ccc3722ad642137b2ae38a4ad6beee863578d93948d8f75b3ffc635" exitCode=1 Feb 23 14:26:29.012145 master-0 kubenswrapper[7728]: I0223 14:26:29.011873 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6847bb4785-5fw2x" event={"ID":"2e89a047-9ebc-459b-b7b3-e902c1fb0e17","Type":"ContainerDied","Data":"3d02c5174ccc3722ad642137b2ae38a4ad6beee863578d93948d8f75b3ffc635"} Feb 23 14:26:29.012145 master-0 kubenswrapper[7728]: I0223 14:26:29.011924 7728 scope.go:117] "RemoveContainer" containerID="4d73e7e4ca95353fc1daf5c78e6fb2d258da6e5bbcc4add88a2d98722b1263c5" Feb 23 14:26:29.012662 master-0 kubenswrapper[7728]: I0223 14:26:29.012595 7728 scope.go:117] "RemoveContainer" containerID="3d02c5174ccc3722ad642137b2ae38a4ad6beee863578d93948d8f75b3ffc635" Feb 23 14:26:29.012986 master-0 kubenswrapper[7728]: E0223 14:26:29.012928 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-6847bb4785-5fw2x_openshift-cluster-storage-operator(2e89a047-9ebc-459b-b7b3-e902c1fb0e17)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6847bb4785-5fw2x" podUID="2e89a047-9ebc-459b-b7b3-e902c1fb0e17" Feb 23 14:26:29.738381 master-0 kubenswrapper[7728]: I0223 14:26:29.738242 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:26:29.739054 master-0 kubenswrapper[7728]: I0223 14:26:29.739018 7728 scope.go:117] "RemoveContainer" containerID="e7e20b5ba72ce778a4607a64cc8928522b6f4e4e91aae5a0ddbe4de3f2e8d4a6" Feb 23 14:26:29.739550 master-0 kubenswrapper[7728]: E0223 14:26:29.739472 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(c9ad9373c007a4fcd25e70622bdc8deb)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" Feb 23 14:26:30.019928 master-0 kubenswrapper[7728]: I0223 14:26:30.019773 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6847bb4785-5fw2x_2e89a047-9ebc-459b-b7b3-e902c1fb0e17/snapshot-controller/3.log" Feb 23 14:26:30.021359 master-0 kubenswrapper[7728]: I0223 14:26:30.021330 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-8bb99f4f-msq8f_482284fd-6911-4ba6-8d57-7966cc51117a/route-controller-manager/1.log" Feb 23 14:26:30.021818 master-0 kubenswrapper[7728]: I0223 14:26:30.021784 7728 generic.go:334] "Generic (PLEG): container finished" podID="482284fd-6911-4ba6-8d57-7966cc51117a" containerID="cc5b0e807a282b75c570fbfb71a174caf59e3ff1678808f33d1b9369bbe859b7" exitCode=255 Feb 23 14:26:30.021868 master-0 kubenswrapper[7728]: I0223 14:26:30.021830 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" event={"ID":"482284fd-6911-4ba6-8d57-7966cc51117a","Type":"ContainerDied","Data":"cc5b0e807a282b75c570fbfb71a174caf59e3ff1678808f33d1b9369bbe859b7"} Feb 23 14:26:30.021966 master-0 kubenswrapper[7728]: I0223 14:26:30.021922 7728 scope.go:117] "RemoveContainer" containerID="30dd1f19a8b444dbc9b769a06f0917819d1c1e9174b5fb3b5552595a9eed345f" Feb 23 14:26:30.022913 master-0 kubenswrapper[7728]: I0223 14:26:30.022850 7728 scope.go:117] "RemoveContainer" containerID="cc5b0e807a282b75c570fbfb71a174caf59e3ff1678808f33d1b9369bbe859b7" Feb 23 14:26:30.023555 master-0 kubenswrapper[7728]: E0223 14:26:30.023262 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"route-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=route-controller-manager pod=route-controller-manager-8bb99f4f-msq8f_openshift-route-controller-manager(482284fd-6911-4ba6-8d57-7966cc51117a)\"" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" podUID="482284fd-6911-4ba6-8d57-7966cc51117a" Feb 23 14:26:30.270009 master-0 kubenswrapper[7728]: I0223 14:26:30.269883 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:26:31.030294 master-0 kubenswrapper[7728]: I0223 14:26:31.030206 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-8bb99f4f-msq8f_482284fd-6911-4ba6-8d57-7966cc51117a/route-controller-manager/1.log" Feb 23 14:26:31.221213 master-0 kubenswrapper[7728]: I0223 14:26:31.221143 7728 scope.go:117] "RemoveContainer" containerID="cb07ee7a08ec58d0214f496b0ca32c3611b77165c521b9fecab35b067ef91753" Feb 23 14:26:31.221588 master-0 kubenswrapper[7728]: I0223 14:26:31.221545 7728 scope.go:117] "RemoveContainer" containerID="41bf9ac4f6ba09181a226cfe2ad608e31e59bbb137b1b1ead593f9c6c980fde1" Feb 23 14:26:31.221922 master-0 kubenswrapper[7728]: E0223 14:26:31.221883 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-operator pod=kube-apiserver-operator-5d87bf58c-nq2tz_openshift-kube-apiserver-operator(961e4ecd-545b-4270-ae34-e733dec793b6)\"" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" podUID="961e4ecd-545b-4270-ae34-e733dec793b6" Feb 23 14:26:31.221961 master-0 kubenswrapper[7728]: E0223 14:26:31.221549 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-scheduler-operator-container\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-scheduler-operator-container pod=openshift-kube-scheduler-operator-77cd4d9559-qvq8x_openshift-kube-scheduler-operator(b9cf1c39-24f0-420b-8020-089616d1cdf0)\"" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" podUID="b9cf1c39-24f0-420b-8020-089616d1cdf0" Feb 23 14:26:32.220168 master-0 kubenswrapper[7728]: I0223 14:26:32.220095 7728 scope.go:117] "RemoveContainer" containerID="4059934c66f6a9887a7e6b1218e04bcfb0fcfe5376abb8c188a9213f581fe6f3" Feb 23 14:26:32.223213 master-0 kubenswrapper[7728]: E0223 14:26:32.220388 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=network-operator pod=network-operator-7d7db75979-x4qnw_openshift-network-operator(674041a2-e2b0-4286-88cc-f1b00571e3f3)\"" pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" podUID="674041a2-e2b0-4286-88cc-f1b00571e3f3" Feb 23 14:26:32.738825 master-0 kubenswrapper[7728]: I0223 14:26:32.738702 7728 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 14:26:33.221229 master-0 kubenswrapper[7728]: I0223 14:26:33.221155 7728 scope.go:117] "RemoveContainer" containerID="22cda996f9dec95459a017791c6284a80f33c42296156317930bcb92d3fc7877" Feb 23 14:26:33.222117 master-0 kubenswrapper[7728]: I0223 14:26:33.221227 7728 scope.go:117] "RemoveContainer" containerID="e062ef1f26297d24d2516be8292a3297ef7a87cfa574a75bfb2f2e2e904d65e1" Feb 23 14:26:33.222117 master-0 kubenswrapper[7728]: E0223 14:26:33.221381 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"service-ca-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=service-ca-operator pod=service-ca-operator-c48c8bf7c-vtnsw_openshift-service-ca-operator(b714a9df-026e-423d-a980-2569f0d92e47)\"" pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" podUID="b714a9df-026e-423d-a980-2569f0d92e47" Feb 23 14:26:33.336586 master-0 kubenswrapper[7728]: E0223 14:26:33.336453 7728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Feb 23 14:26:33.712796 master-0 kubenswrapper[7728]: I0223 14:26:33.712691 7728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:26:33.713415 master-0 kubenswrapper[7728]: I0223 14:26:33.713335 7728 scope.go:117] "RemoveContainer" containerID="cc5b0e807a282b75c570fbfb71a174caf59e3ff1678808f33d1b9369bbe859b7" Feb 23 14:26:33.713635 master-0 kubenswrapper[7728]: E0223 14:26:33.713608 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"route-controller-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=route-controller-manager pod=route-controller-manager-8bb99f4f-msq8f_openshift-route-controller-manager(482284fd-6911-4ba6-8d57-7966cc51117a)\"" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" podUID="482284fd-6911-4ba6-8d57-7966cc51117a" Feb 23 14:26:34.050698 master-0 kubenswrapper[7728]: I0223 14:26:34.050555 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-584cc7bcb5-67ds6_cb6e88cd-98de-446a-92e8-f56a2f133703/openshift-controller-manager-operator/1.log" Feb 23 14:26:34.050698 master-0 kubenswrapper[7728]: I0223 14:26:34.050632 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" event={"ID":"cb6e88cd-98de-446a-92e8-f56a2f133703","Type":"ContainerStarted","Data":"c12a63d20fd94ddf67f99b5864ba6402ee26036260625afea30b344472b19c16"} Feb 23 14:26:34.221971 master-0 kubenswrapper[7728]: I0223 14:26:34.221873 7728 scope.go:117] "RemoveContainer" containerID="50817d53493752eda9d4463a0b5a65e93107befbd5b1e52f265dd7c7f17a73bc" Feb 23 14:26:34.222549 master-0 kubenswrapper[7728]: I0223 14:26:34.222057 7728 scope.go:117] "RemoveContainer" containerID="7c094f15ea265ac3d44bbebfb78fef4402e37dfe5737cb2bab354a08b8292a17" Feb 23 14:26:34.222549 master-0 kubenswrapper[7728]: E0223 14:26:34.222342 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-apiserver-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=openshift-apiserver-operator pod=openshift-apiserver-operator-8586dccc9b-tvnmq_openshift-apiserver-operator(24829faf-50e8-45bb-abb0-7cc5ccf81080)\"" pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" podUID="24829faf-50e8-45bb-abb0-7cc5ccf81080" Feb 23 14:26:34.222549 master-0 kubenswrapper[7728]: E0223 14:26:34.222396 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-controller-manager-operator pod=kube-controller-manager-operator-7bcfbc574b-zdntd_openshift-kube-controller-manager-operator(cf04aca0-8174-4134-835d-37adf6a3b5ca)\"" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" podUID="cf04aca0-8174-4134-835d-37adf6a3b5ca" Feb 23 14:26:35.047903 master-0 kubenswrapper[7728]: E0223 14:26:35.047738 7728 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{openshift-config-operator-6f47d587d6-55qjr.1896e60016c58bd2 openshift-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-config-operator,Name:openshift-config-operator-6f47d587d6-55qjr,UID:92c63c95-e880-4f51-9858-7715343f7bd8,APIVersion:v1,ResourceVersion:8517,FieldPath:spec.containers{openshift-config-operator},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:19:51.473302482 +0000 UTC m=+84.435963778,LastTimestamp:2026-02-23 14:19:51.485893142 +0000 UTC m=+84.448554428,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:26:35.199472 master-0 kubenswrapper[7728]: I0223 14:26:35.199353 7728 patch_prober.go:28] interesting pod/authentication-operator-5bd7c86784-mlbx2 container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.10:8443/healthz\": dial tcp 10.128.0.10:8443: connect: connection refused" start-of-body= Feb 23 14:26:35.199472 master-0 kubenswrapper[7728]: I0223 14:26:35.199443 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" podUID="e2d00ece-7586-4346-adbb-eaae1aeda69e" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.10:8443/healthz\": dial tcp 10.128.0.10:8443: connect: connection refused" Feb 23 14:26:35.220958 master-0 kubenswrapper[7728]: I0223 14:26:35.220872 7728 scope.go:117] "RemoveContainer" containerID="5ace2c4cd314c92825fd50854b7a53f375dd7e9cb995361c6b2c717e5d66eb1b" Feb 23 14:26:35.221145 master-0 kubenswrapper[7728]: I0223 14:26:35.221023 7728 scope.go:117] "RemoveContainer" containerID="525b335554d223a0f792c02a10050ad9f40b958440d7f69f8c4c394f4e398780" Feb 23 14:26:35.221228 master-0 kubenswrapper[7728]: E0223 14:26:35.221174 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=openshift-config-operator pod=openshift-config-operator-6f47d587d6-55qjr_openshift-config-operator(92c63c95-e880-4f51-9858-7715343f7bd8)\"" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" Feb 23 14:26:36.069377 master-0 kubenswrapper[7728]: I0223 14:26:36.069292 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-5bd7768f54-bgg88_d2aa0d48-7c8e-4ddb-84a3-b3c34414c061/cluster-olm-operator/1.log" Feb 23 14:26:36.070435 master-0 kubenswrapper[7728]: I0223 14:26:36.070377 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" event={"ID":"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061","Type":"ContainerStarted","Data":"1e39418594b660aa7d2fd220cbe33ba64607470ce93c748d1b847909be0ae5de"} Feb 23 14:26:36.221115 master-0 kubenswrapper[7728]: I0223 14:26:36.220988 7728 scope.go:117] "RemoveContainer" containerID="a3d7f9dd773bb2be7eef32103651b05954025b8d3ad91ea82c3e56fc88bd34fd" Feb 23 14:26:36.221115 master-0 kubenswrapper[7728]: I0223 14:26:36.221093 7728 scope.go:117] "RemoveContainer" containerID="deffb87f96ddeeef2ceba573c92018620cd6c1adba32e1a82ff2a0041c126856" Feb 23 14:26:36.221719 master-0 kubenswrapper[7728]: E0223 14:26:36.221467 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-storage-version-migrator-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-storage-version-migrator-operator pod=kube-storage-version-migrator-operator-fc889cfd5-tw2r9_openshift-kube-storage-version-migrator-operator(865ceedb-b19a-4f2f-b295-311e1b7a645e)\"" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" podUID="865ceedb-b19a-4f2f-b295-311e1b7a645e" Feb 23 14:26:36.221719 master-0 kubenswrapper[7728]: E0223 14:26:36.221528 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=etcd-operator pod=etcd-operator-545bf96f4d-fpwtm_openshift-etcd-operator(8de1f285-47ac-42aa-8026-8addce656362)\"" pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" podUID="8de1f285-47ac-42aa-8026-8addce656362" Feb 23 14:26:42.221461 master-0 kubenswrapper[7728]: I0223 14:26:42.221355 7728 scope.go:117] "RemoveContainer" containerID="cb07ee7a08ec58d0214f496b0ca32c3611b77165c521b9fecab35b067ef91753" Feb 23 14:26:42.739437 master-0 kubenswrapper[7728]: I0223 14:26:42.739219 7728 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:26:43.220757 master-0 kubenswrapper[7728]: I0223 14:26:43.220701 7728 scope.go:117] "RemoveContainer" containerID="3d02c5174ccc3722ad642137b2ae38a4ad6beee863578d93948d8f75b3ffc635" Feb 23 14:26:43.221069 master-0 kubenswrapper[7728]: E0223 14:26:43.221026 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-6847bb4785-5fw2x_openshift-cluster-storage-operator(2e89a047-9ebc-459b-b7b3-e902c1fb0e17)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6847bb4785-5fw2x" podUID="2e89a047-9ebc-459b-b7b3-e902c1fb0e17" Feb 23 14:26:44.136021 master-0 kubenswrapper[7728]: I0223 14:26:44.135931 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-77cd4d9559-qvq8x_b9cf1c39-24f0-420b-8020-089616d1cdf0/kube-scheduler-operator-container/2.log" Feb 23 14:26:44.136635 master-0 kubenswrapper[7728]: I0223 14:26:44.136034 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" event={"ID":"b9cf1c39-24f0-420b-8020-089616d1cdf0","Type":"ContainerStarted","Data":"de147076f9d00421d454d2b4d2c157f8440c0694fa5a6d3ad620d5c550d4adbe"} Feb 23 14:26:44.221317 master-0 kubenswrapper[7728]: I0223 14:26:44.221107 7728 scope.go:117] "RemoveContainer" containerID="e7e20b5ba72ce778a4607a64cc8928522b6f4e4e91aae5a0ddbe4de3f2e8d4a6" Feb 23 14:26:44.221545 master-0 kubenswrapper[7728]: E0223 14:26:44.221391 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(c9ad9373c007a4fcd25e70622bdc8deb)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" Feb 23 14:26:44.221665 master-0 kubenswrapper[7728]: I0223 14:26:44.221614 7728 scope.go:117] "RemoveContainer" containerID="4059934c66f6a9887a7e6b1218e04bcfb0fcfe5376abb8c188a9213f581fe6f3" Feb 23 14:26:44.221787 master-0 kubenswrapper[7728]: I0223 14:26:44.221760 7728 scope.go:117] "RemoveContainer" containerID="cc5b0e807a282b75c570fbfb71a174caf59e3ff1678808f33d1b9369bbe859b7" Feb 23 14:26:45.144997 master-0 kubenswrapper[7728]: I0223 14:26:45.144937 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7d7db75979-x4qnw_674041a2-e2b0-4286-88cc-f1b00571e3f3/network-operator/2.log" Feb 23 14:26:45.146013 master-0 kubenswrapper[7728]: I0223 14:26:45.145059 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" event={"ID":"674041a2-e2b0-4286-88cc-f1b00571e3f3","Type":"ContainerStarted","Data":"1b2ef32436a2cab762069d2b98a667383566db0becb9c8c8fa356e686ede57e6"} Feb 23 14:26:45.148668 master-0 kubenswrapper[7728]: I0223 14:26:45.148616 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-8bb99f4f-msq8f_482284fd-6911-4ba6-8d57-7966cc51117a/route-controller-manager/1.log" Feb 23 14:26:45.148847 master-0 kubenswrapper[7728]: I0223 14:26:45.148690 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" event={"ID":"482284fd-6911-4ba6-8d57-7966cc51117a","Type":"ContainerStarted","Data":"f3016d799a76a0e861077c16a305235311ad634f8054a70ca605ccf2e9c27c2c"} Feb 23 14:26:45.149556 master-0 kubenswrapper[7728]: I0223 14:26:45.149368 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:26:45.197582 master-0 kubenswrapper[7728]: I0223 14:26:45.197508 7728 patch_prober.go:28] interesting pod/authentication-operator-5bd7c86784-mlbx2 container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.10:8443/healthz\": dial tcp 10.128.0.10:8443: connect: connection refused" start-of-body= Feb 23 14:26:45.197807 master-0 kubenswrapper[7728]: I0223 14:26:45.197610 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" podUID="e2d00ece-7586-4346-adbb-eaae1aeda69e" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.10:8443/healthz\": dial tcp 10.128.0.10:8443: connect: connection refused" Feb 23 14:26:46.149827 master-0 kubenswrapper[7728]: I0223 14:26:46.149674 7728 patch_prober.go:28] interesting pod/route-controller-manager-8bb99f4f-msq8f container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.64:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:26:46.149827 master-0 kubenswrapper[7728]: I0223 14:26:46.149772 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" podUID="482284fd-6911-4ba6-8d57-7966cc51117a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.64:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:26:46.220638 master-0 kubenswrapper[7728]: I0223 14:26:46.220546 7728 scope.go:117] "RemoveContainer" containerID="41bf9ac4f6ba09181a226cfe2ad608e31e59bbb137b1b1ead593f9c6c980fde1" Feb 23 14:26:47.155883 master-0 kubenswrapper[7728]: I0223 14:26:47.155802 7728 patch_prober.go:28] interesting pod/route-controller-manager-8bb99f4f-msq8f container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.64:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:26:47.155883 master-0 kubenswrapper[7728]: I0223 14:26:47.155871 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" podUID="482284fd-6911-4ba6-8d57-7966cc51117a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.64:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:26:47.166151 master-0 kubenswrapper[7728]: I0223 14:26:47.166088 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5d87bf58c-nq2tz_961e4ecd-545b-4270-ae34-e733dec793b6/kube-apiserver-operator/2.log" Feb 23 14:26:47.166324 master-0 kubenswrapper[7728]: I0223 14:26:47.166187 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" event={"ID":"961e4ecd-545b-4270-ae34-e733dec793b6","Type":"ContainerStarted","Data":"fbac2782160d294364c1ee2f8ffe6113c931b2c5daaaa5bdf6584ea5ed99fbf7"} Feb 23 14:26:47.225651 master-0 kubenswrapper[7728]: I0223 14:26:47.225564 7728 scope.go:117] "RemoveContainer" containerID="50817d53493752eda9d4463a0b5a65e93107befbd5b1e52f265dd7c7f17a73bc" Feb 23 14:26:48.176064 master-0 kubenswrapper[7728]: I0223 14:26:48.175991 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-8586dccc9b-tvnmq_24829faf-50e8-45bb-abb0-7cc5ccf81080/openshift-apiserver-operator/2.log" Feb 23 14:26:48.176064 master-0 kubenswrapper[7728]: I0223 14:26:48.176049 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" event={"ID":"24829faf-50e8-45bb-abb0-7cc5ccf81080","Type":"ContainerStarted","Data":"0772c3729738659c84accce490c8a697b2dc528f52ebd605f9f2d93ea9004a9e"} Feb 23 14:26:48.220691 master-0 kubenswrapper[7728]: I0223 14:26:48.220633 7728 scope.go:117] "RemoveContainer" containerID="22cda996f9dec95459a017791c6284a80f33c42296156317930bcb92d3fc7877" Feb 23 14:26:48.221042 master-0 kubenswrapper[7728]: I0223 14:26:48.220989 7728 scope.go:117] "RemoveContainer" containerID="7c094f15ea265ac3d44bbebfb78fef4402e37dfe5737cb2bab354a08b8292a17" Feb 23 14:26:48.221474 master-0 kubenswrapper[7728]: E0223 14:26:48.221419 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-controller-manager-operator pod=kube-controller-manager-operator-7bcfbc574b-zdntd_openshift-kube-controller-manager-operator(cf04aca0-8174-4134-835d-37adf6a3b5ca)\"" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" podUID="cf04aca0-8174-4134-835d-37adf6a3b5ca" Feb 23 14:26:49.186831 master-0 kubenswrapper[7728]: I0223 14:26:49.186768 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-c48c8bf7c-vtnsw_b714a9df-026e-423d-a980-2569f0d92e47/service-ca-operator/2.log" Feb 23 14:26:49.187762 master-0 kubenswrapper[7728]: I0223 14:26:49.186856 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" event={"ID":"b714a9df-026e-423d-a980-2569f0d92e47","Type":"ContainerStarted","Data":"78089bf3eecbd92606ced5d402993a51e0b2a1c97d9b614aa158b112de869851"} Feb 23 14:26:49.220993 master-0 kubenswrapper[7728]: I0223 14:26:49.220902 7728 scope.go:117] "RemoveContainer" containerID="deffb87f96ddeeef2ceba573c92018620cd6c1adba32e1a82ff2a0041c126856" Feb 23 14:26:49.221292 master-0 kubenswrapper[7728]: I0223 14:26:49.221106 7728 scope.go:117] "RemoveContainer" containerID="5ace2c4cd314c92825fd50854b7a53f375dd7e9cb995361c6b2c717e5d66eb1b" Feb 23 14:26:49.221408 master-0 kubenswrapper[7728]: E0223 14:26:49.221354 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=openshift-config-operator pod=openshift-config-operator-6f47d587d6-55qjr_openshift-config-operator(92c63c95-e880-4f51-9858-7715343f7bd8)\"" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" podUID="92c63c95-e880-4f51-9858-7715343f7bd8" Feb 23 14:26:50.199254 master-0 kubenswrapper[7728]: I0223 14:26:50.199192 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-fc889cfd5-tw2r9_865ceedb-b19a-4f2f-b295-311e1b7a645e/kube-storage-version-migrator-operator/2.log" Feb 23 14:26:50.200017 master-0 kubenswrapper[7728]: I0223 14:26:50.199269 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" event={"ID":"865ceedb-b19a-4f2f-b295-311e1b7a645e","Type":"ContainerStarted","Data":"71999ac05242974407057ccc6a31a0e5b271cd2c31f7bd1c3957d9ede98478ca"} Feb 23 14:26:50.225533 master-0 kubenswrapper[7728]: I0223 14:26:50.220441 7728 scope.go:117] "RemoveContainer" containerID="a3d7f9dd773bb2be7eef32103651b05954025b8d3ad91ea82c3e56fc88bd34fd" Feb 23 14:26:51.209283 master-0 kubenswrapper[7728]: I0223 14:26:51.209213 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-545bf96f4d-fpwtm_8de1f285-47ac-42aa-8026-8addce656362/etcd-operator/2.log" Feb 23 14:26:51.210025 master-0 kubenswrapper[7728]: I0223 14:26:51.209356 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" event={"ID":"8de1f285-47ac-42aa-8026-8addce656362","Type":"ContainerStarted","Data":"7241c912fdeaeebbd740c87e12cc1e6c5e87f04c4117739e58873e1ff2b89ecc"} Feb 23 14:26:51.211172 master-0 kubenswrapper[7728]: I0223 14:26:51.211123 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-6569778c84-hsl6c_3488a7eb-5170-478c-9af7-490dbe0f514e/ingress-operator/2.log" Feb 23 14:26:51.211866 master-0 kubenswrapper[7728]: I0223 14:26:51.211748 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-6569778c84-hsl6c_3488a7eb-5170-478c-9af7-490dbe0f514e/ingress-operator/1.log" Feb 23 14:26:51.212237 master-0 kubenswrapper[7728]: I0223 14:26:51.212181 7728 generic.go:334] "Generic (PLEG): container finished" podID="3488a7eb-5170-478c-9af7-490dbe0f514e" containerID="9abb82bc4e660ae80bfe0a01a4c7f25bfcf62f98ac7b617e82941def46f78a19" exitCode=1 Feb 23 14:26:51.212308 master-0 kubenswrapper[7728]: I0223 14:26:51.212280 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" event={"ID":"3488a7eb-5170-478c-9af7-490dbe0f514e","Type":"ContainerDied","Data":"9abb82bc4e660ae80bfe0a01a4c7f25bfcf62f98ac7b617e82941def46f78a19"} Feb 23 14:26:51.212355 master-0 kubenswrapper[7728]: I0223 14:26:51.212331 7728 scope.go:117] "RemoveContainer" containerID="3ea10ed9b3b081ac010f974ab393059b11852999309c95ddd5381bd40e623b2e" Feb 23 14:26:51.213190 master-0 kubenswrapper[7728]: I0223 14:26:51.212908 7728 scope.go:117] "RemoveContainer" containerID="9abb82bc4e660ae80bfe0a01a4c7f25bfcf62f98ac7b617e82941def46f78a19" Feb 23 14:26:51.213190 master-0 kubenswrapper[7728]: E0223 14:26:51.213148 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ingress-operator pod=ingress-operator-6569778c84-hsl6c_openshift-ingress-operator(3488a7eb-5170-478c-9af7-490dbe0f514e)\"" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" podUID="3488a7eb-5170-478c-9af7-490dbe0f514e" Feb 23 14:26:52.222723 master-0 kubenswrapper[7728]: I0223 14:26:52.222640 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-6569778c84-hsl6c_3488a7eb-5170-478c-9af7-490dbe0f514e/ingress-operator/2.log" Feb 23 14:26:52.739074 master-0 kubenswrapper[7728]: I0223 14:26:52.738984 7728 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:26:52.739397 master-0 kubenswrapper[7728]: I0223 14:26:52.739109 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:26:52.739743 master-0 kubenswrapper[7728]: I0223 14:26:52.739712 7728 scope.go:117] "RemoveContainer" containerID="e7e20b5ba72ce778a4607a64cc8928522b6f4e4e91aae5a0ddbe4de3f2e8d4a6" Feb 23 14:26:52.739811 master-0 kubenswrapper[7728]: I0223 14:26:52.739766 7728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"16c3c1cec998b569c900e35bdf09e7492aae0331e38f36582875c9a5db092d13"} pod="kube-system/bootstrap-kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Feb 23 14:26:52.739865 master-0 kubenswrapper[7728]: I0223 14:26:52.739808 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="cluster-policy-controller" containerID="cri-o://16c3c1cec998b569c900e35bdf09e7492aae0331e38f36582875c9a5db092d13" gracePeriod=30 Feb 23 14:26:54.713731 master-0 kubenswrapper[7728]: I0223 14:26:54.713624 7728 patch_prober.go:28] interesting pod/route-controller-manager-8bb99f4f-msq8f container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.64:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:26:54.714296 master-0 kubenswrapper[7728]: I0223 14:26:54.713743 7728 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" podUID="482284fd-6911-4ba6-8d57-7966cc51117a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.64:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:26:56.199096 master-0 kubenswrapper[7728]: I0223 14:26:56.198996 7728 patch_prober.go:28] interesting pod/authentication-operator-5bd7c86784-mlbx2 container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.10:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:26:56.199942 master-0 kubenswrapper[7728]: I0223 14:26:56.199792 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" podUID="e2d00ece-7586-4346-adbb-eaae1aeda69e" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.10:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:26:56.199942 master-0 kubenswrapper[7728]: I0223 14:26:56.199937 7728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:26:56.201298 master-0 kubenswrapper[7728]: I0223 14:26:56.201131 7728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="authentication-operator" containerStatusID={"Type":"cri-o","ID":"0d84cce0e88dcc70d83b8dd67a4e91c62d1f30fed4495c32ca427288ab62004f"} pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" containerMessage="Container authentication-operator failed liveness probe, will be restarted" Feb 23 14:26:56.201298 master-0 kubenswrapper[7728]: I0223 14:26:56.201245 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" podUID="e2d00ece-7586-4346-adbb-eaae1aeda69e" containerName="authentication-operator" containerID="cri-o://0d84cce0e88dcc70d83b8dd67a4e91c62d1f30fed4495c32ca427288ab62004f" gracePeriod=30 Feb 23 14:26:58.221162 master-0 kubenswrapper[7728]: I0223 14:26:58.221078 7728 scope.go:117] "RemoveContainer" containerID="3d02c5174ccc3722ad642137b2ae38a4ad6beee863578d93948d8f75b3ffc635" Feb 23 14:26:58.222051 master-0 kubenswrapper[7728]: E0223 14:26:58.221417 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-6847bb4785-5fw2x_openshift-cluster-storage-operator(2e89a047-9ebc-459b-b7b3-e902c1fb0e17)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6847bb4785-5fw2x" podUID="2e89a047-9ebc-459b-b7b3-e902c1fb0e17" Feb 23 14:26:59.269924 master-0 kubenswrapper[7728]: I0223 14:26:59.269865 7728 generic.go:334] "Generic (PLEG): container finished" podID="c9ad9373c007a4fcd25e70622bdc8deb" containerID="16c3c1cec998b569c900e35bdf09e7492aae0331e38f36582875c9a5db092d13" exitCode=255 Feb 23 14:26:59.270358 master-0 kubenswrapper[7728]: I0223 14:26:59.269930 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"c9ad9373c007a4fcd25e70622bdc8deb","Type":"ContainerDied","Data":"16c3c1cec998b569c900e35bdf09e7492aae0331e38f36582875c9a5db092d13"} Feb 23 14:26:59.270358 master-0 kubenswrapper[7728]: I0223 14:26:59.270137 7728 scope.go:117] "RemoveContainer" containerID="db7391fbfbce6a5b2d3e6ea64af30eda73902a756b9cef12c2e6e67aee0522bb" Feb 23 14:26:59.519054 master-0 kubenswrapper[7728]: E0223 14:26:59.518953 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(c9ad9373c007a4fcd25e70622bdc8deb)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" Feb 23 14:27:00.220755 master-0 kubenswrapper[7728]: I0223 14:27:00.220678 7728 scope.go:117] "RemoveContainer" containerID="7c094f15ea265ac3d44bbebfb78fef4402e37dfe5737cb2bab354a08b8292a17" Feb 23 14:27:00.221050 master-0 kubenswrapper[7728]: E0223 14:27:00.221013 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-controller-manager-operator pod=kube-controller-manager-operator-7bcfbc574b-zdntd_openshift-kube-controller-manager-operator(cf04aca0-8174-4134-835d-37adf6a3b5ca)\"" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" podUID="cf04aca0-8174-4134-835d-37adf6a3b5ca" Feb 23 14:27:00.276446 master-0 kubenswrapper[7728]: I0223 14:27:00.276398 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"c9ad9373c007a4fcd25e70622bdc8deb","Type":"ContainerStarted","Data":"c04ab932ba2dd72c66746c81802bd4070c4841a3db596189904c0ef989dfa15b"} Feb 23 14:27:00.276979 master-0 kubenswrapper[7728]: I0223 14:27:00.276909 7728 scope.go:117] "RemoveContainer" containerID="e7e20b5ba72ce778a4607a64cc8928522b6f4e4e91aae5a0ddbe4de3f2e8d4a6" Feb 23 14:27:00.277185 master-0 kubenswrapper[7728]: E0223 14:27:00.277144 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(c9ad9373c007a4fcd25e70622bdc8deb)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" Feb 23 14:27:00.277805 master-0 kubenswrapper[7728]: I0223 14:27:00.277779 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-5bd7c86784-mlbx2_e2d00ece-7586-4346-adbb-eaae1aeda69e/authentication-operator/2.log" Feb 23 14:27:00.278086 master-0 kubenswrapper[7728]: I0223 14:27:00.278059 7728 generic.go:334] "Generic (PLEG): container finished" podID="e2d00ece-7586-4346-adbb-eaae1aeda69e" containerID="0d84cce0e88dcc70d83b8dd67a4e91c62d1f30fed4495c32ca427288ab62004f" exitCode=255 Feb 23 14:27:00.278150 master-0 kubenswrapper[7728]: I0223 14:27:00.278095 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" event={"ID":"e2d00ece-7586-4346-adbb-eaae1aeda69e","Type":"ContainerDied","Data":"0d84cce0e88dcc70d83b8dd67a4e91c62d1f30fed4495c32ca427288ab62004f"} Feb 23 14:27:00.278150 master-0 kubenswrapper[7728]: I0223 14:27:00.278121 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" event={"ID":"e2d00ece-7586-4346-adbb-eaae1aeda69e","Type":"ContainerStarted","Data":"50bc9e472f842efe389fe206aab27bbc1397e0168e5678badfa7c1d1c6991cdf"} Feb 23 14:27:00.278150 master-0 kubenswrapper[7728]: I0223 14:27:00.278138 7728 scope.go:117] "RemoveContainer" containerID="54011ffa8f000620849835983f9e2c00740786e321e8cd4e4de797c7d208b465" Feb 23 14:27:01.286130 master-0 kubenswrapper[7728]: I0223 14:27:01.286048 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-5bd7c86784-mlbx2_e2d00ece-7586-4346-adbb-eaae1aeda69e/authentication-operator/2.log" Feb 23 14:27:01.287234 master-0 kubenswrapper[7728]: I0223 14:27:01.286827 7728 scope.go:117] "RemoveContainer" containerID="e7e20b5ba72ce778a4607a64cc8928522b6f4e4e91aae5a0ddbe4de3f2e8d4a6" Feb 23 14:27:01.287325 master-0 kubenswrapper[7728]: E0223 14:27:01.287263 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(c9ad9373c007a4fcd25e70622bdc8deb)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" Feb 23 14:27:02.221856 master-0 kubenswrapper[7728]: I0223 14:27:02.221780 7728 scope.go:117] "RemoveContainer" containerID="5ace2c4cd314c92825fd50854b7a53f375dd7e9cb995361c6b2c717e5d66eb1b" Feb 23 14:27:02.227562 master-0 kubenswrapper[7728]: I0223 14:27:02.227474 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pfb9h"] Feb 23 14:27:02.227849 master-0 kubenswrapper[7728]: E0223 14:27:02.227805 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b993917a-bce8-4467-a09d-bfc923a90460" containerName="extract-utilities" Feb 23 14:27:02.227849 master-0 kubenswrapper[7728]: I0223 14:27:02.227828 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="b993917a-bce8-4467-a09d-bfc923a90460" containerName="extract-utilities" Feb 23 14:27:02.227849 master-0 kubenswrapper[7728]: E0223 14:27:02.227840 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f67ab24-82bc-4e71-b974-e25b819986c8" containerName="installer" Feb 23 14:27:02.227849 master-0 kubenswrapper[7728]: I0223 14:27:02.227849 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f67ab24-82bc-4e71-b974-e25b819986c8" containerName="installer" Feb 23 14:27:02.228104 master-0 kubenswrapper[7728]: E0223 14:27:02.227866 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fdb9885-7479-43b5-8613-b2857a798ade" containerName="installer" Feb 23 14:27:02.228104 master-0 kubenswrapper[7728]: I0223 14:27:02.227875 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fdb9885-7479-43b5-8613-b2857a798ade" containerName="installer" Feb 23 14:27:02.228104 master-0 kubenswrapper[7728]: E0223 14:27:02.227889 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a0bf4c4-8272-4f24-8e48-525d7a278b26" containerName="extract-content" Feb 23 14:27:02.228104 master-0 kubenswrapper[7728]: I0223 14:27:02.227897 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a0bf4c4-8272-4f24-8e48-525d7a278b26" containerName="extract-content" Feb 23 14:27:02.228104 master-0 kubenswrapper[7728]: E0223 14:27:02.227909 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a398c0f-1b6a-4836-a8b4-33b004350d84" containerName="extract-content" Feb 23 14:27:02.228104 master-0 kubenswrapper[7728]: I0223 14:27:02.227917 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a398c0f-1b6a-4836-a8b4-33b004350d84" containerName="extract-content" Feb 23 14:27:02.228104 master-0 kubenswrapper[7728]: E0223 14:27:02.227940 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a398c0f-1b6a-4836-a8b4-33b004350d84" containerName="registry-server" Feb 23 14:27:02.228104 master-0 kubenswrapper[7728]: I0223 14:27:02.227949 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a398c0f-1b6a-4836-a8b4-33b004350d84" containerName="registry-server" Feb 23 14:27:02.228104 master-0 kubenswrapper[7728]: E0223 14:27:02.227962 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a0bf4c4-8272-4f24-8e48-525d7a278b26" containerName="extract-utilities" Feb 23 14:27:02.228104 master-0 kubenswrapper[7728]: I0223 14:27:02.227972 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a0bf4c4-8272-4f24-8e48-525d7a278b26" containerName="extract-utilities" Feb 23 14:27:02.228104 master-0 kubenswrapper[7728]: E0223 14:27:02.227985 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b993917a-bce8-4467-a09d-bfc923a90460" containerName="registry-server" Feb 23 14:27:02.228104 master-0 kubenswrapper[7728]: I0223 14:27:02.227993 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="b993917a-bce8-4467-a09d-bfc923a90460" containerName="registry-server" Feb 23 14:27:02.228104 master-0 kubenswrapper[7728]: E0223 14:27:02.228007 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1bffce5-019a-4c97-85f2-929dc19a0bde" containerName="installer" Feb 23 14:27:02.228104 master-0 kubenswrapper[7728]: I0223 14:27:02.228015 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1bffce5-019a-4c97-85f2-929dc19a0bde" containerName="installer" Feb 23 14:27:02.228104 master-0 kubenswrapper[7728]: E0223 14:27:02.228023 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a398c0f-1b6a-4836-a8b4-33b004350d84" containerName="extract-utilities" Feb 23 14:27:02.228104 master-0 kubenswrapper[7728]: I0223 14:27:02.228032 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a398c0f-1b6a-4836-a8b4-33b004350d84" containerName="extract-utilities" Feb 23 14:27:02.228104 master-0 kubenswrapper[7728]: E0223 14:27:02.228047 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29f7b30e-bf6a-4e54-b009-1b0fcd830035" containerName="installer" Feb 23 14:27:02.228104 master-0 kubenswrapper[7728]: I0223 14:27:02.228056 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="29f7b30e-bf6a-4e54-b009-1b0fcd830035" containerName="installer" Feb 23 14:27:02.228104 master-0 kubenswrapper[7728]: E0223 14:27:02.228066 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="493a9ed3-6d64-489a-a68c-235b69a58782" containerName="installer" Feb 23 14:27:02.228104 master-0 kubenswrapper[7728]: I0223 14:27:02.228074 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="493a9ed3-6d64-489a-a68c-235b69a58782" containerName="installer" Feb 23 14:27:02.228104 master-0 kubenswrapper[7728]: E0223 14:27:02.228086 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b993917a-bce8-4467-a09d-bfc923a90460" containerName="extract-content" Feb 23 14:27:02.228104 master-0 kubenswrapper[7728]: I0223 14:27:02.228094 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="b993917a-bce8-4467-a09d-bfc923a90460" containerName="extract-content" Feb 23 14:27:02.228104 master-0 kubenswrapper[7728]: E0223 14:27:02.228103 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43ce2f82-05aa-4778-a444-848a408cf570" containerName="extract-content" Feb 23 14:27:02.228104 master-0 kubenswrapper[7728]: I0223 14:27:02.228111 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ce2f82-05aa-4778-a444-848a408cf570" containerName="extract-content" Feb 23 14:27:02.228104 master-0 kubenswrapper[7728]: E0223 14:27:02.228123 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43ce2f82-05aa-4778-a444-848a408cf570" containerName="extract-utilities" Feb 23 14:27:02.228104 master-0 kubenswrapper[7728]: I0223 14:27:02.228131 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ce2f82-05aa-4778-a444-848a408cf570" containerName="extract-utilities" Feb 23 14:27:02.229999 master-0 kubenswrapper[7728]: I0223 14:27:02.228240 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a0bf4c4-8272-4f24-8e48-525d7a278b26" containerName="extract-content" Feb 23 14:27:02.229999 master-0 kubenswrapper[7728]: I0223 14:27:02.228259 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1bffce5-019a-4c97-85f2-929dc19a0bde" containerName="installer" Feb 23 14:27:02.229999 master-0 kubenswrapper[7728]: I0223 14:27:02.228269 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="493a9ed3-6d64-489a-a68c-235b69a58782" containerName="installer" Feb 23 14:27:02.229999 master-0 kubenswrapper[7728]: I0223 14:27:02.228281 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f67ab24-82bc-4e71-b974-e25b819986c8" containerName="installer" Feb 23 14:27:02.229999 master-0 kubenswrapper[7728]: I0223 14:27:02.228295 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fdb9885-7479-43b5-8613-b2857a798ade" containerName="installer" Feb 23 14:27:02.229999 master-0 kubenswrapper[7728]: I0223 14:27:02.228308 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="b993917a-bce8-4467-a09d-bfc923a90460" containerName="registry-server" Feb 23 14:27:02.229999 master-0 kubenswrapper[7728]: I0223 14:27:02.228321 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a398c0f-1b6a-4836-a8b4-33b004350d84" containerName="registry-server" Feb 23 14:27:02.229999 master-0 kubenswrapper[7728]: I0223 14:27:02.228336 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="29f7b30e-bf6a-4e54-b009-1b0fcd830035" containerName="installer" Feb 23 14:27:02.229999 master-0 kubenswrapper[7728]: I0223 14:27:02.228350 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="43ce2f82-05aa-4778-a444-848a408cf570" containerName="extract-content" Feb 23 14:27:02.229999 master-0 kubenswrapper[7728]: I0223 14:27:02.229238 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pfb9h" Feb 23 14:27:02.232822 master-0 kubenswrapper[7728]: I0223 14:27:02.232749 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-hcgtz" Feb 23 14:27:02.256563 master-0 kubenswrapper[7728]: I0223 14:27:02.256425 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fjpvt"] Feb 23 14:27:02.259607 master-0 kubenswrapper[7728]: I0223 14:27:02.259543 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fjpvt" Feb 23 14:27:02.262678 master-0 kubenswrapper[7728]: I0223 14:27:02.262622 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tl6dk"] Feb 23 14:27:02.263979 master-0 kubenswrapper[7728]: I0223 14:27:02.263935 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-b9hzz" Feb 23 14:27:02.265384 master-0 kubenswrapper[7728]: I0223 14:27:02.265345 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tl6dk" Feb 23 14:27:02.266611 master-0 kubenswrapper[7728]: I0223 14:27:02.266570 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cdrlk"] Feb 23 14:27:02.267383 master-0 kubenswrapper[7728]: I0223 14:27:02.267358 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-xc5pk" Feb 23 14:27:02.268596 master-0 kubenswrapper[7728]: I0223 14:27:02.268562 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cdrlk" Feb 23 14:27:02.269872 master-0 kubenswrapper[7728]: I0223 14:27:02.269829 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fjpvt"] Feb 23 14:27:02.270168 master-0 kubenswrapper[7728]: I0223 14:27:02.270102 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-jgw8n" Feb 23 14:27:02.288617 master-0 kubenswrapper[7728]: I0223 14:27:02.288545 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pfb9h"] Feb 23 14:27:02.296933 master-0 kubenswrapper[7728]: I0223 14:27:02.296546 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cdrlk"] Feb 23 14:27:02.302652 master-0 kubenswrapper[7728]: I0223 14:27:02.302547 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tl6dk"] Feb 23 14:27:02.410188 master-0 kubenswrapper[7728]: I0223 14:27:02.410128 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzn8r\" (UniqueName: \"kubernetes.io/projected/c84f66f0-207e-436a-8f4e-d1971fa815eb-kube-api-access-gzn8r\") pod \"redhat-operators-tl6dk\" (UID: \"c84f66f0-207e-436a-8f4e-d1971fa815eb\") " pod="openshift-marketplace/redhat-operators-tl6dk" Feb 23 14:27:02.410310 master-0 kubenswrapper[7728]: I0223 14:27:02.410222 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vxqg\" (UniqueName: \"kubernetes.io/projected/adbf8f71-f005-4e5b-9de1-e49559cf7386-kube-api-access-5vxqg\") pod \"community-operators-fjpvt\" (UID: \"adbf8f71-f005-4e5b-9de1-e49559cf7386\") " pod="openshift-marketplace/community-operators-fjpvt" Feb 23 14:27:02.410310 master-0 kubenswrapper[7728]: I0223 14:27:02.410248 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f86e881-275c-4387-a23a-06c559c8f1e8-utilities\") pod \"redhat-marketplace-pfb9h\" (UID: \"3f86e881-275c-4387-a23a-06c559c8f1e8\") " pod="openshift-marketplace/redhat-marketplace-pfb9h" Feb 23 14:27:02.410397 master-0 kubenswrapper[7728]: I0223 14:27:02.410318 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c84f66f0-207e-436a-8f4e-d1971fa815eb-utilities\") pod \"redhat-operators-tl6dk\" (UID: \"c84f66f0-207e-436a-8f4e-d1971fa815eb\") " pod="openshift-marketplace/redhat-operators-tl6dk" Feb 23 14:27:02.411024 master-0 kubenswrapper[7728]: I0223 14:27:02.410985 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efdde2df-cd07-4898-88f4-7ecde0e04d7a-utilities\") pod \"certified-operators-cdrlk\" (UID: \"efdde2df-cd07-4898-88f4-7ecde0e04d7a\") " pod="openshift-marketplace/certified-operators-cdrlk" Feb 23 14:27:02.411113 master-0 kubenswrapper[7728]: I0223 14:27:02.411051 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adbf8f71-f005-4e5b-9de1-e49559cf7386-utilities\") pod \"community-operators-fjpvt\" (UID: \"adbf8f71-f005-4e5b-9de1-e49559cf7386\") " pod="openshift-marketplace/community-operators-fjpvt" Feb 23 14:27:02.411113 master-0 kubenswrapper[7728]: I0223 14:27:02.411069 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c84f66f0-207e-436a-8f4e-d1971fa815eb-catalog-content\") pod \"redhat-operators-tl6dk\" (UID: \"c84f66f0-207e-436a-8f4e-d1971fa815eb\") " pod="openshift-marketplace/redhat-operators-tl6dk" Feb 23 14:27:02.411209 master-0 kubenswrapper[7728]: I0223 14:27:02.411137 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f86e881-275c-4387-a23a-06c559c8f1e8-catalog-content\") pod \"redhat-marketplace-pfb9h\" (UID: \"3f86e881-275c-4387-a23a-06c559c8f1e8\") " pod="openshift-marketplace/redhat-marketplace-pfb9h" Feb 23 14:27:02.411450 master-0 kubenswrapper[7728]: I0223 14:27:02.411422 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adbf8f71-f005-4e5b-9de1-e49559cf7386-catalog-content\") pod \"community-operators-fjpvt\" (UID: \"adbf8f71-f005-4e5b-9de1-e49559cf7386\") " pod="openshift-marketplace/community-operators-fjpvt" Feb 23 14:27:02.411547 master-0 kubenswrapper[7728]: I0223 14:27:02.411533 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efdde2df-cd07-4898-88f4-7ecde0e04d7a-catalog-content\") pod \"certified-operators-cdrlk\" (UID: \"efdde2df-cd07-4898-88f4-7ecde0e04d7a\") " pod="openshift-marketplace/certified-operators-cdrlk" Feb 23 14:27:02.412020 master-0 kubenswrapper[7728]: I0223 14:27:02.411750 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpc4t\" (UniqueName: \"kubernetes.io/projected/efdde2df-cd07-4898-88f4-7ecde0e04d7a-kube-api-access-tpc4t\") pod \"certified-operators-cdrlk\" (UID: \"efdde2df-cd07-4898-88f4-7ecde0e04d7a\") " pod="openshift-marketplace/certified-operators-cdrlk" Feb 23 14:27:02.412020 master-0 kubenswrapper[7728]: I0223 14:27:02.411792 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp8kk\" (UniqueName: \"kubernetes.io/projected/3f86e881-275c-4387-a23a-06c559c8f1e8-kube-api-access-wp8kk\") pod \"redhat-marketplace-pfb9h\" (UID: \"3f86e881-275c-4387-a23a-06c559c8f1e8\") " pod="openshift-marketplace/redhat-marketplace-pfb9h" Feb 23 14:27:02.512974 master-0 kubenswrapper[7728]: I0223 14:27:02.512766 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efdde2df-cd07-4898-88f4-7ecde0e04d7a-utilities\") pod \"certified-operators-cdrlk\" (UID: \"efdde2df-cd07-4898-88f4-7ecde0e04d7a\") " pod="openshift-marketplace/certified-operators-cdrlk" Feb 23 14:27:02.512974 master-0 kubenswrapper[7728]: I0223 14:27:02.512835 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adbf8f71-f005-4e5b-9de1-e49559cf7386-utilities\") pod \"community-operators-fjpvt\" (UID: \"adbf8f71-f005-4e5b-9de1-e49559cf7386\") " pod="openshift-marketplace/community-operators-fjpvt" Feb 23 14:27:02.512974 master-0 kubenswrapper[7728]: I0223 14:27:02.512856 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c84f66f0-207e-436a-8f4e-d1971fa815eb-catalog-content\") pod \"redhat-operators-tl6dk\" (UID: \"c84f66f0-207e-436a-8f4e-d1971fa815eb\") " pod="openshift-marketplace/redhat-operators-tl6dk" Feb 23 14:27:02.512974 master-0 kubenswrapper[7728]: I0223 14:27:02.512881 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f86e881-275c-4387-a23a-06c559c8f1e8-catalog-content\") pod \"redhat-marketplace-pfb9h\" (UID: \"3f86e881-275c-4387-a23a-06c559c8f1e8\") " pod="openshift-marketplace/redhat-marketplace-pfb9h" Feb 23 14:27:02.512974 master-0 kubenswrapper[7728]: I0223 14:27:02.512903 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adbf8f71-f005-4e5b-9de1-e49559cf7386-catalog-content\") pod \"community-operators-fjpvt\" (UID: \"adbf8f71-f005-4e5b-9de1-e49559cf7386\") " pod="openshift-marketplace/community-operators-fjpvt" Feb 23 14:27:02.512974 master-0 kubenswrapper[7728]: I0223 14:27:02.512950 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efdde2df-cd07-4898-88f4-7ecde0e04d7a-catalog-content\") pod \"certified-operators-cdrlk\" (UID: \"efdde2df-cd07-4898-88f4-7ecde0e04d7a\") " pod="openshift-marketplace/certified-operators-cdrlk" Feb 23 14:27:02.512974 master-0 kubenswrapper[7728]: I0223 14:27:02.512977 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpc4t\" (UniqueName: \"kubernetes.io/projected/efdde2df-cd07-4898-88f4-7ecde0e04d7a-kube-api-access-tpc4t\") pod \"certified-operators-cdrlk\" (UID: \"efdde2df-cd07-4898-88f4-7ecde0e04d7a\") " pod="openshift-marketplace/certified-operators-cdrlk" Feb 23 14:27:02.513565 master-0 kubenswrapper[7728]: I0223 14:27:02.513005 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp8kk\" (UniqueName: \"kubernetes.io/projected/3f86e881-275c-4387-a23a-06c559c8f1e8-kube-api-access-wp8kk\") pod \"redhat-marketplace-pfb9h\" (UID: \"3f86e881-275c-4387-a23a-06c559c8f1e8\") " pod="openshift-marketplace/redhat-marketplace-pfb9h" Feb 23 14:27:02.513565 master-0 kubenswrapper[7728]: I0223 14:27:02.513030 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzn8r\" (UniqueName: \"kubernetes.io/projected/c84f66f0-207e-436a-8f4e-d1971fa815eb-kube-api-access-gzn8r\") pod \"redhat-operators-tl6dk\" (UID: \"c84f66f0-207e-436a-8f4e-d1971fa815eb\") " pod="openshift-marketplace/redhat-operators-tl6dk" Feb 23 14:27:02.513565 master-0 kubenswrapper[7728]: I0223 14:27:02.513056 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vxqg\" (UniqueName: \"kubernetes.io/projected/adbf8f71-f005-4e5b-9de1-e49559cf7386-kube-api-access-5vxqg\") pod \"community-operators-fjpvt\" (UID: \"adbf8f71-f005-4e5b-9de1-e49559cf7386\") " pod="openshift-marketplace/community-operators-fjpvt" Feb 23 14:27:02.513565 master-0 kubenswrapper[7728]: I0223 14:27:02.513076 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f86e881-275c-4387-a23a-06c559c8f1e8-utilities\") pod \"redhat-marketplace-pfb9h\" (UID: \"3f86e881-275c-4387-a23a-06c559c8f1e8\") " pod="openshift-marketplace/redhat-marketplace-pfb9h" Feb 23 14:27:02.513565 master-0 kubenswrapper[7728]: I0223 14:27:02.513097 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c84f66f0-207e-436a-8f4e-d1971fa815eb-utilities\") pod \"redhat-operators-tl6dk\" (UID: \"c84f66f0-207e-436a-8f4e-d1971fa815eb\") " pod="openshift-marketplace/redhat-operators-tl6dk" Feb 23 14:27:02.513565 master-0 kubenswrapper[7728]: I0223 14:27:02.513328 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efdde2df-cd07-4898-88f4-7ecde0e04d7a-utilities\") pod \"certified-operators-cdrlk\" (UID: \"efdde2df-cd07-4898-88f4-7ecde0e04d7a\") " pod="openshift-marketplace/certified-operators-cdrlk" Feb 23 14:27:02.513839 master-0 kubenswrapper[7728]: I0223 14:27:02.513604 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c84f66f0-207e-436a-8f4e-d1971fa815eb-utilities\") pod \"redhat-operators-tl6dk\" (UID: \"c84f66f0-207e-436a-8f4e-d1971fa815eb\") " pod="openshift-marketplace/redhat-operators-tl6dk" Feb 23 14:27:02.514192 master-0 kubenswrapper[7728]: I0223 14:27:02.514157 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adbf8f71-f005-4e5b-9de1-e49559cf7386-catalog-content\") pod \"community-operators-fjpvt\" (UID: \"adbf8f71-f005-4e5b-9de1-e49559cf7386\") " pod="openshift-marketplace/community-operators-fjpvt" Feb 23 14:27:02.514296 master-0 kubenswrapper[7728]: I0223 14:27:02.514263 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adbf8f71-f005-4e5b-9de1-e49559cf7386-utilities\") pod \"community-operators-fjpvt\" (UID: \"adbf8f71-f005-4e5b-9de1-e49559cf7386\") " pod="openshift-marketplace/community-operators-fjpvt" Feb 23 14:27:02.514704 master-0 kubenswrapper[7728]: I0223 14:27:02.514661 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f86e881-275c-4387-a23a-06c559c8f1e8-utilities\") pod \"redhat-marketplace-pfb9h\" (UID: \"3f86e881-275c-4387-a23a-06c559c8f1e8\") " pod="openshift-marketplace/redhat-marketplace-pfb9h" Feb 23 14:27:02.514793 master-0 kubenswrapper[7728]: I0223 14:27:02.514757 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efdde2df-cd07-4898-88f4-7ecde0e04d7a-catalog-content\") pod \"certified-operators-cdrlk\" (UID: \"efdde2df-cd07-4898-88f4-7ecde0e04d7a\") " pod="openshift-marketplace/certified-operators-cdrlk" Feb 23 14:27:02.514949 master-0 kubenswrapper[7728]: I0223 14:27:02.514905 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f86e881-275c-4387-a23a-06c559c8f1e8-catalog-content\") pod \"redhat-marketplace-pfb9h\" (UID: \"3f86e881-275c-4387-a23a-06c559c8f1e8\") " pod="openshift-marketplace/redhat-marketplace-pfb9h" Feb 23 14:27:02.515567 master-0 kubenswrapper[7728]: I0223 14:27:02.515353 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c84f66f0-207e-436a-8f4e-d1971fa815eb-catalog-content\") pod \"redhat-operators-tl6dk\" (UID: \"c84f66f0-207e-436a-8f4e-d1971fa815eb\") " pod="openshift-marketplace/redhat-operators-tl6dk" Feb 23 14:27:02.533642 master-0 kubenswrapper[7728]: I0223 14:27:02.533593 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzn8r\" (UniqueName: \"kubernetes.io/projected/c84f66f0-207e-436a-8f4e-d1971fa815eb-kube-api-access-gzn8r\") pod \"redhat-operators-tl6dk\" (UID: \"c84f66f0-207e-436a-8f4e-d1971fa815eb\") " pod="openshift-marketplace/redhat-operators-tl6dk" Feb 23 14:27:02.533818 master-0 kubenswrapper[7728]: I0223 14:27:02.533766 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vxqg\" (UniqueName: \"kubernetes.io/projected/adbf8f71-f005-4e5b-9de1-e49559cf7386-kube-api-access-5vxqg\") pod \"community-operators-fjpvt\" (UID: \"adbf8f71-f005-4e5b-9de1-e49559cf7386\") " pod="openshift-marketplace/community-operators-fjpvt" Feb 23 14:27:02.539234 master-0 kubenswrapper[7728]: I0223 14:27:02.539172 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpc4t\" (UniqueName: \"kubernetes.io/projected/efdde2df-cd07-4898-88f4-7ecde0e04d7a-kube-api-access-tpc4t\") pod \"certified-operators-cdrlk\" (UID: \"efdde2df-cd07-4898-88f4-7ecde0e04d7a\") " pod="openshift-marketplace/certified-operators-cdrlk" Feb 23 14:27:02.542144 master-0 kubenswrapper[7728]: I0223 14:27:02.542088 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp8kk\" (UniqueName: \"kubernetes.io/projected/3f86e881-275c-4387-a23a-06c559c8f1e8-kube-api-access-wp8kk\") pod \"redhat-marketplace-pfb9h\" (UID: \"3f86e881-275c-4387-a23a-06c559c8f1e8\") " pod="openshift-marketplace/redhat-marketplace-pfb9h" Feb 23 14:27:02.558268 master-0 kubenswrapper[7728]: I0223 14:27:02.558208 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pfb9h" Feb 23 14:27:02.638102 master-0 kubenswrapper[7728]: I0223 14:27:02.638018 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fjpvt" Feb 23 14:27:02.686440 master-0 kubenswrapper[7728]: I0223 14:27:02.685662 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tl6dk" Feb 23 14:27:02.696909 master-0 kubenswrapper[7728]: I0223 14:27:02.696845 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cdrlk" Feb 23 14:27:03.020274 master-0 kubenswrapper[7728]: I0223 14:27:03.018117 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pfb9h"] Feb 23 14:27:03.024747 master-0 kubenswrapper[7728]: W0223 14:27:03.024696 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f86e881_275c_4387_a23a_06c559c8f1e8.slice/crio-2421d3a73005cb81482c335c859228ad362145060855ec39106d04eb50279bdb WatchSource:0}: Error finding container 2421d3a73005cb81482c335c859228ad362145060855ec39106d04eb50279bdb: Status 404 returned error can't find the container with id 2421d3a73005cb81482c335c859228ad362145060855ec39106d04eb50279bdb Feb 23 14:27:03.123222 master-0 kubenswrapper[7728]: I0223 14:27:03.123183 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fjpvt"] Feb 23 14:27:03.127108 master-0 kubenswrapper[7728]: W0223 14:27:03.127054 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadbf8f71_f005_4e5b_9de1_e49559cf7386.slice/crio-244c9349c0c82d28b67e2cfc680e10b4528e1ddb2f6ad558456c92eee9746fa9 WatchSource:0}: Error finding container 244c9349c0c82d28b67e2cfc680e10b4528e1ddb2f6ad558456c92eee9746fa9: Status 404 returned error can't find the container with id 244c9349c0c82d28b67e2cfc680e10b4528e1ddb2f6ad558456c92eee9746fa9 Feb 23 14:27:03.191916 master-0 kubenswrapper[7728]: I0223 14:27:03.191694 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tl6dk"] Feb 23 14:27:03.194924 master-0 kubenswrapper[7728]: I0223 14:27:03.194843 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cdrlk"] Feb 23 14:27:03.299396 master-0 kubenswrapper[7728]: I0223 14:27:03.299343 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fjpvt" event={"ID":"adbf8f71-f005-4e5b-9de1-e49559cf7386","Type":"ContainerStarted","Data":"244c9349c0c82d28b67e2cfc680e10b4528e1ddb2f6ad558456c92eee9746fa9"} Feb 23 14:27:03.300532 master-0 kubenswrapper[7728]: I0223 14:27:03.300464 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tl6dk" event={"ID":"c84f66f0-207e-436a-8f4e-d1971fa815eb","Type":"ContainerStarted","Data":"223e6055ac3ccbf4f5095a5d877f5de6b8592fbf41a24e3985f9a14f56619a70"} Feb 23 14:27:03.302769 master-0 kubenswrapper[7728]: I0223 14:27:03.302732 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-6f47d587d6-55qjr_92c63c95-e880-4f51-9858-7715343f7bd8/openshift-config-operator/4.log" Feb 23 14:27:03.303293 master-0 kubenswrapper[7728]: I0223 14:27:03.303265 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" event={"ID":"92c63c95-e880-4f51-9858-7715343f7bd8","Type":"ContainerStarted","Data":"b6071b4962f53457c312546180e687a36b9dd499d861714917a6ca4caba881c5"} Feb 23 14:27:03.303630 master-0 kubenswrapper[7728]: I0223 14:27:03.303528 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" Feb 23 14:27:03.304907 master-0 kubenswrapper[7728]: I0223 14:27:03.304875 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cdrlk" event={"ID":"efdde2df-cd07-4898-88f4-7ecde0e04d7a","Type":"ContainerStarted","Data":"4c87997a2f68dd1175880f954711e05356d4ed55a7a6f3583b752f9d11da5e55"} Feb 23 14:27:03.307910 master-0 kubenswrapper[7728]: I0223 14:27:03.307859 7728 generic.go:334] "Generic (PLEG): container finished" podID="3f86e881-275c-4387-a23a-06c559c8f1e8" containerID="d34072e7de379cff9844dcf3892b8004b156d1d5b0fd3f937aa6aac0ab1f96bb" exitCode=0 Feb 23 14:27:03.308002 master-0 kubenswrapper[7728]: I0223 14:27:03.307918 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pfb9h" event={"ID":"3f86e881-275c-4387-a23a-06c559c8f1e8","Type":"ContainerDied","Data":"d34072e7de379cff9844dcf3892b8004b156d1d5b0fd3f937aa6aac0ab1f96bb"} Feb 23 14:27:03.308002 master-0 kubenswrapper[7728]: I0223 14:27:03.307950 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pfb9h" event={"ID":"3f86e881-275c-4387-a23a-06c559c8f1e8","Type":"ContainerStarted","Data":"2421d3a73005cb81482c335c859228ad362145060855ec39106d04eb50279bdb"} Feb 23 14:27:03.310291 master-0 kubenswrapper[7728]: I0223 14:27:03.310266 7728 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 14:27:03.718572 master-0 kubenswrapper[7728]: I0223 14:27:03.718522 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:27:04.316523 master-0 kubenswrapper[7728]: I0223 14:27:04.316411 7728 generic.go:334] "Generic (PLEG): container finished" podID="efdde2df-cd07-4898-88f4-7ecde0e04d7a" containerID="9f45d1abf22f4312045c038d142f9bba7b80278a0653e6693862acdb73f898f7" exitCode=0 Feb 23 14:27:04.326010 master-0 kubenswrapper[7728]: I0223 14:27:04.316511 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cdrlk" event={"ID":"efdde2df-cd07-4898-88f4-7ecde0e04d7a","Type":"ContainerDied","Data":"9f45d1abf22f4312045c038d142f9bba7b80278a0653e6693862acdb73f898f7"} Feb 23 14:27:04.326010 master-0 kubenswrapper[7728]: I0223 14:27:04.318612 7728 generic.go:334] "Generic (PLEG): container finished" podID="adbf8f71-f005-4e5b-9de1-e49559cf7386" containerID="ce4e5fa17dbd27ef1bc8352b48b01fe69433b598b77046f093daa7a26c560341" exitCode=0 Feb 23 14:27:04.326010 master-0 kubenswrapper[7728]: I0223 14:27:04.319219 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fjpvt" event={"ID":"adbf8f71-f005-4e5b-9de1-e49559cf7386","Type":"ContainerDied","Data":"ce4e5fa17dbd27ef1bc8352b48b01fe69433b598b77046f093daa7a26c560341"} Feb 23 14:27:04.326010 master-0 kubenswrapper[7728]: I0223 14:27:04.322097 7728 generic.go:334] "Generic (PLEG): container finished" podID="c84f66f0-207e-436a-8f4e-d1971fa815eb" containerID="d64c027236a9d4db40738067d4f95aca5d20f4b4daf356084c952897b507ab24" exitCode=0 Feb 23 14:27:04.326010 master-0 kubenswrapper[7728]: I0223 14:27:04.322195 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tl6dk" event={"ID":"c84f66f0-207e-436a-8f4e-d1971fa815eb","Type":"ContainerDied","Data":"d64c027236a9d4db40738067d4f95aca5d20f4b4daf356084c952897b507ab24"} Feb 23 14:27:05.344264 master-0 kubenswrapper[7728]: I0223 14:27:05.344188 7728 generic.go:334] "Generic (PLEG): container finished" podID="3f86e881-275c-4387-a23a-06c559c8f1e8" containerID="dd1e3de2ec0845831f5cf402eb5eb3565db815b7cdf1b7b1df427c27e6e8d027" exitCode=0 Feb 23 14:27:05.344941 master-0 kubenswrapper[7728]: I0223 14:27:05.344317 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pfb9h" event={"ID":"3f86e881-275c-4387-a23a-06c559c8f1e8","Type":"ContainerDied","Data":"dd1e3de2ec0845831f5cf402eb5eb3565db815b7cdf1b7b1df427c27e6e8d027"} Feb 23 14:27:05.353066 master-0 kubenswrapper[7728]: I0223 14:27:05.353004 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tl6dk" event={"ID":"c84f66f0-207e-436a-8f4e-d1971fa815eb","Type":"ContainerStarted","Data":"ee32dcd7aedbc475bca90e7ca9047218b836991a39e06b3f0a1f7b5cd0f0132b"} Feb 23 14:27:05.355761 master-0 kubenswrapper[7728]: I0223 14:27:05.355729 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cdrlk" event={"ID":"efdde2df-cd07-4898-88f4-7ecde0e04d7a","Type":"ContainerStarted","Data":"6fcd03423b6cd43db7e56c4b05dae69802a9ed15c0ea5b94c83b762001bf26a8"} Feb 23 14:27:05.910254 master-0 kubenswrapper[7728]: I0223 14:27:05.910176 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:27:05.910977 master-0 kubenswrapper[7728]: I0223 14:27:05.910938 7728 scope.go:117] "RemoveContainer" containerID="e7e20b5ba72ce778a4607a64cc8928522b6f4e4e91aae5a0ddbe4de3f2e8d4a6" Feb 23 14:27:05.911311 master-0 kubenswrapper[7728]: E0223 14:27:05.911270 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(c9ad9373c007a4fcd25e70622bdc8deb)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" Feb 23 14:27:05.993239 master-0 kubenswrapper[7728]: I0223 14:27:05.993175 7728 patch_prober.go:28] interesting pod/machine-config-daemon-fhcgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 14:27:05.993437 master-0 kubenswrapper[7728]: I0223 14:27:05.993267 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" podUID="76c67569-3a72-4de9-87cd-432a4607b15b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 14:27:06.220530 master-0 kubenswrapper[7728]: I0223 14:27:06.220427 7728 scope.go:117] "RemoveContainer" containerID="9abb82bc4e660ae80bfe0a01a4c7f25bfcf62f98ac7b617e82941def46f78a19" Feb 23 14:27:06.220990 master-0 kubenswrapper[7728]: E0223 14:27:06.220929 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ingress-operator pod=ingress-operator-6569778c84-hsl6c_openshift-ingress-operator(3488a7eb-5170-478c-9af7-490dbe0f514e)\"" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" podUID="3488a7eb-5170-478c-9af7-490dbe0f514e" Feb 23 14:27:06.368428 master-0 kubenswrapper[7728]: I0223 14:27:06.368342 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pfb9h" event={"ID":"3f86e881-275c-4387-a23a-06c559c8f1e8","Type":"ContainerStarted","Data":"9deb346771c78fcd8dfbd7481f6daa1fde9e767e61631dcce1b8d456640db83f"} Feb 23 14:27:06.371096 master-0 kubenswrapper[7728]: I0223 14:27:06.371038 7728 generic.go:334] "Generic (PLEG): container finished" podID="adbf8f71-f005-4e5b-9de1-e49559cf7386" containerID="82543a2650cb25bd6bfd3b4eeba404e288e77827c47ab01f21f6a40862867df7" exitCode=0 Feb 23 14:27:06.371203 master-0 kubenswrapper[7728]: I0223 14:27:06.371128 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fjpvt" event={"ID":"adbf8f71-f005-4e5b-9de1-e49559cf7386","Type":"ContainerDied","Data":"82543a2650cb25bd6bfd3b4eeba404e288e77827c47ab01f21f6a40862867df7"} Feb 23 14:27:06.375871 master-0 kubenswrapper[7728]: I0223 14:27:06.375081 7728 generic.go:334] "Generic (PLEG): container finished" podID="c84f66f0-207e-436a-8f4e-d1971fa815eb" containerID="ee32dcd7aedbc475bca90e7ca9047218b836991a39e06b3f0a1f7b5cd0f0132b" exitCode=0 Feb 23 14:27:06.375871 master-0 kubenswrapper[7728]: I0223 14:27:06.375210 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tl6dk" event={"ID":"c84f66f0-207e-436a-8f4e-d1971fa815eb","Type":"ContainerDied","Data":"ee32dcd7aedbc475bca90e7ca9047218b836991a39e06b3f0a1f7b5cd0f0132b"} Feb 23 14:27:06.378812 master-0 kubenswrapper[7728]: I0223 14:27:06.378758 7728 generic.go:334] "Generic (PLEG): container finished" podID="efdde2df-cd07-4898-88f4-7ecde0e04d7a" containerID="6fcd03423b6cd43db7e56c4b05dae69802a9ed15c0ea5b94c83b762001bf26a8" exitCode=0 Feb 23 14:27:06.378812 master-0 kubenswrapper[7728]: I0223 14:27:06.378805 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cdrlk" event={"ID":"efdde2df-cd07-4898-88f4-7ecde0e04d7a","Type":"ContainerDied","Data":"6fcd03423b6cd43db7e56c4b05dae69802a9ed15c0ea5b94c83b762001bf26a8"} Feb 23 14:27:06.379049 master-0 kubenswrapper[7728]: I0223 14:27:06.378838 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cdrlk" event={"ID":"efdde2df-cd07-4898-88f4-7ecde0e04d7a","Type":"ContainerStarted","Data":"bf5dc0832d3e371388160283edfa78766c830a1d36b70ceb5328f1c5793751db"} Feb 23 14:27:06.418942 master-0 kubenswrapper[7728]: I0223 14:27:06.418597 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pfb9h" podStartSLOduration=43.882370641 podStartE2EDuration="46.418559462s" podCreationTimestamp="2026-02-23 14:26:20 +0000 UTC" firstStartedPulling="2026-02-23 14:27:03.310201152 +0000 UTC m=+516.272862448" lastFinishedPulling="2026-02-23 14:27:05.846389973 +0000 UTC m=+518.809051269" observedRunningTime="2026-02-23 14:27:06.415981613 +0000 UTC m=+519.378642919" watchObservedRunningTime="2026-02-23 14:27:06.418559462 +0000 UTC m=+519.381220788" Feb 23 14:27:06.447795 master-0 kubenswrapper[7728]: I0223 14:27:06.447696 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cdrlk" podStartSLOduration=42.968631897 podStartE2EDuration="44.447667854s" podCreationTimestamp="2026-02-23 14:26:22 +0000 UTC" firstStartedPulling="2026-02-23 14:27:04.317893338 +0000 UTC m=+517.280554634" lastFinishedPulling="2026-02-23 14:27:05.796929255 +0000 UTC m=+518.759590591" observedRunningTime="2026-02-23 14:27:06.445465955 +0000 UTC m=+519.408127281" watchObservedRunningTime="2026-02-23 14:27:06.447667854 +0000 UTC m=+519.410329160" Feb 23 14:27:06.492059 master-0 kubenswrapper[7728]: I0223 14:27:06.491874 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" Feb 23 14:27:07.387459 master-0 kubenswrapper[7728]: I0223 14:27:07.387388 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fjpvt" event={"ID":"adbf8f71-f005-4e5b-9de1-e49559cf7386","Type":"ContainerStarted","Data":"73be864b5b54929659f002b9332c455fcb5b8b1df377e86d1c53be302d99b753"} Feb 23 14:27:07.390098 master-0 kubenswrapper[7728]: I0223 14:27:07.390056 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tl6dk" event={"ID":"c84f66f0-207e-436a-8f4e-d1971fa815eb","Type":"ContainerStarted","Data":"7684169f87c9b5161cf129865dea04bd28679344f72d4a78a47e2ebf7f12ba2b"} Feb 23 14:27:07.408550 master-0 kubenswrapper[7728]: I0223 14:27:07.408429 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fjpvt" podStartSLOduration=44.97995484 podStartE2EDuration="47.408410568s" podCreationTimestamp="2026-02-23 14:26:20 +0000 UTC" firstStartedPulling="2026-02-23 14:27:04.320819147 +0000 UTC m=+517.283480433" lastFinishedPulling="2026-02-23 14:27:06.749274855 +0000 UTC m=+519.711936161" observedRunningTime="2026-02-23 14:27:07.40811212 +0000 UTC m=+520.370773426" watchObservedRunningTime="2026-02-23 14:27:07.408410568 +0000 UTC m=+520.371071874" Feb 23 14:27:07.432041 master-0 kubenswrapper[7728]: I0223 14:27:07.431961 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tl6dk" podStartSLOduration=42.981943913 podStartE2EDuration="45.43194332s" podCreationTimestamp="2026-02-23 14:26:22 +0000 UTC" firstStartedPulling="2026-02-23 14:27:04.323022536 +0000 UTC m=+517.285683862" lastFinishedPulling="2026-02-23 14:27:06.773021973 +0000 UTC m=+519.735683269" observedRunningTime="2026-02-23 14:27:07.431006425 +0000 UTC m=+520.393667731" watchObservedRunningTime="2026-02-23 14:27:07.43194332 +0000 UTC m=+520.394604616" Feb 23 14:27:08.819046 master-0 kubenswrapper[7728]: I0223 14:27:08.819006 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-5bd7c86784-mlbx2_e2d00ece-7586-4346-adbb-eaae1aeda69e/authentication-operator/2.log" Feb 23 14:27:08.827360 master-0 kubenswrapper[7728]: I0223 14:27:08.827324 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-5bd7c86784-mlbx2_e2d00ece-7586-4346-adbb-eaae1aeda69e/authentication-operator/3.log" Feb 23 14:27:08.893917 master-0 kubenswrapper[7728]: I0223 14:27:08.893877 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-67f44b4d6d-7lpn4_ea0b3538-9a7d-4995-b628-2d63f21d683c/fix-audit-permissions/0.log" Feb 23 14:27:09.096984 master-0 kubenswrapper[7728]: I0223 14:27:09.096860 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-67f44b4d6d-7lpn4_ea0b3538-9a7d-4995-b628-2d63f21d683c/oauth-apiserver/0.log" Feb 23 14:27:09.296795 master-0 kubenswrapper[7728]: I0223 14:27:09.296717 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-86b8dc6d6-2kvfp_3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3/kube-rbac-proxy/0.log" Feb 23 14:27:09.492715 master-0 kubenswrapper[7728]: I0223 14:27:09.492461 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-86b8dc6d6-2kvfp_3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3/cluster-autoscaler-operator/1.log" Feb 23 14:27:09.738379 master-0 kubenswrapper[7728]: I0223 14:27:09.738288 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:27:09.738918 master-0 kubenswrapper[7728]: I0223 14:27:09.738876 7728 scope.go:117] "RemoveContainer" containerID="e7e20b5ba72ce778a4607a64cc8928522b6f4e4e91aae5a0ddbe4de3f2e8d4a6" Feb 23 14:27:09.739139 master-0 kubenswrapper[7728]: E0223 14:27:09.739088 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(c9ad9373c007a4fcd25e70622bdc8deb)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" Feb 23 14:27:09.745831 master-0 kubenswrapper[7728]: I0223 14:27:09.745699 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:27:09.938467 master-0 kubenswrapper[7728]: I0223 14:27:09.938390 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-d6bb9bb76-4frj6_12b256b7-a57b-4124-8452-25e74cfa7926/cluster-baremetal-operator/3.log" Feb 23 14:27:09.946361 master-0 kubenswrapper[7728]: I0223 14:27:09.946305 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-d6bb9bb76-4frj6_12b256b7-a57b-4124-8452-25e74cfa7926/baremetal-kube-rbac-proxy/0.log" Feb 23 14:27:10.093577 master-0 kubenswrapper[7728]: I0223 14:27:10.093418 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-686847ff5f-9q266_4373687a-61a0-434b-81f7-3fecaa1494ef/control-plane-machine-set-operator/1.log" Feb 23 14:27:10.303793 master-0 kubenswrapper[7728]: I0223 14:27:10.303715 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5c7cf458b4-bb7zl_ceba7b56-f910-473d-aed5-add94868fb31/kube-rbac-proxy/0.log" Feb 23 14:27:10.407184 master-0 kubenswrapper[7728]: I0223 14:27:10.407064 7728 scope.go:117] "RemoveContainer" containerID="e7e20b5ba72ce778a4607a64cc8928522b6f4e4e91aae5a0ddbe4de3f2e8d4a6" Feb 23 14:27:10.407387 master-0 kubenswrapper[7728]: E0223 14:27:10.407297 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-controller-manager pod=bootstrap-kube-controller-manager-master-0_kube-system(c9ad9373c007a4fcd25e70622bdc8deb)\"" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" Feb 23 14:27:10.414702 master-0 kubenswrapper[7728]: I0223 14:27:10.414668 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:27:10.495351 master-0 kubenswrapper[7728]: I0223 14:27:10.495287 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5c7cf458b4-bb7zl_ceba7b56-f910-473d-aed5-add94868fb31/machine-api-operator/0.log" Feb 23 14:27:10.936200 master-0 kubenswrapper[7728]: I0223 14:27:10.936117 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-545bf96f4d-fpwtm_8de1f285-47ac-42aa-8026-8addce656362/etcd-operator/2.log" Feb 23 14:27:10.946255 master-0 kubenswrapper[7728]: I0223 14:27:10.946215 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-545bf96f4d-fpwtm_8de1f285-47ac-42aa-8026-8addce656362/etcd-operator/3.log" Feb 23 14:27:11.416224 master-0 kubenswrapper[7728]: I0223 14:27:11.416084 7728 scope.go:117] "RemoveContainer" containerID="e7e20b5ba72ce778a4607a64cc8928522b6f4e4e91aae5a0ddbe4de3f2e8d4a6" Feb 23 14:27:11.497460 master-0 kubenswrapper[7728]: I0223 14:27:11.497395 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_5f67ab24-82bc-4e71-b974-e25b819986c8/installer/0.log" Feb 23 14:27:11.694039 master-0 kubenswrapper[7728]: I0223 14:27:11.693845 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5d87bf58c-nq2tz_961e4ecd-545b-4270-ae34-e733dec793b6/kube-apiserver-operator/3.log" Feb 23 14:27:11.907198 master-0 kubenswrapper[7728]: I0223 14:27:11.907155 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_bootstrap-kube-apiserver-master-0_687e92a6cecf1e2beeef16a0b322ad08/setup/0.log" Feb 23 14:27:12.098887 master-0 kubenswrapper[7728]: I0223 14:27:12.098720 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_bootstrap-kube-apiserver-master-0_687e92a6cecf1e2beeef16a0b322ad08/kube-apiserver/0.log" Feb 23 14:27:12.221074 master-0 kubenswrapper[7728]: I0223 14:27:12.221023 7728 scope.go:117] "RemoveContainer" containerID="3d02c5174ccc3722ad642137b2ae38a4ad6beee863578d93948d8f75b3ffc635" Feb 23 14:27:12.308761 master-0 kubenswrapper[7728]: I0223 14:27:12.308721 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_bootstrap-kube-apiserver-master-0_687e92a6cecf1e2beeef16a0b322ad08/kube-apiserver-insecure-readyz/0.log" Feb 23 14:27:12.421763 master-0 kubenswrapper[7728]: I0223 14:27:12.421737 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6847bb4785-5fw2x_2e89a047-9ebc-459b-b7b3-e902c1fb0e17/snapshot-controller/3.log" Feb 23 14:27:12.422031 master-0 kubenswrapper[7728]: I0223 14:27:12.422012 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6847bb4785-5fw2x" event={"ID":"2e89a047-9ebc-459b-b7b3-e902c1fb0e17","Type":"ContainerStarted","Data":"0189afd080f3b6ea37eeb4ab1c36f7fceab4ba1f69c22daf84852629bcff7a8b"} Feb 23 14:27:12.424312 master-0 kubenswrapper[7728]: I0223 14:27:12.424296 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"c9ad9373c007a4fcd25e70622bdc8deb","Type":"ContainerStarted","Data":"412625d61293576a0c0a8ae370cd71d0f9c06b3bc76402a07899ef292404e66d"} Feb 23 14:27:12.503887 master-0 kubenswrapper[7728]: I0223 14:27:12.503826 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_29f7b30e-bf6a-4e54-b009-1b0fcd830035/installer/0.log" Feb 23 14:27:12.559464 master-0 kubenswrapper[7728]: I0223 14:27:12.559366 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pfb9h" Feb 23 14:27:12.559464 master-0 kubenswrapper[7728]: I0223 14:27:12.559442 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pfb9h" Feb 23 14:27:12.610717 master-0 kubenswrapper[7728]: I0223 14:27:12.610672 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pfb9h" Feb 23 14:27:12.638281 master-0 kubenswrapper[7728]: I0223 14:27:12.638174 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fjpvt" Feb 23 14:27:12.638577 master-0 kubenswrapper[7728]: I0223 14:27:12.638557 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fjpvt" Feb 23 14:27:12.683634 master-0 kubenswrapper[7728]: I0223 14:27:12.683577 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fjpvt" Feb 23 14:27:12.686655 master-0 kubenswrapper[7728]: I0223 14:27:12.686623 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tl6dk" Feb 23 14:27:12.687534 master-0 kubenswrapper[7728]: I0223 14:27:12.687516 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tl6dk" Feb 23 14:27:12.698144 master-0 kubenswrapper[7728]: I0223 14:27:12.697145 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_0fdb9885-7479-43b5-8613-b2857a798ade/installer/0.log" Feb 23 14:27:12.698144 master-0 kubenswrapper[7728]: I0223 14:27:12.698016 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cdrlk" Feb 23 14:27:12.698144 master-0 kubenswrapper[7728]: I0223 14:27:12.698058 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cdrlk" Feb 23 14:27:12.748070 master-0 kubenswrapper[7728]: I0223 14:27:12.748010 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cdrlk" Feb 23 14:27:12.893585 master-0 kubenswrapper[7728]: I0223 14:27:12.893491 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-77cd4d9559-qvq8x_b9cf1c39-24f0-420b-8020-089616d1cdf0/kube-scheduler-operator-container/2.log" Feb 23 14:27:13.093667 master-0 kubenswrapper[7728]: I0223 14:27:13.093590 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-77cd4d9559-qvq8x_b9cf1c39-24f0-420b-8020-089616d1cdf0/kube-scheduler-operator-container/3.log" Feb 23 14:27:13.294262 master-0 kubenswrapper[7728]: I0223 14:27:13.294183 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-8586dccc9b-tvnmq_24829faf-50e8-45bb-abb0-7cc5ccf81080/openshift-apiserver-operator/2.log" Feb 23 14:27:13.476769 master-0 kubenswrapper[7728]: I0223 14:27:13.476706 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cdrlk" Feb 23 14:27:13.479918 master-0 kubenswrapper[7728]: I0223 14:27:13.479897 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pfb9h" Feb 23 14:27:13.480992 master-0 kubenswrapper[7728]: I0223 14:27:13.480968 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fjpvt" Feb 23 14:27:13.495020 master-0 kubenswrapper[7728]: I0223 14:27:13.494853 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-8586dccc9b-tvnmq_24829faf-50e8-45bb-abb0-7cc5ccf81080/openshift-apiserver-operator/3.log" Feb 23 14:27:13.694341 master-0 kubenswrapper[7728]: I0223 14:27:13.694186 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-666b887977-f7h55_588a804a-430a-47f4-aa97-c08e907239da/fix-audit-permissions/0.log" Feb 23 14:27:13.731353 master-0 kubenswrapper[7728]: I0223 14:27:13.731267 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tl6dk" podUID="c84f66f0-207e-436a-8f4e-d1971fa815eb" containerName="registry-server" probeResult="failure" output=< Feb 23 14:27:13.731353 master-0 kubenswrapper[7728]: timeout: failed to connect service ":50051" within 1s Feb 23 14:27:13.731353 master-0 kubenswrapper[7728]: > Feb 23 14:27:13.900219 master-0 kubenswrapper[7728]: I0223 14:27:13.900161 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-666b887977-f7h55_588a804a-430a-47f4-aa97-c08e907239da/openshift-apiserver/0.log" Feb 23 14:27:14.095425 master-0 kubenswrapper[7728]: I0223 14:27:14.095155 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-666b887977-f7h55_588a804a-430a-47f4-aa97-c08e907239da/openshift-apiserver-check-endpoints/0.log" Feb 23 14:27:14.220902 master-0 kubenswrapper[7728]: I0223 14:27:14.220769 7728 scope.go:117] "RemoveContainer" containerID="7c094f15ea265ac3d44bbebfb78fef4402e37dfe5737cb2bab354a08b8292a17" Feb 23 14:27:14.294310 master-0 kubenswrapper[7728]: I0223 14:27:14.294228 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-545bf96f4d-fpwtm_8de1f285-47ac-42aa-8026-8addce656362/etcd-operator/2.log" Feb 23 14:27:14.493531 master-0 kubenswrapper[7728]: I0223 14:27:14.493458 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-545bf96f4d-fpwtm_8de1f285-47ac-42aa-8026-8addce656362/etcd-operator/3.log" Feb 23 14:27:14.699687 master-0 kubenswrapper[7728]: I0223 14:27:14.699632 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_catalog-operator-596f79dd6f-mhzxn_255b5a89-1b89-42dc-868a-32ce67975a54/catalog-operator/0.log" Feb 23 14:27:14.901217 master-0 kubenswrapper[7728]: I0223 14:27:14.901122 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_olm-operator-5499d7f7bb-t45zz_0cebb80d-d898-44c8-82b3-1e18833cee3f/olm-operator/0.log" Feb 23 14:27:15.297582 master-0 kubenswrapper[7728]: I0223 14:27:15.297456 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-5c75f78c8b-cj2l7_5b54fc16-d2f7-4b10-a611-5b411b389c5a/package-server-manager/0.log" Feb 23 14:27:15.452826 master-0 kubenswrapper[7728]: I0223 14:27:15.452732 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-7bcfbc574b-zdntd_cf04aca0-8174-4134-835d-37adf6a3b5ca/kube-controller-manager-operator/3.log" Feb 23 14:27:15.453149 master-0 kubenswrapper[7728]: I0223 14:27:15.452880 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" event={"ID":"cf04aca0-8174-4134-835d-37adf6a3b5ca","Type":"ContainerStarted","Data":"6333d975bdd0987ec01df36eb0d3e9c836ef6da8f8df4fea72b742962ec92ffd"} Feb 23 14:27:15.493379 master-0 kubenswrapper[7728]: I0223 14:27:15.493308 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-5c75f78c8b-cj2l7_5b54fc16-d2f7-4b10-a611-5b411b389c5a/kube-rbac-proxy/0.log" Feb 23 14:27:15.757160 master-0 kubenswrapper[7728]: I0223 14:27:15.757090 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-5c75f78c8b-cj2l7_5b54fc16-d2f7-4b10-a611-5b411b389c5a/package-server-manager/1.log" Feb 23 14:27:16.225912 master-0 kubenswrapper[7728]: I0223 14:27:16.225807 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_packageserver-65c9585877-m66zh_0315476e-7140-4777-8061-9cead4c92024/packageserver/0.log" Feb 23 14:27:16.363905 master-0 kubenswrapper[7728]: I0223 14:27:16.363822 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:27:16.367783 master-0 kubenswrapper[7728]: I0223 14:27:16.367725 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:27:16.461530 master-0 kubenswrapper[7728]: I0223 14:27:16.461435 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:27:19.221610 master-0 kubenswrapper[7728]: I0223 14:27:19.221450 7728 scope.go:117] "RemoveContainer" containerID="9abb82bc4e660ae80bfe0a01a4c7f25bfcf62f98ac7b617e82941def46f78a19" Feb 23 14:27:19.491468 master-0 kubenswrapper[7728]: I0223 14:27:19.491314 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-6569778c84-hsl6c_3488a7eb-5170-478c-9af7-490dbe0f514e/ingress-operator/2.log" Feb 23 14:27:19.491874 master-0 kubenswrapper[7728]: I0223 14:27:19.491814 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" event={"ID":"3488a7eb-5170-478c-9af7-490dbe0f514e","Type":"ContainerStarted","Data":"ee438b16358b074b5b2b8beb6e302d9825fe67047f68f4e63ab8627671ec9d19"} Feb 23 14:27:22.730930 master-0 kubenswrapper[7728]: I0223 14:27:22.730856 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tl6dk" Feb 23 14:27:22.788618 master-0 kubenswrapper[7728]: I0223 14:27:22.788567 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tl6dk" Feb 23 14:27:27.443599 master-0 kubenswrapper[7728]: I0223 14:27:27.443515 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:27:35.992803 master-0 kubenswrapper[7728]: I0223 14:27:35.992647 7728 patch_prober.go:28] interesting pod/machine-config-daemon-fhcgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 14:27:35.992803 master-0 kubenswrapper[7728]: I0223 14:27:35.992757 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" podUID="76c67569-3a72-4de9-87cd-432a4607b15b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 14:28:05.993028 master-0 kubenswrapper[7728]: I0223 14:28:05.992885 7728 patch_prober.go:28] interesting pod/machine-config-daemon-fhcgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 14:28:05.993028 master-0 kubenswrapper[7728]: I0223 14:28:05.992964 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" podUID="76c67569-3a72-4de9-87cd-432a4607b15b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 14:28:05.993028 master-0 kubenswrapper[7728]: I0223 14:28:05.993019 7728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" Feb 23 14:28:05.993704 master-0 kubenswrapper[7728]: I0223 14:28:05.993622 7728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4f6e215689332bab70cfb5b43a7cfcbaa2bd241cb7a2a1c1757464250604d426"} pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 14:28:05.993704 master-0 kubenswrapper[7728]: I0223 14:28:05.993682 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" podUID="76c67569-3a72-4de9-87cd-432a4607b15b" containerName="machine-config-daemon" containerID="cri-o://4f6e215689332bab70cfb5b43a7cfcbaa2bd241cb7a2a1c1757464250604d426" gracePeriod=600 Feb 23 14:28:06.834554 master-0 kubenswrapper[7728]: I0223 14:28:06.834445 7728 generic.go:334] "Generic (PLEG): container finished" podID="76c67569-3a72-4de9-87cd-432a4607b15b" containerID="4f6e215689332bab70cfb5b43a7cfcbaa2bd241cb7a2a1c1757464250604d426" exitCode=0 Feb 23 14:28:06.834554 master-0 kubenswrapper[7728]: I0223 14:28:06.834526 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" event={"ID":"76c67569-3a72-4de9-87cd-432a4607b15b","Type":"ContainerDied","Data":"4f6e215689332bab70cfb5b43a7cfcbaa2bd241cb7a2a1c1757464250604d426"} Feb 23 14:28:06.834794 master-0 kubenswrapper[7728]: I0223 14:28:06.834592 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" event={"ID":"76c67569-3a72-4de9-87cd-432a4607b15b","Type":"ContainerStarted","Data":"0ccc3d4cff85a134107f96cc12ad89d4b417a48e075d82e5410bc67ca88a884e"} Feb 23 14:28:06.834794 master-0 kubenswrapper[7728]: I0223 14:28:06.834630 7728 scope.go:117] "RemoveContainer" containerID="a927374fcf62fad56c9d8325450d07e92c07f04787ed291d9c0071fab4d22549" Feb 23 14:29:30.398175 master-0 kubenswrapper[7728]: I0223 14:29:30.398130 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-6569778c84-hsl6c_3488a7eb-5170-478c-9af7-490dbe0f514e/ingress-operator/3.log" Feb 23 14:29:30.398752 master-0 kubenswrapper[7728]: I0223 14:29:30.398684 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-6569778c84-hsl6c_3488a7eb-5170-478c-9af7-490dbe0f514e/ingress-operator/2.log" Feb 23 14:29:30.399077 master-0 kubenswrapper[7728]: I0223 14:29:30.399049 7728 generic.go:334] "Generic (PLEG): container finished" podID="3488a7eb-5170-478c-9af7-490dbe0f514e" containerID="ee438b16358b074b5b2b8beb6e302d9825fe67047f68f4e63ab8627671ec9d19" exitCode=1 Feb 23 14:29:30.399141 master-0 kubenswrapper[7728]: I0223 14:29:30.399088 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" event={"ID":"3488a7eb-5170-478c-9af7-490dbe0f514e","Type":"ContainerDied","Data":"ee438b16358b074b5b2b8beb6e302d9825fe67047f68f4e63ab8627671ec9d19"} Feb 23 14:29:30.399141 master-0 kubenswrapper[7728]: I0223 14:29:30.399124 7728 scope.go:117] "RemoveContainer" containerID="9abb82bc4e660ae80bfe0a01a4c7f25bfcf62f98ac7b617e82941def46f78a19" Feb 23 14:29:30.399835 master-0 kubenswrapper[7728]: I0223 14:29:30.399544 7728 scope.go:117] "RemoveContainer" containerID="ee438b16358b074b5b2b8beb6e302d9825fe67047f68f4e63ab8627671ec9d19" Feb 23 14:29:30.400042 master-0 kubenswrapper[7728]: E0223 14:29:30.399999 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ingress-operator pod=ingress-operator-6569778c84-hsl6c_openshift-ingress-operator(3488a7eb-5170-478c-9af7-490dbe0f514e)\"" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" podUID="3488a7eb-5170-478c-9af7-490dbe0f514e" Feb 23 14:29:31.408208 master-0 kubenswrapper[7728]: I0223 14:29:31.408148 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-6569778c84-hsl6c_3488a7eb-5170-478c-9af7-490dbe0f514e/ingress-operator/3.log" Feb 23 14:29:41.221586 master-0 kubenswrapper[7728]: I0223 14:29:41.221464 7728 scope.go:117] "RemoveContainer" containerID="ee438b16358b074b5b2b8beb6e302d9825fe67047f68f4e63ab8627671ec9d19" Feb 23 14:29:41.222351 master-0 kubenswrapper[7728]: E0223 14:29:41.222228 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ingress-operator pod=ingress-operator-6569778c84-hsl6c_openshift-ingress-operator(3488a7eb-5170-478c-9af7-490dbe0f514e)\"" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" podUID="3488a7eb-5170-478c-9af7-490dbe0f514e" Feb 23 14:29:52.220554 master-0 kubenswrapper[7728]: I0223 14:29:52.220493 7728 scope.go:117] "RemoveContainer" containerID="ee438b16358b074b5b2b8beb6e302d9825fe67047f68f4e63ab8627671ec9d19" Feb 23 14:29:52.221192 master-0 kubenswrapper[7728]: E0223 14:29:52.220742 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ingress-operator pod=ingress-operator-6569778c84-hsl6c_openshift-ingress-operator(3488a7eb-5170-478c-9af7-490dbe0f514e)\"" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" podUID="3488a7eb-5170-478c-9af7-490dbe0f514e" Feb 23 14:30:03.220647 master-0 kubenswrapper[7728]: I0223 14:30:03.220561 7728 scope.go:117] "RemoveContainer" containerID="ee438b16358b074b5b2b8beb6e302d9825fe67047f68f4e63ab8627671ec9d19" Feb 23 14:30:03.221448 master-0 kubenswrapper[7728]: E0223 14:30:03.220798 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ingress-operator pod=ingress-operator-6569778c84-hsl6c_openshift-ingress-operator(3488a7eb-5170-478c-9af7-490dbe0f514e)\"" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" podUID="3488a7eb-5170-478c-9af7-490dbe0f514e" Feb 23 14:30:16.221136 master-0 kubenswrapper[7728]: I0223 14:30:16.221061 7728 scope.go:117] "RemoveContainer" containerID="ee438b16358b074b5b2b8beb6e302d9825fe67047f68f4e63ab8627671ec9d19" Feb 23 14:30:16.713547 master-0 kubenswrapper[7728]: I0223 14:30:16.713415 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-6569778c84-hsl6c_3488a7eb-5170-478c-9af7-490dbe0f514e/ingress-operator/3.log" Feb 23 14:30:16.713959 master-0 kubenswrapper[7728]: I0223 14:30:16.713912 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" event={"ID":"3488a7eb-5170-478c-9af7-490dbe0f514e","Type":"ContainerStarted","Data":"b59ddaa1f996d8d231b18a402187cbb1ee1446439ec71026f52221d4aaab529f"} Feb 23 14:30:35.993210 master-0 kubenswrapper[7728]: I0223 14:30:35.993106 7728 patch_prober.go:28] interesting pod/machine-config-daemon-fhcgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 14:30:35.993210 master-0 kubenswrapper[7728]: I0223 14:30:35.993199 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" podUID="76c67569-3a72-4de9-87cd-432a4607b15b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 14:31:05.993385 master-0 kubenswrapper[7728]: I0223 14:31:05.993206 7728 patch_prober.go:28] interesting pod/machine-config-daemon-fhcgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 14:31:05.993385 master-0 kubenswrapper[7728]: I0223 14:31:05.993302 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" podUID="76c67569-3a72-4de9-87cd-432a4607b15b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 14:31:35.992916 master-0 kubenswrapper[7728]: I0223 14:31:35.992795 7728 patch_prober.go:28] interesting pod/machine-config-daemon-fhcgg container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 14:31:35.992916 master-0 kubenswrapper[7728]: I0223 14:31:35.992901 7728 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" podUID="76c67569-3a72-4de9-87cd-432a4607b15b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 14:31:35.996042 master-0 kubenswrapper[7728]: I0223 14:31:35.992983 7728 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" Feb 23 14:31:35.996042 master-0 kubenswrapper[7728]: I0223 14:31:35.993898 7728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0ccc3d4cff85a134107f96cc12ad89d4b417a48e075d82e5410bc67ca88a884e"} pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 14:31:35.996042 master-0 kubenswrapper[7728]: I0223 14:31:35.994002 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" podUID="76c67569-3a72-4de9-87cd-432a4607b15b" containerName="machine-config-daemon" containerID="cri-o://0ccc3d4cff85a134107f96cc12ad89d4b417a48e075d82e5410bc67ca88a884e" gracePeriod=600 Feb 23 14:31:36.261853 master-0 kubenswrapper[7728]: I0223 14:31:36.261677 7728 generic.go:334] "Generic (PLEG): container finished" podID="76c67569-3a72-4de9-87cd-432a4607b15b" containerID="0ccc3d4cff85a134107f96cc12ad89d4b417a48e075d82e5410bc67ca88a884e" exitCode=0 Feb 23 14:31:36.261853 master-0 kubenswrapper[7728]: I0223 14:31:36.261728 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" event={"ID":"76c67569-3a72-4de9-87cd-432a4607b15b","Type":"ContainerDied","Data":"0ccc3d4cff85a134107f96cc12ad89d4b417a48e075d82e5410bc67ca88a884e"} Feb 23 14:31:36.261853 master-0 kubenswrapper[7728]: I0223 14:31:36.261814 7728 scope.go:117] "RemoveContainer" containerID="4f6e215689332bab70cfb5b43a7cfcbaa2bd241cb7a2a1c1757464250604d426" Feb 23 14:31:37.271324 master-0 kubenswrapper[7728]: I0223 14:31:37.271248 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" event={"ID":"76c67569-3a72-4de9-87cd-432a4607b15b","Type":"ContainerStarted","Data":"124a68bb47c3e30bb951c84920269609e54235b4ea92c589f9e5c85ddedbc17d"} Feb 23 14:31:59.657079 master-0 kubenswrapper[7728]: I0223 14:31:59.657008 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb"] Feb 23 14:31:59.657898 master-0 kubenswrapper[7728]: I0223 14:31:59.657293 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" podUID="c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04" containerName="kube-rbac-proxy" containerID="cri-o://b9696feb917a539850d6758463096fc5875c43e8388f7fe164eb4e11ef28ad37" gracePeriod=30 Feb 23 14:31:59.657898 master-0 kubenswrapper[7728]: I0223 14:31:59.657317 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" podUID="c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04" containerName="config-sync-controllers" containerID="cri-o://6025024f5636acf7c27c5b7a39d4bb79ae61ce702639b0f3582d218d6929f100" gracePeriod=30 Feb 23 14:31:59.657898 master-0 kubenswrapper[7728]: I0223 14:31:59.657393 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" podUID="c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04" containerName="cluster-cloud-controller-manager" containerID="cri-o://56b4c213d9a7300864baea1d80b35071587d7cb159b21a8c971fe22bd13e200d" gracePeriod=30 Feb 23 14:31:59.836084 master-0 kubenswrapper[7728]: I0223 14:31:59.836035 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb_c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04/config-sync-controllers/0.log" Feb 23 14:31:59.836500 master-0 kubenswrapper[7728]: I0223 14:31:59.836448 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb_c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04/cluster-cloud-controller-manager/0.log" Feb 23 14:31:59.836615 master-0 kubenswrapper[7728]: I0223 14:31:59.836546 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" Feb 23 14:31:59.908649 master-0 kubenswrapper[7728]: I0223 14:31:59.908380 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb"] Feb 23 14:31:59.908934 master-0 kubenswrapper[7728]: E0223 14:31:59.908824 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04" containerName="cluster-cloud-controller-manager" Feb 23 14:31:59.908934 master-0 kubenswrapper[7728]: I0223 14:31:59.908849 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04" containerName="cluster-cloud-controller-manager" Feb 23 14:31:59.908934 master-0 kubenswrapper[7728]: E0223 14:31:59.908901 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04" containerName="kube-rbac-proxy" Feb 23 14:31:59.908934 master-0 kubenswrapper[7728]: I0223 14:31:59.908913 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04" containerName="kube-rbac-proxy" Feb 23 14:31:59.909364 master-0 kubenswrapper[7728]: E0223 14:31:59.908950 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04" containerName="config-sync-controllers" Feb 23 14:31:59.909364 master-0 kubenswrapper[7728]: I0223 14:31:59.908963 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04" containerName="config-sync-controllers" Feb 23 14:31:59.909364 master-0 kubenswrapper[7728]: I0223 14:31:59.909135 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04" containerName="cluster-cloud-controller-manager" Feb 23 14:31:59.909364 master-0 kubenswrapper[7728]: I0223 14:31:59.909162 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04" containerName="kube-rbac-proxy" Feb 23 14:31:59.909364 master-0 kubenswrapper[7728]: I0223 14:31:59.909186 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04" containerName="cluster-cloud-controller-manager" Feb 23 14:31:59.909364 master-0 kubenswrapper[7728]: I0223 14:31:59.909207 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04" containerName="config-sync-controllers" Feb 23 14:31:59.909364 master-0 kubenswrapper[7728]: I0223 14:31:59.909227 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04" containerName="config-sync-controllers" Feb 23 14:31:59.910035 master-0 kubenswrapper[7728]: E0223 14:31:59.909397 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04" containerName="config-sync-controllers" Feb 23 14:31:59.910035 master-0 kubenswrapper[7728]: I0223 14:31:59.909412 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04" containerName="config-sync-controllers" Feb 23 14:31:59.910035 master-0 kubenswrapper[7728]: E0223 14:31:59.909460 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04" containerName="cluster-cloud-controller-manager" Feb 23 14:31:59.910035 master-0 kubenswrapper[7728]: I0223 14:31:59.909472 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04" containerName="cluster-cloud-controller-manager" Feb 23 14:31:59.913519 master-0 kubenswrapper[7728]: I0223 14:31:59.910629 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" Feb 23 14:31:59.978072 master-0 kubenswrapper[7728]: I0223 14:31:59.978014 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04-auth-proxy-config\") pod \"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04\" (UID: \"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04\") " Feb 23 14:31:59.978471 master-0 kubenswrapper[7728]: I0223 14:31:59.978430 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvp2k\" (UniqueName: \"kubernetes.io/projected/c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04-kube-api-access-vvp2k\") pod \"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04\" (UID: \"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04\") " Feb 23 14:31:59.978575 master-0 kubenswrapper[7728]: I0223 14:31:59.978491 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04" (UID: "c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:31:59.978575 master-0 kubenswrapper[7728]: I0223 14:31:59.978505 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04-images\") pod \"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04\" (UID: \"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04\") " Feb 23 14:31:59.978575 master-0 kubenswrapper[7728]: I0223 14:31:59.978550 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04-cloud-controller-manager-operator-tls\") pod \"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04\" (UID: \"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04\") " Feb 23 14:31:59.978709 master-0 kubenswrapper[7728]: I0223 14:31:59.978580 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04-host-etc-kube\") pod \"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04\" (UID: \"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04\") " Feb 23 14:31:59.978709 master-0 kubenswrapper[7728]: I0223 14:31:59.978670 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-images\") pod \"cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb\" (UID: \"172d47fd-e1a1-4d77-9e31-c4f22e824d5f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" Feb 23 14:31:59.978791 master-0 kubenswrapper[7728]: I0223 14:31:59.978758 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb\" (UID: \"172d47fd-e1a1-4d77-9e31-c4f22e824d5f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" Feb 23 14:31:59.978791 master-0 kubenswrapper[7728]: I0223 14:31:59.978783 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb\" (UID: \"172d47fd-e1a1-4d77-9e31-c4f22e824d5f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" Feb 23 14:31:59.978884 master-0 kubenswrapper[7728]: I0223 14:31:59.978809 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x6q2\" (UniqueName: \"kubernetes.io/projected/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-kube-api-access-9x6q2\") pod \"cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb\" (UID: \"172d47fd-e1a1-4d77-9e31-c4f22e824d5f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" Feb 23 14:31:59.978884 master-0 kubenswrapper[7728]: I0223 14:31:59.978839 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb\" (UID: \"172d47fd-e1a1-4d77-9e31-c4f22e824d5f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" Feb 23 14:31:59.978973 master-0 kubenswrapper[7728]: I0223 14:31:59.978888 7728 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04-auth-proxy-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:31:59.978973 master-0 kubenswrapper[7728]: I0223 14:31:59.978896 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04-host-etc-kube" (OuterVolumeSpecName: "host-etc-kube") pod "c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04" (UID: "c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04"). InnerVolumeSpecName "host-etc-kube". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:31:59.979395 master-0 kubenswrapper[7728]: I0223 14:31:59.979370 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04-images" (OuterVolumeSpecName: "images") pod "c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04" (UID: "c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:31:59.981772 master-0 kubenswrapper[7728]: I0223 14:31:59.981737 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04-kube-api-access-vvp2k" (OuterVolumeSpecName: "kube-api-access-vvp2k") pod "c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04" (UID: "c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04"). InnerVolumeSpecName "kube-api-access-vvp2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:31:59.981837 master-0 kubenswrapper[7728]: I0223 14:31:59.981799 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04-cloud-controller-manager-operator-tls" (OuterVolumeSpecName: "cloud-controller-manager-operator-tls") pod "c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04" (UID: "c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04"). InnerVolumeSpecName "cloud-controller-manager-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:32:00.079531 master-0 kubenswrapper[7728]: I0223 14:32:00.079467 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb\" (UID: \"172d47fd-e1a1-4d77-9e31-c4f22e824d5f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" Feb 23 14:32:00.079721 master-0 kubenswrapper[7728]: I0223 14:32:00.079539 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb\" (UID: \"172d47fd-e1a1-4d77-9e31-c4f22e824d5f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" Feb 23 14:32:00.079721 master-0 kubenswrapper[7728]: I0223 14:32:00.079569 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x6q2\" (UniqueName: \"kubernetes.io/projected/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-kube-api-access-9x6q2\") pod \"cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb\" (UID: \"172d47fd-e1a1-4d77-9e31-c4f22e824d5f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" Feb 23 14:32:00.079721 master-0 kubenswrapper[7728]: I0223 14:32:00.079692 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb\" (UID: \"172d47fd-e1a1-4d77-9e31-c4f22e824d5f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" Feb 23 14:32:00.079892 master-0 kubenswrapper[7728]: I0223 14:32:00.079822 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb\" (UID: \"172d47fd-e1a1-4d77-9e31-c4f22e824d5f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" Feb 23 14:32:00.080022 master-0 kubenswrapper[7728]: I0223 14:32:00.079994 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-images\") pod \"cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb\" (UID: \"172d47fd-e1a1-4d77-9e31-c4f22e824d5f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" Feb 23 14:32:00.080190 master-0 kubenswrapper[7728]: I0223 14:32:00.080153 7728 reconciler_common.go:293] "Volume detached for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04-cloud-controller-manager-operator-tls\") on node \"master-0\" DevicePath \"\"" Feb 23 14:32:00.080249 master-0 kubenswrapper[7728]: I0223 14:32:00.080211 7728 reconciler_common.go:293] "Volume detached for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04-host-etc-kube\") on node \"master-0\" DevicePath \"\"" Feb 23 14:32:00.080283 master-0 kubenswrapper[7728]: I0223 14:32:00.080245 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvp2k\" (UniqueName: \"kubernetes.io/projected/c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04-kube-api-access-vvp2k\") on node \"master-0\" DevicePath \"\"" Feb 23 14:32:00.080283 master-0 kubenswrapper[7728]: I0223 14:32:00.080272 7728 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04-images\") on node \"master-0\" DevicePath \"\"" Feb 23 14:32:00.080632 master-0 kubenswrapper[7728]: I0223 14:32:00.080604 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb\" (UID: \"172d47fd-e1a1-4d77-9e31-c4f22e824d5f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" Feb 23 14:32:00.080789 master-0 kubenswrapper[7728]: I0223 14:32:00.080745 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-images\") pod \"cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb\" (UID: \"172d47fd-e1a1-4d77-9e31-c4f22e824d5f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" Feb 23 14:32:00.084178 master-0 kubenswrapper[7728]: I0223 14:32:00.084130 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb\" (UID: \"172d47fd-e1a1-4d77-9e31-c4f22e824d5f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" Feb 23 14:32:00.095818 master-0 kubenswrapper[7728]: I0223 14:32:00.095736 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x6q2\" (UniqueName: \"kubernetes.io/projected/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-kube-api-access-9x6q2\") pod \"cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb\" (UID: \"172d47fd-e1a1-4d77-9e31-c4f22e824d5f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" Feb 23 14:32:00.255078 master-0 kubenswrapper[7728]: I0223 14:32:00.255029 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" Feb 23 14:32:00.277165 master-0 kubenswrapper[7728]: W0223 14:32:00.277100 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod172d47fd_e1a1_4d77_9e31_c4f22e824d5f.slice/crio-b51483fb30eb125aa1ba7d4f431cb050c71a85528244347d0f1ad28b65c42bd5 WatchSource:0}: Error finding container b51483fb30eb125aa1ba7d4f431cb050c71a85528244347d0f1ad28b65c42bd5: Status 404 returned error can't find the container with id b51483fb30eb125aa1ba7d4f431cb050c71a85528244347d0f1ad28b65c42bd5 Feb 23 14:32:00.415727 master-0 kubenswrapper[7728]: I0223 14:32:00.415660 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" event={"ID":"172d47fd-e1a1-4d77-9e31-c4f22e824d5f","Type":"ContainerStarted","Data":"b51483fb30eb125aa1ba7d4f431cb050c71a85528244347d0f1ad28b65c42bd5"} Feb 23 14:32:00.419969 master-0 kubenswrapper[7728]: I0223 14:32:00.419923 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb_c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04/config-sync-controllers/0.log" Feb 23 14:32:00.421563 master-0 kubenswrapper[7728]: I0223 14:32:00.420790 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb_c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04/cluster-cloud-controller-manager/0.log" Feb 23 14:32:00.421563 master-0 kubenswrapper[7728]: I0223 14:32:00.420902 7728 generic.go:334] "Generic (PLEG): container finished" podID="c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04" containerID="6025024f5636acf7c27c5b7a39d4bb79ae61ce702639b0f3582d218d6929f100" exitCode=0 Feb 23 14:32:00.421563 master-0 kubenswrapper[7728]: I0223 14:32:00.420942 7728 generic.go:334] "Generic (PLEG): container finished" podID="c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04" containerID="56b4c213d9a7300864baea1d80b35071587d7cb159b21a8c971fe22bd13e200d" exitCode=0 Feb 23 14:32:00.421563 master-0 kubenswrapper[7728]: I0223 14:32:00.420965 7728 generic.go:334] "Generic (PLEG): container finished" podID="c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04" containerID="b9696feb917a539850d6758463096fc5875c43e8388f7fe164eb4e11ef28ad37" exitCode=0 Feb 23 14:32:00.421563 master-0 kubenswrapper[7728]: I0223 14:32:00.421008 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" event={"ID":"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04","Type":"ContainerDied","Data":"6025024f5636acf7c27c5b7a39d4bb79ae61ce702639b0f3582d218d6929f100"} Feb 23 14:32:00.421563 master-0 kubenswrapper[7728]: I0223 14:32:00.421059 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" event={"ID":"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04","Type":"ContainerDied","Data":"56b4c213d9a7300864baea1d80b35071587d7cb159b21a8c971fe22bd13e200d"} Feb 23 14:32:00.421563 master-0 kubenswrapper[7728]: I0223 14:32:00.421087 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" event={"ID":"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04","Type":"ContainerDied","Data":"b9696feb917a539850d6758463096fc5875c43e8388f7fe164eb4e11ef28ad37"} Feb 23 14:32:00.421563 master-0 kubenswrapper[7728]: I0223 14:32:00.421020 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" Feb 23 14:32:00.421563 master-0 kubenswrapper[7728]: I0223 14:32:00.421119 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb" event={"ID":"c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04","Type":"ContainerDied","Data":"223c331c7c80307a3bc03e311dd668265166e3a40331282a045bd5ac8d6db865"} Feb 23 14:32:00.421563 master-0 kubenswrapper[7728]: I0223 14:32:00.421159 7728 scope.go:117] "RemoveContainer" containerID="6025024f5636acf7c27c5b7a39d4bb79ae61ce702639b0f3582d218d6929f100" Feb 23 14:32:00.447307 master-0 kubenswrapper[7728]: I0223 14:32:00.444129 7728 scope.go:117] "RemoveContainer" containerID="56b4c213d9a7300864baea1d80b35071587d7cb159b21a8c971fe22bd13e200d" Feb 23 14:32:00.468072 master-0 kubenswrapper[7728]: I0223 14:32:00.468034 7728 scope.go:117] "RemoveContainer" containerID="b9696feb917a539850d6758463096fc5875c43e8388f7fe164eb4e11ef28ad37" Feb 23 14:32:00.501856 master-0 kubenswrapper[7728]: I0223 14:32:00.501829 7728 scope.go:117] "RemoveContainer" containerID="06ed5eab4f45a414dec39fdf73e09eda9befba12eaf73ac8d264e79dbcbe1fcb" Feb 23 14:32:00.524848 master-0 kubenswrapper[7728]: I0223 14:32:00.524764 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb"] Feb 23 14:32:00.552195 master-0 kubenswrapper[7728]: I0223 14:32:00.549692 7728 scope.go:117] "RemoveContainer" containerID="89c65e32357fb90a65db3743a53bf98698ca1c5da74b91fe797e842ada8b4fd8" Feb 23 14:32:00.581338 master-0 kubenswrapper[7728]: I0223 14:32:00.581263 7728 scope.go:117] "RemoveContainer" containerID="6025024f5636acf7c27c5b7a39d4bb79ae61ce702639b0f3582d218d6929f100" Feb 23 14:32:00.582043 master-0 kubenswrapper[7728]: E0223 14:32:00.582015 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6025024f5636acf7c27c5b7a39d4bb79ae61ce702639b0f3582d218d6929f100\": container with ID starting with 6025024f5636acf7c27c5b7a39d4bb79ae61ce702639b0f3582d218d6929f100 not found: ID does not exist" containerID="6025024f5636acf7c27c5b7a39d4bb79ae61ce702639b0f3582d218d6929f100" Feb 23 14:32:00.582121 master-0 kubenswrapper[7728]: I0223 14:32:00.582050 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6025024f5636acf7c27c5b7a39d4bb79ae61ce702639b0f3582d218d6929f100"} err="failed to get container status \"6025024f5636acf7c27c5b7a39d4bb79ae61ce702639b0f3582d218d6929f100\": rpc error: code = NotFound desc = could not find container \"6025024f5636acf7c27c5b7a39d4bb79ae61ce702639b0f3582d218d6929f100\": container with ID starting with 6025024f5636acf7c27c5b7a39d4bb79ae61ce702639b0f3582d218d6929f100 not found: ID does not exist" Feb 23 14:32:00.582121 master-0 kubenswrapper[7728]: I0223 14:32:00.582076 7728 scope.go:117] "RemoveContainer" containerID="56b4c213d9a7300864baea1d80b35071587d7cb159b21a8c971fe22bd13e200d" Feb 23 14:32:00.582802 master-0 kubenswrapper[7728]: E0223 14:32:00.582749 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56b4c213d9a7300864baea1d80b35071587d7cb159b21a8c971fe22bd13e200d\": container with ID starting with 56b4c213d9a7300864baea1d80b35071587d7cb159b21a8c971fe22bd13e200d not found: ID does not exist" containerID="56b4c213d9a7300864baea1d80b35071587d7cb159b21a8c971fe22bd13e200d" Feb 23 14:32:00.582870 master-0 kubenswrapper[7728]: I0223 14:32:00.582823 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56b4c213d9a7300864baea1d80b35071587d7cb159b21a8c971fe22bd13e200d"} err="failed to get container status \"56b4c213d9a7300864baea1d80b35071587d7cb159b21a8c971fe22bd13e200d\": rpc error: code = NotFound desc = could not find container \"56b4c213d9a7300864baea1d80b35071587d7cb159b21a8c971fe22bd13e200d\": container with ID starting with 56b4c213d9a7300864baea1d80b35071587d7cb159b21a8c971fe22bd13e200d not found: ID does not exist" Feb 23 14:32:00.582913 master-0 kubenswrapper[7728]: I0223 14:32:00.582869 7728 scope.go:117] "RemoveContainer" containerID="b9696feb917a539850d6758463096fc5875c43e8388f7fe164eb4e11ef28ad37" Feb 23 14:32:00.583672 master-0 kubenswrapper[7728]: E0223 14:32:00.583603 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9696feb917a539850d6758463096fc5875c43e8388f7fe164eb4e11ef28ad37\": container with ID starting with b9696feb917a539850d6758463096fc5875c43e8388f7fe164eb4e11ef28ad37 not found: ID does not exist" containerID="b9696feb917a539850d6758463096fc5875c43e8388f7fe164eb4e11ef28ad37" Feb 23 14:32:00.583747 master-0 kubenswrapper[7728]: I0223 14:32:00.583687 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9696feb917a539850d6758463096fc5875c43e8388f7fe164eb4e11ef28ad37"} err="failed to get container status \"b9696feb917a539850d6758463096fc5875c43e8388f7fe164eb4e11ef28ad37\": rpc error: code = NotFound desc = could not find container \"b9696feb917a539850d6758463096fc5875c43e8388f7fe164eb4e11ef28ad37\": container with ID starting with b9696feb917a539850d6758463096fc5875c43e8388f7fe164eb4e11ef28ad37 not found: ID does not exist" Feb 23 14:32:00.583787 master-0 kubenswrapper[7728]: I0223 14:32:00.583755 7728 scope.go:117] "RemoveContainer" containerID="06ed5eab4f45a414dec39fdf73e09eda9befba12eaf73ac8d264e79dbcbe1fcb" Feb 23 14:32:00.584615 master-0 kubenswrapper[7728]: E0223 14:32:00.584569 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06ed5eab4f45a414dec39fdf73e09eda9befba12eaf73ac8d264e79dbcbe1fcb\": container with ID starting with 06ed5eab4f45a414dec39fdf73e09eda9befba12eaf73ac8d264e79dbcbe1fcb not found: ID does not exist" containerID="06ed5eab4f45a414dec39fdf73e09eda9befba12eaf73ac8d264e79dbcbe1fcb" Feb 23 14:32:00.584615 master-0 kubenswrapper[7728]: I0223 14:32:00.584597 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06ed5eab4f45a414dec39fdf73e09eda9befba12eaf73ac8d264e79dbcbe1fcb"} err="failed to get container status \"06ed5eab4f45a414dec39fdf73e09eda9befba12eaf73ac8d264e79dbcbe1fcb\": rpc error: code = NotFound desc = could not find container \"06ed5eab4f45a414dec39fdf73e09eda9befba12eaf73ac8d264e79dbcbe1fcb\": container with ID starting with 06ed5eab4f45a414dec39fdf73e09eda9befba12eaf73ac8d264e79dbcbe1fcb not found: ID does not exist" Feb 23 14:32:00.584736 master-0 kubenswrapper[7728]: I0223 14:32:00.584616 7728 scope.go:117] "RemoveContainer" containerID="89c65e32357fb90a65db3743a53bf98698ca1c5da74b91fe797e842ada8b4fd8" Feb 23 14:32:00.585109 master-0 kubenswrapper[7728]: E0223 14:32:00.585075 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89c65e32357fb90a65db3743a53bf98698ca1c5da74b91fe797e842ada8b4fd8\": container with ID starting with 89c65e32357fb90a65db3743a53bf98698ca1c5da74b91fe797e842ada8b4fd8 not found: ID does not exist" containerID="89c65e32357fb90a65db3743a53bf98698ca1c5da74b91fe797e842ada8b4fd8" Feb 23 14:32:00.585109 master-0 kubenswrapper[7728]: I0223 14:32:00.585100 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89c65e32357fb90a65db3743a53bf98698ca1c5da74b91fe797e842ada8b4fd8"} err="failed to get container status \"89c65e32357fb90a65db3743a53bf98698ca1c5da74b91fe797e842ada8b4fd8\": rpc error: code = NotFound desc = could not find container \"89c65e32357fb90a65db3743a53bf98698ca1c5da74b91fe797e842ada8b4fd8\": container with ID starting with 89c65e32357fb90a65db3743a53bf98698ca1c5da74b91fe797e842ada8b4fd8 not found: ID does not exist" Feb 23 14:32:00.585219 master-0 kubenswrapper[7728]: I0223 14:32:00.585119 7728 scope.go:117] "RemoveContainer" containerID="6025024f5636acf7c27c5b7a39d4bb79ae61ce702639b0f3582d218d6929f100" Feb 23 14:32:00.585821 master-0 kubenswrapper[7728]: I0223 14:32:00.585776 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6025024f5636acf7c27c5b7a39d4bb79ae61ce702639b0f3582d218d6929f100"} err="failed to get container status \"6025024f5636acf7c27c5b7a39d4bb79ae61ce702639b0f3582d218d6929f100\": rpc error: code = NotFound desc = could not find container \"6025024f5636acf7c27c5b7a39d4bb79ae61ce702639b0f3582d218d6929f100\": container with ID starting with 6025024f5636acf7c27c5b7a39d4bb79ae61ce702639b0f3582d218d6929f100 not found: ID does not exist" Feb 23 14:32:00.585899 master-0 kubenswrapper[7728]: I0223 14:32:00.585822 7728 scope.go:117] "RemoveContainer" containerID="56b4c213d9a7300864baea1d80b35071587d7cb159b21a8c971fe22bd13e200d" Feb 23 14:32:00.586349 master-0 kubenswrapper[7728]: I0223 14:32:00.586317 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56b4c213d9a7300864baea1d80b35071587d7cb159b21a8c971fe22bd13e200d"} err="failed to get container status \"56b4c213d9a7300864baea1d80b35071587d7cb159b21a8c971fe22bd13e200d\": rpc error: code = NotFound desc = could not find container \"56b4c213d9a7300864baea1d80b35071587d7cb159b21a8c971fe22bd13e200d\": container with ID starting with 56b4c213d9a7300864baea1d80b35071587d7cb159b21a8c971fe22bd13e200d not found: ID does not exist" Feb 23 14:32:00.586431 master-0 kubenswrapper[7728]: I0223 14:32:00.586348 7728 scope.go:117] "RemoveContainer" containerID="b9696feb917a539850d6758463096fc5875c43e8388f7fe164eb4e11ef28ad37" Feb 23 14:32:00.586688 master-0 kubenswrapper[7728]: I0223 14:32:00.586662 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9696feb917a539850d6758463096fc5875c43e8388f7fe164eb4e11ef28ad37"} err="failed to get container status \"b9696feb917a539850d6758463096fc5875c43e8388f7fe164eb4e11ef28ad37\": rpc error: code = NotFound desc = could not find container \"b9696feb917a539850d6758463096fc5875c43e8388f7fe164eb4e11ef28ad37\": container with ID starting with b9696feb917a539850d6758463096fc5875c43e8388f7fe164eb4e11ef28ad37 not found: ID does not exist" Feb 23 14:32:00.586688 master-0 kubenswrapper[7728]: I0223 14:32:00.586687 7728 scope.go:117] "RemoveContainer" containerID="06ed5eab4f45a414dec39fdf73e09eda9befba12eaf73ac8d264e79dbcbe1fcb" Feb 23 14:32:00.587012 master-0 kubenswrapper[7728]: I0223 14:32:00.586965 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06ed5eab4f45a414dec39fdf73e09eda9befba12eaf73ac8d264e79dbcbe1fcb"} err="failed to get container status \"06ed5eab4f45a414dec39fdf73e09eda9befba12eaf73ac8d264e79dbcbe1fcb\": rpc error: code = NotFound desc = could not find container \"06ed5eab4f45a414dec39fdf73e09eda9befba12eaf73ac8d264e79dbcbe1fcb\": container with ID starting with 06ed5eab4f45a414dec39fdf73e09eda9befba12eaf73ac8d264e79dbcbe1fcb not found: ID does not exist" Feb 23 14:32:00.587080 master-0 kubenswrapper[7728]: I0223 14:32:00.587015 7728 scope.go:117] "RemoveContainer" containerID="89c65e32357fb90a65db3743a53bf98698ca1c5da74b91fe797e842ada8b4fd8" Feb 23 14:32:00.587340 master-0 kubenswrapper[7728]: I0223 14:32:00.587310 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89c65e32357fb90a65db3743a53bf98698ca1c5da74b91fe797e842ada8b4fd8"} err="failed to get container status \"89c65e32357fb90a65db3743a53bf98698ca1c5da74b91fe797e842ada8b4fd8\": rpc error: code = NotFound desc = could not find container \"89c65e32357fb90a65db3743a53bf98698ca1c5da74b91fe797e842ada8b4fd8\": container with ID starting with 89c65e32357fb90a65db3743a53bf98698ca1c5da74b91fe797e842ada8b4fd8 not found: ID does not exist" Feb 23 14:32:00.587340 master-0 kubenswrapper[7728]: I0223 14:32:00.587339 7728 scope.go:117] "RemoveContainer" containerID="6025024f5636acf7c27c5b7a39d4bb79ae61ce702639b0f3582d218d6929f100" Feb 23 14:32:00.587910 master-0 kubenswrapper[7728]: I0223 14:32:00.587866 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6025024f5636acf7c27c5b7a39d4bb79ae61ce702639b0f3582d218d6929f100"} err="failed to get container status \"6025024f5636acf7c27c5b7a39d4bb79ae61ce702639b0f3582d218d6929f100\": rpc error: code = NotFound desc = could not find container \"6025024f5636acf7c27c5b7a39d4bb79ae61ce702639b0f3582d218d6929f100\": container with ID starting with 6025024f5636acf7c27c5b7a39d4bb79ae61ce702639b0f3582d218d6929f100 not found: ID does not exist" Feb 23 14:32:00.587910 master-0 kubenswrapper[7728]: I0223 14:32:00.587901 7728 scope.go:117] "RemoveContainer" containerID="56b4c213d9a7300864baea1d80b35071587d7cb159b21a8c971fe22bd13e200d" Feb 23 14:32:00.588458 master-0 kubenswrapper[7728]: I0223 14:32:00.588418 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56b4c213d9a7300864baea1d80b35071587d7cb159b21a8c971fe22bd13e200d"} err="failed to get container status \"56b4c213d9a7300864baea1d80b35071587d7cb159b21a8c971fe22bd13e200d\": rpc error: code = NotFound desc = could not find container \"56b4c213d9a7300864baea1d80b35071587d7cb159b21a8c971fe22bd13e200d\": container with ID starting with 56b4c213d9a7300864baea1d80b35071587d7cb159b21a8c971fe22bd13e200d not found: ID does not exist" Feb 23 14:32:00.588458 master-0 kubenswrapper[7728]: I0223 14:32:00.588444 7728 scope.go:117] "RemoveContainer" containerID="b9696feb917a539850d6758463096fc5875c43e8388f7fe164eb4e11ef28ad37" Feb 23 14:32:00.589090 master-0 kubenswrapper[7728]: I0223 14:32:00.589025 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9696feb917a539850d6758463096fc5875c43e8388f7fe164eb4e11ef28ad37"} err="failed to get container status \"b9696feb917a539850d6758463096fc5875c43e8388f7fe164eb4e11ef28ad37\": rpc error: code = NotFound desc = could not find container \"b9696feb917a539850d6758463096fc5875c43e8388f7fe164eb4e11ef28ad37\": container with ID starting with b9696feb917a539850d6758463096fc5875c43e8388f7fe164eb4e11ef28ad37 not found: ID does not exist" Feb 23 14:32:00.589090 master-0 kubenswrapper[7728]: I0223 14:32:00.589076 7728 scope.go:117] "RemoveContainer" containerID="06ed5eab4f45a414dec39fdf73e09eda9befba12eaf73ac8d264e79dbcbe1fcb" Feb 23 14:32:00.589441 master-0 kubenswrapper[7728]: I0223 14:32:00.589413 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06ed5eab4f45a414dec39fdf73e09eda9befba12eaf73ac8d264e79dbcbe1fcb"} err="failed to get container status \"06ed5eab4f45a414dec39fdf73e09eda9befba12eaf73ac8d264e79dbcbe1fcb\": rpc error: code = NotFound desc = could not find container \"06ed5eab4f45a414dec39fdf73e09eda9befba12eaf73ac8d264e79dbcbe1fcb\": container with ID starting with 06ed5eab4f45a414dec39fdf73e09eda9befba12eaf73ac8d264e79dbcbe1fcb not found: ID does not exist" Feb 23 14:32:00.589441 master-0 kubenswrapper[7728]: I0223 14:32:00.589437 7728 scope.go:117] "RemoveContainer" containerID="89c65e32357fb90a65db3743a53bf98698ca1c5da74b91fe797e842ada8b4fd8" Feb 23 14:32:00.589812 master-0 kubenswrapper[7728]: I0223 14:32:00.589785 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89c65e32357fb90a65db3743a53bf98698ca1c5da74b91fe797e842ada8b4fd8"} err="failed to get container status \"89c65e32357fb90a65db3743a53bf98698ca1c5da74b91fe797e842ada8b4fd8\": rpc error: code = NotFound desc = could not find container \"89c65e32357fb90a65db3743a53bf98698ca1c5da74b91fe797e842ada8b4fd8\": container with ID starting with 89c65e32357fb90a65db3743a53bf98698ca1c5da74b91fe797e842ada8b4fd8 not found: ID does not exist" Feb 23 14:32:00.605897 master-0 kubenswrapper[7728]: I0223 14:32:00.605848 7728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-cbd75ff8d-9dllb"] Feb 23 14:32:01.241989 master-0 kubenswrapper[7728]: I0223 14:32:01.241345 7728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04" path="/var/lib/kubelet/pods/c58fee1f-8bd2-46b1-9bf9-a8a3dd55aa04/volumes" Feb 23 14:32:01.348926 master-0 kubenswrapper[7728]: I0223 14:32:01.348835 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-54cb48566c-g4r57"] Feb 23 14:32:01.349749 master-0 kubenswrapper[7728]: I0223 14:32:01.349720 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-54cb48566c-g4r57" Feb 23 14:32:01.351331 master-0 kubenswrapper[7728]: I0223 14:32:01.351295 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 23 14:32:01.351839 master-0 kubenswrapper[7728]: I0223 14:32:01.351492 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-j9tjr" Feb 23 14:32:01.360392 master-0 kubenswrapper[7728]: I0223 14:32:01.360348 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-54cb48566c-g4r57"] Feb 23 14:32:01.398231 master-0 kubenswrapper[7728]: I0223 14:32:01.397624 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5cc28e06-3542-4a25-a8b1-5f5b4ee41114-mcc-auth-proxy-config\") pod \"machine-config-controller-54cb48566c-g4r57\" (UID: \"5cc28e06-3542-4a25-a8b1-5f5b4ee41114\") " pod="openshift-machine-config-operator/machine-config-controller-54cb48566c-g4r57" Feb 23 14:32:01.398231 master-0 kubenswrapper[7728]: I0223 14:32:01.397693 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cc28e06-3542-4a25-a8b1-5f5b4ee41114-proxy-tls\") pod \"machine-config-controller-54cb48566c-g4r57\" (UID: \"5cc28e06-3542-4a25-a8b1-5f5b4ee41114\") " pod="openshift-machine-config-operator/machine-config-controller-54cb48566c-g4r57" Feb 23 14:32:01.398231 master-0 kubenswrapper[7728]: I0223 14:32:01.397774 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phzkn\" (UniqueName: \"kubernetes.io/projected/5cc28e06-3542-4a25-a8b1-5f5b4ee41114-kube-api-access-phzkn\") pod \"machine-config-controller-54cb48566c-g4r57\" (UID: \"5cc28e06-3542-4a25-a8b1-5f5b4ee41114\") " pod="openshift-machine-config-operator/machine-config-controller-54cb48566c-g4r57" Feb 23 14:32:01.431056 master-0 kubenswrapper[7728]: I0223 14:32:01.431007 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" event={"ID":"172d47fd-e1a1-4d77-9e31-c4f22e824d5f","Type":"ContainerStarted","Data":"c9d52dd8df6e377a4d3cb203e2bf79a0b610e87fcaa198b6bee208b63f11b129"} Feb 23 14:32:01.431143 master-0 kubenswrapper[7728]: I0223 14:32:01.431049 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" event={"ID":"172d47fd-e1a1-4d77-9e31-c4f22e824d5f","Type":"ContainerStarted","Data":"5735cff00eeb26df95f36e1548b653da767e9f1764f098d8c148f2a98be789ec"} Feb 23 14:32:01.498596 master-0 kubenswrapper[7728]: I0223 14:32:01.498537 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5cc28e06-3542-4a25-a8b1-5f5b4ee41114-mcc-auth-proxy-config\") pod \"machine-config-controller-54cb48566c-g4r57\" (UID: \"5cc28e06-3542-4a25-a8b1-5f5b4ee41114\") " pod="openshift-machine-config-operator/machine-config-controller-54cb48566c-g4r57" Feb 23 14:32:01.498596 master-0 kubenswrapper[7728]: I0223 14:32:01.498601 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cc28e06-3542-4a25-a8b1-5f5b4ee41114-proxy-tls\") pod \"machine-config-controller-54cb48566c-g4r57\" (UID: \"5cc28e06-3542-4a25-a8b1-5f5b4ee41114\") " pod="openshift-machine-config-operator/machine-config-controller-54cb48566c-g4r57" Feb 23 14:32:01.499184 master-0 kubenswrapper[7728]: I0223 14:32:01.499140 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phzkn\" (UniqueName: \"kubernetes.io/projected/5cc28e06-3542-4a25-a8b1-5f5b4ee41114-kube-api-access-phzkn\") pod \"machine-config-controller-54cb48566c-g4r57\" (UID: \"5cc28e06-3542-4a25-a8b1-5f5b4ee41114\") " pod="openshift-machine-config-operator/machine-config-controller-54cb48566c-g4r57" Feb 23 14:32:01.501415 master-0 kubenswrapper[7728]: I0223 14:32:01.501350 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5cc28e06-3542-4a25-a8b1-5f5b4ee41114-mcc-auth-proxy-config\") pod \"machine-config-controller-54cb48566c-g4r57\" (UID: \"5cc28e06-3542-4a25-a8b1-5f5b4ee41114\") " pod="openshift-machine-config-operator/machine-config-controller-54cb48566c-g4r57" Feb 23 14:32:01.510233 master-0 kubenswrapper[7728]: I0223 14:32:01.510182 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cc28e06-3542-4a25-a8b1-5f5b4ee41114-proxy-tls\") pod \"machine-config-controller-54cb48566c-g4r57\" (UID: \"5cc28e06-3542-4a25-a8b1-5f5b4ee41114\") " pod="openshift-machine-config-operator/machine-config-controller-54cb48566c-g4r57" Feb 23 14:32:01.522602 master-0 kubenswrapper[7728]: I0223 14:32:01.522557 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phzkn\" (UniqueName: \"kubernetes.io/projected/5cc28e06-3542-4a25-a8b1-5f5b4ee41114-kube-api-access-phzkn\") pod \"machine-config-controller-54cb48566c-g4r57\" (UID: \"5cc28e06-3542-4a25-a8b1-5f5b4ee41114\") " pod="openshift-machine-config-operator/machine-config-controller-54cb48566c-g4r57" Feb 23 14:32:01.710198 master-0 kubenswrapper[7728]: I0223 14:32:01.710072 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-54cb48566c-g4r57" Feb 23 14:32:02.068236 master-0 kubenswrapper[7728]: I0223 14:32:02.067935 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-54cb48566c-g4r57"] Feb 23 14:32:02.072586 master-0 kubenswrapper[7728]: W0223 14:32:02.072333 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cc28e06_3542_4a25_a8b1_5f5b4ee41114.slice/crio-e8da33da933e20232c5b6f5c3675ee250e9d8a32fcabaead70736dd1e091c691 WatchSource:0}: Error finding container e8da33da933e20232c5b6f5c3675ee250e9d8a32fcabaead70736dd1e091c691: Status 404 returned error can't find the container with id e8da33da933e20232c5b6f5c3675ee250e9d8a32fcabaead70736dd1e091c691 Feb 23 14:32:02.441153 master-0 kubenswrapper[7728]: I0223 14:32:02.441074 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-54cb48566c-g4r57" event={"ID":"5cc28e06-3542-4a25-a8b1-5f5b4ee41114","Type":"ContainerStarted","Data":"dab7706dbc7b87f5dfe86583c49c1b5888ad124b3a580895f9a21bab5c02cb05"} Feb 23 14:32:02.441709 master-0 kubenswrapper[7728]: I0223 14:32:02.441172 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-54cb48566c-g4r57" event={"ID":"5cc28e06-3542-4a25-a8b1-5f5b4ee41114","Type":"ContainerStarted","Data":"e8da33da933e20232c5b6f5c3675ee250e9d8a32fcabaead70736dd1e091c691"} Feb 23 14:32:02.444215 master-0 kubenswrapper[7728]: I0223 14:32:02.444161 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" event={"ID":"172d47fd-e1a1-4d77-9e31-c4f22e824d5f","Type":"ContainerStarted","Data":"fdc652e9656c00d3c073d43834d94e829b13a496c3f4293067e250e5e619251d"} Feb 23 14:32:03.247255 master-0 kubenswrapper[7728]: I0223 14:32:03.246413 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" podStartSLOduration=4.24638285 podStartE2EDuration="4.24638285s" podCreationTimestamp="2026-02-23 14:31:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:32:03.245675011 +0000 UTC m=+816.208336347" watchObservedRunningTime="2026-02-23 14:32:03.24638285 +0000 UTC m=+816.209044186" Feb 23 14:32:03.314656 master-0 kubenswrapper[7728]: I0223 14:32:03.314590 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530950-wnkgm"] Feb 23 14:32:03.315686 master-0 kubenswrapper[7728]: I0223 14:32:03.315653 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530950-wnkgm" Feb 23 14:32:03.317983 master-0 kubenswrapper[7728]: I0223 14:32:03.317851 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 14:32:03.318140 master-0 kubenswrapper[7728]: I0223 14:32:03.318104 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-jqqg2" Feb 23 14:32:03.330020 master-0 kubenswrapper[7728]: I0223 14:32:03.329972 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7kb5\" (UniqueName: \"kubernetes.io/projected/78f5dea4-ed09-44a1-8eb1-d1fc497cc173-kube-api-access-t7kb5\") pod \"collect-profiles-29530950-wnkgm\" (UID: \"78f5dea4-ed09-44a1-8eb1-d1fc497cc173\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530950-wnkgm" Feb 23 14:32:03.330122 master-0 kubenswrapper[7728]: I0223 14:32:03.330034 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/78f5dea4-ed09-44a1-8eb1-d1fc497cc173-secret-volume\") pod \"collect-profiles-29530950-wnkgm\" (UID: \"78f5dea4-ed09-44a1-8eb1-d1fc497cc173\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530950-wnkgm" Feb 23 14:32:03.330122 master-0 kubenswrapper[7728]: I0223 14:32:03.330064 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/78f5dea4-ed09-44a1-8eb1-d1fc497cc173-config-volume\") pod \"collect-profiles-29530950-wnkgm\" (UID: \"78f5dea4-ed09-44a1-8eb1-d1fc497cc173\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530950-wnkgm" Feb 23 14:32:03.430563 master-0 kubenswrapper[7728]: I0223 14:32:03.371841 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530950-wnkgm"] Feb 23 14:32:03.432263 master-0 kubenswrapper[7728]: I0223 14:32:03.432179 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7kb5\" (UniqueName: \"kubernetes.io/projected/78f5dea4-ed09-44a1-8eb1-d1fc497cc173-kube-api-access-t7kb5\") pod \"collect-profiles-29530950-wnkgm\" (UID: \"78f5dea4-ed09-44a1-8eb1-d1fc497cc173\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530950-wnkgm" Feb 23 14:32:03.432387 master-0 kubenswrapper[7728]: I0223 14:32:03.432276 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/78f5dea4-ed09-44a1-8eb1-d1fc497cc173-secret-volume\") pod \"collect-profiles-29530950-wnkgm\" (UID: \"78f5dea4-ed09-44a1-8eb1-d1fc497cc173\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530950-wnkgm" Feb 23 14:32:03.432387 master-0 kubenswrapper[7728]: I0223 14:32:03.432306 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/78f5dea4-ed09-44a1-8eb1-d1fc497cc173-config-volume\") pod \"collect-profiles-29530950-wnkgm\" (UID: \"78f5dea4-ed09-44a1-8eb1-d1fc497cc173\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530950-wnkgm" Feb 23 14:32:03.433540 master-0 kubenswrapper[7728]: I0223 14:32:03.433507 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/78f5dea4-ed09-44a1-8eb1-d1fc497cc173-config-volume\") pod \"collect-profiles-29530950-wnkgm\" (UID: \"78f5dea4-ed09-44a1-8eb1-d1fc497cc173\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530950-wnkgm" Feb 23 14:32:03.437180 master-0 kubenswrapper[7728]: I0223 14:32:03.437139 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/78f5dea4-ed09-44a1-8eb1-d1fc497cc173-secret-volume\") pod \"collect-profiles-29530950-wnkgm\" (UID: \"78f5dea4-ed09-44a1-8eb1-d1fc497cc173\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530950-wnkgm" Feb 23 14:32:03.451251 master-0 kubenswrapper[7728]: I0223 14:32:03.451206 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7kb5\" (UniqueName: \"kubernetes.io/projected/78f5dea4-ed09-44a1-8eb1-d1fc497cc173-kube-api-access-t7kb5\") pod \"collect-profiles-29530950-wnkgm\" (UID: \"78f5dea4-ed09-44a1-8eb1-d1fc497cc173\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530950-wnkgm" Feb 23 14:32:03.452923 master-0 kubenswrapper[7728]: I0223 14:32:03.452866 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-54cb48566c-g4r57" event={"ID":"5cc28e06-3542-4a25-a8b1-5f5b4ee41114","Type":"ContainerStarted","Data":"f6ede5e39b7b33efaf1190fcbfc978e3b3ab212001c387b4a6ad421efbb824cf"} Feb 23 14:32:03.484795 master-0 kubenswrapper[7728]: I0223 14:32:03.484705 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-54cb48566c-g4r57" podStartSLOduration=2.484677533 podStartE2EDuration="2.484677533s" podCreationTimestamp="2026-02-23 14:32:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:32:03.478843731 +0000 UTC m=+816.441505057" watchObservedRunningTime="2026-02-23 14:32:03.484677533 +0000 UTC m=+816.447338869" Feb 23 14:32:03.637932 master-0 kubenswrapper[7728]: I0223 14:32:03.637772 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530950-wnkgm" Feb 23 14:32:04.070012 master-0 kubenswrapper[7728]: I0223 14:32:04.069915 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530950-wnkgm"] Feb 23 14:32:04.079546 master-0 kubenswrapper[7728]: W0223 14:32:04.079306 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78f5dea4_ed09_44a1_8eb1_d1fc497cc173.slice/crio-692c4eae0472c8ecc4c0d101041ab3c55666af550c8833bacb84eb0444c9ad4b WatchSource:0}: Error finding container 692c4eae0472c8ecc4c0d101041ab3c55666af550c8833bacb84eb0444c9ad4b: Status 404 returned error can't find the container with id 692c4eae0472c8ecc4c0d101041ab3c55666af550c8833bacb84eb0444c9ad4b Feb 23 14:32:04.461280 master-0 kubenswrapper[7728]: I0223 14:32:04.461150 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530950-wnkgm" event={"ID":"78f5dea4-ed09-44a1-8eb1-d1fc497cc173","Type":"ContainerStarted","Data":"27840ca7db3cacb7b24041918e945eaa29f553e36d936e622a640f67b21753c5"} Feb 23 14:32:04.461835 master-0 kubenswrapper[7728]: I0223 14:32:04.461336 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530950-wnkgm" event={"ID":"78f5dea4-ed09-44a1-8eb1-d1fc497cc173","Type":"ContainerStarted","Data":"692c4eae0472c8ecc4c0d101041ab3c55666af550c8833bacb84eb0444c9ad4b"} Feb 23 14:32:04.534947 master-0 kubenswrapper[7728]: I0223 14:32:04.534870 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-75d56db95f-rg8tp"] Feb 23 14:32:04.536057 master-0 kubenswrapper[7728]: I0223 14:32:04.536021 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-75d56db95f-rg8tp" Feb 23 14:32:04.537714 master-0 kubenswrapper[7728]: I0223 14:32:04.537668 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Feb 23 14:32:04.538074 master-0 kubenswrapper[7728]: I0223 14:32:04.538021 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-58fb6744f5-848dv"] Feb 23 14:32:04.539001 master-0 kubenswrapper[7728]: I0223 14:32:04.538928 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-58fb6744f5-848dv" Feb 23 14:32:04.540995 master-0 kubenswrapper[7728]: I0223 14:32:04.540935 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7b65dc9fcb-w68qb"] Feb 23 14:32:04.541862 master-0 kubenswrapper[7728]: I0223 14:32:04.541825 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:32:04.544355 master-0 kubenswrapper[7728]: I0223 14:32:04.544316 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 23 14:32:04.544459 master-0 kubenswrapper[7728]: I0223 14:32:04.544360 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 23 14:32:04.545093 master-0 kubenswrapper[7728]: I0223 14:32:04.545060 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c02c8912-46c9-4f86-ad28-9bfb2eca4e54-tls-certificates\") pod \"prometheus-operator-admission-webhook-75d56db95f-rg8tp\" (UID: \"c02c8912-46c9-4f86-ad28-9bfb2eca4e54\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-75d56db95f-rg8tp" Feb 23 14:32:04.545763 master-0 kubenswrapper[7728]: I0223 14:32:04.545706 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qfr7\" (UniqueName: \"kubernetes.io/projected/e5104cdd-85b8-49ba-95ca-3e9c8218a01e-kube-api-access-8qfr7\") pod \"network-check-source-58fb6744f5-848dv\" (UID: \"e5104cdd-85b8-49ba-95ca-3e9c8218a01e\") " pod="openshift-network-diagnostics/network-check-source-58fb6744f5-848dv" Feb 23 14:32:04.546952 master-0 kubenswrapper[7728]: I0223 14:32:04.546919 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 23 14:32:04.547133 master-0 kubenswrapper[7728]: I0223 14:32:04.547102 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 23 14:32:04.547357 master-0 kubenswrapper[7728]: I0223 14:32:04.547327 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 23 14:32:04.548740 master-0 kubenswrapper[7728]: I0223 14:32:04.548707 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 23 14:32:04.646798 master-0 kubenswrapper[7728]: I0223 14:32:04.646702 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06bde94a-3126-4d0f-baba-49dc5fbec61b-metrics-certs\") pod \"router-default-7b65dc9fcb-w68qb\" (UID: \"06bde94a-3126-4d0f-baba-49dc5fbec61b\") " pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:32:04.646798 master-0 kubenswrapper[7728]: I0223 14:32:04.646809 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qfr7\" (UniqueName: \"kubernetes.io/projected/e5104cdd-85b8-49ba-95ca-3e9c8218a01e-kube-api-access-8qfr7\") pod \"network-check-source-58fb6744f5-848dv\" (UID: \"e5104cdd-85b8-49ba-95ca-3e9c8218a01e\") " pod="openshift-network-diagnostics/network-check-source-58fb6744f5-848dv" Feb 23 14:32:04.647094 master-0 kubenswrapper[7728]: I0223 14:32:04.646851 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p6hn\" (UniqueName: \"kubernetes.io/projected/06bde94a-3126-4d0f-baba-49dc5fbec61b-kube-api-access-2p6hn\") pod \"router-default-7b65dc9fcb-w68qb\" (UID: \"06bde94a-3126-4d0f-baba-49dc5fbec61b\") " pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:32:04.647094 master-0 kubenswrapper[7728]: I0223 14:32:04.646883 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/06bde94a-3126-4d0f-baba-49dc5fbec61b-default-certificate\") pod \"router-default-7b65dc9fcb-w68qb\" (UID: \"06bde94a-3126-4d0f-baba-49dc5fbec61b\") " pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:32:04.647094 master-0 kubenswrapper[7728]: I0223 14:32:04.646902 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/06bde94a-3126-4d0f-baba-49dc5fbec61b-stats-auth\") pod \"router-default-7b65dc9fcb-w68qb\" (UID: \"06bde94a-3126-4d0f-baba-49dc5fbec61b\") " pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:32:04.647094 master-0 kubenswrapper[7728]: I0223 14:32:04.646921 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06bde94a-3126-4d0f-baba-49dc5fbec61b-service-ca-bundle\") pod \"router-default-7b65dc9fcb-w68qb\" (UID: \"06bde94a-3126-4d0f-baba-49dc5fbec61b\") " pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:32:04.647094 master-0 kubenswrapper[7728]: I0223 14:32:04.646949 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c02c8912-46c9-4f86-ad28-9bfb2eca4e54-tls-certificates\") pod \"prometheus-operator-admission-webhook-75d56db95f-rg8tp\" (UID: \"c02c8912-46c9-4f86-ad28-9bfb2eca4e54\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-75d56db95f-rg8tp" Feb 23 14:32:04.650752 master-0 kubenswrapper[7728]: I0223 14:32:04.650718 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c02c8912-46c9-4f86-ad28-9bfb2eca4e54-tls-certificates\") pod \"prometheus-operator-admission-webhook-75d56db95f-rg8tp\" (UID: \"c02c8912-46c9-4f86-ad28-9bfb2eca4e54\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-75d56db95f-rg8tp" Feb 23 14:32:04.703422 master-0 kubenswrapper[7728]: I0223 14:32:04.703367 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-58fb6744f5-848dv"] Feb 23 14:32:04.705869 master-0 kubenswrapper[7728]: I0223 14:32:04.705650 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-75d56db95f-rg8tp"] Feb 23 14:32:04.707068 master-0 kubenswrapper[7728]: I0223 14:32:04.707009 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29530950-wnkgm" podStartSLOduration=124.70699008 podStartE2EDuration="2m4.70699008s" podCreationTimestamp="2026-02-23 14:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:32:04.702010971 +0000 UTC m=+817.664672267" watchObservedRunningTime="2026-02-23 14:32:04.70699008 +0000 UTC m=+817.669651367" Feb 23 14:32:04.735190 master-0 kubenswrapper[7728]: I0223 14:32:04.733725 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qfr7\" (UniqueName: \"kubernetes.io/projected/e5104cdd-85b8-49ba-95ca-3e9c8218a01e-kube-api-access-8qfr7\") pod \"network-check-source-58fb6744f5-848dv\" (UID: \"e5104cdd-85b8-49ba-95ca-3e9c8218a01e\") " pod="openshift-network-diagnostics/network-check-source-58fb6744f5-848dv" Feb 23 14:32:04.747790 master-0 kubenswrapper[7728]: I0223 14:32:04.747738 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p6hn\" (UniqueName: \"kubernetes.io/projected/06bde94a-3126-4d0f-baba-49dc5fbec61b-kube-api-access-2p6hn\") pod \"router-default-7b65dc9fcb-w68qb\" (UID: \"06bde94a-3126-4d0f-baba-49dc5fbec61b\") " pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:32:04.747988 master-0 kubenswrapper[7728]: I0223 14:32:04.747811 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/06bde94a-3126-4d0f-baba-49dc5fbec61b-default-certificate\") pod \"router-default-7b65dc9fcb-w68qb\" (UID: \"06bde94a-3126-4d0f-baba-49dc5fbec61b\") " pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:32:04.747988 master-0 kubenswrapper[7728]: I0223 14:32:04.747836 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/06bde94a-3126-4d0f-baba-49dc5fbec61b-stats-auth\") pod \"router-default-7b65dc9fcb-w68qb\" (UID: \"06bde94a-3126-4d0f-baba-49dc5fbec61b\") " pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:32:04.747988 master-0 kubenswrapper[7728]: I0223 14:32:04.747860 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06bde94a-3126-4d0f-baba-49dc5fbec61b-service-ca-bundle\") pod \"router-default-7b65dc9fcb-w68qb\" (UID: \"06bde94a-3126-4d0f-baba-49dc5fbec61b\") " pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:32:04.747988 master-0 kubenswrapper[7728]: I0223 14:32:04.747935 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06bde94a-3126-4d0f-baba-49dc5fbec61b-metrics-certs\") pod \"router-default-7b65dc9fcb-w68qb\" (UID: \"06bde94a-3126-4d0f-baba-49dc5fbec61b\") " pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:32:04.750267 master-0 kubenswrapper[7728]: I0223 14:32:04.750228 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06bde94a-3126-4d0f-baba-49dc5fbec61b-service-ca-bundle\") pod \"router-default-7b65dc9fcb-w68qb\" (UID: \"06bde94a-3126-4d0f-baba-49dc5fbec61b\") " pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:32:04.751466 master-0 kubenswrapper[7728]: I0223 14:32:04.751439 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06bde94a-3126-4d0f-baba-49dc5fbec61b-metrics-certs\") pod \"router-default-7b65dc9fcb-w68qb\" (UID: \"06bde94a-3126-4d0f-baba-49dc5fbec61b\") " pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:32:04.753196 master-0 kubenswrapper[7728]: I0223 14:32:04.753175 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/06bde94a-3126-4d0f-baba-49dc5fbec61b-stats-auth\") pod \"router-default-7b65dc9fcb-w68qb\" (UID: \"06bde94a-3126-4d0f-baba-49dc5fbec61b\") " pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:32:04.763365 master-0 kubenswrapper[7728]: I0223 14:32:04.763339 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/06bde94a-3126-4d0f-baba-49dc5fbec61b-default-certificate\") pod \"router-default-7b65dc9fcb-w68qb\" (UID: \"06bde94a-3126-4d0f-baba-49dc5fbec61b\") " pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:32:04.810770 master-0 kubenswrapper[7728]: I0223 14:32:04.810730 7728 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 23 14:32:04.850572 master-0 kubenswrapper[7728]: I0223 14:32:04.850202 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p6hn\" (UniqueName: \"kubernetes.io/projected/06bde94a-3126-4d0f-baba-49dc5fbec61b-kube-api-access-2p6hn\") pod \"router-default-7b65dc9fcb-w68qb\" (UID: \"06bde94a-3126-4d0f-baba-49dc5fbec61b\") " pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:32:04.868507 master-0 kubenswrapper[7728]: I0223 14:32:04.866342 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-75d56db95f-rg8tp" Feb 23 14:32:04.885381 master-0 kubenswrapper[7728]: I0223 14:32:04.885306 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-58fb6744f5-848dv" Feb 23 14:32:04.905336 master-0 kubenswrapper[7728]: I0223 14:32:04.904048 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:32:04.944766 master-0 kubenswrapper[7728]: I0223 14:32:04.944320 7728 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 14:32:05.385624 master-0 kubenswrapper[7728]: I0223 14:32:05.384931 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-58fb6744f5-848dv"] Feb 23 14:32:05.389846 master-0 kubenswrapper[7728]: I0223 14:32:05.389813 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-75d56db95f-rg8tp"] Feb 23 14:32:05.392989 master-0 kubenswrapper[7728]: W0223 14:32:05.392272 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5104cdd_85b8_49ba_95ca_3e9c8218a01e.slice/crio-82317c2ff886c91d683fc1357282b875ededf218f4ed66da91969784b2202cd9 WatchSource:0}: Error finding container 82317c2ff886c91d683fc1357282b875ededf218f4ed66da91969784b2202cd9: Status 404 returned error can't find the container with id 82317c2ff886c91d683fc1357282b875ededf218f4ed66da91969784b2202cd9 Feb 23 14:32:05.470639 master-0 kubenswrapper[7728]: I0223 14:32:05.470594 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-58fb6744f5-848dv" event={"ID":"e5104cdd-85b8-49ba-95ca-3e9c8218a01e","Type":"ContainerStarted","Data":"82317c2ff886c91d683fc1357282b875ededf218f4ed66da91969784b2202cd9"} Feb 23 14:32:05.471981 master-0 kubenswrapper[7728]: I0223 14:32:05.471937 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" event={"ID":"06bde94a-3126-4d0f-baba-49dc5fbec61b","Type":"ContainerStarted","Data":"e75c63f7eeabff918836bbbeb9c11ed440bed473107c3e0b28076ddfdf91aadb"} Feb 23 14:32:05.473316 master-0 kubenswrapper[7728]: I0223 14:32:05.473262 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-75d56db95f-rg8tp" event={"ID":"c02c8912-46c9-4f86-ad28-9bfb2eca4e54","Type":"ContainerStarted","Data":"b6a377cd77390650f0232d49deb3c30f98f3070caae925837c6c5aeeec6246e5"} Feb 23 14:32:05.475225 master-0 kubenswrapper[7728]: I0223 14:32:05.475172 7728 generic.go:334] "Generic (PLEG): container finished" podID="78f5dea4-ed09-44a1-8eb1-d1fc497cc173" containerID="27840ca7db3cacb7b24041918e945eaa29f553e36d936e622a640f67b21753c5" exitCode=0 Feb 23 14:32:05.475297 master-0 kubenswrapper[7728]: I0223 14:32:05.475225 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530950-wnkgm" event={"ID":"78f5dea4-ed09-44a1-8eb1-d1fc497cc173","Type":"ContainerDied","Data":"27840ca7db3cacb7b24041918e945eaa29f553e36d936e622a640f67b21753c5"} Feb 23 14:32:06.485016 master-0 kubenswrapper[7728]: I0223 14:32:06.484959 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-58fb6744f5-848dv" event={"ID":"e5104cdd-85b8-49ba-95ca-3e9c8218a01e","Type":"ContainerStarted","Data":"4a7b48481b75f40fe6158bb21dd43fb7c910b44534fca5550f51e80a8e690723"} Feb 23 14:32:06.555370 master-0 kubenswrapper[7728]: I0223 14:32:06.555284 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-58fb6744f5-848dv" podStartSLOduration=868.555261828 podStartE2EDuration="14m28.555261828s" podCreationTimestamp="2026-02-23 14:17:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:32:06.552170547 +0000 UTC m=+819.514831843" watchObservedRunningTime="2026-02-23 14:32:06.555261828 +0000 UTC m=+819.517923134" Feb 23 14:32:07.110877 master-0 kubenswrapper[7728]: I0223 14:32:07.110816 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530950-wnkgm" Feb 23 14:32:07.180902 master-0 kubenswrapper[7728]: I0223 14:32:07.180850 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/78f5dea4-ed09-44a1-8eb1-d1fc497cc173-secret-volume\") pod \"78f5dea4-ed09-44a1-8eb1-d1fc497cc173\" (UID: \"78f5dea4-ed09-44a1-8eb1-d1fc497cc173\") " Feb 23 14:32:07.180902 master-0 kubenswrapper[7728]: I0223 14:32:07.180907 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/78f5dea4-ed09-44a1-8eb1-d1fc497cc173-config-volume\") pod \"78f5dea4-ed09-44a1-8eb1-d1fc497cc173\" (UID: \"78f5dea4-ed09-44a1-8eb1-d1fc497cc173\") " Feb 23 14:32:07.181185 master-0 kubenswrapper[7728]: I0223 14:32:07.180948 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7kb5\" (UniqueName: \"kubernetes.io/projected/78f5dea4-ed09-44a1-8eb1-d1fc497cc173-kube-api-access-t7kb5\") pod \"78f5dea4-ed09-44a1-8eb1-d1fc497cc173\" (UID: \"78f5dea4-ed09-44a1-8eb1-d1fc497cc173\") " Feb 23 14:32:07.181703 master-0 kubenswrapper[7728]: I0223 14:32:07.181638 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78f5dea4-ed09-44a1-8eb1-d1fc497cc173-config-volume" (OuterVolumeSpecName: "config-volume") pod "78f5dea4-ed09-44a1-8eb1-d1fc497cc173" (UID: "78f5dea4-ed09-44a1-8eb1-d1fc497cc173"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:32:07.184197 master-0 kubenswrapper[7728]: I0223 14:32:07.184144 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78f5dea4-ed09-44a1-8eb1-d1fc497cc173-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "78f5dea4-ed09-44a1-8eb1-d1fc497cc173" (UID: "78f5dea4-ed09-44a1-8eb1-d1fc497cc173"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:32:07.184294 master-0 kubenswrapper[7728]: I0223 14:32:07.184212 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78f5dea4-ed09-44a1-8eb1-d1fc497cc173-kube-api-access-t7kb5" (OuterVolumeSpecName: "kube-api-access-t7kb5") pod "78f5dea4-ed09-44a1-8eb1-d1fc497cc173" (UID: "78f5dea4-ed09-44a1-8eb1-d1fc497cc173"). InnerVolumeSpecName "kube-api-access-t7kb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:32:07.252458 master-0 kubenswrapper[7728]: I0223 14:32:07.252406 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-qwsmk"] Feb 23 14:32:07.252819 master-0 kubenswrapper[7728]: E0223 14:32:07.252789 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78f5dea4-ed09-44a1-8eb1-d1fc497cc173" containerName="collect-profiles" Feb 23 14:32:07.252954 master-0 kubenswrapper[7728]: I0223 14:32:07.252809 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="78f5dea4-ed09-44a1-8eb1-d1fc497cc173" containerName="collect-profiles" Feb 23 14:32:07.253061 master-0 kubenswrapper[7728]: I0223 14:32:07.253019 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="78f5dea4-ed09-44a1-8eb1-d1fc497cc173" containerName="collect-profiles" Feb 23 14:32:07.253402 master-0 kubenswrapper[7728]: I0223 14:32:07.253379 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-qwsmk" Feb 23 14:32:07.255031 master-0 kubenswrapper[7728]: I0223 14:32:07.254994 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-8k4nq" Feb 23 14:32:07.255932 master-0 kubenswrapper[7728]: I0223 14:32:07.255262 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 23 14:32:07.255932 master-0 kubenswrapper[7728]: I0223 14:32:07.255472 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 23 14:32:07.282897 master-0 kubenswrapper[7728]: I0223 14:32:07.282832 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c0a39496-5e47-4415-b8bf-ed0634797ce1-node-bootstrap-token\") pod \"machine-config-server-qwsmk\" (UID: \"c0a39496-5e47-4415-b8bf-ed0634797ce1\") " pod="openshift-machine-config-operator/machine-config-server-qwsmk" Feb 23 14:32:07.282897 master-0 kubenswrapper[7728]: I0223 14:32:07.282904 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sflb\" (UniqueName: \"kubernetes.io/projected/c0a39496-5e47-4415-b8bf-ed0634797ce1-kube-api-access-9sflb\") pod \"machine-config-server-qwsmk\" (UID: \"c0a39496-5e47-4415-b8bf-ed0634797ce1\") " pod="openshift-machine-config-operator/machine-config-server-qwsmk" Feb 23 14:32:07.283464 master-0 kubenswrapper[7728]: I0223 14:32:07.283385 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c0a39496-5e47-4415-b8bf-ed0634797ce1-certs\") pod \"machine-config-server-qwsmk\" (UID: \"c0a39496-5e47-4415-b8bf-ed0634797ce1\") " pod="openshift-machine-config-operator/machine-config-server-qwsmk" Feb 23 14:32:07.283591 master-0 kubenswrapper[7728]: I0223 14:32:07.283550 7728 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/78f5dea4-ed09-44a1-8eb1-d1fc497cc173-secret-volume\") on node \"master-0\" DevicePath \"\"" Feb 23 14:32:07.283591 master-0 kubenswrapper[7728]: I0223 14:32:07.283584 7728 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/78f5dea4-ed09-44a1-8eb1-d1fc497cc173-config-volume\") on node \"master-0\" DevicePath \"\"" Feb 23 14:32:07.283713 master-0 kubenswrapper[7728]: I0223 14:32:07.283596 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7kb5\" (UniqueName: \"kubernetes.io/projected/78f5dea4-ed09-44a1-8eb1-d1fc497cc173-kube-api-access-t7kb5\") on node \"master-0\" DevicePath \"\"" Feb 23 14:32:07.385517 master-0 kubenswrapper[7728]: I0223 14:32:07.385358 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c0a39496-5e47-4415-b8bf-ed0634797ce1-certs\") pod \"machine-config-server-qwsmk\" (UID: \"c0a39496-5e47-4415-b8bf-ed0634797ce1\") " pod="openshift-machine-config-operator/machine-config-server-qwsmk" Feb 23 14:32:07.385517 master-0 kubenswrapper[7728]: I0223 14:32:07.385430 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c0a39496-5e47-4415-b8bf-ed0634797ce1-node-bootstrap-token\") pod \"machine-config-server-qwsmk\" (UID: \"c0a39496-5e47-4415-b8bf-ed0634797ce1\") " pod="openshift-machine-config-operator/machine-config-server-qwsmk" Feb 23 14:32:07.385517 master-0 kubenswrapper[7728]: I0223 14:32:07.385463 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sflb\" (UniqueName: \"kubernetes.io/projected/c0a39496-5e47-4415-b8bf-ed0634797ce1-kube-api-access-9sflb\") pod \"machine-config-server-qwsmk\" (UID: \"c0a39496-5e47-4415-b8bf-ed0634797ce1\") " pod="openshift-machine-config-operator/machine-config-server-qwsmk" Feb 23 14:32:07.392688 master-0 kubenswrapper[7728]: I0223 14:32:07.392394 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c0a39496-5e47-4415-b8bf-ed0634797ce1-certs\") pod \"machine-config-server-qwsmk\" (UID: \"c0a39496-5e47-4415-b8bf-ed0634797ce1\") " pod="openshift-machine-config-operator/machine-config-server-qwsmk" Feb 23 14:32:07.404168 master-0 kubenswrapper[7728]: I0223 14:32:07.400353 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c0a39496-5e47-4415-b8bf-ed0634797ce1-node-bootstrap-token\") pod \"machine-config-server-qwsmk\" (UID: \"c0a39496-5e47-4415-b8bf-ed0634797ce1\") " pod="openshift-machine-config-operator/machine-config-server-qwsmk" Feb 23 14:32:07.424375 master-0 kubenswrapper[7728]: I0223 14:32:07.424319 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sflb\" (UniqueName: \"kubernetes.io/projected/c0a39496-5e47-4415-b8bf-ed0634797ce1-kube-api-access-9sflb\") pod \"machine-config-server-qwsmk\" (UID: \"c0a39496-5e47-4415-b8bf-ed0634797ce1\") " pod="openshift-machine-config-operator/machine-config-server-qwsmk" Feb 23 14:32:07.491827 master-0 kubenswrapper[7728]: I0223 14:32:07.490995 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530950-wnkgm" event={"ID":"78f5dea4-ed09-44a1-8eb1-d1fc497cc173","Type":"ContainerDied","Data":"692c4eae0472c8ecc4c0d101041ab3c55666af550c8833bacb84eb0444c9ad4b"} Feb 23 14:32:07.491827 master-0 kubenswrapper[7728]: I0223 14:32:07.491076 7728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="692c4eae0472c8ecc4c0d101041ab3c55666af550c8833bacb84eb0444c9ad4b" Feb 23 14:32:07.491827 master-0 kubenswrapper[7728]: I0223 14:32:07.491016 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530950-wnkgm" Feb 23 14:32:07.570455 master-0 kubenswrapper[7728]: I0223 14:32:07.570415 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-qwsmk" Feb 23 14:32:10.507197 master-0 kubenswrapper[7728]: I0223 14:32:10.507114 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-75d56db95f-rg8tp" event={"ID":"c02c8912-46c9-4f86-ad28-9bfb2eca4e54","Type":"ContainerStarted","Data":"2e584c3fc9cf516104594bf0a348d5b59d91e3945a5a45d94da10c2d426282bd"} Feb 23 14:32:10.507197 master-0 kubenswrapper[7728]: I0223 14:32:10.507187 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-75d56db95f-rg8tp" Feb 23 14:32:10.509757 master-0 kubenswrapper[7728]: I0223 14:32:10.509709 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" event={"ID":"06bde94a-3126-4d0f-baba-49dc5fbec61b","Type":"ContainerStarted","Data":"7461839a3a630e391eda2be4a947e3e187fea230edbbc3e8b3af02abc9e03e06"} Feb 23 14:32:10.511454 master-0 kubenswrapper[7728]: I0223 14:32:10.511414 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-75d56db95f-rg8tp" Feb 23 14:32:10.515733 master-0 kubenswrapper[7728]: I0223 14:32:10.515619 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-qwsmk" event={"ID":"c0a39496-5e47-4415-b8bf-ed0634797ce1","Type":"ContainerStarted","Data":"82ebb613a27a91687ba715ffa70d565def5a49f407b70b9bf2cfb026a5c45a01"} Feb 23 14:32:10.515733 master-0 kubenswrapper[7728]: I0223 14:32:10.515666 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-qwsmk" event={"ID":"c0a39496-5e47-4415-b8bf-ed0634797ce1","Type":"ContainerStarted","Data":"3debae3963d7ed5cbda602b7ac89dc5c4d861ae9dad9b89fb0b3fcce27f1aad1"} Feb 23 14:32:10.537403 master-0 kubenswrapper[7728]: I0223 14:32:10.537315 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-75d56db95f-rg8tp" podStartSLOduration=773.691446557 podStartE2EDuration="12m58.537290211s" podCreationTimestamp="2026-02-23 14:19:12 +0000 UTC" firstStartedPulling="2026-02-23 14:32:05.400427343 +0000 UTC m=+818.363088639" lastFinishedPulling="2026-02-23 14:32:10.246270987 +0000 UTC m=+823.208932293" observedRunningTime="2026-02-23 14:32:10.534655783 +0000 UTC m=+823.497317089" watchObservedRunningTime="2026-02-23 14:32:10.537290211 +0000 UTC m=+823.499951507" Feb 23 14:32:10.557197 master-0 kubenswrapper[7728]: I0223 14:32:10.557123 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podStartSLOduration=773.257365074 podStartE2EDuration="12m58.557108216s" podCreationTimestamp="2026-02-23 14:19:12 +0000 UTC" firstStartedPulling="2026-02-23 14:32:04.944268487 +0000 UTC m=+817.906929783" lastFinishedPulling="2026-02-23 14:32:10.244011589 +0000 UTC m=+823.206672925" observedRunningTime="2026-02-23 14:32:10.552803114 +0000 UTC m=+823.515464410" watchObservedRunningTime="2026-02-23 14:32:10.557108216 +0000 UTC m=+823.519769512" Feb 23 14:32:10.591858 master-0 kubenswrapper[7728]: I0223 14:32:10.589777 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-qwsmk" podStartSLOduration=3.5897511250000003 podStartE2EDuration="3.589751125s" podCreationTimestamp="2026-02-23 14:32:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:32:10.587421484 +0000 UTC m=+823.550082800" watchObservedRunningTime="2026-02-23 14:32:10.589751125 +0000 UTC m=+823.552412431" Feb 23 14:32:10.693771 master-0 kubenswrapper[7728]: I0223 14:32:10.693619 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-754bc4d665-nl92v"] Feb 23 14:32:10.694911 master-0 kubenswrapper[7728]: I0223 14:32:10.694874 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" Feb 23 14:32:10.698801 master-0 kubenswrapper[7728]: I0223 14:32:10.698763 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Feb 23 14:32:10.698902 master-0 kubenswrapper[7728]: I0223 14:32:10.698823 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Feb 23 14:32:10.699003 master-0 kubenswrapper[7728]: I0223 14:32:10.698978 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Feb 23 14:32:10.699567 master-0 kubenswrapper[7728]: I0223 14:32:10.699468 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-5q5j4" Feb 23 14:32:10.712585 master-0 kubenswrapper[7728]: I0223 14:32:10.712553 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-754bc4d665-nl92v"] Feb 23 14:32:10.728038 master-0 kubenswrapper[7728]: I0223 14:32:10.727995 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/18da400b-2271-455d-be0d-0ed44c74f78d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-754bc4d665-nl92v\" (UID: \"18da400b-2271-455d-be0d-0ed44c74f78d\") " pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" Feb 23 14:32:10.728221 master-0 kubenswrapper[7728]: I0223 14:32:10.728049 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/18da400b-2271-455d-be0d-0ed44c74f78d-prometheus-operator-tls\") pod \"prometheus-operator-754bc4d665-nl92v\" (UID: \"18da400b-2271-455d-be0d-0ed44c74f78d\") " pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" Feb 23 14:32:10.728221 master-0 kubenswrapper[7728]: I0223 14:32:10.728073 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/18da400b-2271-455d-be0d-0ed44c74f78d-metrics-client-ca\") pod \"prometheus-operator-754bc4d665-nl92v\" (UID: \"18da400b-2271-455d-be0d-0ed44c74f78d\") " pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" Feb 23 14:32:10.728221 master-0 kubenswrapper[7728]: I0223 14:32:10.728090 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w5kr\" (UniqueName: \"kubernetes.io/projected/18da400b-2271-455d-be0d-0ed44c74f78d-kube-api-access-2w5kr\") pod \"prometheus-operator-754bc4d665-nl92v\" (UID: \"18da400b-2271-455d-be0d-0ed44c74f78d\") " pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" Feb 23 14:32:10.829134 master-0 kubenswrapper[7728]: I0223 14:32:10.829067 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/18da400b-2271-455d-be0d-0ed44c74f78d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-754bc4d665-nl92v\" (UID: \"18da400b-2271-455d-be0d-0ed44c74f78d\") " pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" Feb 23 14:32:10.829403 master-0 kubenswrapper[7728]: I0223 14:32:10.829344 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/18da400b-2271-455d-be0d-0ed44c74f78d-prometheus-operator-tls\") pod \"prometheus-operator-754bc4d665-nl92v\" (UID: \"18da400b-2271-455d-be0d-0ed44c74f78d\") " pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" Feb 23 14:32:10.829625 master-0 kubenswrapper[7728]: E0223 14:32:10.829583 7728 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Feb 23 14:32:10.829680 master-0 kubenswrapper[7728]: I0223 14:32:10.829614 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/18da400b-2271-455d-be0d-0ed44c74f78d-metrics-client-ca\") pod \"prometheus-operator-754bc4d665-nl92v\" (UID: \"18da400b-2271-455d-be0d-0ed44c74f78d\") " pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" Feb 23 14:32:10.829719 master-0 kubenswrapper[7728]: E0223 14:32:10.829676 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18da400b-2271-455d-be0d-0ed44c74f78d-prometheus-operator-tls podName:18da400b-2271-455d-be0d-0ed44c74f78d nodeName:}" failed. No retries permitted until 2026-02-23 14:32:11.32964821 +0000 UTC m=+824.292309536 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/18da400b-2271-455d-be0d-0ed44c74f78d-prometheus-operator-tls") pod "prometheus-operator-754bc4d665-nl92v" (UID: "18da400b-2271-455d-be0d-0ed44c74f78d") : secret "prometheus-operator-tls" not found Feb 23 14:32:10.829758 master-0 kubenswrapper[7728]: I0223 14:32:10.829707 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w5kr\" (UniqueName: \"kubernetes.io/projected/18da400b-2271-455d-be0d-0ed44c74f78d-kube-api-access-2w5kr\") pod \"prometheus-operator-754bc4d665-nl92v\" (UID: \"18da400b-2271-455d-be0d-0ed44c74f78d\") " pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" Feb 23 14:32:10.830582 master-0 kubenswrapper[7728]: I0223 14:32:10.830552 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/18da400b-2271-455d-be0d-0ed44c74f78d-metrics-client-ca\") pod \"prometheus-operator-754bc4d665-nl92v\" (UID: \"18da400b-2271-455d-be0d-0ed44c74f78d\") " pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" Feb 23 14:32:10.832970 master-0 kubenswrapper[7728]: I0223 14:32:10.832929 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/18da400b-2271-455d-be0d-0ed44c74f78d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-754bc4d665-nl92v\" (UID: \"18da400b-2271-455d-be0d-0ed44c74f78d\") " pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" Feb 23 14:32:10.851001 master-0 kubenswrapper[7728]: I0223 14:32:10.850945 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w5kr\" (UniqueName: \"kubernetes.io/projected/18da400b-2271-455d-be0d-0ed44c74f78d-kube-api-access-2w5kr\") pod \"prometheus-operator-754bc4d665-nl92v\" (UID: \"18da400b-2271-455d-be0d-0ed44c74f78d\") " pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" Feb 23 14:32:10.905822 master-0 kubenswrapper[7728]: I0223 14:32:10.905759 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:32:10.907903 master-0 kubenswrapper[7728]: I0223 14:32:10.907841 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:10.907903 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:10.907903 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:10.907903 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:10.908044 master-0 kubenswrapper[7728]: I0223 14:32:10.907926 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:11.337834 master-0 kubenswrapper[7728]: I0223 14:32:11.337779 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/18da400b-2271-455d-be0d-0ed44c74f78d-prometheus-operator-tls\") pod \"prometheus-operator-754bc4d665-nl92v\" (UID: \"18da400b-2271-455d-be0d-0ed44c74f78d\") " pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" Feb 23 14:32:11.341168 master-0 kubenswrapper[7728]: I0223 14:32:11.341126 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/18da400b-2271-455d-be0d-0ed44c74f78d-prometheus-operator-tls\") pod \"prometheus-operator-754bc4d665-nl92v\" (UID: \"18da400b-2271-455d-be0d-0ed44c74f78d\") " pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" Feb 23 14:32:11.614876 master-0 kubenswrapper[7728]: I0223 14:32:11.614767 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" Feb 23 14:32:11.907803 master-0 kubenswrapper[7728]: I0223 14:32:11.907649 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:11.907803 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:11.907803 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:11.907803 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:11.907803 master-0 kubenswrapper[7728]: I0223 14:32:11.907721 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:12.090967 master-0 kubenswrapper[7728]: I0223 14:32:12.090881 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-754bc4d665-nl92v"] Feb 23 14:32:12.099689 master-0 kubenswrapper[7728]: W0223 14:32:12.099612 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18da400b_2271_455d_be0d_0ed44c74f78d.slice/crio-b6510ef0b5fc51e782ccd9549d8a6adfc4072f0d6db015ee20beaf2f6eb3bcaa WatchSource:0}: Error finding container b6510ef0b5fc51e782ccd9549d8a6adfc4072f0d6db015ee20beaf2f6eb3bcaa: Status 404 returned error can't find the container with id b6510ef0b5fc51e782ccd9549d8a6adfc4072f0d6db015ee20beaf2f6eb3bcaa Feb 23 14:32:12.534053 master-0 kubenswrapper[7728]: I0223 14:32:12.533933 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" event={"ID":"18da400b-2271-455d-be0d-0ed44c74f78d","Type":"ContainerStarted","Data":"b6510ef0b5fc51e782ccd9549d8a6adfc4072f0d6db015ee20beaf2f6eb3bcaa"} Feb 23 14:32:12.907934 master-0 kubenswrapper[7728]: I0223 14:32:12.907795 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:12.907934 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:12.907934 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:12.907934 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:12.907934 master-0 kubenswrapper[7728]: I0223 14:32:12.907913 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:13.906927 master-0 kubenswrapper[7728]: I0223 14:32:13.906837 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:13.906927 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:13.906927 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:13.906927 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:13.907284 master-0 kubenswrapper[7728]: I0223 14:32:13.906925 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:14.904693 master-0 kubenswrapper[7728]: I0223 14:32:14.904649 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:32:14.906930 master-0 kubenswrapper[7728]: I0223 14:32:14.906884 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:14.906930 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:14.906930 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:14.906930 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:14.907112 master-0 kubenswrapper[7728]: I0223 14:32:14.906940 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:15.228017 master-0 kubenswrapper[7728]: I0223 14:32:15.227950 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-2-retry-1-master-0"] Feb 23 14:32:15.229171 master-0 kubenswrapper[7728]: I0223 14:32:15.229139 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Feb 23 14:32:15.235509 master-0 kubenswrapper[7728]: I0223 14:32:15.232131 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-4hjzs" Feb 23 14:32:15.240402 master-0 kubenswrapper[7728]: I0223 14:32:15.236862 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-retry-1-master-0"] Feb 23 14:32:15.240402 master-0 kubenswrapper[7728]: I0223 14:32:15.237638 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 23 14:32:15.289251 master-0 kubenswrapper[7728]: I0223 14:32:15.289187 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25b855e3-80dc-4ee5-80ab-c4742578a92f-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"25b855e3-80dc-4ee5-80ab-c4742578a92f\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Feb 23 14:32:15.289499 master-0 kubenswrapper[7728]: I0223 14:32:15.289310 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/25b855e3-80dc-4ee5-80ab-c4742578a92f-var-lock\") pod \"installer-2-retry-1-master-0\" (UID: \"25b855e3-80dc-4ee5-80ab-c4742578a92f\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Feb 23 14:32:15.289499 master-0 kubenswrapper[7728]: I0223 14:32:15.289358 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/25b855e3-80dc-4ee5-80ab-c4742578a92f-kubelet-dir\") pod \"installer-2-retry-1-master-0\" (UID: \"25b855e3-80dc-4ee5-80ab-c4742578a92f\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Feb 23 14:32:15.392187 master-0 kubenswrapper[7728]: I0223 14:32:15.392127 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/25b855e3-80dc-4ee5-80ab-c4742578a92f-var-lock\") pod \"installer-2-retry-1-master-0\" (UID: \"25b855e3-80dc-4ee5-80ab-c4742578a92f\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Feb 23 14:32:15.392187 master-0 kubenswrapper[7728]: I0223 14:32:15.392194 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/25b855e3-80dc-4ee5-80ab-c4742578a92f-kubelet-dir\") pod \"installer-2-retry-1-master-0\" (UID: \"25b855e3-80dc-4ee5-80ab-c4742578a92f\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Feb 23 14:32:15.392460 master-0 kubenswrapper[7728]: I0223 14:32:15.392341 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/25b855e3-80dc-4ee5-80ab-c4742578a92f-var-lock\") pod \"installer-2-retry-1-master-0\" (UID: \"25b855e3-80dc-4ee5-80ab-c4742578a92f\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Feb 23 14:32:15.392460 master-0 kubenswrapper[7728]: I0223 14:32:15.392419 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25b855e3-80dc-4ee5-80ab-c4742578a92f-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"25b855e3-80dc-4ee5-80ab-c4742578a92f\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Feb 23 14:32:15.392619 master-0 kubenswrapper[7728]: I0223 14:32:15.392591 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/25b855e3-80dc-4ee5-80ab-c4742578a92f-kubelet-dir\") pod \"installer-2-retry-1-master-0\" (UID: \"25b855e3-80dc-4ee5-80ab-c4742578a92f\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Feb 23 14:32:15.429280 master-0 kubenswrapper[7728]: I0223 14:32:15.426964 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25b855e3-80dc-4ee5-80ab-c4742578a92f-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"25b855e3-80dc-4ee5-80ab-c4742578a92f\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Feb 23 14:32:15.564180 master-0 kubenswrapper[7728]: I0223 14:32:15.564124 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" event={"ID":"18da400b-2271-455d-be0d-0ed44c74f78d","Type":"ContainerStarted","Data":"fa5fb647e8638eaad65b8811059d4ee7fcda78f5ba1a5eaad0814e3ad8d66e8c"} Feb 23 14:32:15.564368 master-0 kubenswrapper[7728]: I0223 14:32:15.564192 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" event={"ID":"18da400b-2271-455d-be0d-0ed44c74f78d","Type":"ContainerStarted","Data":"c81b3d29f769948ceb44d95e3377e7a7b6aa0ce8d14bbde1a834e3be95ef060d"} Feb 23 14:32:15.586001 master-0 kubenswrapper[7728]: I0223 14:32:15.585949 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Feb 23 14:32:15.592017 master-0 kubenswrapper[7728]: I0223 14:32:15.591338 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" podStartSLOduration=2.8047459310000002 podStartE2EDuration="5.591321475s" podCreationTimestamp="2026-02-23 14:32:10 +0000 UTC" firstStartedPulling="2026-02-23 14:32:12.101661209 +0000 UTC m=+825.064322525" lastFinishedPulling="2026-02-23 14:32:14.888236773 +0000 UTC m=+827.850898069" observedRunningTime="2026-02-23 14:32:15.589361784 +0000 UTC m=+828.552023120" watchObservedRunningTime="2026-02-23 14:32:15.591321475 +0000 UTC m=+828.553982771" Feb 23 14:32:15.907503 master-0 kubenswrapper[7728]: I0223 14:32:15.907438 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:15.907503 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:15.907503 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:15.907503 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:15.908424 master-0 kubenswrapper[7728]: I0223 14:32:15.907514 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:16.007865 master-0 kubenswrapper[7728]: I0223 14:32:16.007808 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-retry-1-master-0"] Feb 23 14:32:16.585847 master-0 kubenswrapper[7728]: I0223 14:32:16.585703 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" event={"ID":"25b855e3-80dc-4ee5-80ab-c4742578a92f","Type":"ContainerStarted","Data":"9e428f62a82052df41d6797fdf53021748cfc643092e404d44bde9e1092162d6"} Feb 23 14:32:16.585847 master-0 kubenswrapper[7728]: I0223 14:32:16.585784 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" event={"ID":"25b855e3-80dc-4ee5-80ab-c4742578a92f","Type":"ContainerStarted","Data":"38c349954c9e4d48bcd2d1d0bfed1c2e92410197933aa893d6cec912dd9abe84"} Feb 23 14:32:16.611162 master-0 kubenswrapper[7728]: I0223 14:32:16.611042 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" podStartSLOduration=1.6110171580000001 podStartE2EDuration="1.611017158s" podCreationTimestamp="2026-02-23 14:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:32:16.608501992 +0000 UTC m=+829.571163308" watchObservedRunningTime="2026-02-23 14:32:16.611017158 +0000 UTC m=+829.573678494" Feb 23 14:32:16.907889 master-0 kubenswrapper[7728]: I0223 14:32:16.907827 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:16.907889 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:16.907889 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:16.907889 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:16.908422 master-0 kubenswrapper[7728]: I0223 14:32:16.907896 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:17.044234 master-0 kubenswrapper[7728]: I0223 14:32:17.044165 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j"] Feb 23 14:32:17.045244 master-0 kubenswrapper[7728]: I0223 14:32:17.045170 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" Feb 23 14:32:17.047594 master-0 kubenswrapper[7728]: I0223 14:32:17.047561 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Feb 23 14:32:17.047712 master-0 kubenswrapper[7728]: I0223 14:32:17.047571 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Feb 23 14:32:17.053544 master-0 kubenswrapper[7728]: I0223 14:32:17.053473 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-ckhv6"] Feb 23 14:32:17.054550 master-0 kubenswrapper[7728]: I0223 14:32:17.054517 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-h82x8" Feb 23 14:32:17.055740 master-0 kubenswrapper[7728]: I0223 14:32:17.055709 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:32:17.057251 master-0 kubenswrapper[7728]: I0223 14:32:17.057180 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Feb 23 14:32:17.057354 master-0 kubenswrapper[7728]: I0223 14:32:17.057318 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Feb 23 14:32:17.057389 master-0 kubenswrapper[7728]: I0223 14:32:17.057188 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-vkk2b" Feb 23 14:32:17.066072 master-0 kubenswrapper[7728]: I0223 14:32:17.066024 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j"] Feb 23 14:32:17.103501 master-0 kubenswrapper[7728]: I0223 14:32:17.101556 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-59584d565f-pdl4r"] Feb 23 14:32:17.103501 master-0 kubenswrapper[7728]: I0223 14:32:17.102795 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:32:17.108158 master-0 kubenswrapper[7728]: I0223 14:32:17.108113 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Feb 23 14:32:17.108346 master-0 kubenswrapper[7728]: I0223 14:32:17.108318 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-vzz9z" Feb 23 14:32:17.109662 master-0 kubenswrapper[7728]: I0223 14:32:17.109628 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Feb 23 14:32:17.109854 master-0 kubenswrapper[7728]: I0223 14:32:17.109837 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Feb 23 14:32:17.113204 master-0 kubenswrapper[7728]: I0223 14:32:17.112854 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-59584d565f-pdl4r"] Feb 23 14:32:17.115669 master-0 kubenswrapper[7728]: I0223 14:32:17.115576 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94ddm\" (UniqueName: \"kubernetes.io/projected/15ad7f4e-44c6-4426-8b97-c47a47786544-kube-api-access-94ddm\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:32:17.115669 master-0 kubenswrapper[7728]: I0223 14:32:17.115633 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/15ad7f4e-44c6-4426-8b97-c47a47786544-metrics-client-ca\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:32:17.115669 master-0 kubenswrapper[7728]: I0223 14:32:17.115648 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-tls\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:32:17.115669 master-0 kubenswrapper[7728]: I0223 14:32:17.115667 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:32:17.115853 master-0 kubenswrapper[7728]: I0223 14:32:17.115683 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/15ad7f4e-44c6-4426-8b97-c47a47786544-sys\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:32:17.115853 master-0 kubenswrapper[7728]: I0223 14:32:17.115713 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-textfile\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:32:17.115853 master-0 kubenswrapper[7728]: I0223 14:32:17.115741 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/15ad7f4e-44c6-4426-8b97-c47a47786544-root\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:32:17.115853 master-0 kubenswrapper[7728]: I0223 14:32:17.115765 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-wtmp\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:32:17.224514 master-0 kubenswrapper[7728]: I0223 14:32:17.220012 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:32:17.224514 master-0 kubenswrapper[7728]: I0223 14:32:17.220063 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fae9a4cf-2acf-4728-9105-87e004052fe5-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-6dbff8cb4c-9qg7j\" (UID: \"fae9a4cf-2acf-4728-9105-87e004052fe5\") " pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" Feb 23 14:32:17.224514 master-0 kubenswrapper[7728]: I0223 14:32:17.220096 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/15ad7f4e-44c6-4426-8b97-c47a47786544-sys\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:32:17.224514 master-0 kubenswrapper[7728]: I0223 14:32:17.220125 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:32:17.224514 master-0 kubenswrapper[7728]: I0223 14:32:17.220155 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:32:17.224514 master-0 kubenswrapper[7728]: I0223 14:32:17.220181 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-textfile\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:32:17.224514 master-0 kubenswrapper[7728]: I0223 14:32:17.220211 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/15ad7f4e-44c6-4426-8b97-c47a47786544-root\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:32:17.224514 master-0 kubenswrapper[7728]: I0223 14:32:17.220228 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8v9z\" (UniqueName: \"kubernetes.io/projected/fae9a4cf-2acf-4728-9105-87e004052fe5-kube-api-access-x8v9z\") pod \"openshift-state-metrics-6dbff8cb4c-9qg7j\" (UID: \"fae9a4cf-2acf-4728-9105-87e004052fe5\") " pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" Feb 23 14:32:17.224514 master-0 kubenswrapper[7728]: I0223 14:32:17.220246 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fae9a4cf-2acf-4728-9105-87e004052fe5-openshift-state-metrics-tls\") pod \"openshift-state-metrics-6dbff8cb4c-9qg7j\" (UID: \"fae9a4cf-2acf-4728-9105-87e004052fe5\") " pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" Feb 23 14:32:17.224514 master-0 kubenswrapper[7728]: I0223 14:32:17.220263 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-metrics-client-ca\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:32:17.224514 master-0 kubenswrapper[7728]: I0223 14:32:17.220285 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-wtmp\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:32:17.224514 master-0 kubenswrapper[7728]: I0223 14:32:17.220303 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fae9a4cf-2acf-4728-9105-87e004052fe5-metrics-client-ca\") pod \"openshift-state-metrics-6dbff8cb4c-9qg7j\" (UID: \"fae9a4cf-2acf-4728-9105-87e004052fe5\") " pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" Feb 23 14:32:17.224514 master-0 kubenswrapper[7728]: I0223 14:32:17.220320 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzcqx\" (UniqueName: \"kubernetes.io/projected/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-api-access-dzcqx\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:32:17.224514 master-0 kubenswrapper[7728]: I0223 14:32:17.220338 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94ddm\" (UniqueName: \"kubernetes.io/projected/15ad7f4e-44c6-4426-8b97-c47a47786544-kube-api-access-94ddm\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:32:17.224514 master-0 kubenswrapper[7728]: I0223 14:32:17.220355 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-state-metrics-tls\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:32:17.224514 master-0 kubenswrapper[7728]: I0223 14:32:17.220377 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-volume-directive-shadow\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:32:17.224514 master-0 kubenswrapper[7728]: I0223 14:32:17.220398 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/15ad7f4e-44c6-4426-8b97-c47a47786544-metrics-client-ca\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:32:17.224514 master-0 kubenswrapper[7728]: I0223 14:32:17.220414 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-tls\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:32:17.224514 master-0 kubenswrapper[7728]: E0223 14:32:17.220535 7728 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Feb 23 14:32:17.224514 master-0 kubenswrapper[7728]: E0223 14:32:17.220582 7728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-tls podName:15ad7f4e-44c6-4426-8b97-c47a47786544 nodeName:}" failed. No retries permitted until 2026-02-23 14:32:17.72056654 +0000 UTC m=+830.683227826 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-tls") pod "node-exporter-ckhv6" (UID: "15ad7f4e-44c6-4426-8b97-c47a47786544") : secret "node-exporter-tls" not found Feb 23 14:32:17.225999 master-0 kubenswrapper[7728]: I0223 14:32:17.225773 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/15ad7f4e-44c6-4426-8b97-c47a47786544-sys\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:32:17.226809 master-0 kubenswrapper[7728]: I0223 14:32:17.226116 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-textfile\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:32:17.226809 master-0 kubenswrapper[7728]: I0223 14:32:17.226158 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/15ad7f4e-44c6-4426-8b97-c47a47786544-root\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:32:17.226809 master-0 kubenswrapper[7728]: I0223 14:32:17.226489 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-wtmp\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:32:17.227006 master-0 kubenswrapper[7728]: I0223 14:32:17.226975 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/15ad7f4e-44c6-4426-8b97-c47a47786544-metrics-client-ca\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:32:17.230091 master-0 kubenswrapper[7728]: I0223 14:32:17.229921 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:32:17.248502 master-0 kubenswrapper[7728]: I0223 14:32:17.247513 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94ddm\" (UniqueName: \"kubernetes.io/projected/15ad7f4e-44c6-4426-8b97-c47a47786544-kube-api-access-94ddm\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:32:17.321888 master-0 kubenswrapper[7728]: I0223 14:32:17.321374 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fae9a4cf-2acf-4728-9105-87e004052fe5-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-6dbff8cb4c-9qg7j\" (UID: \"fae9a4cf-2acf-4728-9105-87e004052fe5\") " pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" Feb 23 14:32:17.322063 master-0 kubenswrapper[7728]: I0223 14:32:17.321908 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:32:17.322063 master-0 kubenswrapper[7728]: I0223 14:32:17.321943 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:32:17.322898 master-0 kubenswrapper[7728]: I0223 14:32:17.322773 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:32:17.322963 master-0 kubenswrapper[7728]: I0223 14:32:17.322933 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8v9z\" (UniqueName: \"kubernetes.io/projected/fae9a4cf-2acf-4728-9105-87e004052fe5-kube-api-access-x8v9z\") pod \"openshift-state-metrics-6dbff8cb4c-9qg7j\" (UID: \"fae9a4cf-2acf-4728-9105-87e004052fe5\") " pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" Feb 23 14:32:17.323037 master-0 kubenswrapper[7728]: I0223 14:32:17.323010 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fae9a4cf-2acf-4728-9105-87e004052fe5-openshift-state-metrics-tls\") pod \"openshift-state-metrics-6dbff8cb4c-9qg7j\" (UID: \"fae9a4cf-2acf-4728-9105-87e004052fe5\") " pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" Feb 23 14:32:17.323080 master-0 kubenswrapper[7728]: I0223 14:32:17.323051 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-metrics-client-ca\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:32:17.323131 master-0 kubenswrapper[7728]: I0223 14:32:17.323114 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fae9a4cf-2acf-4728-9105-87e004052fe5-metrics-client-ca\") pod \"openshift-state-metrics-6dbff8cb4c-9qg7j\" (UID: \"fae9a4cf-2acf-4728-9105-87e004052fe5\") " pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" Feb 23 14:32:17.323169 master-0 kubenswrapper[7728]: I0223 14:32:17.323136 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzcqx\" (UniqueName: \"kubernetes.io/projected/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-api-access-dzcqx\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:32:17.323169 master-0 kubenswrapper[7728]: I0223 14:32:17.323162 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-state-metrics-tls\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:32:17.323243 master-0 kubenswrapper[7728]: I0223 14:32:17.323195 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-volume-directive-shadow\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:32:17.324521 master-0 kubenswrapper[7728]: I0223 14:32:17.323880 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-metrics-client-ca\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:32:17.324521 master-0 kubenswrapper[7728]: I0223 14:32:17.323883 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fae9a4cf-2acf-4728-9105-87e004052fe5-metrics-client-ca\") pod \"openshift-state-metrics-6dbff8cb4c-9qg7j\" (UID: \"fae9a4cf-2acf-4728-9105-87e004052fe5\") " pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" Feb 23 14:32:17.324521 master-0 kubenswrapper[7728]: I0223 14:32:17.324322 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-volume-directive-shadow\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:32:17.324702 master-0 kubenswrapper[7728]: I0223 14:32:17.324662 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fae9a4cf-2acf-4728-9105-87e004052fe5-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-6dbff8cb4c-9qg7j\" (UID: \"fae9a4cf-2acf-4728-9105-87e004052fe5\") " pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" Feb 23 14:32:17.326504 master-0 kubenswrapper[7728]: I0223 14:32:17.325700 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:32:17.327239 master-0 kubenswrapper[7728]: I0223 14:32:17.327163 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-state-metrics-tls\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:32:17.328163 master-0 kubenswrapper[7728]: I0223 14:32:17.328118 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fae9a4cf-2acf-4728-9105-87e004052fe5-openshift-state-metrics-tls\") pod \"openshift-state-metrics-6dbff8cb4c-9qg7j\" (UID: \"fae9a4cf-2acf-4728-9105-87e004052fe5\") " pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" Feb 23 14:32:17.339799 master-0 kubenswrapper[7728]: I0223 14:32:17.339742 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8v9z\" (UniqueName: \"kubernetes.io/projected/fae9a4cf-2acf-4728-9105-87e004052fe5-kube-api-access-x8v9z\") pod \"openshift-state-metrics-6dbff8cb4c-9qg7j\" (UID: \"fae9a4cf-2acf-4728-9105-87e004052fe5\") " pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" Feb 23 14:32:17.341123 master-0 kubenswrapper[7728]: I0223 14:32:17.341091 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzcqx\" (UniqueName: \"kubernetes.io/projected/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-api-access-dzcqx\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:32:17.359912 master-0 kubenswrapper[7728]: I0223 14:32:17.359840 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" Feb 23 14:32:17.449618 master-0 kubenswrapper[7728]: I0223 14:32:17.449569 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:32:17.595499 master-0 kubenswrapper[7728]: I0223 14:32:17.595443 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-6569778c84-hsl6c_3488a7eb-5170-478c-9af7-490dbe0f514e/ingress-operator/4.log" Feb 23 14:32:17.595990 master-0 kubenswrapper[7728]: I0223 14:32:17.595966 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-6569778c84-hsl6c_3488a7eb-5170-478c-9af7-490dbe0f514e/ingress-operator/3.log" Feb 23 14:32:17.596261 master-0 kubenswrapper[7728]: I0223 14:32:17.596234 7728 generic.go:334] "Generic (PLEG): container finished" podID="3488a7eb-5170-478c-9af7-490dbe0f514e" containerID="b59ddaa1f996d8d231b18a402187cbb1ee1446439ec71026f52221d4aaab529f" exitCode=1 Feb 23 14:32:17.596666 master-0 kubenswrapper[7728]: I0223 14:32:17.596568 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" event={"ID":"3488a7eb-5170-478c-9af7-490dbe0f514e","Type":"ContainerDied","Data":"b59ddaa1f996d8d231b18a402187cbb1ee1446439ec71026f52221d4aaab529f"} Feb 23 14:32:17.596666 master-0 kubenswrapper[7728]: I0223 14:32:17.596644 7728 scope.go:117] "RemoveContainer" containerID="ee438b16358b074b5b2b8beb6e302d9825fe67047f68f4e63ab8627671ec9d19" Feb 23 14:32:17.597234 master-0 kubenswrapper[7728]: I0223 14:32:17.597210 7728 scope.go:117] "RemoveContainer" containerID="b59ddaa1f996d8d231b18a402187cbb1ee1446439ec71026f52221d4aaab529f" Feb 23 14:32:17.597428 master-0 kubenswrapper[7728]: E0223 14:32:17.597402 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=ingress-operator pod=ingress-operator-6569778c84-hsl6c_openshift-ingress-operator(3488a7eb-5170-478c-9af7-490dbe0f514e)\"" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" podUID="3488a7eb-5170-478c-9af7-490dbe0f514e" Feb 23 14:32:17.731032 master-0 kubenswrapper[7728]: I0223 14:32:17.730973 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-tls\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:32:17.734196 master-0 kubenswrapper[7728]: I0223 14:32:17.734143 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-tls\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:32:17.759014 master-0 kubenswrapper[7728]: I0223 14:32:17.758960 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j"] Feb 23 14:32:17.764988 master-0 kubenswrapper[7728]: W0223 14:32:17.764905 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfae9a4cf_2acf_4728_9105_87e004052fe5.slice/crio-f9fabe4de8507d0278903a966443b61784dc222f54713517ea295798fc992f95 WatchSource:0}: Error finding container f9fabe4de8507d0278903a966443b61784dc222f54713517ea295798fc992f95: Status 404 returned error can't find the container with id f9fabe4de8507d0278903a966443b61784dc222f54713517ea295798fc992f95 Feb 23 14:32:17.908160 master-0 kubenswrapper[7728]: I0223 14:32:17.908068 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:17.908160 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:17.908160 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:17.908160 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:17.908823 master-0 kubenswrapper[7728]: I0223 14:32:17.908187 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:17.910419 master-0 kubenswrapper[7728]: I0223 14:32:17.910386 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-59584d565f-pdl4r"] Feb 23 14:32:17.921211 master-0 kubenswrapper[7728]: W0223 14:32:17.921149 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf05baa8c_39e2_4a73_aa0f_f1ccf074fd4c.slice/crio-f5da67148a68052c542cf85f2d066448d7345b96cbb5647569d62bb97b2af2b1 WatchSource:0}: Error finding container f5da67148a68052c542cf85f2d066448d7345b96cbb5647569d62bb97b2af2b1: Status 404 returned error can't find the container with id f5da67148a68052c542cf85f2d066448d7345b96cbb5647569d62bb97b2af2b1 Feb 23 14:32:17.975403 master-0 kubenswrapper[7728]: I0223 14:32:17.975329 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:32:18.607153 master-0 kubenswrapper[7728]: I0223 14:32:18.607037 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" event={"ID":"fae9a4cf-2acf-4728-9105-87e004052fe5","Type":"ContainerStarted","Data":"b1dde519d91e918a963d5dd69f1689c73c27c10bcb7f3538a8a0cc3077e15afb"} Feb 23 14:32:18.607153 master-0 kubenswrapper[7728]: I0223 14:32:18.607110 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" event={"ID":"fae9a4cf-2acf-4728-9105-87e004052fe5","Type":"ContainerStarted","Data":"20acbe7091a8520599c223e28966f8924f32322212993de0d035d00606e90c34"} Feb 23 14:32:18.607153 master-0 kubenswrapper[7728]: I0223 14:32:18.607122 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" event={"ID":"fae9a4cf-2acf-4728-9105-87e004052fe5","Type":"ContainerStarted","Data":"f9fabe4de8507d0278903a966443b61784dc222f54713517ea295798fc992f95"} Feb 23 14:32:18.608791 master-0 kubenswrapper[7728]: I0223 14:32:18.608731 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" event={"ID":"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c","Type":"ContainerStarted","Data":"f5da67148a68052c542cf85f2d066448d7345b96cbb5647569d62bb97b2af2b1"} Feb 23 14:32:18.610835 master-0 kubenswrapper[7728]: I0223 14:32:18.610788 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-6569778c84-hsl6c_3488a7eb-5170-478c-9af7-490dbe0f514e/ingress-operator/4.log" Feb 23 14:32:18.613261 master-0 kubenswrapper[7728]: I0223 14:32:18.613209 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ckhv6" event={"ID":"15ad7f4e-44c6-4426-8b97-c47a47786544","Type":"ContainerStarted","Data":"38ec00e9dfbef6fee1166da5b097f1e6a12696d48f32303b497f0b2d760141c8"} Feb 23 14:32:18.908059 master-0 kubenswrapper[7728]: I0223 14:32:18.907965 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:18.908059 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:18.908059 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:18.908059 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:18.908691 master-0 kubenswrapper[7728]: I0223 14:32:18.908656 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:19.906642 master-0 kubenswrapper[7728]: I0223 14:32:19.906577 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:19.906642 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:19.906642 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:19.906642 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:19.906992 master-0 kubenswrapper[7728]: I0223 14:32:19.906707 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:20.628517 master-0 kubenswrapper[7728]: I0223 14:32:20.628408 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" event={"ID":"fae9a4cf-2acf-4728-9105-87e004052fe5","Type":"ContainerStarted","Data":"02e5f74fd0fc64761a36d80414a3b15fecade9e3a74052c75afffc286b820ab8"} Feb 23 14:32:20.636396 master-0 kubenswrapper[7728]: I0223 14:32:20.636337 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" event={"ID":"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c","Type":"ContainerStarted","Data":"027076b56ad4f46e2ddfd17e8442ed8e0b84dc543030de168a735a3c04b63c22"} Feb 23 14:32:20.636605 master-0 kubenswrapper[7728]: I0223 14:32:20.636402 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" event={"ID":"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c","Type":"ContainerStarted","Data":"e10405a7c233b1e760533a57a424faa717efb1d70d9ccd63d71a225ba97dfe41"} Feb 23 14:32:20.636605 master-0 kubenswrapper[7728]: I0223 14:32:20.636432 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" event={"ID":"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c","Type":"ContainerStarted","Data":"2d6756d14611d57202a60d04d7be8115390739a88f5d00004fc09162e73f45a9"} Feb 23 14:32:20.639253 master-0 kubenswrapper[7728]: I0223 14:32:20.639183 7728 generic.go:334] "Generic (PLEG): container finished" podID="15ad7f4e-44c6-4426-8b97-c47a47786544" containerID="1c473cd845ba12d4bf1927e76251ff9dc47cd40e997137152d0746ccd7834430" exitCode=0 Feb 23 14:32:20.639253 master-0 kubenswrapper[7728]: I0223 14:32:20.639237 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ckhv6" event={"ID":"15ad7f4e-44c6-4426-8b97-c47a47786544","Type":"ContainerDied","Data":"1c473cd845ba12d4bf1927e76251ff9dc47cd40e997137152d0746ccd7834430"} Feb 23 14:32:20.661658 master-0 kubenswrapper[7728]: I0223 14:32:20.661246 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" podStartSLOduration=2.110374717 podStartE2EDuration="3.661210482s" podCreationTimestamp="2026-02-23 14:32:17 +0000 UTC" firstStartedPulling="2026-02-23 14:32:18.078335494 +0000 UTC m=+831.040996800" lastFinishedPulling="2026-02-23 14:32:19.629171269 +0000 UTC m=+832.591832565" observedRunningTime="2026-02-23 14:32:20.654443856 +0000 UTC m=+833.617105162" watchObservedRunningTime="2026-02-23 14:32:20.661210482 +0000 UTC m=+833.623871808" Feb 23 14:32:20.692265 master-0 kubenswrapper[7728]: I0223 14:32:20.692153 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" podStartSLOduration=1.980547343 podStartE2EDuration="3.692128006s" podCreationTimestamp="2026-02-23 14:32:17 +0000 UTC" firstStartedPulling="2026-02-23 14:32:17.922705469 +0000 UTC m=+830.885366765" lastFinishedPulling="2026-02-23 14:32:19.634286132 +0000 UTC m=+832.596947428" observedRunningTime="2026-02-23 14:32:20.677228529 +0000 UTC m=+833.639889925" watchObservedRunningTime="2026-02-23 14:32:20.692128006 +0000 UTC m=+833.654789302" Feb 23 14:32:20.907157 master-0 kubenswrapper[7728]: I0223 14:32:20.907036 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:20.907157 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:20.907157 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:20.907157 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:20.907157 master-0 kubenswrapper[7728]: I0223 14:32:20.907115 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:21.015941 master-0 kubenswrapper[7728]: I0223 14:32:21.015872 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-4-retry-1-master-0"] Feb 23 14:32:21.017105 master-0 kubenswrapper[7728]: I0223 14:32:21.017084 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Feb 23 14:32:21.020132 master-0 kubenswrapper[7728]: I0223 14:32:21.020089 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Feb 23 14:32:21.021824 master-0 kubenswrapper[7728]: I0223 14:32:21.021785 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-bhcxh" Feb 23 14:32:21.032786 master-0 kubenswrapper[7728]: I0223 14:32:21.032710 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-retry-1-master-0"] Feb 23 14:32:21.078267 master-0 kubenswrapper[7728]: I0223 14:32:21.078022 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/15245f43-22db-42eb-ab0b-702240986437-var-lock\") pod \"installer-4-retry-1-master-0\" (UID: \"15245f43-22db-42eb-ab0b-702240986437\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Feb 23 14:32:21.078644 master-0 kubenswrapper[7728]: I0223 14:32:21.078616 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15245f43-22db-42eb-ab0b-702240986437-kubelet-dir\") pod \"installer-4-retry-1-master-0\" (UID: \"15245f43-22db-42eb-ab0b-702240986437\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Feb 23 14:32:21.078850 master-0 kubenswrapper[7728]: I0223 14:32:21.078821 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15245f43-22db-42eb-ab0b-702240986437-kube-api-access\") pod \"installer-4-retry-1-master-0\" (UID: \"15245f43-22db-42eb-ab0b-702240986437\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Feb 23 14:32:21.180460 master-0 kubenswrapper[7728]: I0223 14:32:21.180303 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/15245f43-22db-42eb-ab0b-702240986437-var-lock\") pod \"installer-4-retry-1-master-0\" (UID: \"15245f43-22db-42eb-ab0b-702240986437\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Feb 23 14:32:21.180460 master-0 kubenswrapper[7728]: I0223 14:32:21.180385 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15245f43-22db-42eb-ab0b-702240986437-kubelet-dir\") pod \"installer-4-retry-1-master-0\" (UID: \"15245f43-22db-42eb-ab0b-702240986437\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Feb 23 14:32:21.180460 master-0 kubenswrapper[7728]: I0223 14:32:21.180430 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15245f43-22db-42eb-ab0b-702240986437-kube-api-access\") pod \"installer-4-retry-1-master-0\" (UID: \"15245f43-22db-42eb-ab0b-702240986437\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Feb 23 14:32:21.180803 master-0 kubenswrapper[7728]: I0223 14:32:21.180633 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/15245f43-22db-42eb-ab0b-702240986437-var-lock\") pod \"installer-4-retry-1-master-0\" (UID: \"15245f43-22db-42eb-ab0b-702240986437\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Feb 23 14:32:21.180996 master-0 kubenswrapper[7728]: I0223 14:32:21.180974 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15245f43-22db-42eb-ab0b-702240986437-kubelet-dir\") pod \"installer-4-retry-1-master-0\" (UID: \"15245f43-22db-42eb-ab0b-702240986437\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Feb 23 14:32:21.200582 master-0 kubenswrapper[7728]: I0223 14:32:21.200546 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15245f43-22db-42eb-ab0b-702240986437-kube-api-access\") pod \"installer-4-retry-1-master-0\" (UID: \"15245f43-22db-42eb-ab0b-702240986437\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Feb 23 14:32:21.341748 master-0 kubenswrapper[7728]: I0223 14:32:21.341654 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Feb 23 14:32:21.649421 master-0 kubenswrapper[7728]: I0223 14:32:21.649348 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ckhv6" event={"ID":"15ad7f4e-44c6-4426-8b97-c47a47786544","Type":"ContainerStarted","Data":"3af9ebc2851196aac12096fa739a23a372f4916e42d4cc0e0b42e3494587ee38"} Feb 23 14:32:21.649421 master-0 kubenswrapper[7728]: I0223 14:32:21.649423 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ckhv6" event={"ID":"15ad7f4e-44c6-4426-8b97-c47a47786544","Type":"ContainerStarted","Data":"9cd0e2f066454ddc59ba9d55af0212ac1d7bb30de28054ded4b6510cddb437d7"} Feb 23 14:32:21.709770 master-0 kubenswrapper[7728]: I0223 14:32:21.709579 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-ckhv6" podStartSLOduration=3.0913985840000002 podStartE2EDuration="4.709556279s" podCreationTimestamp="2026-02-23 14:32:17 +0000 UTC" firstStartedPulling="2026-02-23 14:32:18.013461378 +0000 UTC m=+830.976122674" lastFinishedPulling="2026-02-23 14:32:19.631619073 +0000 UTC m=+832.594280369" observedRunningTime="2026-02-23 14:32:21.707465875 +0000 UTC m=+834.670127171" watchObservedRunningTime="2026-02-23 14:32:21.709556279 +0000 UTC m=+834.672217615" Feb 23 14:32:21.852322 master-0 kubenswrapper[7728]: I0223 14:32:21.852252 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-retry-1-master-0"] Feb 23 14:32:21.858915 master-0 kubenswrapper[7728]: W0223 14:32:21.858817 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod15245f43_22db_42eb_ab0b_702240986437.slice/crio-926e5a9c249a2c0844ff9ae9838c305e194c581c93350a2a61909bb76a1d1f42 WatchSource:0}: Error finding container 926e5a9c249a2c0844ff9ae9838c305e194c581c93350a2a61909bb76a1d1f42: Status 404 returned error can't find the container with id 926e5a9c249a2c0844ff9ae9838c305e194c581c93350a2a61909bb76a1d1f42 Feb 23 14:32:21.907799 master-0 kubenswrapper[7728]: I0223 14:32:21.907671 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:21.907799 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:21.907799 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:21.907799 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:21.908180 master-0 kubenswrapper[7728]: I0223 14:32:21.907838 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:22.619178 master-0 kubenswrapper[7728]: I0223 14:32:22.619091 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-f55d8f669-b2gf9"] Feb 23 14:32:22.619987 master-0 kubenswrapper[7728]: I0223 14:32:22.619949 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:32:22.623130 master-0 kubenswrapper[7728]: I0223 14:32:22.623022 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-l72nr" Feb 23 14:32:22.623817 master-0 kubenswrapper[7728]: I0223 14:32:22.623266 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Feb 23 14:32:22.632377 master-0 kubenswrapper[7728]: I0223 14:32:22.625852 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-car2df00nf4i0" Feb 23 14:32:22.632377 master-0 kubenswrapper[7728]: I0223 14:32:22.626031 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Feb 23 14:32:22.632377 master-0 kubenswrapper[7728]: I0223 14:32:22.626242 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Feb 23 14:32:22.632377 master-0 kubenswrapper[7728]: I0223 14:32:22.626741 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Feb 23 14:32:22.636067 master-0 kubenswrapper[7728]: I0223 14:32:22.636004 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-f55d8f669-b2gf9"] Feb 23 14:32:22.658319 master-0 kubenswrapper[7728]: I0223 14:32:22.658218 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" event={"ID":"15245f43-22db-42eb-ab0b-702240986437","Type":"ContainerStarted","Data":"d8fda6fec7eadedba1d4400e4d7e27798506234350c769e6451d1eaf5b0ede8d"} Feb 23 14:32:22.658319 master-0 kubenswrapper[7728]: I0223 14:32:22.658283 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" event={"ID":"15245f43-22db-42eb-ab0b-702240986437","Type":"ContainerStarted","Data":"926e5a9c249a2c0844ff9ae9838c305e194c581c93350a2a61909bb76a1d1f42"} Feb 23 14:32:22.689954 master-0 kubenswrapper[7728]: I0223 14:32:22.680311 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" podStartSLOduration=1.680284458 podStartE2EDuration="1.680284458s" podCreationTimestamp="2026-02-23 14:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:32:22.677065544 +0000 UTC m=+835.639726850" watchObservedRunningTime="2026-02-23 14:32:22.680284458 +0000 UTC m=+835.642945784" Feb 23 14:32:22.807807 master-0 kubenswrapper[7728]: I0223 14:32:22.806937 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-secret-metrics-client-certs\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:32:22.807807 master-0 kubenswrapper[7728]: I0223 14:32:22.807411 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-secret-metrics-server-tls\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:32:22.807807 master-0 kubenswrapper[7728]: I0223 14:32:22.807525 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hnfl\" (UniqueName: \"kubernetes.io/projected/9416f5d0-32b4-4065-b678-26913af8b6dd-kube-api-access-7hnfl\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:32:22.808639 master-0 kubenswrapper[7728]: I0223 14:32:22.808588 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9416f5d0-32b4-4065-b678-26913af8b6dd-metrics-server-audit-profiles\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:32:22.808800 master-0 kubenswrapper[7728]: I0223 14:32:22.808660 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9416f5d0-32b4-4065-b678-26913af8b6dd-audit-log\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:32:22.808800 master-0 kubenswrapper[7728]: I0223 14:32:22.808750 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-client-ca-bundle\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:32:22.808961 master-0 kubenswrapper[7728]: I0223 14:32:22.808817 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9416f5d0-32b4-4065-b678-26913af8b6dd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:32:22.907416 master-0 kubenswrapper[7728]: I0223 14:32:22.907288 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:22.907416 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:22.907416 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:22.907416 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:22.907972 master-0 kubenswrapper[7728]: I0223 14:32:22.907930 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:22.910191 master-0 kubenswrapper[7728]: I0223 14:32:22.910132 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-secret-metrics-client-certs\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:32:22.910328 master-0 kubenswrapper[7728]: I0223 14:32:22.910213 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-secret-metrics-server-tls\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:32:22.910328 master-0 kubenswrapper[7728]: I0223 14:32:22.910245 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hnfl\" (UniqueName: \"kubernetes.io/projected/9416f5d0-32b4-4065-b678-26913af8b6dd-kube-api-access-7hnfl\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:32:22.910328 master-0 kubenswrapper[7728]: I0223 14:32:22.910268 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9416f5d0-32b4-4065-b678-26913af8b6dd-metrics-server-audit-profiles\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:32:22.910328 master-0 kubenswrapper[7728]: I0223 14:32:22.910306 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9416f5d0-32b4-4065-b678-26913af8b6dd-audit-log\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:32:22.910636 master-0 kubenswrapper[7728]: I0223 14:32:22.910338 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-client-ca-bundle\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:32:22.910636 master-0 kubenswrapper[7728]: I0223 14:32:22.910373 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9416f5d0-32b4-4065-b678-26913af8b6dd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:32:22.911531 master-0 kubenswrapper[7728]: I0223 14:32:22.911449 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9416f5d0-32b4-4065-b678-26913af8b6dd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:32:22.911531 master-0 kubenswrapper[7728]: I0223 14:32:22.911521 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9416f5d0-32b4-4065-b678-26913af8b6dd-audit-log\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:32:22.913301 master-0 kubenswrapper[7728]: I0223 14:32:22.913235 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9416f5d0-32b4-4065-b678-26913af8b6dd-metrics-server-audit-profiles\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:32:22.915965 master-0 kubenswrapper[7728]: I0223 14:32:22.915913 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-client-ca-bundle\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:32:22.916596 master-0 kubenswrapper[7728]: I0223 14:32:22.916554 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-secret-metrics-client-certs\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:32:22.917214 master-0 kubenswrapper[7728]: I0223 14:32:22.917181 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-secret-metrics-server-tls\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:32:22.941106 master-0 kubenswrapper[7728]: I0223 14:32:22.941067 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hnfl\" (UniqueName: \"kubernetes.io/projected/9416f5d0-32b4-4065-b678-26913af8b6dd-kube-api-access-7hnfl\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:32:22.971761 master-0 kubenswrapper[7728]: I0223 14:32:22.971673 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:32:23.434469 master-0 kubenswrapper[7728]: I0223 14:32:23.434392 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-f55d8f669-b2gf9"] Feb 23 14:32:23.443947 master-0 kubenswrapper[7728]: W0223 14:32:23.443885 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9416f5d0_32b4_4065_b678_26913af8b6dd.slice/crio-ba80f8cbf4454b204ee21a5520078d48d5261a99279a142cb4f152e1edc60436 WatchSource:0}: Error finding container ba80f8cbf4454b204ee21a5520078d48d5261a99279a142cb4f152e1edc60436: Status 404 returned error can't find the container with id ba80f8cbf4454b204ee21a5520078d48d5261a99279a142cb4f152e1edc60436 Feb 23 14:32:23.668993 master-0 kubenswrapper[7728]: I0223 14:32:23.668594 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" event={"ID":"9416f5d0-32b4-4065-b678-26913af8b6dd","Type":"ContainerStarted","Data":"ba80f8cbf4454b204ee21a5520078d48d5261a99279a142cb4f152e1edc60436"} Feb 23 14:32:23.908687 master-0 kubenswrapper[7728]: I0223 14:32:23.908550 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:23.908687 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:23.908687 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:23.908687 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:23.908687 master-0 kubenswrapper[7728]: I0223 14:32:23.908641 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:24.908007 master-0 kubenswrapper[7728]: I0223 14:32:24.907902 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:24.908007 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:24.908007 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:24.908007 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:24.908652 master-0 kubenswrapper[7728]: I0223 14:32:24.908043 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:25.685981 master-0 kubenswrapper[7728]: I0223 14:32:25.685916 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" event={"ID":"9416f5d0-32b4-4065-b678-26913af8b6dd","Type":"ContainerStarted","Data":"f866731e4ac5121ccde39a6f28422037df55500fc5889296919662d103c3a36f"} Feb 23 14:32:25.712633 master-0 kubenswrapper[7728]: I0223 14:32:25.712563 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" podStartSLOduration=2.288040444 podStartE2EDuration="3.712546517s" podCreationTimestamp="2026-02-23 14:32:22 +0000 UTC" firstStartedPulling="2026-02-23 14:32:23.447336864 +0000 UTC m=+836.409998150" lastFinishedPulling="2026-02-23 14:32:24.871842927 +0000 UTC m=+837.834504223" observedRunningTime="2026-02-23 14:32:25.708696377 +0000 UTC m=+838.671357683" watchObservedRunningTime="2026-02-23 14:32:25.712546517 +0000 UTC m=+838.675207813" Feb 23 14:32:25.906955 master-0 kubenswrapper[7728]: I0223 14:32:25.906846 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:25.906955 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:25.906955 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:25.906955 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:25.906955 master-0 kubenswrapper[7728]: I0223 14:32:25.906928 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:26.907559 master-0 kubenswrapper[7728]: I0223 14:32:26.907438 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:26.907559 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:26.907559 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:26.907559 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:26.908767 master-0 kubenswrapper[7728]: I0223 14:32:26.907569 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:27.908289 master-0 kubenswrapper[7728]: I0223 14:32:27.908200 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:27.908289 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:27.908289 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:27.908289 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:27.908945 master-0 kubenswrapper[7728]: I0223 14:32:27.908306 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:28.908589 master-0 kubenswrapper[7728]: I0223 14:32:28.908468 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:28.908589 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:28.908589 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:28.908589 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:28.908589 master-0 kubenswrapper[7728]: I0223 14:32:28.908581 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:29.907863 master-0 kubenswrapper[7728]: I0223 14:32:29.907773 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:29.907863 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:29.907863 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:29.907863 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:29.907863 master-0 kubenswrapper[7728]: I0223 14:32:29.907852 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:30.419974 master-0 kubenswrapper[7728]: I0223 14:32:30.419893 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-nwdpd"] Feb 23 14:32:30.421904 master-0 kubenswrapper[7728]: I0223 14:32:30.421853 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nwdpd" Feb 23 14:32:30.423678 master-0 kubenswrapper[7728]: I0223 14:32:30.423631 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 23 14:32:30.423774 master-0 kubenswrapper[7728]: I0223 14:32:30.423631 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 23 14:32:30.424776 master-0 kubenswrapper[7728]: I0223 14:32:30.424720 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 23 14:32:30.424858 master-0 kubenswrapper[7728]: I0223 14:32:30.424771 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-wvfd9" Feb 23 14:32:30.431570 master-0 kubenswrapper[7728]: I0223 14:32:30.431452 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-nwdpd"] Feb 23 14:32:30.522080 master-0 kubenswrapper[7728]: I0223 14:32:30.521835 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87f989cd-6c19-4a30-833a-10e98b7a0326-cert\") pod \"ingress-canary-nwdpd\" (UID: \"87f989cd-6c19-4a30-833a-10e98b7a0326\") " pod="openshift-ingress-canary/ingress-canary-nwdpd" Feb 23 14:32:30.522080 master-0 kubenswrapper[7728]: I0223 14:32:30.521925 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpqzn\" (UniqueName: \"kubernetes.io/projected/87f989cd-6c19-4a30-833a-10e98b7a0326-kube-api-access-wpqzn\") pod \"ingress-canary-nwdpd\" (UID: \"87f989cd-6c19-4a30-833a-10e98b7a0326\") " pod="openshift-ingress-canary/ingress-canary-nwdpd" Feb 23 14:32:30.623493 master-0 kubenswrapper[7728]: I0223 14:32:30.623395 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87f989cd-6c19-4a30-833a-10e98b7a0326-cert\") pod \"ingress-canary-nwdpd\" (UID: \"87f989cd-6c19-4a30-833a-10e98b7a0326\") " pod="openshift-ingress-canary/ingress-canary-nwdpd" Feb 23 14:32:30.623724 master-0 kubenswrapper[7728]: I0223 14:32:30.623518 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpqzn\" (UniqueName: \"kubernetes.io/projected/87f989cd-6c19-4a30-833a-10e98b7a0326-kube-api-access-wpqzn\") pod \"ingress-canary-nwdpd\" (UID: \"87f989cd-6c19-4a30-833a-10e98b7a0326\") " pod="openshift-ingress-canary/ingress-canary-nwdpd" Feb 23 14:32:30.629004 master-0 kubenswrapper[7728]: I0223 14:32:30.628950 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87f989cd-6c19-4a30-833a-10e98b7a0326-cert\") pod \"ingress-canary-nwdpd\" (UID: \"87f989cd-6c19-4a30-833a-10e98b7a0326\") " pod="openshift-ingress-canary/ingress-canary-nwdpd" Feb 23 14:32:30.641769 master-0 kubenswrapper[7728]: I0223 14:32:30.641729 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpqzn\" (UniqueName: \"kubernetes.io/projected/87f989cd-6c19-4a30-833a-10e98b7a0326-kube-api-access-wpqzn\") pod \"ingress-canary-nwdpd\" (UID: \"87f989cd-6c19-4a30-833a-10e98b7a0326\") " pod="openshift-ingress-canary/ingress-canary-nwdpd" Feb 23 14:32:30.757242 master-0 kubenswrapper[7728]: I0223 14:32:30.757180 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nwdpd" Feb 23 14:32:30.879995 master-0 kubenswrapper[7728]: I0223 14:32:30.879628 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-njjl7"] Feb 23 14:32:30.881643 master-0 kubenswrapper[7728]: I0223 14:32:30.881499 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-njjl7" Feb 23 14:32:30.884086 master-0 kubenswrapper[7728]: I0223 14:32:30.883737 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-b9qj9" Feb 23 14:32:30.884086 master-0 kubenswrapper[7728]: I0223 14:32:30.883947 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Feb 23 14:32:30.915086 master-0 kubenswrapper[7728]: I0223 14:32:30.914707 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:30.915086 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:30.915086 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:30.915086 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:30.915086 master-0 kubenswrapper[7728]: I0223 14:32:30.914780 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:30.928733 master-0 kubenswrapper[7728]: I0223 14:32:30.928663 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/56b2b6a4-9230-46e4-9689-b267bc753b6f-ready\") pod \"cni-sysctl-allowlist-ds-njjl7\" (UID: \"56b2b6a4-9230-46e4-9689-b267bc753b6f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-njjl7" Feb 23 14:32:30.928733 master-0 kubenswrapper[7728]: I0223 14:32:30.928726 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/56b2b6a4-9230-46e4-9689-b267bc753b6f-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-njjl7\" (UID: \"56b2b6a4-9230-46e4-9689-b267bc753b6f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-njjl7" Feb 23 14:32:30.930863 master-0 kubenswrapper[7728]: I0223 14:32:30.928765 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/56b2b6a4-9230-46e4-9689-b267bc753b6f-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-njjl7\" (UID: \"56b2b6a4-9230-46e4-9689-b267bc753b6f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-njjl7" Feb 23 14:32:30.930863 master-0 kubenswrapper[7728]: I0223 14:32:30.928834 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgkrc\" (UniqueName: \"kubernetes.io/projected/56b2b6a4-9230-46e4-9689-b267bc753b6f-kube-api-access-kgkrc\") pod \"cni-sysctl-allowlist-ds-njjl7\" (UID: \"56b2b6a4-9230-46e4-9689-b267bc753b6f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-njjl7" Feb 23 14:32:31.030099 master-0 kubenswrapper[7728]: I0223 14:32:31.030005 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgkrc\" (UniqueName: \"kubernetes.io/projected/56b2b6a4-9230-46e4-9689-b267bc753b6f-kube-api-access-kgkrc\") pod \"cni-sysctl-allowlist-ds-njjl7\" (UID: \"56b2b6a4-9230-46e4-9689-b267bc753b6f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-njjl7" Feb 23 14:32:31.030325 master-0 kubenswrapper[7728]: I0223 14:32:31.030309 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/56b2b6a4-9230-46e4-9689-b267bc753b6f-ready\") pod \"cni-sysctl-allowlist-ds-njjl7\" (UID: \"56b2b6a4-9230-46e4-9689-b267bc753b6f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-njjl7" Feb 23 14:32:31.030403 master-0 kubenswrapper[7728]: I0223 14:32:31.030392 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/56b2b6a4-9230-46e4-9689-b267bc753b6f-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-njjl7\" (UID: \"56b2b6a4-9230-46e4-9689-b267bc753b6f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-njjl7" Feb 23 14:32:31.030529 master-0 kubenswrapper[7728]: I0223 14:32:31.030471 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/56b2b6a4-9230-46e4-9689-b267bc753b6f-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-njjl7\" (UID: \"56b2b6a4-9230-46e4-9689-b267bc753b6f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-njjl7" Feb 23 14:32:31.031174 master-0 kubenswrapper[7728]: I0223 14:32:31.031158 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/56b2b6a4-9230-46e4-9689-b267bc753b6f-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-njjl7\" (UID: \"56b2b6a4-9230-46e4-9689-b267bc753b6f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-njjl7" Feb 23 14:32:31.031793 master-0 kubenswrapper[7728]: I0223 14:32:31.031777 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/56b2b6a4-9230-46e4-9689-b267bc753b6f-ready\") pod \"cni-sysctl-allowlist-ds-njjl7\" (UID: \"56b2b6a4-9230-46e4-9689-b267bc753b6f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-njjl7" Feb 23 14:32:31.031937 master-0 kubenswrapper[7728]: I0223 14:32:31.031924 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/56b2b6a4-9230-46e4-9689-b267bc753b6f-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-njjl7\" (UID: \"56b2b6a4-9230-46e4-9689-b267bc753b6f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-njjl7" Feb 23 14:32:31.053492 master-0 kubenswrapper[7728]: I0223 14:32:31.053449 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgkrc\" (UniqueName: \"kubernetes.io/projected/56b2b6a4-9230-46e4-9689-b267bc753b6f-kube-api-access-kgkrc\") pod \"cni-sysctl-allowlist-ds-njjl7\" (UID: \"56b2b6a4-9230-46e4-9689-b267bc753b6f\") " pod="openshift-multus/cni-sysctl-allowlist-ds-njjl7" Feb 23 14:32:31.213965 master-0 kubenswrapper[7728]: I0223 14:32:31.213921 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-njjl7" Feb 23 14:32:31.237628 master-0 kubenswrapper[7728]: W0223 14:32:31.236505 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56b2b6a4_9230_46e4_9689_b267bc753b6f.slice/crio-310238c5b2504d8af703571fde117480b00b3b9b8f61f02ad61a3558948f39ca WatchSource:0}: Error finding container 310238c5b2504d8af703571fde117480b00b3b9b8f61f02ad61a3558948f39ca: Status 404 returned error can't find the container with id 310238c5b2504d8af703571fde117480b00b3b9b8f61f02ad61a3558948f39ca Feb 23 14:32:31.243773 master-0 kubenswrapper[7728]: I0223 14:32:31.242852 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-nwdpd"] Feb 23 14:32:31.245710 master-0 kubenswrapper[7728]: W0223 14:32:31.245671 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87f989cd_6c19_4a30_833a_10e98b7a0326.slice/crio-b960fc2af9e3400ec5c9c6469cfc9540631ffe8f2ef43e226085fb14e2ada0b8 WatchSource:0}: Error finding container b960fc2af9e3400ec5c9c6469cfc9540631ffe8f2ef43e226085fb14e2ada0b8: Status 404 returned error can't find the container with id b960fc2af9e3400ec5c9c6469cfc9540631ffe8f2ef43e226085fb14e2ada0b8 Feb 23 14:32:31.730126 master-0 kubenswrapper[7728]: I0223 14:32:31.730007 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-njjl7" event={"ID":"56b2b6a4-9230-46e4-9689-b267bc753b6f","Type":"ContainerStarted","Data":"b4056ada42457871cd18017ebc6a92f379eb1f932bf638f2bfe123f8631970c2"} Feb 23 14:32:31.730929 master-0 kubenswrapper[7728]: I0223 14:32:31.730337 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-njjl7" event={"ID":"56b2b6a4-9230-46e4-9689-b267bc753b6f","Type":"ContainerStarted","Data":"310238c5b2504d8af703571fde117480b00b3b9b8f61f02ad61a3558948f39ca"} Feb 23 14:32:31.730929 master-0 kubenswrapper[7728]: I0223 14:32:31.730818 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-njjl7" Feb 23 14:32:31.732904 master-0 kubenswrapper[7728]: I0223 14:32:31.732843 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-nwdpd" event={"ID":"87f989cd-6c19-4a30-833a-10e98b7a0326","Type":"ContainerStarted","Data":"3d15bb45bdbf256571f89062eabcbe60b3f2bafe86110ce99ab9a1b2166faf20"} Feb 23 14:32:31.733033 master-0 kubenswrapper[7728]: I0223 14:32:31.732909 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-nwdpd" event={"ID":"87f989cd-6c19-4a30-833a-10e98b7a0326","Type":"ContainerStarted","Data":"b960fc2af9e3400ec5c9c6469cfc9540631ffe8f2ef43e226085fb14e2ada0b8"} Feb 23 14:32:31.908024 master-0 kubenswrapper[7728]: I0223 14:32:31.907829 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:31.908024 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:31.908024 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:31.908024 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:31.908024 master-0 kubenswrapper[7728]: I0223 14:32:31.907918 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:31.937430 master-0 kubenswrapper[7728]: I0223 14:32:31.937276 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-njjl7" podStartSLOduration=1.937240997 podStartE2EDuration="1.937240997s" podCreationTimestamp="2026-02-23 14:32:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:32:31.931669443 +0000 UTC m=+844.894330729" watchObservedRunningTime="2026-02-23 14:32:31.937240997 +0000 UTC m=+844.899902323" Feb 23 14:32:32.016210 master-0 kubenswrapper[7728]: I0223 14:32:32.016148 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Feb 23 14:32:32.018469 master-0 kubenswrapper[7728]: I0223 14:32:32.018410 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Feb 23 14:32:32.020404 master-0 kubenswrapper[7728]: I0223 14:32:32.020350 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 23 14:32:32.021070 master-0 kubenswrapper[7728]: I0223 14:32:32.021019 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5gfcq" Feb 23 14:32:32.047276 master-0 kubenswrapper[7728]: I0223 14:32:32.047215 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5caa659d-854d-43d6-8169-e03a6b1e97f1-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"5caa659d-854d-43d6-8169-e03a6b1e97f1\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Feb 23 14:32:32.047532 master-0 kubenswrapper[7728]: I0223 14:32:32.047288 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5caa659d-854d-43d6-8169-e03a6b1e97f1-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"5caa659d-854d-43d6-8169-e03a6b1e97f1\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Feb 23 14:32:32.047532 master-0 kubenswrapper[7728]: I0223 14:32:32.047334 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5caa659d-854d-43d6-8169-e03a6b1e97f1-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"5caa659d-854d-43d6-8169-e03a6b1e97f1\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Feb 23 14:32:32.100770 master-0 kubenswrapper[7728]: I0223 14:32:32.100690 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Feb 23 14:32:32.148853 master-0 kubenswrapper[7728]: I0223 14:32:32.148770 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5caa659d-854d-43d6-8169-e03a6b1e97f1-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"5caa659d-854d-43d6-8169-e03a6b1e97f1\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Feb 23 14:32:32.149101 master-0 kubenswrapper[7728]: I0223 14:32:32.148924 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5caa659d-854d-43d6-8169-e03a6b1e97f1-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"5caa659d-854d-43d6-8169-e03a6b1e97f1\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Feb 23 14:32:32.149101 master-0 kubenswrapper[7728]: I0223 14:32:32.148979 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5caa659d-854d-43d6-8169-e03a6b1e97f1-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"5caa659d-854d-43d6-8169-e03a6b1e97f1\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Feb 23 14:32:32.149101 master-0 kubenswrapper[7728]: I0223 14:32:32.149044 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5caa659d-854d-43d6-8169-e03a6b1e97f1-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"5caa659d-854d-43d6-8169-e03a6b1e97f1\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Feb 23 14:32:32.149305 master-0 kubenswrapper[7728]: I0223 14:32:32.149149 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5caa659d-854d-43d6-8169-e03a6b1e97f1-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"5caa659d-854d-43d6-8169-e03a6b1e97f1\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Feb 23 14:32:32.206606 master-0 kubenswrapper[7728]: I0223 14:32:32.206026 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5caa659d-854d-43d6-8169-e03a6b1e97f1-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"5caa659d-854d-43d6-8169-e03a6b1e97f1\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Feb 23 14:32:32.348825 master-0 kubenswrapper[7728]: I0223 14:32:32.348756 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Feb 23 14:32:32.770694 master-0 kubenswrapper[7728]: I0223 14:32:32.770565 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-njjl7" Feb 23 14:32:32.907576 master-0 kubenswrapper[7728]: I0223 14:32:32.907497 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:32.907576 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:32.907576 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:32.907576 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:32.907576 master-0 kubenswrapper[7728]: I0223 14:32:32.907573 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:33.221518 master-0 kubenswrapper[7728]: I0223 14:32:33.221432 7728 scope.go:117] "RemoveContainer" containerID="b59ddaa1f996d8d231b18a402187cbb1ee1446439ec71026f52221d4aaab529f" Feb 23 14:32:33.221973 master-0 kubenswrapper[7728]: E0223 14:32:33.221911 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=ingress-operator pod=ingress-operator-6569778c84-hsl6c_openshift-ingress-operator(3488a7eb-5170-478c-9af7-490dbe0f514e)\"" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" podUID="3488a7eb-5170-478c-9af7-490dbe0f514e" Feb 23 14:32:33.908698 master-0 kubenswrapper[7728]: I0223 14:32:33.908604 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:33.908698 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:33.908698 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:33.908698 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:33.909730 master-0 kubenswrapper[7728]: I0223 14:32:33.908731 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:34.911000 master-0 kubenswrapper[7728]: I0223 14:32:34.908530 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:34.911000 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:34.911000 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:34.911000 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:34.911000 master-0 kubenswrapper[7728]: I0223 14:32:34.908629 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:35.907580 master-0 kubenswrapper[7728]: I0223 14:32:35.907521 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:35.907580 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:35.907580 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:35.907580 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:35.908623 master-0 kubenswrapper[7728]: I0223 14:32:35.908543 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:35.981595 master-0 kubenswrapper[7728]: I0223 14:32:35.981017 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-nwdpd" podStartSLOduration=5.980975465 podStartE2EDuration="5.980975465s" podCreationTimestamp="2026-02-23 14:32:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:32:35.951658183 +0000 UTC m=+848.914319519" watchObservedRunningTime="2026-02-23 14:32:35.980975465 +0000 UTC m=+848.943636801" Feb 23 14:32:36.005919 master-0 kubenswrapper[7728]: I0223 14:32:36.005868 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Feb 23 14:32:36.768807 master-0 kubenswrapper[7728]: I0223 14:32:36.768722 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"5caa659d-854d-43d6-8169-e03a6b1e97f1","Type":"ContainerStarted","Data":"df4fa1ae04da7af9b174cd0914daf1b675cdba93f2bb2264ca25aa4373df6054"} Feb 23 14:32:36.768807 master-0 kubenswrapper[7728]: I0223 14:32:36.768790 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"5caa659d-854d-43d6-8169-e03a6b1e97f1","Type":"ContainerStarted","Data":"a1b72bc7fa94d2067752196074cb58d6cb50c64ab06eb17aa033c2353eecf711"} Feb 23 14:32:36.885792 master-0 kubenswrapper[7728]: I0223 14:32:36.885685 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" podStartSLOduration=5.885654277 podStartE2EDuration="5.885654277s" podCreationTimestamp="2026-02-23 14:32:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:32:36.82537441 +0000 UTC m=+849.788035726" watchObservedRunningTime="2026-02-23 14:32:36.885654277 +0000 UTC m=+849.848315573" Feb 23 14:32:36.886465 master-0 kubenswrapper[7728]: I0223 14:32:36.886436 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-njjl7"] Feb 23 14:32:36.886849 master-0 kubenswrapper[7728]: I0223 14:32:36.886746 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-njjl7" podUID="56b2b6a4-9230-46e4-9689-b267bc753b6f" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://b4056ada42457871cd18017ebc6a92f379eb1f932bf638f2bfe123f8631970c2" gracePeriod=30 Feb 23 14:32:36.907503 master-0 kubenswrapper[7728]: I0223 14:32:36.907414 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:36.907503 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:36.907503 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:36.907503 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:36.907845 master-0 kubenswrapper[7728]: I0223 14:32:36.907541 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:37.907621 master-0 kubenswrapper[7728]: I0223 14:32:37.907576 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:37.907621 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:37.907621 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:37.907621 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:37.908233 master-0 kubenswrapper[7728]: I0223 14:32:37.908207 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:38.907295 master-0 kubenswrapper[7728]: I0223 14:32:38.907171 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:38.907295 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:38.907295 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:38.907295 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:38.907295 master-0 kubenswrapper[7728]: I0223 14:32:38.907263 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:39.907692 master-0 kubenswrapper[7728]: I0223 14:32:39.907610 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:39.907692 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:39.907692 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:39.907692 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:39.908363 master-0 kubenswrapper[7728]: I0223 14:32:39.907693 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:40.359705 master-0 kubenswrapper[7728]: I0223 14:32:40.359642 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Feb 23 14:32:40.359937 master-0 kubenswrapper[7728]: I0223 14:32:40.359865 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" podUID="5caa659d-854d-43d6-8169-e03a6b1e97f1" containerName="installer" containerID="cri-o://df4fa1ae04da7af9b174cd0914daf1b675cdba93f2bb2264ca25aa4373df6054" gracePeriod=30 Feb 23 14:32:40.907749 master-0 kubenswrapper[7728]: I0223 14:32:40.907707 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:40.907749 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:40.907749 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:40.907749 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:40.908433 master-0 kubenswrapper[7728]: I0223 14:32:40.907757 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:41.216251 master-0 kubenswrapper[7728]: E0223 14:32:41.216174 7728 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b4056ada42457871cd18017ebc6a92f379eb1f932bf638f2bfe123f8631970c2" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 23 14:32:41.217942 master-0 kubenswrapper[7728]: E0223 14:32:41.217897 7728 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b4056ada42457871cd18017ebc6a92f379eb1f932bf638f2bfe123f8631970c2" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 23 14:32:41.219194 master-0 kubenswrapper[7728]: E0223 14:32:41.219153 7728 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b4056ada42457871cd18017ebc6a92f379eb1f932bf638f2bfe123f8631970c2" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 23 14:32:41.219194 master-0 kubenswrapper[7728]: E0223 14:32:41.219186 7728 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-njjl7" podUID="56b2b6a4-9230-46e4-9689-b267bc753b6f" containerName="kube-multus-additional-cni-plugins" Feb 23 14:32:41.907923 master-0 kubenswrapper[7728]: I0223 14:32:41.907827 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:41.907923 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:41.907923 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:41.907923 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:41.909026 master-0 kubenswrapper[7728]: I0223 14:32:41.908590 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:42.322074 master-0 kubenswrapper[7728]: I0223 14:32:42.321990 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-5f54bf67d4-2p4jz"] Feb 23 14:32:42.323200 master-0 kubenswrapper[7728]: I0223 14:32:42.323176 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5f54bf67d4-2p4jz" Feb 23 14:32:42.325506 master-0 kubenswrapper[7728]: I0223 14:32:42.325446 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-srdtb" Feb 23 14:32:42.332109 master-0 kubenswrapper[7728]: I0223 14:32:42.332034 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-5f54bf67d4-2p4jz"] Feb 23 14:32:42.395243 master-0 kubenswrapper[7728]: I0223 14:32:42.395156 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chs7z\" (UniqueName: \"kubernetes.io/projected/8dd5fa7c-0519-4170-89c6-b369e5fc1990-kube-api-access-chs7z\") pod \"multus-admission-controller-5f54bf67d4-2p4jz\" (UID: \"8dd5fa7c-0519-4170-89c6-b369e5fc1990\") " pod="openshift-multus/multus-admission-controller-5f54bf67d4-2p4jz" Feb 23 14:32:42.395617 master-0 kubenswrapper[7728]: I0223 14:32:42.395275 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8dd5fa7c-0519-4170-89c6-b369e5fc1990-webhook-certs\") pod \"multus-admission-controller-5f54bf67d4-2p4jz\" (UID: \"8dd5fa7c-0519-4170-89c6-b369e5fc1990\") " pod="openshift-multus/multus-admission-controller-5f54bf67d4-2p4jz" Feb 23 14:32:42.496718 master-0 kubenswrapper[7728]: I0223 14:32:42.496673 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8dd5fa7c-0519-4170-89c6-b369e5fc1990-webhook-certs\") pod \"multus-admission-controller-5f54bf67d4-2p4jz\" (UID: \"8dd5fa7c-0519-4170-89c6-b369e5fc1990\") " pod="openshift-multus/multus-admission-controller-5f54bf67d4-2p4jz" Feb 23 14:32:42.496936 master-0 kubenswrapper[7728]: I0223 14:32:42.496762 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chs7z\" (UniqueName: \"kubernetes.io/projected/8dd5fa7c-0519-4170-89c6-b369e5fc1990-kube-api-access-chs7z\") pod \"multus-admission-controller-5f54bf67d4-2p4jz\" (UID: \"8dd5fa7c-0519-4170-89c6-b369e5fc1990\") " pod="openshift-multus/multus-admission-controller-5f54bf67d4-2p4jz" Feb 23 14:32:42.499950 master-0 kubenswrapper[7728]: I0223 14:32:42.499884 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8dd5fa7c-0519-4170-89c6-b369e5fc1990-webhook-certs\") pod \"multus-admission-controller-5f54bf67d4-2p4jz\" (UID: \"8dd5fa7c-0519-4170-89c6-b369e5fc1990\") " pod="openshift-multus/multus-admission-controller-5f54bf67d4-2p4jz" Feb 23 14:32:42.513650 master-0 kubenswrapper[7728]: I0223 14:32:42.513580 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chs7z\" (UniqueName: \"kubernetes.io/projected/8dd5fa7c-0519-4170-89c6-b369e5fc1990-kube-api-access-chs7z\") pod \"multus-admission-controller-5f54bf67d4-2p4jz\" (UID: \"8dd5fa7c-0519-4170-89c6-b369e5fc1990\") " pod="openshift-multus/multus-admission-controller-5f54bf67d4-2p4jz" Feb 23 14:32:42.638834 master-0 kubenswrapper[7728]: I0223 14:32:42.638679 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5f54bf67d4-2p4jz" Feb 23 14:32:42.906848 master-0 kubenswrapper[7728]: I0223 14:32:42.906512 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:42.906848 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:42.906848 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:42.906848 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:42.906848 master-0 kubenswrapper[7728]: I0223 14:32:42.906567 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:42.972096 master-0 kubenswrapper[7728]: I0223 14:32:42.972056 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:32:42.972825 master-0 kubenswrapper[7728]: I0223 14:32:42.972807 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:32:43.137258 master-0 kubenswrapper[7728]: I0223 14:32:43.137217 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-5f54bf67d4-2p4jz"] Feb 23 14:32:43.819030 master-0 kubenswrapper[7728]: I0223 14:32:43.818966 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5f54bf67d4-2p4jz" event={"ID":"8dd5fa7c-0519-4170-89c6-b369e5fc1990","Type":"ContainerStarted","Data":"37a983f29806128b58a67941f38167320dfc1a3122dd4612f255a73dd51901f6"} Feb 23 14:32:43.819030 master-0 kubenswrapper[7728]: I0223 14:32:43.819016 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5f54bf67d4-2p4jz" event={"ID":"8dd5fa7c-0519-4170-89c6-b369e5fc1990","Type":"ContainerStarted","Data":"0ed4966819a9ced864bc862dafc478844197956409ab608658cff2b481eaf91a"} Feb 23 14:32:43.819030 master-0 kubenswrapper[7728]: I0223 14:32:43.819027 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5f54bf67d4-2p4jz" event={"ID":"8dd5fa7c-0519-4170-89c6-b369e5fc1990","Type":"ContainerStarted","Data":"8a899f46b5ae367f29ecac877a3d8b6b2ea9e0cf04f3dc088df5a7ab7fffcc36"} Feb 23 14:32:43.836543 master-0 kubenswrapper[7728]: I0223 14:32:43.836452 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-5f54bf67d4-2p4jz" podStartSLOduration=1.836433908 podStartE2EDuration="1.836433908s" podCreationTimestamp="2026-02-23 14:32:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:32:43.833101711 +0000 UTC m=+856.795763027" watchObservedRunningTime="2026-02-23 14:32:43.836433908 +0000 UTC m=+856.799095204" Feb 23 14:32:43.873002 master-0 kubenswrapper[7728]: I0223 14:32:43.872935 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v"] Feb 23 14:32:43.874471 master-0 kubenswrapper[7728]: I0223 14:32:43.873186 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v" podUID="842d45c5-3452-4e97-b5f5-540395330a65" containerName="multus-admission-controller" containerID="cri-o://a1c596e71b919718ef2a5ecca5e4edc213fe0205cd3b4ffeba1110c64e033918" gracePeriod=30 Feb 23 14:32:43.874471 master-0 kubenswrapper[7728]: I0223 14:32:43.873569 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v" podUID="842d45c5-3452-4e97-b5f5-540395330a65" containerName="kube-rbac-proxy" containerID="cri-o://2c192f5e695207c7fb2b849827d14cbf1e828f8b6127dbf574b9e2669fd9c4a7" gracePeriod=30 Feb 23 14:32:43.908571 master-0 kubenswrapper[7728]: I0223 14:32:43.908517 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:43.908571 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:43.908571 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:43.908571 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:43.908795 master-0 kubenswrapper[7728]: I0223 14:32:43.908579 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:44.220859 master-0 kubenswrapper[7728]: I0223 14:32:44.220806 7728 scope.go:117] "RemoveContainer" containerID="b59ddaa1f996d8d231b18a402187cbb1ee1446439ec71026f52221d4aaab529f" Feb 23 14:32:44.221384 master-0 kubenswrapper[7728]: E0223 14:32:44.221083 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=ingress-operator pod=ingress-operator-6569778c84-hsl6c_openshift-ingress-operator(3488a7eb-5170-478c-9af7-490dbe0f514e)\"" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" podUID="3488a7eb-5170-478c-9af7-490dbe0f514e" Feb 23 14:32:44.833256 master-0 kubenswrapper[7728]: I0223 14:32:44.833190 7728 generic.go:334] "Generic (PLEG): container finished" podID="842d45c5-3452-4e97-b5f5-540395330a65" containerID="2c192f5e695207c7fb2b849827d14cbf1e828f8b6127dbf574b9e2669fd9c4a7" exitCode=0 Feb 23 14:32:44.833551 master-0 kubenswrapper[7728]: I0223 14:32:44.833273 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v" event={"ID":"842d45c5-3452-4e97-b5f5-540395330a65","Type":"ContainerDied","Data":"2c192f5e695207c7fb2b849827d14cbf1e828f8b6127dbf574b9e2669fd9c4a7"} Feb 23 14:32:44.908661 master-0 kubenswrapper[7728]: I0223 14:32:44.908554 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:44.908661 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:44.908661 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:44.908661 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:44.909111 master-0 kubenswrapper[7728]: I0223 14:32:44.908693 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:45.092365 master-0 kubenswrapper[7728]: I0223 14:32:45.092226 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Feb 23 14:32:45.094442 master-0 kubenswrapper[7728]: I0223 14:32:45.094402 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Feb 23 14:32:45.166592 master-0 kubenswrapper[7728]: I0223 14:32:45.166530 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Feb 23 14:32:45.254717 master-0 kubenswrapper[7728]: I0223 14:32:45.254647 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/94e30288-572c-4c6f-a063-a30243db8fd8-var-lock\") pod \"installer-2-master-0\" (UID: \"94e30288-572c-4c6f-a063-a30243db8fd8\") " pod="openshift-kube-apiserver/installer-2-master-0" Feb 23 14:32:45.255352 master-0 kubenswrapper[7728]: I0223 14:32:45.254747 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/94e30288-572c-4c6f-a063-a30243db8fd8-kube-api-access\") pod \"installer-2-master-0\" (UID: \"94e30288-572c-4c6f-a063-a30243db8fd8\") " pod="openshift-kube-apiserver/installer-2-master-0" Feb 23 14:32:45.255352 master-0 kubenswrapper[7728]: I0223 14:32:45.254808 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/94e30288-572c-4c6f-a063-a30243db8fd8-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"94e30288-572c-4c6f-a063-a30243db8fd8\") " pod="openshift-kube-apiserver/installer-2-master-0" Feb 23 14:32:45.356383 master-0 kubenswrapper[7728]: I0223 14:32:45.356246 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/94e30288-572c-4c6f-a063-a30243db8fd8-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"94e30288-572c-4c6f-a063-a30243db8fd8\") " pod="openshift-kube-apiserver/installer-2-master-0" Feb 23 14:32:45.356620 master-0 kubenswrapper[7728]: I0223 14:32:45.356434 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/94e30288-572c-4c6f-a063-a30243db8fd8-var-lock\") pod \"installer-2-master-0\" (UID: \"94e30288-572c-4c6f-a063-a30243db8fd8\") " pod="openshift-kube-apiserver/installer-2-master-0" Feb 23 14:32:45.356620 master-0 kubenswrapper[7728]: I0223 14:32:45.356441 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/94e30288-572c-4c6f-a063-a30243db8fd8-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"94e30288-572c-4c6f-a063-a30243db8fd8\") " pod="openshift-kube-apiserver/installer-2-master-0" Feb 23 14:32:45.356620 master-0 kubenswrapper[7728]: I0223 14:32:45.356502 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/94e30288-572c-4c6f-a063-a30243db8fd8-kube-api-access\") pod \"installer-2-master-0\" (UID: \"94e30288-572c-4c6f-a063-a30243db8fd8\") " pod="openshift-kube-apiserver/installer-2-master-0" Feb 23 14:32:45.357089 master-0 kubenswrapper[7728]: I0223 14:32:45.356860 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/94e30288-572c-4c6f-a063-a30243db8fd8-var-lock\") pod \"installer-2-master-0\" (UID: \"94e30288-572c-4c6f-a063-a30243db8fd8\") " pod="openshift-kube-apiserver/installer-2-master-0" Feb 23 14:32:45.375118 master-0 kubenswrapper[7728]: I0223 14:32:45.375090 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/94e30288-572c-4c6f-a063-a30243db8fd8-kube-api-access\") pod \"installer-2-master-0\" (UID: \"94e30288-572c-4c6f-a063-a30243db8fd8\") " pod="openshift-kube-apiserver/installer-2-master-0" Feb 23 14:32:45.424447 master-0 kubenswrapper[7728]: I0223 14:32:45.424373 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Feb 23 14:32:45.828913 master-0 kubenswrapper[7728]: I0223 14:32:45.828843 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Feb 23 14:32:45.910666 master-0 kubenswrapper[7728]: I0223 14:32:45.910601 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:45.910666 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:45.910666 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:45.910666 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:45.910666 master-0 kubenswrapper[7728]: I0223 14:32:45.910650 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:46.849088 master-0 kubenswrapper[7728]: I0223 14:32:46.849037 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"94e30288-572c-4c6f-a063-a30243db8fd8","Type":"ContainerStarted","Data":"610ae43e00e8e1b0ff3dba88a6993fdf43f969aae5bdeeca94356519cf7c2602"} Feb 23 14:32:46.849088 master-0 kubenswrapper[7728]: I0223 14:32:46.849086 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"94e30288-572c-4c6f-a063-a30243db8fd8","Type":"ContainerStarted","Data":"afc8fc4dee0a8892459a65c0cd44c3d38a00ddab249a8e98f6954d9605e9c33a"} Feb 23 14:32:46.873335 master-0 kubenswrapper[7728]: I0223 14:32:46.873246 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-2-master-0" podStartSLOduration=2.873225925 podStartE2EDuration="2.873225925s" podCreationTimestamp="2026-02-23 14:32:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:32:46.870376541 +0000 UTC m=+859.833037857" watchObservedRunningTime="2026-02-23 14:32:46.873225925 +0000 UTC m=+859.835887221" Feb 23 14:32:46.907381 master-0 kubenswrapper[7728]: I0223 14:32:46.907327 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:46.907381 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:46.907381 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:46.907381 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:46.907928 master-0 kubenswrapper[7728]: I0223 14:32:46.907890 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:47.910191 master-0 kubenswrapper[7728]: I0223 14:32:47.910073 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:47.910191 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:47.910191 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:47.910191 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:47.910191 master-0 kubenswrapper[7728]: I0223 14:32:47.910153 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:48.906822 master-0 kubenswrapper[7728]: I0223 14:32:48.906745 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:48.906822 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:48.906822 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:48.906822 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:48.906822 master-0 kubenswrapper[7728]: I0223 14:32:48.906822 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:49.208328 master-0 kubenswrapper[7728]: I0223 14:32:49.208242 7728 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Feb 23 14:32:49.210852 master-0 kubenswrapper[7728]: I0223 14:32:49.208507 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="cluster-policy-controller" containerID="cri-o://c04ab932ba2dd72c66746c81802bd4070c4841a3db596189904c0ef989dfa15b" gracePeriod=30 Feb 23 14:32:49.210852 master-0 kubenswrapper[7728]: I0223 14:32:49.208589 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="kube-controller-manager" containerID="cri-o://412625d61293576a0c0a8ae370cd71d0f9c06b3bc76402a07899ef292404e66d" gracePeriod=30 Feb 23 14:32:49.211354 master-0 kubenswrapper[7728]: I0223 14:32:49.210921 7728 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Feb 23 14:32:49.211354 master-0 kubenswrapper[7728]: E0223 14:32:49.211202 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="kube-controller-manager" Feb 23 14:32:49.211354 master-0 kubenswrapper[7728]: I0223 14:32:49.211219 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="kube-controller-manager" Feb 23 14:32:49.211354 master-0 kubenswrapper[7728]: E0223 14:32:49.211236 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="kube-controller-manager" Feb 23 14:32:49.211354 master-0 kubenswrapper[7728]: I0223 14:32:49.211242 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="kube-controller-manager" Feb 23 14:32:49.211354 master-0 kubenswrapper[7728]: E0223 14:32:49.211252 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="cluster-policy-controller" Feb 23 14:32:49.211354 master-0 kubenswrapper[7728]: I0223 14:32:49.211258 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="cluster-policy-controller" Feb 23 14:32:49.211354 master-0 kubenswrapper[7728]: E0223 14:32:49.211268 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="cluster-policy-controller" Feb 23 14:32:49.211354 master-0 kubenswrapper[7728]: I0223 14:32:49.211274 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="cluster-policy-controller" Feb 23 14:32:49.211354 master-0 kubenswrapper[7728]: E0223 14:32:49.211285 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="kube-controller-manager" Feb 23 14:32:49.211354 master-0 kubenswrapper[7728]: I0223 14:32:49.211290 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="kube-controller-manager" Feb 23 14:32:49.211354 master-0 kubenswrapper[7728]: E0223 14:32:49.211297 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="kube-controller-manager" Feb 23 14:32:49.211354 master-0 kubenswrapper[7728]: I0223 14:32:49.211303 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="kube-controller-manager" Feb 23 14:32:49.211354 master-0 kubenswrapper[7728]: E0223 14:32:49.211313 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="cluster-policy-controller" Feb 23 14:32:49.211354 master-0 kubenswrapper[7728]: I0223 14:32:49.211319 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="cluster-policy-controller" Feb 23 14:32:49.211354 master-0 kubenswrapper[7728]: E0223 14:32:49.211326 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="kube-controller-manager" Feb 23 14:32:49.211354 master-0 kubenswrapper[7728]: I0223 14:32:49.211331 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="kube-controller-manager" Feb 23 14:32:49.212143 master-0 kubenswrapper[7728]: I0223 14:32:49.211447 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="cluster-policy-controller" Feb 23 14:32:49.212143 master-0 kubenswrapper[7728]: I0223 14:32:49.211459 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="cluster-policy-controller" Feb 23 14:32:49.212143 master-0 kubenswrapper[7728]: I0223 14:32:49.211468 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="kube-controller-manager" Feb 23 14:32:49.212143 master-0 kubenswrapper[7728]: I0223 14:32:49.211497 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="kube-controller-manager" Feb 23 14:32:49.212143 master-0 kubenswrapper[7728]: I0223 14:32:49.211508 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="kube-controller-manager" Feb 23 14:32:49.212143 master-0 kubenswrapper[7728]: I0223 14:32:49.211517 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="kube-controller-manager" Feb 23 14:32:49.212143 master-0 kubenswrapper[7728]: I0223 14:32:49.211528 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="kube-controller-manager" Feb 23 14:32:49.212143 master-0 kubenswrapper[7728]: E0223 14:32:49.211625 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="kube-controller-manager" Feb 23 14:32:49.212143 master-0 kubenswrapper[7728]: I0223 14:32:49.211633 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="kube-controller-manager" Feb 23 14:32:49.212143 master-0 kubenswrapper[7728]: E0223 14:32:49.211644 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="cluster-policy-controller" Feb 23 14:32:49.212143 master-0 kubenswrapper[7728]: I0223 14:32:49.211650 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="cluster-policy-controller" Feb 23 14:32:49.212143 master-0 kubenswrapper[7728]: E0223 14:32:49.211662 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="kube-controller-manager" Feb 23 14:32:49.212143 master-0 kubenswrapper[7728]: I0223 14:32:49.211668 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="kube-controller-manager" Feb 23 14:32:49.212143 master-0 kubenswrapper[7728]: I0223 14:32:49.211781 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="kube-controller-manager" Feb 23 14:32:49.212143 master-0 kubenswrapper[7728]: I0223 14:32:49.211800 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="cluster-policy-controller" Feb 23 14:32:49.212143 master-0 kubenswrapper[7728]: I0223 14:32:49.211812 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="cluster-policy-controller" Feb 23 14:32:49.212143 master-0 kubenswrapper[7728]: I0223 14:32:49.212039 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9ad9373c007a4fcd25e70622bdc8deb" containerName="kube-controller-manager" Feb 23 14:32:49.213243 master-0 kubenswrapper[7728]: I0223 14:32:49.212722 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:32:49.264041 master-0 kubenswrapper[7728]: I0223 14:32:49.258416 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Feb 23 14:32:49.318835 master-0 kubenswrapper[7728]: I0223 14:32:49.318774 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/79b63b0311108e042b4d16d40534ff93-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"79b63b0311108e042b4d16d40534ff93\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:32:49.319038 master-0 kubenswrapper[7728]: I0223 14:32:49.319013 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/79b63b0311108e042b4d16d40534ff93-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"79b63b0311108e042b4d16d40534ff93\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:32:49.421198 master-0 kubenswrapper[7728]: I0223 14:32:49.421083 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/79b63b0311108e042b4d16d40534ff93-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"79b63b0311108e042b4d16d40534ff93\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:32:49.421406 master-0 kubenswrapper[7728]: I0223 14:32:49.421242 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/79b63b0311108e042b4d16d40534ff93-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"79b63b0311108e042b4d16d40534ff93\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:32:49.421406 master-0 kubenswrapper[7728]: I0223 14:32:49.421242 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/79b63b0311108e042b4d16d40534ff93-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"79b63b0311108e042b4d16d40534ff93\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:32:49.421406 master-0 kubenswrapper[7728]: I0223 14:32:49.421363 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/79b63b0311108e042b4d16d40534ff93-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"79b63b0311108e042b4d16d40534ff93\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:32:49.554550 master-0 kubenswrapper[7728]: I0223 14:32:49.554178 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:32:49.573514 master-0 kubenswrapper[7728]: W0223 14:32:49.573459 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79b63b0311108e042b4d16d40534ff93.slice/crio-fe39d10604843d18955327bfbe5d658cdd687bd280154adde8e869872e1753cf WatchSource:0}: Error finding container fe39d10604843d18955327bfbe5d658cdd687bd280154adde8e869872e1753cf: Status 404 returned error can't find the container with id fe39d10604843d18955327bfbe5d658cdd687bd280154adde8e869872e1753cf Feb 23 14:32:49.875237 master-0 kubenswrapper[7728]: I0223 14:32:49.875049 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"79b63b0311108e042b4d16d40534ff93","Type":"ContainerStarted","Data":"cc06826ca4ea200fc8d9842f30146d34aae96828a181bf8a495c15fc9235ebdd"} Feb 23 14:32:49.875237 master-0 kubenswrapper[7728]: I0223 14:32:49.875121 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"79b63b0311108e042b4d16d40534ff93","Type":"ContainerStarted","Data":"fe39d10604843d18955327bfbe5d658cdd687bd280154adde8e869872e1753cf"} Feb 23 14:32:49.879733 master-0 kubenswrapper[7728]: I0223 14:32:49.879677 7728 generic.go:334] "Generic (PLEG): container finished" podID="c9ad9373c007a4fcd25e70622bdc8deb" containerID="412625d61293576a0c0a8ae370cd71d0f9c06b3bc76402a07899ef292404e66d" exitCode=0 Feb 23 14:32:49.879733 master-0 kubenswrapper[7728]: I0223 14:32:49.879724 7728 generic.go:334] "Generic (PLEG): container finished" podID="c9ad9373c007a4fcd25e70622bdc8deb" containerID="c04ab932ba2dd72c66746c81802bd4070c4841a3db596189904c0ef989dfa15b" exitCode=0 Feb 23 14:32:49.879923 master-0 kubenswrapper[7728]: I0223 14:32:49.879813 7728 scope.go:117] "RemoveContainer" containerID="e7e20b5ba72ce778a4607a64cc8928522b6f4e4e91aae5a0ddbe4de3f2e8d4a6" Feb 23 14:32:49.885508 master-0 kubenswrapper[7728]: I0223 14:32:49.885460 7728 generic.go:334] "Generic (PLEG): container finished" podID="25b855e3-80dc-4ee5-80ab-c4742578a92f" containerID="9e428f62a82052df41d6797fdf53021748cfc643092e404d44bde9e1092162d6" exitCode=0 Feb 23 14:32:49.885584 master-0 kubenswrapper[7728]: I0223 14:32:49.885520 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" event={"ID":"25b855e3-80dc-4ee5-80ab-c4742578a92f","Type":"ContainerDied","Data":"9e428f62a82052df41d6797fdf53021748cfc643092e404d44bde9e1092162d6"} Feb 23 14:32:49.909941 master-0 kubenswrapper[7728]: I0223 14:32:49.907776 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:49.909941 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:49.909941 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:49.909941 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:49.909941 master-0 kubenswrapper[7728]: I0223 14:32:49.907872 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:49.930748 master-0 kubenswrapper[7728]: I0223 14:32:49.928710 7728 scope.go:117] "RemoveContainer" containerID="16c3c1cec998b569c900e35bdf09e7492aae0331e38f36582875c9a5db092d13" Feb 23 14:32:50.116136 master-0 kubenswrapper[7728]: I0223 14:32:50.104886 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:32:50.230471 master-0 kubenswrapper[7728]: I0223 14:32:50.230377 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-ssl-certs-host\") pod \"c9ad9373c007a4fcd25e70622bdc8deb\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " Feb 23 14:32:50.230471 master-0 kubenswrapper[7728]: I0223 14:32:50.230489 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-secrets\") pod \"c9ad9373c007a4fcd25e70622bdc8deb\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " Feb 23 14:32:50.231103 master-0 kubenswrapper[7728]: I0223 14:32:50.230510 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-config\") pod \"c9ad9373c007a4fcd25e70622bdc8deb\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " Feb 23 14:32:50.231103 master-0 kubenswrapper[7728]: I0223 14:32:50.230551 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-etc-kubernetes-cloud\") pod \"c9ad9373c007a4fcd25e70622bdc8deb\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " Feb 23 14:32:50.231103 master-0 kubenswrapper[7728]: I0223 14:32:50.230613 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-logs\") pod \"c9ad9373c007a4fcd25e70622bdc8deb\" (UID: \"c9ad9373c007a4fcd25e70622bdc8deb\") " Feb 23 14:32:50.231103 master-0 kubenswrapper[7728]: I0223 14:32:50.230689 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-secrets" (OuterVolumeSpecName: "secrets") pod "c9ad9373c007a4fcd25e70622bdc8deb" (UID: "c9ad9373c007a4fcd25e70622bdc8deb"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:32:50.231103 master-0 kubenswrapper[7728]: I0223 14:32:50.230740 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-config" (OuterVolumeSpecName: "config") pod "c9ad9373c007a4fcd25e70622bdc8deb" (UID: "c9ad9373c007a4fcd25e70622bdc8deb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:32:50.231103 master-0 kubenswrapper[7728]: I0223 14:32:50.230750 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "c9ad9373c007a4fcd25e70622bdc8deb" (UID: "c9ad9373c007a4fcd25e70622bdc8deb"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:32:50.231103 master-0 kubenswrapper[7728]: I0223 14:32:50.230823 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-logs" (OuterVolumeSpecName: "logs") pod "c9ad9373c007a4fcd25e70622bdc8deb" (UID: "c9ad9373c007a4fcd25e70622bdc8deb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:32:50.231103 master-0 kubenswrapper[7728]: I0223 14:32:50.230811 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "c9ad9373c007a4fcd25e70622bdc8deb" (UID: "c9ad9373c007a4fcd25e70622bdc8deb"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:32:50.231342 master-0 kubenswrapper[7728]: I0223 14:32:50.231197 7728 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-secrets\") on node \"master-0\" DevicePath \"\"" Feb 23 14:32:50.231342 master-0 kubenswrapper[7728]: I0223 14:32:50.231211 7728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:32:50.231342 master-0 kubenswrapper[7728]: I0223 14:32:50.231221 7728 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Feb 23 14:32:50.231342 master-0 kubenswrapper[7728]: I0223 14:32:50.231233 7728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-logs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:32:50.231342 master-0 kubenswrapper[7728]: I0223 14:32:50.231241 7728 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/c9ad9373c007a4fcd25e70622bdc8deb-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Feb 23 14:32:50.897316 master-0 kubenswrapper[7728]: I0223 14:32:50.897168 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"79b63b0311108e042b4d16d40534ff93","Type":"ContainerStarted","Data":"4727007b722af10f03ef1e413d9a5307206f80fd37d32a2fd4680a08e0c09abd"} Feb 23 14:32:50.897316 master-0 kubenswrapper[7728]: I0223 14:32:50.897229 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"79b63b0311108e042b4d16d40534ff93","Type":"ContainerStarted","Data":"3128474a58df77e7f674ea5fc2b490b80a290b5c331332dff119879f6a4cc014"} Feb 23 14:32:50.897316 master-0 kubenswrapper[7728]: I0223 14:32:50.897243 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"79b63b0311108e042b4d16d40534ff93","Type":"ContainerStarted","Data":"ebb1604449992626b0f5a092af2505cc2585db163d59c98aca43b3946d7b4f4c"} Feb 23 14:32:50.900110 master-0 kubenswrapper[7728]: I0223 14:32:50.900066 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Feb 23 14:32:50.900598 master-0 kubenswrapper[7728]: I0223 14:32:50.900539 7728 scope.go:117] "RemoveContainer" containerID="412625d61293576a0c0a8ae370cd71d0f9c06b3bc76402a07899ef292404e66d" Feb 23 14:32:50.914571 master-0 kubenswrapper[7728]: I0223 14:32:50.910687 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:50.914571 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:50.914571 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:50.914571 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:50.914571 master-0 kubenswrapper[7728]: I0223 14:32:50.910769 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:50.938958 master-0 kubenswrapper[7728]: I0223 14:32:50.935602 7728 scope.go:117] "RemoveContainer" containerID="c04ab932ba2dd72c66746c81802bd4070c4841a3db596189904c0ef989dfa15b" Feb 23 14:32:50.940090 master-0 kubenswrapper[7728]: I0223 14:32:50.940014 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=1.93998726 podStartE2EDuration="1.93998726s" podCreationTimestamp="2026-02-23 14:32:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:32:50.936868879 +0000 UTC m=+863.899530195" watchObservedRunningTime="2026-02-23 14:32:50.93998726 +0000 UTC m=+863.902648556" Feb 23 14:32:51.217056 master-0 kubenswrapper[7728]: E0223 14:32:51.216995 7728 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b4056ada42457871cd18017ebc6a92f379eb1f932bf638f2bfe123f8631970c2" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 23 14:32:51.218662 master-0 kubenswrapper[7728]: E0223 14:32:51.218629 7728 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b4056ada42457871cd18017ebc6a92f379eb1f932bf638f2bfe123f8631970c2" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 23 14:32:51.219794 master-0 kubenswrapper[7728]: E0223 14:32:51.219745 7728 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b4056ada42457871cd18017ebc6a92f379eb1f932bf638f2bfe123f8631970c2" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 23 14:32:51.219855 master-0 kubenswrapper[7728]: E0223 14:32:51.219819 7728 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-njjl7" podUID="56b2b6a4-9230-46e4-9689-b267bc753b6f" containerName="kube-multus-additional-cni-plugins" Feb 23 14:32:51.232274 master-0 kubenswrapper[7728]: I0223 14:32:51.231780 7728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9ad9373c007a4fcd25e70622bdc8deb" path="/var/lib/kubelet/pods/c9ad9373c007a4fcd25e70622bdc8deb/volumes" Feb 23 14:32:51.232274 master-0 kubenswrapper[7728]: I0223 14:32:51.232219 7728 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="" Feb 23 14:32:51.253530 master-0 kubenswrapper[7728]: I0223 14:32:51.253458 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Feb 23 14:32:51.253530 master-0 kubenswrapper[7728]: I0223 14:32:51.253526 7728 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="42a06015-5766-435d-af4f-46c1fd8e3466" Feb 23 14:32:51.256644 master-0 kubenswrapper[7728]: I0223 14:32:51.256567 7728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Feb 23 14:32:51.256644 master-0 kubenswrapper[7728]: I0223 14:32:51.256608 7728 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="42a06015-5766-435d-af4f-46c1fd8e3466" Feb 23 14:32:51.289186 master-0 kubenswrapper[7728]: I0223 14:32:51.289128 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Feb 23 14:32:51.455782 master-0 kubenswrapper[7728]: I0223 14:32:51.455694 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/25b855e3-80dc-4ee5-80ab-c4742578a92f-var-lock\") pod \"25b855e3-80dc-4ee5-80ab-c4742578a92f\" (UID: \"25b855e3-80dc-4ee5-80ab-c4742578a92f\") " Feb 23 14:32:51.455985 master-0 kubenswrapper[7728]: I0223 14:32:51.455815 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25b855e3-80dc-4ee5-80ab-c4742578a92f-var-lock" (OuterVolumeSpecName: "var-lock") pod "25b855e3-80dc-4ee5-80ab-c4742578a92f" (UID: "25b855e3-80dc-4ee5-80ab-c4742578a92f"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:32:51.455985 master-0 kubenswrapper[7728]: I0223 14:32:51.455836 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/25b855e3-80dc-4ee5-80ab-c4742578a92f-kubelet-dir\") pod \"25b855e3-80dc-4ee5-80ab-c4742578a92f\" (UID: \"25b855e3-80dc-4ee5-80ab-c4742578a92f\") " Feb 23 14:32:51.455985 master-0 kubenswrapper[7728]: I0223 14:32:51.455865 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25b855e3-80dc-4ee5-80ab-c4742578a92f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "25b855e3-80dc-4ee5-80ab-c4742578a92f" (UID: "25b855e3-80dc-4ee5-80ab-c4742578a92f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:32:51.455985 master-0 kubenswrapper[7728]: I0223 14:32:51.455884 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25b855e3-80dc-4ee5-80ab-c4742578a92f-kube-api-access\") pod \"25b855e3-80dc-4ee5-80ab-c4742578a92f\" (UID: \"25b855e3-80dc-4ee5-80ab-c4742578a92f\") " Feb 23 14:32:51.456157 master-0 kubenswrapper[7728]: I0223 14:32:51.456088 7728 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/25b855e3-80dc-4ee5-80ab-c4742578a92f-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:32:51.456157 master-0 kubenswrapper[7728]: I0223 14:32:51.456101 7728 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/25b855e3-80dc-4ee5-80ab-c4742578a92f-var-lock\") on node \"master-0\" DevicePath \"\"" Feb 23 14:32:51.459166 master-0 kubenswrapper[7728]: I0223 14:32:51.459100 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25b855e3-80dc-4ee5-80ab-c4742578a92f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "25b855e3-80dc-4ee5-80ab-c4742578a92f" (UID: "25b855e3-80dc-4ee5-80ab-c4742578a92f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:32:51.557714 master-0 kubenswrapper[7728]: I0223 14:32:51.557639 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25b855e3-80dc-4ee5-80ab-c4742578a92f-kube-api-access\") on node \"master-0\" DevicePath \"\"" Feb 23 14:32:51.909392 master-0 kubenswrapper[7728]: I0223 14:32:51.909264 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:51.909392 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:51.909392 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:51.909392 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:51.909392 master-0 kubenswrapper[7728]: I0223 14:32:51.909346 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:51.910943 master-0 kubenswrapper[7728]: I0223 14:32:51.910909 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" event={"ID":"25b855e3-80dc-4ee5-80ab-c4742578a92f","Type":"ContainerDied","Data":"38c349954c9e4d48bcd2d1d0bfed1c2e92410197933aa893d6cec912dd9abe84"} Feb 23 14:32:51.910943 master-0 kubenswrapper[7728]: I0223 14:32:51.910926 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Feb 23 14:32:51.910943 master-0 kubenswrapper[7728]: I0223 14:32:51.910940 7728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38c349954c9e4d48bcd2d1d0bfed1c2e92410197933aa893d6cec912dd9abe84" Feb 23 14:32:52.907998 master-0 kubenswrapper[7728]: I0223 14:32:52.907915 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:52.907998 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:52.907998 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:52.907998 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:52.907998 master-0 kubenswrapper[7728]: I0223 14:32:52.907994 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:53.516946 master-0 kubenswrapper[7728]: I0223 14:32:53.516890 7728 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Feb 23 14:32:53.517157 master-0 kubenswrapper[7728]: I0223 14:32:53.517114 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="56c3cb71c9851003c8de7e7c5db4b87e" containerName="kube-scheduler" containerID="cri-o://901941f5b39d593d08535a59f0a3320fa3d1d31c538434d8bc740dd1aca5de85" gracePeriod=30 Feb 23 14:32:53.518136 master-0 kubenswrapper[7728]: I0223 14:32:53.517834 7728 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Feb 23 14:32:53.518136 master-0 kubenswrapper[7728]: E0223 14:32:53.518088 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25b855e3-80dc-4ee5-80ab-c4742578a92f" containerName="installer" Feb 23 14:32:53.518136 master-0 kubenswrapper[7728]: I0223 14:32:53.518103 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="25b855e3-80dc-4ee5-80ab-c4742578a92f" containerName="installer" Feb 23 14:32:53.518136 master-0 kubenswrapper[7728]: E0223 14:32:53.518120 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c3cb71c9851003c8de7e7c5db4b87e" containerName="kube-scheduler" Feb 23 14:32:53.518136 master-0 kubenswrapper[7728]: I0223 14:32:53.518127 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c3cb71c9851003c8de7e7c5db4b87e" containerName="kube-scheduler" Feb 23 14:32:53.518313 master-0 kubenswrapper[7728]: E0223 14:32:53.518154 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c3cb71c9851003c8de7e7c5db4b87e" containerName="kube-scheduler" Feb 23 14:32:53.518313 master-0 kubenswrapper[7728]: I0223 14:32:53.518162 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c3cb71c9851003c8de7e7c5db4b87e" containerName="kube-scheduler" Feb 23 14:32:53.518313 master-0 kubenswrapper[7728]: I0223 14:32:53.518304 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="56c3cb71c9851003c8de7e7c5db4b87e" containerName="kube-scheduler" Feb 23 14:32:53.518405 master-0 kubenswrapper[7728]: I0223 14:32:53.518326 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="25b855e3-80dc-4ee5-80ab-c4742578a92f" containerName="installer" Feb 23 14:32:53.518637 master-0 kubenswrapper[7728]: I0223 14:32:53.518610 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="56c3cb71c9851003c8de7e7c5db4b87e" containerName="kube-scheduler" Feb 23 14:32:53.519641 master-0 kubenswrapper[7728]: I0223 14:32:53.519613 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Feb 23 14:32:53.553868 master-0 kubenswrapper[7728]: I0223 14:32:53.553700 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Feb 23 14:32:53.585541 master-0 kubenswrapper[7728]: I0223 14:32:53.585371 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/56ff46cdb00d28519af7c0cdc9ea8d11-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"56ff46cdb00d28519af7c0cdc9ea8d11\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Feb 23 14:32:53.585541 master-0 kubenswrapper[7728]: I0223 14:32:53.585418 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/56ff46cdb00d28519af7c0cdc9ea8d11-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"56ff46cdb00d28519af7c0cdc9ea8d11\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Feb 23 14:32:53.668173 master-0 kubenswrapper[7728]: I0223 14:32:53.668130 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Feb 23 14:32:53.686293 master-0 kubenswrapper[7728]: I0223 14:32:53.686235 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/56ff46cdb00d28519af7c0cdc9ea8d11-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"56ff46cdb00d28519af7c0cdc9ea8d11\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Feb 23 14:32:53.686293 master-0 kubenswrapper[7728]: I0223 14:32:53.686289 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/56ff46cdb00d28519af7c0cdc9ea8d11-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"56ff46cdb00d28519af7c0cdc9ea8d11\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Feb 23 14:32:53.686586 master-0 kubenswrapper[7728]: I0223 14:32:53.686380 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/56ff46cdb00d28519af7c0cdc9ea8d11-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"56ff46cdb00d28519af7c0cdc9ea8d11\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Feb 23 14:32:53.686586 master-0 kubenswrapper[7728]: I0223 14:32:53.686412 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/56ff46cdb00d28519af7c0cdc9ea8d11-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"56ff46cdb00d28519af7c0cdc9ea8d11\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Feb 23 14:32:53.687141 master-0 kubenswrapper[7728]: I0223 14:32:53.687099 7728 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="23535d00-9f55-4d93-a373-c374c9306afb" Feb 23 14:32:53.787748 master-0 kubenswrapper[7728]: I0223 14:32:53.787605 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/56c3cb71c9851003c8de7e7c5db4b87e-logs\") pod \"56c3cb71c9851003c8de7e7c5db4b87e\" (UID: \"56c3cb71c9851003c8de7e7c5db4b87e\") " Feb 23 14:32:53.787748 master-0 kubenswrapper[7728]: I0223 14:32:53.787657 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/56c3cb71c9851003c8de7e7c5db4b87e-secrets\") pod \"56c3cb71c9851003c8de7e7c5db4b87e\" (UID: \"56c3cb71c9851003c8de7e7c5db4b87e\") " Feb 23 14:32:53.788003 master-0 kubenswrapper[7728]: I0223 14:32:53.787913 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56c3cb71c9851003c8de7e7c5db4b87e-logs" (OuterVolumeSpecName: "logs") pod "56c3cb71c9851003c8de7e7c5db4b87e" (UID: "56c3cb71c9851003c8de7e7c5db4b87e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:32:53.788003 master-0 kubenswrapper[7728]: I0223 14:32:53.787922 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56c3cb71c9851003c8de7e7c5db4b87e-secrets" (OuterVolumeSpecName: "secrets") pod "56c3cb71c9851003c8de7e7c5db4b87e" (UID: "56c3cb71c9851003c8de7e7c5db4b87e"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:32:53.853721 master-0 kubenswrapper[7728]: I0223 14:32:53.853650 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Feb 23 14:32:53.886352 master-0 kubenswrapper[7728]: W0223 14:32:53.886252 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56ff46cdb00d28519af7c0cdc9ea8d11.slice/crio-edb6ec0bf3017407524bcef4889f83a1c2c54ff05b0e40918d7eda5368e06757 WatchSource:0}: Error finding container edb6ec0bf3017407524bcef4889f83a1c2c54ff05b0e40918d7eda5368e06757: Status 404 returned error can't find the container with id edb6ec0bf3017407524bcef4889f83a1c2c54ff05b0e40918d7eda5368e06757 Feb 23 14:32:53.888811 master-0 kubenswrapper[7728]: I0223 14:32:53.888676 7728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/56c3cb71c9851003c8de7e7c5db4b87e-logs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:32:53.888811 master-0 kubenswrapper[7728]: I0223 14:32:53.888735 7728 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/56c3cb71c9851003c8de7e7c5db4b87e-secrets\") on node \"master-0\" DevicePath \"\"" Feb 23 14:32:53.907241 master-0 kubenswrapper[7728]: I0223 14:32:53.907164 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:53.907241 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:53.907241 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:53.907241 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:53.907529 master-0 kubenswrapper[7728]: I0223 14:32:53.907262 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:53.934661 master-0 kubenswrapper[7728]: I0223 14:32:53.934602 7728 generic.go:334] "Generic (PLEG): container finished" podID="15245f43-22db-42eb-ab0b-702240986437" containerID="d8fda6fec7eadedba1d4400e4d7e27798506234350c769e6451d1eaf5b0ede8d" exitCode=0 Feb 23 14:32:53.935133 master-0 kubenswrapper[7728]: I0223 14:32:53.934728 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" event={"ID":"15245f43-22db-42eb-ab0b-702240986437","Type":"ContainerDied","Data":"d8fda6fec7eadedba1d4400e4d7e27798506234350c769e6451d1eaf5b0ede8d"} Feb 23 14:32:53.936287 master-0 kubenswrapper[7728]: I0223 14:32:53.936236 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"56ff46cdb00d28519af7c0cdc9ea8d11","Type":"ContainerStarted","Data":"edb6ec0bf3017407524bcef4889f83a1c2c54ff05b0e40918d7eda5368e06757"} Feb 23 14:32:53.939904 master-0 kubenswrapper[7728]: I0223 14:32:53.939866 7728 generic.go:334] "Generic (PLEG): container finished" podID="56c3cb71c9851003c8de7e7c5db4b87e" containerID="901941f5b39d593d08535a59f0a3320fa3d1d31c538434d8bc740dd1aca5de85" exitCode=0 Feb 23 14:32:53.939978 master-0 kubenswrapper[7728]: I0223 14:32:53.939908 7728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b4c5917f42a018e656736ffe0ec509b45d342d70ccb1039a1f41866022cf32e" Feb 23 14:32:53.939978 master-0 kubenswrapper[7728]: I0223 14:32:53.939913 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Feb 23 14:32:53.939978 master-0 kubenswrapper[7728]: I0223 14:32:53.939928 7728 scope.go:117] "RemoveContainer" containerID="1161af5c0919fc04c557fffb0fa1799b448226d91a3bed741eb027099a2bf8f9" Feb 23 14:32:54.907927 master-0 kubenswrapper[7728]: I0223 14:32:54.907792 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:54.907927 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:54.907927 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:54.907927 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:54.907927 master-0 kubenswrapper[7728]: I0223 14:32:54.907879 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:54.950504 master-0 kubenswrapper[7728]: I0223 14:32:54.950418 7728 generic.go:334] "Generic (PLEG): container finished" podID="56ff46cdb00d28519af7c0cdc9ea8d11" containerID="7a566b5e0634944e8d5422e837762f49e59d372d80be5def3116f1b2efb53f3a" exitCode=0 Feb 23 14:32:54.951313 master-0 kubenswrapper[7728]: I0223 14:32:54.951275 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"56ff46cdb00d28519af7c0cdc9ea8d11","Type":"ContainerDied","Data":"7a566b5e0634944e8d5422e837762f49e59d372d80be5def3116f1b2efb53f3a"} Feb 23 14:32:55.232748 master-0 kubenswrapper[7728]: I0223 14:32:55.232709 7728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56c3cb71c9851003c8de7e7c5db4b87e" path="/var/lib/kubelet/pods/56c3cb71c9851003c8de7e7c5db4b87e/volumes" Feb 23 14:32:55.232979 master-0 kubenswrapper[7728]: I0223 14:32:55.232958 7728 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="" Feb 23 14:32:55.250806 master-0 kubenswrapper[7728]: I0223 14:32:55.250731 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Feb 23 14:32:55.250806 master-0 kubenswrapper[7728]: I0223 14:32:55.250781 7728 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="23535d00-9f55-4d93-a373-c374c9306afb" Feb 23 14:32:55.254424 master-0 kubenswrapper[7728]: I0223 14:32:55.254353 7728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Feb 23 14:32:55.254424 master-0 kubenswrapper[7728]: I0223 14:32:55.254414 7728 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="23535d00-9f55-4d93-a373-c374c9306afb" Feb 23 14:32:55.266851 master-0 kubenswrapper[7728]: I0223 14:32:55.266803 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Feb 23 14:32:55.406661 master-0 kubenswrapper[7728]: I0223 14:32:55.406610 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15245f43-22db-42eb-ab0b-702240986437-kubelet-dir\") pod \"15245f43-22db-42eb-ab0b-702240986437\" (UID: \"15245f43-22db-42eb-ab0b-702240986437\") " Feb 23 14:32:55.406835 master-0 kubenswrapper[7728]: I0223 14:32:55.406682 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15245f43-22db-42eb-ab0b-702240986437-kube-api-access\") pod \"15245f43-22db-42eb-ab0b-702240986437\" (UID: \"15245f43-22db-42eb-ab0b-702240986437\") " Feb 23 14:32:55.406835 master-0 kubenswrapper[7728]: I0223 14:32:55.406780 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/15245f43-22db-42eb-ab0b-702240986437-var-lock\") pod \"15245f43-22db-42eb-ab0b-702240986437\" (UID: \"15245f43-22db-42eb-ab0b-702240986437\") " Feb 23 14:32:55.407248 master-0 kubenswrapper[7728]: I0223 14:32:55.407115 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15245f43-22db-42eb-ab0b-702240986437-var-lock" (OuterVolumeSpecName: "var-lock") pod "15245f43-22db-42eb-ab0b-702240986437" (UID: "15245f43-22db-42eb-ab0b-702240986437"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:32:55.407248 master-0 kubenswrapper[7728]: I0223 14:32:55.407155 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15245f43-22db-42eb-ab0b-702240986437-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "15245f43-22db-42eb-ab0b-702240986437" (UID: "15245f43-22db-42eb-ab0b-702240986437"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:32:55.410438 master-0 kubenswrapper[7728]: I0223 14:32:55.410396 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15245f43-22db-42eb-ab0b-702240986437-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "15245f43-22db-42eb-ab0b-702240986437" (UID: "15245f43-22db-42eb-ab0b-702240986437"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:32:55.508886 master-0 kubenswrapper[7728]: I0223 14:32:55.508822 7728 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/15245f43-22db-42eb-ab0b-702240986437-var-lock\") on node \"master-0\" DevicePath \"\"" Feb 23 14:32:55.508886 master-0 kubenswrapper[7728]: I0223 14:32:55.508856 7728 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15245f43-22db-42eb-ab0b-702240986437-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:32:55.508886 master-0 kubenswrapper[7728]: I0223 14:32:55.508866 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/15245f43-22db-42eb-ab0b-702240986437-kube-api-access\") on node \"master-0\" DevicePath \"\"" Feb 23 14:32:55.907219 master-0 kubenswrapper[7728]: I0223 14:32:55.907063 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:55.907219 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:55.907219 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:55.907219 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:55.907219 master-0 kubenswrapper[7728]: I0223 14:32:55.907151 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:55.967149 master-0 kubenswrapper[7728]: I0223 14:32:55.967068 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"56ff46cdb00d28519af7c0cdc9ea8d11","Type":"ContainerStarted","Data":"1bc827a8854dfec15010881aca2028b9a63aeaba5a66ba610581f32b2d5f3a53"} Feb 23 14:32:55.967149 master-0 kubenswrapper[7728]: I0223 14:32:55.967134 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"56ff46cdb00d28519af7c0cdc9ea8d11","Type":"ContainerStarted","Data":"a83359f382bbf7e84f344fddfee0b01fcac40fd46b179877c37b9571576884e8"} Feb 23 14:32:55.967149 master-0 kubenswrapper[7728]: I0223 14:32:55.967146 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"56ff46cdb00d28519af7c0cdc9ea8d11","Type":"ContainerStarted","Data":"a607c62f2f6fcfd8c1e82eea4a4f2c6c0363686e9d645511b15629d774c518ef"} Feb 23 14:32:55.967800 master-0 kubenswrapper[7728]: I0223 14:32:55.967185 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Feb 23 14:32:55.969547 master-0 kubenswrapper[7728]: I0223 14:32:55.969247 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" event={"ID":"15245f43-22db-42eb-ab0b-702240986437","Type":"ContainerDied","Data":"926e5a9c249a2c0844ff9ae9838c305e194c581c93350a2a61909bb76a1d1f42"} Feb 23 14:32:55.969547 master-0 kubenswrapper[7728]: I0223 14:32:55.969287 7728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="926e5a9c249a2c0844ff9ae9838c305e194c581c93350a2a61909bb76a1d1f42" Feb 23 14:32:55.969547 master-0 kubenswrapper[7728]: I0223 14:32:55.969322 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Feb 23 14:32:56.009145 master-0 kubenswrapper[7728]: I0223 14:32:56.009041 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podStartSLOduration=3.009023595 podStartE2EDuration="3.009023595s" podCreationTimestamp="2026-02-23 14:32:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:32:56.000877153 +0000 UTC m=+868.963538499" watchObservedRunningTime="2026-02-23 14:32:56.009023595 +0000 UTC m=+868.971684891" Feb 23 14:32:56.221050 master-0 kubenswrapper[7728]: I0223 14:32:56.220993 7728 scope.go:117] "RemoveContainer" containerID="b59ddaa1f996d8d231b18a402187cbb1ee1446439ec71026f52221d4aaab529f" Feb 23 14:32:56.221273 master-0 kubenswrapper[7728]: E0223 14:32:56.221220 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=ingress-operator pod=ingress-operator-6569778c84-hsl6c_openshift-ingress-operator(3488a7eb-5170-478c-9af7-490dbe0f514e)\"" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" podUID="3488a7eb-5170-478c-9af7-490dbe0f514e" Feb 23 14:32:56.906823 master-0 kubenswrapper[7728]: I0223 14:32:56.906725 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:56.906823 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:56.906823 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:56.906823 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:56.907215 master-0 kubenswrapper[7728]: I0223 14:32:56.906847 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:57.907438 master-0 kubenswrapper[7728]: I0223 14:32:57.907380 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:57.907438 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:57.907438 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:57.907438 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:57.908000 master-0 kubenswrapper[7728]: I0223 14:32:57.907466 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:58.909004 master-0 kubenswrapper[7728]: I0223 14:32:58.908938 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:58.909004 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:58.909004 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:58.909004 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:58.909582 master-0 kubenswrapper[7728]: I0223 14:32:58.909016 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:59.554850 master-0 kubenswrapper[7728]: I0223 14:32:59.554796 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:32:59.554850 master-0 kubenswrapper[7728]: I0223 14:32:59.554861 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:32:59.555105 master-0 kubenswrapper[7728]: I0223 14:32:59.554875 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:32:59.555105 master-0 kubenswrapper[7728]: I0223 14:32:59.554885 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:32:59.556132 master-0 kubenswrapper[7728]: I0223 14:32:59.556091 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Feb 23 14:32:59.556261 master-0 kubenswrapper[7728]: I0223 14:32:59.556231 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-2-master-0" podUID="94e30288-572c-4c6f-a063-a30243db8fd8" containerName="installer" containerID="cri-o://610ae43e00e8e1b0ff3dba88a6993fdf43f969aae5bdeeca94356519cf7c2602" gracePeriod=30 Feb 23 14:32:59.559995 master-0 kubenswrapper[7728]: I0223 14:32:59.559954 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:32:59.561470 master-0 kubenswrapper[7728]: I0223 14:32:59.560968 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:32:59.907904 master-0 kubenswrapper[7728]: I0223 14:32:59.907726 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:32:59.907904 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:32:59.907904 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:32:59.907904 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:32:59.907904 master-0 kubenswrapper[7728]: I0223 14:32:59.907797 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:32:59.999356 master-0 kubenswrapper[7728]: I0223 14:32:59.999256 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_94e30288-572c-4c6f-a063-a30243db8fd8/installer/0.log" Feb 23 14:32:59.999356 master-0 kubenswrapper[7728]: I0223 14:32:59.999308 7728 generic.go:334] "Generic (PLEG): container finished" podID="94e30288-572c-4c6f-a063-a30243db8fd8" containerID="610ae43e00e8e1b0ff3dba88a6993fdf43f969aae5bdeeca94356519cf7c2602" exitCode=1 Feb 23 14:33:00.001147 master-0 kubenswrapper[7728]: I0223 14:32:59.999966 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"94e30288-572c-4c6f-a063-a30243db8fd8","Type":"ContainerDied","Data":"610ae43e00e8e1b0ff3dba88a6993fdf43f969aae5bdeeca94356519cf7c2602"} Feb 23 14:33:00.001147 master-0 kubenswrapper[7728]: I0223 14:33:00.000196 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"94e30288-572c-4c6f-a063-a30243db8fd8","Type":"ContainerDied","Data":"afc8fc4dee0a8892459a65c0cd44c3d38a00ddab249a8e98f6954d9605e9c33a"} Feb 23 14:33:00.001147 master-0 kubenswrapper[7728]: I0223 14:33:00.000217 7728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afc8fc4dee0a8892459a65c0cd44c3d38a00ddab249a8e98f6954d9605e9c33a" Feb 23 14:33:00.005781 master-0 kubenswrapper[7728]: I0223 14:33:00.005712 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:33:00.006173 master-0 kubenswrapper[7728]: I0223 14:33:00.006134 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:33:00.009185 master-0 kubenswrapper[7728]: I0223 14:33:00.009156 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_94e30288-572c-4c6f-a063-a30243db8fd8/installer/0.log" Feb 23 14:33:00.009419 master-0 kubenswrapper[7728]: I0223 14:33:00.009395 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Feb 23 14:33:00.074316 master-0 kubenswrapper[7728]: I0223 14:33:00.074236 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/94e30288-572c-4c6f-a063-a30243db8fd8-kube-api-access\") pod \"94e30288-572c-4c6f-a063-a30243db8fd8\" (UID: \"94e30288-572c-4c6f-a063-a30243db8fd8\") " Feb 23 14:33:00.074555 master-0 kubenswrapper[7728]: I0223 14:33:00.074454 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/94e30288-572c-4c6f-a063-a30243db8fd8-kubelet-dir\") pod \"94e30288-572c-4c6f-a063-a30243db8fd8\" (UID: \"94e30288-572c-4c6f-a063-a30243db8fd8\") " Feb 23 14:33:00.074555 master-0 kubenswrapper[7728]: I0223 14:33:00.074520 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94e30288-572c-4c6f-a063-a30243db8fd8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "94e30288-572c-4c6f-a063-a30243db8fd8" (UID: "94e30288-572c-4c6f-a063-a30243db8fd8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:33:00.074555 master-0 kubenswrapper[7728]: I0223 14:33:00.074548 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/94e30288-572c-4c6f-a063-a30243db8fd8-var-lock\") pod \"94e30288-572c-4c6f-a063-a30243db8fd8\" (UID: \"94e30288-572c-4c6f-a063-a30243db8fd8\") " Feb 23 14:33:00.074705 master-0 kubenswrapper[7728]: I0223 14:33:00.074640 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94e30288-572c-4c6f-a063-a30243db8fd8-var-lock" (OuterVolumeSpecName: "var-lock") pod "94e30288-572c-4c6f-a063-a30243db8fd8" (UID: "94e30288-572c-4c6f-a063-a30243db8fd8"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:33:00.074901 master-0 kubenswrapper[7728]: I0223 14:33:00.074869 7728 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/94e30288-572c-4c6f-a063-a30243db8fd8-var-lock\") on node \"master-0\" DevicePath \"\"" Feb 23 14:33:00.074901 master-0 kubenswrapper[7728]: I0223 14:33:00.074888 7728 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/94e30288-572c-4c6f-a063-a30243db8fd8-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:33:00.077112 master-0 kubenswrapper[7728]: I0223 14:33:00.077069 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94e30288-572c-4c6f-a063-a30243db8fd8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "94e30288-572c-4c6f-a063-a30243db8fd8" (UID: "94e30288-572c-4c6f-a063-a30243db8fd8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:33:00.175723 master-0 kubenswrapper[7728]: I0223 14:33:00.175620 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/94e30288-572c-4c6f-a063-a30243db8fd8-kube-api-access\") on node \"master-0\" DevicePath \"\"" Feb 23 14:33:00.907958 master-0 kubenswrapper[7728]: I0223 14:33:00.907868 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:00.907958 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:00.907958 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:00.907958 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:00.908301 master-0 kubenswrapper[7728]: I0223 14:33:00.907990 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:01.008316 master-0 kubenswrapper[7728]: I0223 14:33:01.008242 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Feb 23 14:33:01.061812 master-0 kubenswrapper[7728]: I0223 14:33:01.061050 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Feb 23 14:33:01.078587 master-0 kubenswrapper[7728]: I0223 14:33:01.078471 7728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Feb 23 14:33:01.216424 master-0 kubenswrapper[7728]: E0223 14:33:01.216347 7728 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b4056ada42457871cd18017ebc6a92f379eb1f932bf638f2bfe123f8631970c2" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 23 14:33:01.218180 master-0 kubenswrapper[7728]: E0223 14:33:01.218119 7728 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b4056ada42457871cd18017ebc6a92f379eb1f932bf638f2bfe123f8631970c2" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 23 14:33:01.220092 master-0 kubenswrapper[7728]: E0223 14:33:01.219981 7728 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b4056ada42457871cd18017ebc6a92f379eb1f932bf638f2bfe123f8631970c2" cmd=["/bin/bash","-c","test -f /ready/ready"] Feb 23 14:33:01.220189 master-0 kubenswrapper[7728]: E0223 14:33:01.220113 7728 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-njjl7" podUID="56b2b6a4-9230-46e4-9689-b267bc753b6f" containerName="kube-multus-additional-cni-plugins" Feb 23 14:33:01.227662 master-0 kubenswrapper[7728]: I0223 14:33:01.227611 7728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94e30288-572c-4c6f-a063-a30243db8fd8" path="/var/lib/kubelet/pods/94e30288-572c-4c6f-a063-a30243db8fd8/volumes" Feb 23 14:33:01.907545 master-0 kubenswrapper[7728]: I0223 14:33:01.907449 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:01.907545 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:01.907545 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:01.907545 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:01.907545 master-0 kubenswrapper[7728]: I0223 14:33:01.907551 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:02.907520 master-0 kubenswrapper[7728]: I0223 14:33:02.907440 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:02.907520 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:02.907520 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:02.907520 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:02.908538 master-0 kubenswrapper[7728]: I0223 14:33:02.907566 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:02.979778 master-0 kubenswrapper[7728]: I0223 14:33:02.979614 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:33:02.987258 master-0 kubenswrapper[7728]: I0223 14:33:02.987210 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:33:03.778008 master-0 kubenswrapper[7728]: I0223 14:33:03.777952 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Feb 23 14:33:03.778643 master-0 kubenswrapper[7728]: E0223 14:33:03.778618 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94e30288-572c-4c6f-a063-a30243db8fd8" containerName="installer" Feb 23 14:33:03.778767 master-0 kubenswrapper[7728]: I0223 14:33:03.778749 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="94e30288-572c-4c6f-a063-a30243db8fd8" containerName="installer" Feb 23 14:33:03.778874 master-0 kubenswrapper[7728]: E0223 14:33:03.778858 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15245f43-22db-42eb-ab0b-702240986437" containerName="installer" Feb 23 14:33:03.778981 master-0 kubenswrapper[7728]: I0223 14:33:03.778966 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="15245f43-22db-42eb-ab0b-702240986437" containerName="installer" Feb 23 14:33:03.779287 master-0 kubenswrapper[7728]: I0223 14:33:03.779268 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="94e30288-572c-4c6f-a063-a30243db8fd8" containerName="installer" Feb 23 14:33:03.779413 master-0 kubenswrapper[7728]: I0223 14:33:03.779396 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="15245f43-22db-42eb-ab0b-702240986437" containerName="installer" Feb 23 14:33:03.780163 master-0 kubenswrapper[7728]: I0223 14:33:03.780140 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Feb 23 14:33:03.811468 master-0 kubenswrapper[7728]: I0223 14:33:03.811396 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Feb 23 14:33:03.912169 master-0 kubenswrapper[7728]: I0223 14:33:03.911700 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:03.912169 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:03.912169 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:03.912169 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:03.912169 master-0 kubenswrapper[7728]: I0223 14:33:03.911766 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:03.926812 master-0 kubenswrapper[7728]: I0223 14:33:03.926761 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a81da820-e31a-455d-b87f-95322ee57d3a-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"a81da820-e31a-455d-b87f-95322ee57d3a\") " pod="openshift-kube-apiserver/installer-3-master-0" Feb 23 14:33:03.926812 master-0 kubenswrapper[7728]: I0223 14:33:03.926811 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a81da820-e31a-455d-b87f-95322ee57d3a-var-lock\") pod \"installer-3-master-0\" (UID: \"a81da820-e31a-455d-b87f-95322ee57d3a\") " pod="openshift-kube-apiserver/installer-3-master-0" Feb 23 14:33:03.927037 master-0 kubenswrapper[7728]: I0223 14:33:03.926832 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a81da820-e31a-455d-b87f-95322ee57d3a-kube-api-access\") pod \"installer-3-master-0\" (UID: \"a81da820-e31a-455d-b87f-95322ee57d3a\") " pod="openshift-kube-apiserver/installer-3-master-0" Feb 23 14:33:04.027797 master-0 kubenswrapper[7728]: I0223 14:33:04.027752 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a81da820-e31a-455d-b87f-95322ee57d3a-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"a81da820-e31a-455d-b87f-95322ee57d3a\") " pod="openshift-kube-apiserver/installer-3-master-0" Feb 23 14:33:04.028169 master-0 kubenswrapper[7728]: I0223 14:33:04.027905 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a81da820-e31a-455d-b87f-95322ee57d3a-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"a81da820-e31a-455d-b87f-95322ee57d3a\") " pod="openshift-kube-apiserver/installer-3-master-0" Feb 23 14:33:04.028169 master-0 kubenswrapper[7728]: I0223 14:33:04.028090 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a81da820-e31a-455d-b87f-95322ee57d3a-var-lock\") pod \"installer-3-master-0\" (UID: \"a81da820-e31a-455d-b87f-95322ee57d3a\") " pod="openshift-kube-apiserver/installer-3-master-0" Feb 23 14:33:04.028405 master-0 kubenswrapper[7728]: I0223 14:33:04.028375 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a81da820-e31a-455d-b87f-95322ee57d3a-var-lock\") pod \"installer-3-master-0\" (UID: \"a81da820-e31a-455d-b87f-95322ee57d3a\") " pod="openshift-kube-apiserver/installer-3-master-0" Feb 23 14:33:04.028568 master-0 kubenswrapper[7728]: I0223 14:33:04.028431 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a81da820-e31a-455d-b87f-95322ee57d3a-kube-api-access\") pod \"installer-3-master-0\" (UID: \"a81da820-e31a-455d-b87f-95322ee57d3a\") " pod="openshift-kube-apiserver/installer-3-master-0" Feb 23 14:33:04.046589 master-0 kubenswrapper[7728]: I0223 14:33:04.046515 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a81da820-e31a-455d-b87f-95322ee57d3a-kube-api-access\") pod \"installer-3-master-0\" (UID: \"a81da820-e31a-455d-b87f-95322ee57d3a\") " pod="openshift-kube-apiserver/installer-3-master-0" Feb 23 14:33:04.142135 master-0 kubenswrapper[7728]: I0223 14:33:04.141987 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Feb 23 14:33:04.551687 master-0 kubenswrapper[7728]: I0223 14:33:04.551622 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Feb 23 14:33:04.560925 master-0 kubenswrapper[7728]: W0223 14:33:04.560876 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda81da820_e31a_455d_b87f_95322ee57d3a.slice/crio-3061aec629b93c613efcc696d7dd1406550961b477efc6823a883d8b2ccdd9eb WatchSource:0}: Error finding container 3061aec629b93c613efcc696d7dd1406550961b477efc6823a883d8b2ccdd9eb: Status 404 returned error can't find the container with id 3061aec629b93c613efcc696d7dd1406550961b477efc6823a883d8b2ccdd9eb Feb 23 14:33:04.907363 master-0 kubenswrapper[7728]: I0223 14:33:04.907325 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:04.907363 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:04.907363 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:04.907363 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:04.907725 master-0 kubenswrapper[7728]: I0223 14:33:04.907687 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:05.045735 master-0 kubenswrapper[7728]: I0223 14:33:05.045287 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"a81da820-e31a-455d-b87f-95322ee57d3a","Type":"ContainerStarted","Data":"4ab6720cde27468abd78741071a75b59adccc5da3d3ab8179ed1abe8921b6c7f"} Feb 23 14:33:05.046795 master-0 kubenswrapper[7728]: I0223 14:33:05.046744 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"a81da820-e31a-455d-b87f-95322ee57d3a","Type":"ContainerStarted","Data":"3061aec629b93c613efcc696d7dd1406550961b477efc6823a883d8b2ccdd9eb"} Feb 23 14:33:05.069722 master-0 kubenswrapper[7728]: I0223 14:33:05.069609 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-3-master-0" podStartSLOduration=2.06957848 podStartE2EDuration="2.06957848s" podCreationTimestamp="2026-02-23 14:33:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:33:05.063816191 +0000 UTC m=+878.026477527" watchObservedRunningTime="2026-02-23 14:33:05.06957848 +0000 UTC m=+878.032239816" Feb 23 14:33:05.914653 master-0 kubenswrapper[7728]: I0223 14:33:05.909522 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:05.914653 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:05.914653 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:05.914653 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:05.914653 master-0 kubenswrapper[7728]: I0223 14:33:05.909603 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:06.908114 master-0 kubenswrapper[7728]: I0223 14:33:06.908022 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:06.908114 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:06.908114 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:06.908114 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:06.908744 master-0 kubenswrapper[7728]: I0223 14:33:06.908158 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:07.010237 master-0 kubenswrapper[7728]: I0223 14:33:07.010179 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-njjl7_56b2b6a4-9230-46e4-9689-b267bc753b6f/kube-multus-additional-cni-plugins/0.log" Feb 23 14:33:07.010454 master-0 kubenswrapper[7728]: I0223 14:33:07.010293 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-njjl7" Feb 23 14:33:07.066985 master-0 kubenswrapper[7728]: I0223 14:33:07.066909 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-njjl7_56b2b6a4-9230-46e4-9689-b267bc753b6f/kube-multus-additional-cni-plugins/0.log" Feb 23 14:33:07.066985 master-0 kubenswrapper[7728]: I0223 14:33:07.066969 7728 generic.go:334] "Generic (PLEG): container finished" podID="56b2b6a4-9230-46e4-9689-b267bc753b6f" containerID="b4056ada42457871cd18017ebc6a92f379eb1f932bf638f2bfe123f8631970c2" exitCode=137 Feb 23 14:33:07.067276 master-0 kubenswrapper[7728]: I0223 14:33:07.067052 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-njjl7" Feb 23 14:33:07.067276 master-0 kubenswrapper[7728]: I0223 14:33:07.067125 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-njjl7" event={"ID":"56b2b6a4-9230-46e4-9689-b267bc753b6f","Type":"ContainerDied","Data":"b4056ada42457871cd18017ebc6a92f379eb1f932bf638f2bfe123f8631970c2"} Feb 23 14:33:07.067276 master-0 kubenswrapper[7728]: I0223 14:33:07.067252 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-njjl7" event={"ID":"56b2b6a4-9230-46e4-9689-b267bc753b6f","Type":"ContainerDied","Data":"310238c5b2504d8af703571fde117480b00b3b9b8f61f02ad61a3558948f39ca"} Feb 23 14:33:07.067467 master-0 kubenswrapper[7728]: I0223 14:33:07.067306 7728 scope.go:117] "RemoveContainer" containerID="b4056ada42457871cd18017ebc6a92f379eb1f932bf638f2bfe123f8631970c2" Feb 23 14:33:07.074432 master-0 kubenswrapper[7728]: I0223 14:33:07.074391 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgkrc\" (UniqueName: \"kubernetes.io/projected/56b2b6a4-9230-46e4-9689-b267bc753b6f-kube-api-access-kgkrc\") pod \"56b2b6a4-9230-46e4-9689-b267bc753b6f\" (UID: \"56b2b6a4-9230-46e4-9689-b267bc753b6f\") " Feb 23 14:33:07.081297 master-0 kubenswrapper[7728]: I0223 14:33:07.081239 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56b2b6a4-9230-46e4-9689-b267bc753b6f-kube-api-access-kgkrc" (OuterVolumeSpecName: "kube-api-access-kgkrc") pod "56b2b6a4-9230-46e4-9689-b267bc753b6f" (UID: "56b2b6a4-9230-46e4-9689-b267bc753b6f"). InnerVolumeSpecName "kube-api-access-kgkrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:33:07.088907 master-0 kubenswrapper[7728]: I0223 14:33:07.088877 7728 scope.go:117] "RemoveContainer" containerID="b4056ada42457871cd18017ebc6a92f379eb1f932bf638f2bfe123f8631970c2" Feb 23 14:33:07.089467 master-0 kubenswrapper[7728]: E0223 14:33:07.089430 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4056ada42457871cd18017ebc6a92f379eb1f932bf638f2bfe123f8631970c2\": container with ID starting with b4056ada42457871cd18017ebc6a92f379eb1f932bf638f2bfe123f8631970c2 not found: ID does not exist" containerID="b4056ada42457871cd18017ebc6a92f379eb1f932bf638f2bfe123f8631970c2" Feb 23 14:33:07.089557 master-0 kubenswrapper[7728]: I0223 14:33:07.089488 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4056ada42457871cd18017ebc6a92f379eb1f932bf638f2bfe123f8631970c2"} err="failed to get container status \"b4056ada42457871cd18017ebc6a92f379eb1f932bf638f2bfe123f8631970c2\": rpc error: code = NotFound desc = could not find container \"b4056ada42457871cd18017ebc6a92f379eb1f932bf638f2bfe123f8631970c2\": container with ID starting with b4056ada42457871cd18017ebc6a92f379eb1f932bf638f2bfe123f8631970c2 not found: ID does not exist" Feb 23 14:33:07.175935 master-0 kubenswrapper[7728]: I0223 14:33:07.175805 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/56b2b6a4-9230-46e4-9689-b267bc753b6f-ready\") pod \"56b2b6a4-9230-46e4-9689-b267bc753b6f\" (UID: \"56b2b6a4-9230-46e4-9689-b267bc753b6f\") " Feb 23 14:33:07.176263 master-0 kubenswrapper[7728]: I0223 14:33:07.176237 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/56b2b6a4-9230-46e4-9689-b267bc753b6f-cni-sysctl-allowlist\") pod \"56b2b6a4-9230-46e4-9689-b267bc753b6f\" (UID: \"56b2b6a4-9230-46e4-9689-b267bc753b6f\") " Feb 23 14:33:07.176811 master-0 kubenswrapper[7728]: I0223 14:33:07.176176 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56b2b6a4-9230-46e4-9689-b267bc753b6f-ready" (OuterVolumeSpecName: "ready") pod "56b2b6a4-9230-46e4-9689-b267bc753b6f" (UID: "56b2b6a4-9230-46e4-9689-b267bc753b6f"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:33:07.176811 master-0 kubenswrapper[7728]: I0223 14:33:07.176638 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56b2b6a4-9230-46e4-9689-b267bc753b6f-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "56b2b6a4-9230-46e4-9689-b267bc753b6f" (UID: "56b2b6a4-9230-46e4-9689-b267bc753b6f"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:33:07.176898 master-0 kubenswrapper[7728]: I0223 14:33:07.176818 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/56b2b6a4-9230-46e4-9689-b267bc753b6f-tuning-conf-dir\") pod \"56b2b6a4-9230-46e4-9689-b267bc753b6f\" (UID: \"56b2b6a4-9230-46e4-9689-b267bc753b6f\") " Feb 23 14:33:07.176937 master-0 kubenswrapper[7728]: I0223 14:33:07.176923 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56b2b6a4-9230-46e4-9689-b267bc753b6f-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "56b2b6a4-9230-46e4-9689-b267bc753b6f" (UID: "56b2b6a4-9230-46e4-9689-b267bc753b6f"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:33:07.177362 master-0 kubenswrapper[7728]: I0223 14:33:07.177334 7728 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/56b2b6a4-9230-46e4-9689-b267bc753b6f-tuning-conf-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:33:07.177362 master-0 kubenswrapper[7728]: I0223 14:33:07.177361 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgkrc\" (UniqueName: \"kubernetes.io/projected/56b2b6a4-9230-46e4-9689-b267bc753b6f-kube-api-access-kgkrc\") on node \"master-0\" DevicePath \"\"" Feb 23 14:33:07.177464 master-0 kubenswrapper[7728]: I0223 14:33:07.177372 7728 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/56b2b6a4-9230-46e4-9689-b267bc753b6f-ready\") on node \"master-0\" DevicePath \"\"" Feb 23 14:33:07.177464 master-0 kubenswrapper[7728]: I0223 14:33:07.177382 7728 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/56b2b6a4-9230-46e4-9689-b267bc753b6f-cni-sysctl-allowlist\") on node \"master-0\" DevicePath \"\"" Feb 23 14:33:07.397572 master-0 kubenswrapper[7728]: I0223 14:33:07.397249 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-njjl7"] Feb 23 14:33:07.400385 master-0 kubenswrapper[7728]: I0223 14:33:07.400332 7728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-njjl7"] Feb 23 14:33:07.731560 master-0 kubenswrapper[7728]: I0223 14:33:07.730979 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-retry-1-master-0_5caa659d-854d-43d6-8169-e03a6b1e97f1/installer/0.log" Feb 23 14:33:07.731758 master-0 kubenswrapper[7728]: I0223 14:33:07.731679 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Feb 23 14:33:07.885547 master-0 kubenswrapper[7728]: I0223 14:33:07.885419 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5caa659d-854d-43d6-8169-e03a6b1e97f1-kubelet-dir\") pod \"5caa659d-854d-43d6-8169-e03a6b1e97f1\" (UID: \"5caa659d-854d-43d6-8169-e03a6b1e97f1\") " Feb 23 14:33:07.885781 master-0 kubenswrapper[7728]: I0223 14:33:07.885561 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5caa659d-854d-43d6-8169-e03a6b1e97f1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5caa659d-854d-43d6-8169-e03a6b1e97f1" (UID: "5caa659d-854d-43d6-8169-e03a6b1e97f1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:33:07.885781 master-0 kubenswrapper[7728]: I0223 14:33:07.885651 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5caa659d-854d-43d6-8169-e03a6b1e97f1-var-lock\") pod \"5caa659d-854d-43d6-8169-e03a6b1e97f1\" (UID: \"5caa659d-854d-43d6-8169-e03a6b1e97f1\") " Feb 23 14:33:07.885781 master-0 kubenswrapper[7728]: I0223 14:33:07.885733 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5caa659d-854d-43d6-8169-e03a6b1e97f1-var-lock" (OuterVolumeSpecName: "var-lock") pod "5caa659d-854d-43d6-8169-e03a6b1e97f1" (UID: "5caa659d-854d-43d6-8169-e03a6b1e97f1"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:33:07.885781 master-0 kubenswrapper[7728]: I0223 14:33:07.885767 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5caa659d-854d-43d6-8169-e03a6b1e97f1-kube-api-access\") pod \"5caa659d-854d-43d6-8169-e03a6b1e97f1\" (UID: \"5caa659d-854d-43d6-8169-e03a6b1e97f1\") " Feb 23 14:33:07.886224 master-0 kubenswrapper[7728]: I0223 14:33:07.886189 7728 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5caa659d-854d-43d6-8169-e03a6b1e97f1-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:33:07.886268 master-0 kubenswrapper[7728]: I0223 14:33:07.886225 7728 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5caa659d-854d-43d6-8169-e03a6b1e97f1-var-lock\") on node \"master-0\" DevicePath \"\"" Feb 23 14:33:07.888426 master-0 kubenswrapper[7728]: I0223 14:33:07.888340 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5caa659d-854d-43d6-8169-e03a6b1e97f1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5caa659d-854d-43d6-8169-e03a6b1e97f1" (UID: "5caa659d-854d-43d6-8169-e03a6b1e97f1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:33:07.908463 master-0 kubenswrapper[7728]: I0223 14:33:07.908320 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:07.908463 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:07.908463 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:07.908463 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:07.908463 master-0 kubenswrapper[7728]: I0223 14:33:07.908399 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:07.987385 master-0 kubenswrapper[7728]: I0223 14:33:07.987306 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5caa659d-854d-43d6-8169-e03a6b1e97f1-kube-api-access\") on node \"master-0\" DevicePath \"\"" Feb 23 14:33:08.091436 master-0 kubenswrapper[7728]: I0223 14:33:08.091383 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-retry-1-master-0_5caa659d-854d-43d6-8169-e03a6b1e97f1/installer/0.log" Feb 23 14:33:08.091436 master-0 kubenswrapper[7728]: I0223 14:33:08.091433 7728 generic.go:334] "Generic (PLEG): container finished" podID="5caa659d-854d-43d6-8169-e03a6b1e97f1" containerID="df4fa1ae04da7af9b174cd0914daf1b675cdba93f2bb2264ca25aa4373df6054" exitCode=1 Feb 23 14:33:08.092133 master-0 kubenswrapper[7728]: I0223 14:33:08.091495 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"5caa659d-854d-43d6-8169-e03a6b1e97f1","Type":"ContainerDied","Data":"df4fa1ae04da7af9b174cd0914daf1b675cdba93f2bb2264ca25aa4373df6054"} Feb 23 14:33:08.092133 master-0 kubenswrapper[7728]: I0223 14:33:08.091521 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"5caa659d-854d-43d6-8169-e03a6b1e97f1","Type":"ContainerDied","Data":"a1b72bc7fa94d2067752196074cb58d6cb50c64ab06eb17aa033c2353eecf711"} Feb 23 14:33:08.092133 master-0 kubenswrapper[7728]: I0223 14:33:08.091551 7728 scope.go:117] "RemoveContainer" containerID="df4fa1ae04da7af9b174cd0914daf1b675cdba93f2bb2264ca25aa4373df6054" Feb 23 14:33:08.092133 master-0 kubenswrapper[7728]: I0223 14:33:08.091610 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Feb 23 14:33:08.110586 master-0 kubenswrapper[7728]: I0223 14:33:08.110467 7728 scope.go:117] "RemoveContainer" containerID="df4fa1ae04da7af9b174cd0914daf1b675cdba93f2bb2264ca25aa4373df6054" Feb 23 14:33:08.111052 master-0 kubenswrapper[7728]: E0223 14:33:08.110978 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df4fa1ae04da7af9b174cd0914daf1b675cdba93f2bb2264ca25aa4373df6054\": container with ID starting with df4fa1ae04da7af9b174cd0914daf1b675cdba93f2bb2264ca25aa4373df6054 not found: ID does not exist" containerID="df4fa1ae04da7af9b174cd0914daf1b675cdba93f2bb2264ca25aa4373df6054" Feb 23 14:33:08.111052 master-0 kubenswrapper[7728]: I0223 14:33:08.111039 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df4fa1ae04da7af9b174cd0914daf1b675cdba93f2bb2264ca25aa4373df6054"} err="failed to get container status \"df4fa1ae04da7af9b174cd0914daf1b675cdba93f2bb2264ca25aa4373df6054\": rpc error: code = NotFound desc = could not find container \"df4fa1ae04da7af9b174cd0914daf1b675cdba93f2bb2264ca25aa4373df6054\": container with ID starting with df4fa1ae04da7af9b174cd0914daf1b675cdba93f2bb2264ca25aa4373df6054 not found: ID does not exist" Feb 23 14:33:08.135800 master-0 kubenswrapper[7728]: I0223 14:33:08.135308 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Feb 23 14:33:08.147991 master-0 kubenswrapper[7728]: I0223 14:33:08.147924 7728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Feb 23 14:33:08.907626 master-0 kubenswrapper[7728]: I0223 14:33:08.907536 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:08.907626 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:08.907626 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:08.907626 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:08.908180 master-0 kubenswrapper[7728]: I0223 14:33:08.907619 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:09.235383 master-0 kubenswrapper[7728]: I0223 14:33:09.235304 7728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56b2b6a4-9230-46e4-9689-b267bc753b6f" path="/var/lib/kubelet/pods/56b2b6a4-9230-46e4-9689-b267bc753b6f/volumes" Feb 23 14:33:09.236242 master-0 kubenswrapper[7728]: I0223 14:33:09.236220 7728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5caa659d-854d-43d6-8169-e03a6b1e97f1" path="/var/lib/kubelet/pods/5caa659d-854d-43d6-8169-e03a6b1e97f1/volumes" Feb 23 14:33:09.907011 master-0 kubenswrapper[7728]: I0223 14:33:09.906902 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:09.907011 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:09.907011 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:09.907011 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:09.907011 master-0 kubenswrapper[7728]: I0223 14:33:09.906997 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:10.907221 master-0 kubenswrapper[7728]: I0223 14:33:10.907172 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:10.907221 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:10.907221 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:10.907221 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:10.907935 master-0 kubenswrapper[7728]: I0223 14:33:10.907907 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:11.220725 master-0 kubenswrapper[7728]: I0223 14:33:11.220677 7728 scope.go:117] "RemoveContainer" containerID="b59ddaa1f996d8d231b18a402187cbb1ee1446439ec71026f52221d4aaab529f" Feb 23 14:33:11.220924 master-0 kubenswrapper[7728]: E0223 14:33:11.220893 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=ingress-operator pod=ingress-operator-6569778c84-hsl6c_openshift-ingress-operator(3488a7eb-5170-478c-9af7-490dbe0f514e)\"" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" podUID="3488a7eb-5170-478c-9af7-490dbe0f514e" Feb 23 14:33:11.907026 master-0 kubenswrapper[7728]: I0223 14:33:11.906966 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:11.907026 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:11.907026 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:11.907026 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:11.907870 master-0 kubenswrapper[7728]: I0223 14:33:11.907034 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:12.907883 master-0 kubenswrapper[7728]: I0223 14:33:12.907808 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:12.907883 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:12.907883 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:12.907883 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:12.908546 master-0 kubenswrapper[7728]: I0223 14:33:12.907887 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:13.920347 master-0 kubenswrapper[7728]: I0223 14:33:13.920284 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:13.920347 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:13.920347 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:13.920347 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:13.921047 master-0 kubenswrapper[7728]: I0223 14:33:13.920365 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:14.160231 master-0 kubenswrapper[7728]: I0223 14:33:14.160108 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-5f98f4f8d5-fnc9v_842d45c5-3452-4e97-b5f5-540395330a65/multus-admission-controller/0.log" Feb 23 14:33:14.160231 master-0 kubenswrapper[7728]: I0223 14:33:14.160177 7728 generic.go:334] "Generic (PLEG): container finished" podID="842d45c5-3452-4e97-b5f5-540395330a65" containerID="a1c596e71b919718ef2a5ecca5e4edc213fe0205cd3b4ffeba1110c64e033918" exitCode=137 Feb 23 14:33:14.160231 master-0 kubenswrapper[7728]: I0223 14:33:14.160219 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v" event={"ID":"842d45c5-3452-4e97-b5f5-540395330a65","Type":"ContainerDied","Data":"a1c596e71b919718ef2a5ecca5e4edc213fe0205cd3b4ffeba1110c64e033918"} Feb 23 14:33:14.616258 master-0 kubenswrapper[7728]: I0223 14:33:14.616176 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-5f98f4f8d5-fnc9v_842d45c5-3452-4e97-b5f5-540395330a65/multus-admission-controller/0.log" Feb 23 14:33:14.616527 master-0 kubenswrapper[7728]: I0223 14:33:14.616300 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v" Feb 23 14:33:14.791812 master-0 kubenswrapper[7728]: I0223 14:33:14.791742 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j54j5\" (UniqueName: \"kubernetes.io/projected/842d45c5-3452-4e97-b5f5-540395330a65-kube-api-access-j54j5\") pod \"842d45c5-3452-4e97-b5f5-540395330a65\" (UID: \"842d45c5-3452-4e97-b5f5-540395330a65\") " Feb 23 14:33:14.792141 master-0 kubenswrapper[7728]: I0223 14:33:14.791869 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs\") pod \"842d45c5-3452-4e97-b5f5-540395330a65\" (UID: \"842d45c5-3452-4e97-b5f5-540395330a65\") " Feb 23 14:33:14.794559 master-0 kubenswrapper[7728]: I0223 14:33:14.794436 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/842d45c5-3452-4e97-b5f5-540395330a65-kube-api-access-j54j5" (OuterVolumeSpecName: "kube-api-access-j54j5") pod "842d45c5-3452-4e97-b5f5-540395330a65" (UID: "842d45c5-3452-4e97-b5f5-540395330a65"). InnerVolumeSpecName "kube-api-access-j54j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:33:14.794722 master-0 kubenswrapper[7728]: I0223 14:33:14.794682 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "842d45c5-3452-4e97-b5f5-540395330a65" (UID: "842d45c5-3452-4e97-b5f5-540395330a65"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:33:14.893957 master-0 kubenswrapper[7728]: I0223 14:33:14.893876 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j54j5\" (UniqueName: \"kubernetes.io/projected/842d45c5-3452-4e97-b5f5-540395330a65-kube-api-access-j54j5\") on node \"master-0\" DevicePath \"\"" Feb 23 14:33:14.893957 master-0 kubenswrapper[7728]: I0223 14:33:14.893944 7728 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/842d45c5-3452-4e97-b5f5-540395330a65-webhook-certs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:33:14.907488 master-0 kubenswrapper[7728]: I0223 14:33:14.907401 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:14.907488 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:14.907488 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:14.907488 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:14.907739 master-0 kubenswrapper[7728]: I0223 14:33:14.907539 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:15.170416 master-0 kubenswrapper[7728]: I0223 14:33:15.170215 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-5f98f4f8d5-fnc9v_842d45c5-3452-4e97-b5f5-540395330a65/multus-admission-controller/0.log" Feb 23 14:33:15.170416 master-0 kubenswrapper[7728]: I0223 14:33:15.170280 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v" event={"ID":"842d45c5-3452-4e97-b5f5-540395330a65","Type":"ContainerDied","Data":"fa13ad269ac35b8dfe7ce217763b060906b7b9f87b5b13a745caacb634cb9d80"} Feb 23 14:33:15.170416 master-0 kubenswrapper[7728]: I0223 14:33:15.170322 7728 scope.go:117] "RemoveContainer" containerID="2c192f5e695207c7fb2b849827d14cbf1e828f8b6127dbf574b9e2669fd9c4a7" Feb 23 14:33:15.170416 master-0 kubenswrapper[7728]: I0223 14:33:15.170341 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v" Feb 23 14:33:15.189867 master-0 kubenswrapper[7728]: I0223 14:33:15.189820 7728 scope.go:117] "RemoveContainer" containerID="a1c596e71b919718ef2a5ecca5e4edc213fe0205cd3b4ffeba1110c64e033918" Feb 23 14:33:15.212025 master-0 kubenswrapper[7728]: I0223 14:33:15.211442 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v"] Feb 23 14:33:15.216127 master-0 kubenswrapper[7728]: I0223 14:33:15.216079 7728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/multus-admission-controller-5f98f4f8d5-fnc9v"] Feb 23 14:33:15.231721 master-0 kubenswrapper[7728]: I0223 14:33:15.231681 7728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="842d45c5-3452-4e97-b5f5-540395330a65" path="/var/lib/kubelet/pods/842d45c5-3452-4e97-b5f5-540395330a65/volumes" Feb 23 14:33:15.910693 master-0 kubenswrapper[7728]: I0223 14:33:15.910612 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:15.910693 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:15.910693 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:15.910693 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:15.911157 master-0 kubenswrapper[7728]: I0223 14:33:15.910723 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:16.907633 master-0 kubenswrapper[7728]: I0223 14:33:16.907541 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:16.907633 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:16.907633 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:16.907633 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:16.908162 master-0 kubenswrapper[7728]: I0223 14:33:16.907661 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:17.896452 master-0 kubenswrapper[7728]: I0223 14:33:17.896374 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Feb 23 14:33:17.896862 master-0 kubenswrapper[7728]: E0223 14:33:17.896817 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56b2b6a4-9230-46e4-9689-b267bc753b6f" containerName="kube-multus-additional-cni-plugins" Feb 23 14:33:17.896862 master-0 kubenswrapper[7728]: I0223 14:33:17.896849 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="56b2b6a4-9230-46e4-9689-b267bc753b6f" containerName="kube-multus-additional-cni-plugins" Feb 23 14:33:17.897040 master-0 kubenswrapper[7728]: E0223 14:33:17.896878 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="842d45c5-3452-4e97-b5f5-540395330a65" containerName="kube-rbac-proxy" Feb 23 14:33:17.897040 master-0 kubenswrapper[7728]: I0223 14:33:17.896893 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="842d45c5-3452-4e97-b5f5-540395330a65" containerName="kube-rbac-proxy" Feb 23 14:33:17.897040 master-0 kubenswrapper[7728]: E0223 14:33:17.896918 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5caa659d-854d-43d6-8169-e03a6b1e97f1" containerName="installer" Feb 23 14:33:17.897040 master-0 kubenswrapper[7728]: I0223 14:33:17.896933 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="5caa659d-854d-43d6-8169-e03a6b1e97f1" containerName="installer" Feb 23 14:33:17.897040 master-0 kubenswrapper[7728]: E0223 14:33:17.896963 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="842d45c5-3452-4e97-b5f5-540395330a65" containerName="multus-admission-controller" Feb 23 14:33:17.897040 master-0 kubenswrapper[7728]: I0223 14:33:17.896977 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="842d45c5-3452-4e97-b5f5-540395330a65" containerName="multus-admission-controller" Feb 23 14:33:17.897518 master-0 kubenswrapper[7728]: I0223 14:33:17.897154 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="842d45c5-3452-4e97-b5f5-540395330a65" containerName="multus-admission-controller" Feb 23 14:33:17.897518 master-0 kubenswrapper[7728]: I0223 14:33:17.897191 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="56b2b6a4-9230-46e4-9689-b267bc753b6f" containerName="kube-multus-additional-cni-plugins" Feb 23 14:33:17.897518 master-0 kubenswrapper[7728]: I0223 14:33:17.897219 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="5caa659d-854d-43d6-8169-e03a6b1e97f1" containerName="installer" Feb 23 14:33:17.897518 master-0 kubenswrapper[7728]: I0223 14:33:17.897242 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="842d45c5-3452-4e97-b5f5-540395330a65" containerName="kube-rbac-proxy" Feb 23 14:33:17.898107 master-0 kubenswrapper[7728]: I0223 14:33:17.898052 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Feb 23 14:33:17.899868 master-0 kubenswrapper[7728]: I0223 14:33:17.899804 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Feb 23 14:33:17.900084 master-0 kubenswrapper[7728]: I0223 14:33:17.900041 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-bhcxh" Feb 23 14:33:17.907314 master-0 kubenswrapper[7728]: I0223 14:33:17.907238 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:17.907314 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:17.907314 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:17.907314 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:17.907314 master-0 kubenswrapper[7728]: I0223 14:33:17.907300 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:17.917443 master-0 kubenswrapper[7728]: I0223 14:33:17.917182 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Feb 23 14:33:17.938692 master-0 kubenswrapper[7728]: I0223 14:33:17.938600 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1225c7e0-f2d1-4b39-979c-c77191862c81-kube-api-access\") pod \"installer-5-master-0\" (UID: \"1225c7e0-f2d1-4b39-979c-c77191862c81\") " pod="openshift-kube-scheduler/installer-5-master-0" Feb 23 14:33:17.938692 master-0 kubenswrapper[7728]: I0223 14:33:17.938678 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1225c7e0-f2d1-4b39-979c-c77191862c81-var-lock\") pod \"installer-5-master-0\" (UID: \"1225c7e0-f2d1-4b39-979c-c77191862c81\") " pod="openshift-kube-scheduler/installer-5-master-0" Feb 23 14:33:17.939081 master-0 kubenswrapper[7728]: I0223 14:33:17.938773 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1225c7e0-f2d1-4b39-979c-c77191862c81-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"1225c7e0-f2d1-4b39-979c-c77191862c81\") " pod="openshift-kube-scheduler/installer-5-master-0" Feb 23 14:33:18.040130 master-0 kubenswrapper[7728]: I0223 14:33:18.039996 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1225c7e0-f2d1-4b39-979c-c77191862c81-var-lock\") pod \"installer-5-master-0\" (UID: \"1225c7e0-f2d1-4b39-979c-c77191862c81\") " pod="openshift-kube-scheduler/installer-5-master-0" Feb 23 14:33:18.040298 master-0 kubenswrapper[7728]: I0223 14:33:18.040144 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1225c7e0-f2d1-4b39-979c-c77191862c81-var-lock\") pod \"installer-5-master-0\" (UID: \"1225c7e0-f2d1-4b39-979c-c77191862c81\") " pod="openshift-kube-scheduler/installer-5-master-0" Feb 23 14:33:18.040298 master-0 kubenswrapper[7728]: I0223 14:33:18.040249 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1225c7e0-f2d1-4b39-979c-c77191862c81-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"1225c7e0-f2d1-4b39-979c-c77191862c81\") " pod="openshift-kube-scheduler/installer-5-master-0" Feb 23 14:33:18.040368 master-0 kubenswrapper[7728]: I0223 14:33:18.040356 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1225c7e0-f2d1-4b39-979c-c77191862c81-kube-api-access\") pod \"installer-5-master-0\" (UID: \"1225c7e0-f2d1-4b39-979c-c77191862c81\") " pod="openshift-kube-scheduler/installer-5-master-0" Feb 23 14:33:18.040454 master-0 kubenswrapper[7728]: I0223 14:33:18.040401 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1225c7e0-f2d1-4b39-979c-c77191862c81-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"1225c7e0-f2d1-4b39-979c-c77191862c81\") " pod="openshift-kube-scheduler/installer-5-master-0" Feb 23 14:33:18.058993 master-0 kubenswrapper[7728]: I0223 14:33:18.058935 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1225c7e0-f2d1-4b39-979c-c77191862c81-kube-api-access\") pod \"installer-5-master-0\" (UID: \"1225c7e0-f2d1-4b39-979c-c77191862c81\") " pod="openshift-kube-scheduler/installer-5-master-0" Feb 23 14:33:18.238080 master-0 kubenswrapper[7728]: I0223 14:33:18.238017 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Feb 23 14:33:18.730091 master-0 kubenswrapper[7728]: I0223 14:33:18.730046 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Feb 23 14:33:18.732352 master-0 kubenswrapper[7728]: W0223 14:33:18.732304 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1225c7e0_f2d1_4b39_979c_c77191862c81.slice/crio-c76f131e30758a750c900160974a6f4d36e53aa3d716b4777b6fcccdf538b7ff WatchSource:0}: Error finding container c76f131e30758a750c900160974a6f4d36e53aa3d716b4777b6fcccdf538b7ff: Status 404 returned error can't find the container with id c76f131e30758a750c900160974a6f4d36e53aa3d716b4777b6fcccdf538b7ff Feb 23 14:33:18.907399 master-0 kubenswrapper[7728]: I0223 14:33:18.907346 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:18.907399 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:18.907399 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:18.907399 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:18.907853 master-0 kubenswrapper[7728]: I0223 14:33:18.907418 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:19.199466 master-0 kubenswrapper[7728]: I0223 14:33:19.199401 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"1225c7e0-f2d1-4b39-979c-c77191862c81","Type":"ContainerStarted","Data":"77b1792020215e1792b7b140e0ac936225d54418ad659b82a3189d8865905a56"} Feb 23 14:33:19.199466 master-0 kubenswrapper[7728]: I0223 14:33:19.199459 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"1225c7e0-f2d1-4b39-979c-c77191862c81","Type":"ContainerStarted","Data":"c76f131e30758a750c900160974a6f4d36e53aa3d716b4777b6fcccdf538b7ff"} Feb 23 14:33:19.281462 master-0 kubenswrapper[7728]: I0223 14:33:19.281249 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-5-master-0" podStartSLOduration=2.281231682 podStartE2EDuration="2.281231682s" podCreationTimestamp="2026-02-23 14:33:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:33:19.278712907 +0000 UTC m=+892.241374233" watchObservedRunningTime="2026-02-23 14:33:19.281231682 +0000 UTC m=+892.243892978" Feb 23 14:33:19.907389 master-0 kubenswrapper[7728]: I0223 14:33:19.907283 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:19.907389 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:19.907389 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:19.907389 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:19.908192 master-0 kubenswrapper[7728]: I0223 14:33:19.907415 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:20.915286 master-0 kubenswrapper[7728]: I0223 14:33:20.915221 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:20.915286 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:20.915286 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:20.915286 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:20.915947 master-0 kubenswrapper[7728]: I0223 14:33:20.915295 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:21.907682 master-0 kubenswrapper[7728]: I0223 14:33:21.907591 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:21.907682 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:21.907682 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:21.907682 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:21.907682 master-0 kubenswrapper[7728]: I0223 14:33:21.907679 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:22.221187 master-0 kubenswrapper[7728]: I0223 14:33:22.221069 7728 scope.go:117] "RemoveContainer" containerID="b59ddaa1f996d8d231b18a402187cbb1ee1446439ec71026f52221d4aaab529f" Feb 23 14:33:22.222018 master-0 kubenswrapper[7728]: E0223 14:33:22.221449 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=ingress-operator pod=ingress-operator-6569778c84-hsl6c_openshift-ingress-operator(3488a7eb-5170-478c-9af7-490dbe0f514e)\"" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" podUID="3488a7eb-5170-478c-9af7-490dbe0f514e" Feb 23 14:33:22.907884 master-0 kubenswrapper[7728]: I0223 14:33:22.907835 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:22.907884 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:22.907884 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:22.907884 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:22.908328 master-0 kubenswrapper[7728]: I0223 14:33:22.908295 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:23.907205 master-0 kubenswrapper[7728]: I0223 14:33:23.907145 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:23.907205 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:23.907205 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:23.907205 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:23.907913 master-0 kubenswrapper[7728]: I0223 14:33:23.907233 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:24.908235 master-0 kubenswrapper[7728]: I0223 14:33:24.908171 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:24.908235 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:24.908235 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:24.908235 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:24.908918 master-0 kubenswrapper[7728]: I0223 14:33:24.908274 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:25.907822 master-0 kubenswrapper[7728]: I0223 14:33:25.907753 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:25.907822 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:25.907822 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:25.907822 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:25.908862 master-0 kubenswrapper[7728]: I0223 14:33:25.907882 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:26.347966 master-0 kubenswrapper[7728]: I0223 14:33:26.347916 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Feb 23 14:33:26.349137 master-0 kubenswrapper[7728]: I0223 14:33:26.349117 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Feb 23 14:33:26.351532 master-0 kubenswrapper[7728]: I0223 14:33:26.351515 7728 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-4hjzs" Feb 23 14:33:26.351697 master-0 kubenswrapper[7728]: I0223 14:33:26.351641 7728 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 23 14:33:26.423439 master-0 kubenswrapper[7728]: I0223 14:33:26.423397 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Feb 23 14:33:26.479159 master-0 kubenswrapper[7728]: I0223 14:33:26.479123 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a52cbaf6-c1af-4c29-aef9-67523f5148c6-var-lock\") pod \"installer-3-master-0\" (UID: \"a52cbaf6-c1af-4c29-aef9-67523f5148c6\") " pod="openshift-kube-controller-manager/installer-3-master-0" Feb 23 14:33:26.479412 master-0 kubenswrapper[7728]: I0223 14:33:26.479397 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a52cbaf6-c1af-4c29-aef9-67523f5148c6-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"a52cbaf6-c1af-4c29-aef9-67523f5148c6\") " pod="openshift-kube-controller-manager/installer-3-master-0" Feb 23 14:33:26.479516 master-0 kubenswrapper[7728]: I0223 14:33:26.479503 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a52cbaf6-c1af-4c29-aef9-67523f5148c6-kube-api-access\") pod \"installer-3-master-0\" (UID: \"a52cbaf6-c1af-4c29-aef9-67523f5148c6\") " pod="openshift-kube-controller-manager/installer-3-master-0" Feb 23 14:33:26.581170 master-0 kubenswrapper[7728]: I0223 14:33:26.581085 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a52cbaf6-c1af-4c29-aef9-67523f5148c6-var-lock\") pod \"installer-3-master-0\" (UID: \"a52cbaf6-c1af-4c29-aef9-67523f5148c6\") " pod="openshift-kube-controller-manager/installer-3-master-0" Feb 23 14:33:26.581170 master-0 kubenswrapper[7728]: I0223 14:33:26.581164 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a52cbaf6-c1af-4c29-aef9-67523f5148c6-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"a52cbaf6-c1af-4c29-aef9-67523f5148c6\") " pod="openshift-kube-controller-manager/installer-3-master-0" Feb 23 14:33:26.581469 master-0 kubenswrapper[7728]: I0223 14:33:26.581194 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a52cbaf6-c1af-4c29-aef9-67523f5148c6-kube-api-access\") pod \"installer-3-master-0\" (UID: \"a52cbaf6-c1af-4c29-aef9-67523f5148c6\") " pod="openshift-kube-controller-manager/installer-3-master-0" Feb 23 14:33:26.581668 master-0 kubenswrapper[7728]: I0223 14:33:26.581640 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a52cbaf6-c1af-4c29-aef9-67523f5148c6-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"a52cbaf6-c1af-4c29-aef9-67523f5148c6\") " pod="openshift-kube-controller-manager/installer-3-master-0" Feb 23 14:33:26.581815 master-0 kubenswrapper[7728]: I0223 14:33:26.581777 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a52cbaf6-c1af-4c29-aef9-67523f5148c6-var-lock\") pod \"installer-3-master-0\" (UID: \"a52cbaf6-c1af-4c29-aef9-67523f5148c6\") " pod="openshift-kube-controller-manager/installer-3-master-0" Feb 23 14:33:26.598237 master-0 kubenswrapper[7728]: I0223 14:33:26.598112 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a52cbaf6-c1af-4c29-aef9-67523f5148c6-kube-api-access\") pod \"installer-3-master-0\" (UID: \"a52cbaf6-c1af-4c29-aef9-67523f5148c6\") " pod="openshift-kube-controller-manager/installer-3-master-0" Feb 23 14:33:26.681255 master-0 kubenswrapper[7728]: I0223 14:33:26.681179 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Feb 23 14:33:26.907823 master-0 kubenswrapper[7728]: I0223 14:33:26.907700 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:26.907823 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:26.907823 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:26.907823 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:26.907823 master-0 kubenswrapper[7728]: I0223 14:33:26.907803 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:27.155961 master-0 kubenswrapper[7728]: I0223 14:33:27.155687 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Feb 23 14:33:27.158914 master-0 kubenswrapper[7728]: W0223 14:33:27.158764 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda52cbaf6_c1af_4c29_aef9_67523f5148c6.slice/crio-a0069da11d319bc5848078fcc0287e9847a5a57b6af5121c1b96241adcdaa914 WatchSource:0}: Error finding container a0069da11d319bc5848078fcc0287e9847a5a57b6af5121c1b96241adcdaa914: Status 404 returned error can't find the container with id a0069da11d319bc5848078fcc0287e9847a5a57b6af5121c1b96241adcdaa914 Feb 23 14:33:27.269938 master-0 kubenswrapper[7728]: I0223 14:33:27.266668 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"a52cbaf6-c1af-4c29-aef9-67523f5148c6","Type":"ContainerStarted","Data":"a0069da11d319bc5848078fcc0287e9847a5a57b6af5121c1b96241adcdaa914"} Feb 23 14:33:27.736659 master-0 kubenswrapper[7728]: I0223 14:33:27.736584 7728 scope.go:117] "RemoveContainer" containerID="901941f5b39d593d08535a59f0a3320fa3d1d31c538434d8bc740dd1aca5de85" Feb 23 14:33:27.907239 master-0 kubenswrapper[7728]: I0223 14:33:27.907128 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:27.907239 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:27.907239 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:27.907239 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:27.908147 master-0 kubenswrapper[7728]: I0223 14:33:27.907242 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:28.273003 master-0 kubenswrapper[7728]: I0223 14:33:28.272937 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"a52cbaf6-c1af-4c29-aef9-67523f5148c6","Type":"ContainerStarted","Data":"f2ac2c56c1e34a13c986eed31e989cbdb13313c0a91d44d3e68704b2399e5a39"} Feb 23 14:33:28.304511 master-0 kubenswrapper[7728]: I0223 14:33:28.301397 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-3-master-0" podStartSLOduration=2.301366706 podStartE2EDuration="2.301366706s" podCreationTimestamp="2026-02-23 14:33:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:33:28.294085027 +0000 UTC m=+901.256746323" watchObservedRunningTime="2026-02-23 14:33:28.301366706 +0000 UTC m=+901.264028002" Feb 23 14:33:28.908873 master-0 kubenswrapper[7728]: I0223 14:33:28.908763 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:28.908873 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:28.908873 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:28.908873 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:28.909367 master-0 kubenswrapper[7728]: I0223 14:33:28.908907 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:29.907800 master-0 kubenswrapper[7728]: I0223 14:33:29.907733 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:29.907800 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:29.907800 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:29.907800 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:29.908330 master-0 kubenswrapper[7728]: I0223 14:33:29.907808 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:30.907517 master-0 kubenswrapper[7728]: I0223 14:33:30.907436 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:30.907517 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:30.907517 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:30.907517 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:30.907965 master-0 kubenswrapper[7728]: I0223 14:33:30.907632 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:31.909816 master-0 kubenswrapper[7728]: I0223 14:33:31.909737 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:31.909816 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:31.909816 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:31.909816 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:31.910596 master-0 kubenswrapper[7728]: I0223 14:33:31.909830 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:32.367736 master-0 kubenswrapper[7728]: I0223 14:33:32.367646 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Feb 23 14:33:32.368006 master-0 kubenswrapper[7728]: I0223 14:33:32.367956 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-3-master-0" podUID="a81da820-e31a-455d-b87f-95322ee57d3a" containerName="installer" containerID="cri-o://4ab6720cde27468abd78741071a75b59adccc5da3d3ab8179ed1abe8921b6c7f" gracePeriod=30 Feb 23 14:33:32.908351 master-0 kubenswrapper[7728]: I0223 14:33:32.908264 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:32.908351 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:32.908351 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:32.908351 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:32.908670 master-0 kubenswrapper[7728]: I0223 14:33:32.908399 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:33.908127 master-0 kubenswrapper[7728]: I0223 14:33:33.908044 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:33.908127 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:33.908127 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:33.908127 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:33.908920 master-0 kubenswrapper[7728]: I0223 14:33:33.908132 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:34.221500 master-0 kubenswrapper[7728]: I0223 14:33:34.221420 7728 scope.go:117] "RemoveContainer" containerID="b59ddaa1f996d8d231b18a402187cbb1ee1446439ec71026f52221d4aaab529f" Feb 23 14:33:34.221827 master-0 kubenswrapper[7728]: E0223 14:33:34.221780 7728 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=ingress-operator pod=ingress-operator-6569778c84-hsl6c_openshift-ingress-operator(3488a7eb-5170-478c-9af7-490dbe0f514e)\"" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" podUID="3488a7eb-5170-478c-9af7-490dbe0f514e" Feb 23 14:33:34.907295 master-0 kubenswrapper[7728]: I0223 14:33:34.907240 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:34.907295 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:34.907295 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:34.907295 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:34.907701 master-0 kubenswrapper[7728]: I0223 14:33:34.907307 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:35.609637 master-0 kubenswrapper[7728]: I0223 14:33:35.609580 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Feb 23 14:33:35.611402 master-0 kubenswrapper[7728]: I0223 14:33:35.611339 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Feb 23 14:33:35.621698 master-0 kubenswrapper[7728]: I0223 14:33:35.621651 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e1148263-7b15-4c12-a217-8b030ecd9348-var-lock\") pod \"installer-4-master-0\" (UID: \"e1148263-7b15-4c12-a217-8b030ecd9348\") " pod="openshift-kube-apiserver/installer-4-master-0" Feb 23 14:33:35.621798 master-0 kubenswrapper[7728]: I0223 14:33:35.621766 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1148263-7b15-4c12-a217-8b030ecd9348-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"e1148263-7b15-4c12-a217-8b030ecd9348\") " pod="openshift-kube-apiserver/installer-4-master-0" Feb 23 14:33:35.621993 master-0 kubenswrapper[7728]: I0223 14:33:35.621949 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access\") pod \"installer-4-master-0\" (UID: \"e1148263-7b15-4c12-a217-8b030ecd9348\") " pod="openshift-kube-apiserver/installer-4-master-0" Feb 23 14:33:35.646906 master-0 kubenswrapper[7728]: I0223 14:33:35.644658 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Feb 23 14:33:35.725009 master-0 kubenswrapper[7728]: I0223 14:33:35.724269 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e1148263-7b15-4c12-a217-8b030ecd9348-var-lock\") pod \"installer-4-master-0\" (UID: \"e1148263-7b15-4c12-a217-8b030ecd9348\") " pod="openshift-kube-apiserver/installer-4-master-0" Feb 23 14:33:35.725463 master-0 kubenswrapper[7728]: I0223 14:33:35.725440 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1148263-7b15-4c12-a217-8b030ecd9348-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"e1148263-7b15-4c12-a217-8b030ecd9348\") " pod="openshift-kube-apiserver/installer-4-master-0" Feb 23 14:33:35.725715 master-0 kubenswrapper[7728]: I0223 14:33:35.725689 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access\") pod \"installer-4-master-0\" (UID: \"e1148263-7b15-4c12-a217-8b030ecd9348\") " pod="openshift-kube-apiserver/installer-4-master-0" Feb 23 14:33:35.726342 master-0 kubenswrapper[7728]: I0223 14:33:35.726272 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e1148263-7b15-4c12-a217-8b030ecd9348-var-lock\") pod \"installer-4-master-0\" (UID: \"e1148263-7b15-4c12-a217-8b030ecd9348\") " pod="openshift-kube-apiserver/installer-4-master-0" Feb 23 14:33:35.726446 master-0 kubenswrapper[7728]: I0223 14:33:35.726301 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1148263-7b15-4c12-a217-8b030ecd9348-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"e1148263-7b15-4c12-a217-8b030ecd9348\") " pod="openshift-kube-apiserver/installer-4-master-0" Feb 23 14:33:35.757056 master-0 kubenswrapper[7728]: I0223 14:33:35.757002 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access\") pod \"installer-4-master-0\" (UID: \"e1148263-7b15-4c12-a217-8b030ecd9348\") " pod="openshift-kube-apiserver/installer-4-master-0" Feb 23 14:33:35.907437 master-0 kubenswrapper[7728]: I0223 14:33:35.907281 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:35.907437 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:35.907437 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:35.907437 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:35.907437 master-0 kubenswrapper[7728]: I0223 14:33:35.907333 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:35.949401 master-0 kubenswrapper[7728]: I0223 14:33:35.949294 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Feb 23 14:33:36.373234 master-0 kubenswrapper[7728]: I0223 14:33:36.373179 7728 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Feb 23 14:33:36.383772 master-0 kubenswrapper[7728]: W0223 14:33:36.383651 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode1148263_7b15_4c12_a217_8b030ecd9348.slice/crio-329af3342eefe9c97b6c099fc381b37d3806777c128f648a5ea19e45c1f91e62 WatchSource:0}: Error finding container 329af3342eefe9c97b6c099fc381b37d3806777c128f648a5ea19e45c1f91e62: Status 404 returned error can't find the container with id 329af3342eefe9c97b6c099fc381b37d3806777c128f648a5ea19e45c1f91e62 Feb 23 14:33:36.909030 master-0 kubenswrapper[7728]: I0223 14:33:36.908926 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:36.909030 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:36.909030 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:36.909030 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:36.909030 master-0 kubenswrapper[7728]: I0223 14:33:36.908999 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:37.339881 master-0 kubenswrapper[7728]: I0223 14:33:37.339785 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"e1148263-7b15-4c12-a217-8b030ecd9348","Type":"ContainerStarted","Data":"909663cdb0c0ac8db46b5e0989f1e87cc68ef03f3124e36ed314cba8e6058032"} Feb 23 14:33:37.339881 master-0 kubenswrapper[7728]: I0223 14:33:37.339870 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"e1148263-7b15-4c12-a217-8b030ecd9348","Type":"ContainerStarted","Data":"329af3342eefe9c97b6c099fc381b37d3806777c128f648a5ea19e45c1f91e62"} Feb 23 14:33:37.357834 master-0 kubenswrapper[7728]: I0223 14:33:37.357754 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-4-master-0" podStartSLOduration=2.357732031 podStartE2EDuration="2.357732031s" podCreationTimestamp="2026-02-23 14:33:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:33:37.357053353 +0000 UTC m=+910.319714669" watchObservedRunningTime="2026-02-23 14:33:37.357732031 +0000 UTC m=+910.320393347" Feb 23 14:33:37.909151 master-0 kubenswrapper[7728]: I0223 14:33:37.909056 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:37.909151 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:37.909151 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:37.909151 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:37.910197 master-0 kubenswrapper[7728]: I0223 14:33:37.909170 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:38.910687 master-0 kubenswrapper[7728]: I0223 14:33:38.910619 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:38.910687 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:38.910687 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:38.910687 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:38.911850 master-0 kubenswrapper[7728]: I0223 14:33:38.910719 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:39.910468 master-0 kubenswrapper[7728]: I0223 14:33:39.910379 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:39.910468 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:39.910468 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:39.910468 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:39.911051 master-0 kubenswrapper[7728]: I0223 14:33:39.910516 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:40.908810 master-0 kubenswrapper[7728]: I0223 14:33:40.908671 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:40.908810 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:40.908810 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:40.908810 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:40.908810 master-0 kubenswrapper[7728]: I0223 14:33:40.908778 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:41.908302 master-0 kubenswrapper[7728]: I0223 14:33:41.908193 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:41.908302 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:41.908302 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:41.908302 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:41.908875 master-0 kubenswrapper[7728]: I0223 14:33:41.908338 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:42.907752 master-0 kubenswrapper[7728]: I0223 14:33:42.907672 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:42.907752 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:42.907752 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:42.907752 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:42.907752 master-0 kubenswrapper[7728]: I0223 14:33:42.907737 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:43.863674 master-0 kubenswrapper[7728]: I0223 14:33:43.863570 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Feb 23 14:33:43.907583 master-0 kubenswrapper[7728]: I0223 14:33:43.907516 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:43.907583 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:43.907583 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:43.907583 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:43.907862 master-0 kubenswrapper[7728]: I0223 14:33:43.907592 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:44.910218 master-0 kubenswrapper[7728]: I0223 14:33:44.910141 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:44.910218 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:44.910218 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:44.910218 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:44.910921 master-0 kubenswrapper[7728]: I0223 14:33:44.910259 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:45.909848 master-0 kubenswrapper[7728]: I0223 14:33:45.909800 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:45.909848 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:45.909848 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:45.909848 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:45.910113 master-0 kubenswrapper[7728]: I0223 14:33:45.909878 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:46.162536 master-0 kubenswrapper[7728]: I0223 14:33:46.162461 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-3-master-0_a81da820-e31a-455d-b87f-95322ee57d3a/installer/0.log" Feb 23 14:33:46.163003 master-0 kubenswrapper[7728]: I0223 14:33:46.162563 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Feb 23 14:33:46.286163 master-0 kubenswrapper[7728]: I0223 14:33:46.286100 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a81da820-e31a-455d-b87f-95322ee57d3a-var-lock\") pod \"a81da820-e31a-455d-b87f-95322ee57d3a\" (UID: \"a81da820-e31a-455d-b87f-95322ee57d3a\") " Feb 23 14:33:46.286399 master-0 kubenswrapper[7728]: I0223 14:33:46.286201 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a81da820-e31a-455d-b87f-95322ee57d3a-kube-api-access\") pod \"a81da820-e31a-455d-b87f-95322ee57d3a\" (UID: \"a81da820-e31a-455d-b87f-95322ee57d3a\") " Feb 23 14:33:46.286399 master-0 kubenswrapper[7728]: I0223 14:33:46.286208 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a81da820-e31a-455d-b87f-95322ee57d3a-var-lock" (OuterVolumeSpecName: "var-lock") pod "a81da820-e31a-455d-b87f-95322ee57d3a" (UID: "a81da820-e31a-455d-b87f-95322ee57d3a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:33:46.286399 master-0 kubenswrapper[7728]: I0223 14:33:46.286224 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a81da820-e31a-455d-b87f-95322ee57d3a-kubelet-dir\") pod \"a81da820-e31a-455d-b87f-95322ee57d3a\" (UID: \"a81da820-e31a-455d-b87f-95322ee57d3a\") " Feb 23 14:33:46.286399 master-0 kubenswrapper[7728]: I0223 14:33:46.286271 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a81da820-e31a-455d-b87f-95322ee57d3a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a81da820-e31a-455d-b87f-95322ee57d3a" (UID: "a81da820-e31a-455d-b87f-95322ee57d3a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:33:46.286774 master-0 kubenswrapper[7728]: I0223 14:33:46.286731 7728 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a81da820-e31a-455d-b87f-95322ee57d3a-var-lock\") on node \"master-0\" DevicePath \"\"" Feb 23 14:33:46.286774 master-0 kubenswrapper[7728]: I0223 14:33:46.286769 7728 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a81da820-e31a-455d-b87f-95322ee57d3a-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:33:46.288823 master-0 kubenswrapper[7728]: I0223 14:33:46.288770 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a81da820-e31a-455d-b87f-95322ee57d3a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a81da820-e31a-455d-b87f-95322ee57d3a" (UID: "a81da820-e31a-455d-b87f-95322ee57d3a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:33:46.388246 master-0 kubenswrapper[7728]: I0223 14:33:46.388073 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a81da820-e31a-455d-b87f-95322ee57d3a-kube-api-access\") on node \"master-0\" DevicePath \"\"" Feb 23 14:33:46.413784 master-0 kubenswrapper[7728]: I0223 14:33:46.413629 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-3-master-0_a81da820-e31a-455d-b87f-95322ee57d3a/installer/0.log" Feb 23 14:33:46.413784 master-0 kubenswrapper[7728]: I0223 14:33:46.413738 7728 generic.go:334] "Generic (PLEG): container finished" podID="a81da820-e31a-455d-b87f-95322ee57d3a" containerID="4ab6720cde27468abd78741071a75b59adccc5da3d3ab8179ed1abe8921b6c7f" exitCode=1 Feb 23 14:33:46.414179 master-0 kubenswrapper[7728]: I0223 14:33:46.413795 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"a81da820-e31a-455d-b87f-95322ee57d3a","Type":"ContainerDied","Data":"4ab6720cde27468abd78741071a75b59adccc5da3d3ab8179ed1abe8921b6c7f"} Feb 23 14:33:46.414179 master-0 kubenswrapper[7728]: I0223 14:33:46.413948 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"a81da820-e31a-455d-b87f-95322ee57d3a","Type":"ContainerDied","Data":"3061aec629b93c613efcc696d7dd1406550961b477efc6823a883d8b2ccdd9eb"} Feb 23 14:33:46.414179 master-0 kubenswrapper[7728]: I0223 14:33:46.413957 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Feb 23 14:33:46.414179 master-0 kubenswrapper[7728]: I0223 14:33:46.414035 7728 scope.go:117] "RemoveContainer" containerID="4ab6720cde27468abd78741071a75b59adccc5da3d3ab8179ed1abe8921b6c7f" Feb 23 14:33:46.441818 master-0 kubenswrapper[7728]: I0223 14:33:46.441759 7728 scope.go:117] "RemoveContainer" containerID="4ab6720cde27468abd78741071a75b59adccc5da3d3ab8179ed1abe8921b6c7f" Feb 23 14:33:46.442384 master-0 kubenswrapper[7728]: E0223 14:33:46.442312 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ab6720cde27468abd78741071a75b59adccc5da3d3ab8179ed1abe8921b6c7f\": container with ID starting with 4ab6720cde27468abd78741071a75b59adccc5da3d3ab8179ed1abe8921b6c7f not found: ID does not exist" containerID="4ab6720cde27468abd78741071a75b59adccc5da3d3ab8179ed1abe8921b6c7f" Feb 23 14:33:46.442467 master-0 kubenswrapper[7728]: I0223 14:33:46.442389 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ab6720cde27468abd78741071a75b59adccc5da3d3ab8179ed1abe8921b6c7f"} err="failed to get container status \"4ab6720cde27468abd78741071a75b59adccc5da3d3ab8179ed1abe8921b6c7f\": rpc error: code = NotFound desc = could not find container \"4ab6720cde27468abd78741071a75b59adccc5da3d3ab8179ed1abe8921b6c7f\": container with ID starting with 4ab6720cde27468abd78741071a75b59adccc5da3d3ab8179ed1abe8921b6c7f not found: ID does not exist" Feb 23 14:33:46.464333 master-0 kubenswrapper[7728]: I0223 14:33:46.464258 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Feb 23 14:33:46.477528 master-0 kubenswrapper[7728]: I0223 14:33:46.477384 7728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Feb 23 14:33:46.912278 master-0 kubenswrapper[7728]: I0223 14:33:46.912222 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:46.912278 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:46.912278 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:46.912278 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:46.912825 master-0 kubenswrapper[7728]: I0223 14:33:46.912294 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:47.230817 master-0 kubenswrapper[7728]: I0223 14:33:47.230722 7728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a81da820-e31a-455d-b87f-95322ee57d3a" path="/var/lib/kubelet/pods/a81da820-e31a-455d-b87f-95322ee57d3a/volumes" Feb 23 14:33:47.909805 master-0 kubenswrapper[7728]: I0223 14:33:47.909709 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:47.909805 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:47.909805 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:47.909805 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:47.910217 master-0 kubenswrapper[7728]: I0223 14:33:47.909808 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:48.908851 master-0 kubenswrapper[7728]: I0223 14:33:48.908779 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:48.908851 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:48.908851 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:48.908851 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:48.910761 master-0 kubenswrapper[7728]: I0223 14:33:48.908875 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:49.221515 master-0 kubenswrapper[7728]: I0223 14:33:49.221412 7728 scope.go:117] "RemoveContainer" containerID="b59ddaa1f996d8d231b18a402187cbb1ee1446439ec71026f52221d4aaab529f" Feb 23 14:33:49.909886 master-0 kubenswrapper[7728]: I0223 14:33:49.909729 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:49.909886 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:49.909886 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:49.909886 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:49.909886 master-0 kubenswrapper[7728]: I0223 14:33:49.909812 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:50.292780 master-0 kubenswrapper[7728]: I0223 14:33:50.292721 7728 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Feb 23 14:33:50.293106 master-0 kubenswrapper[7728]: I0223 14:33:50.293040 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="56ff46cdb00d28519af7c0cdc9ea8d11" containerName="kube-scheduler" containerID="cri-o://a607c62f2f6fcfd8c1e82eea4a4f2c6c0363686e9d645511b15629d774c518ef" gracePeriod=30 Feb 23 14:33:50.293264 master-0 kubenswrapper[7728]: I0223 14:33:50.293229 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="56ff46cdb00d28519af7c0cdc9ea8d11" containerName="kube-scheduler-recovery-controller" containerID="cri-o://1bc827a8854dfec15010881aca2028b9a63aeaba5a66ba610581f32b2d5f3a53" gracePeriod=30 Feb 23 14:33:50.293352 master-0 kubenswrapper[7728]: I0223 14:33:50.293318 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="56ff46cdb00d28519af7c0cdc9ea8d11" containerName="kube-scheduler-cert-syncer" containerID="cri-o://a83359f382bbf7e84f344fddfee0b01fcac40fd46b179877c37b9571576884e8" gracePeriod=30 Feb 23 14:33:50.293697 master-0 kubenswrapper[7728]: I0223 14:33:50.293658 7728 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Feb 23 14:33:50.294548 master-0 kubenswrapper[7728]: E0223 14:33:50.294002 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a81da820-e31a-455d-b87f-95322ee57d3a" containerName="installer" Feb 23 14:33:50.294548 master-0 kubenswrapper[7728]: I0223 14:33:50.294019 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="a81da820-e31a-455d-b87f-95322ee57d3a" containerName="installer" Feb 23 14:33:50.294548 master-0 kubenswrapper[7728]: E0223 14:33:50.294048 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ff46cdb00d28519af7c0cdc9ea8d11" containerName="kube-scheduler" Feb 23 14:33:50.294548 master-0 kubenswrapper[7728]: I0223 14:33:50.294056 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ff46cdb00d28519af7c0cdc9ea8d11" containerName="kube-scheduler" Feb 23 14:33:50.294548 master-0 kubenswrapper[7728]: E0223 14:33:50.294072 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ff46cdb00d28519af7c0cdc9ea8d11" containerName="kube-scheduler-recovery-controller" Feb 23 14:33:50.294548 master-0 kubenswrapper[7728]: I0223 14:33:50.294082 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ff46cdb00d28519af7c0cdc9ea8d11" containerName="kube-scheduler-recovery-controller" Feb 23 14:33:50.294548 master-0 kubenswrapper[7728]: E0223 14:33:50.294094 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ff46cdb00d28519af7c0cdc9ea8d11" containerName="kube-scheduler-cert-syncer" Feb 23 14:33:50.294548 master-0 kubenswrapper[7728]: I0223 14:33:50.294102 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ff46cdb00d28519af7c0cdc9ea8d11" containerName="kube-scheduler-cert-syncer" Feb 23 14:33:50.294548 master-0 kubenswrapper[7728]: E0223 14:33:50.294116 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ff46cdb00d28519af7c0cdc9ea8d11" containerName="wait-for-host-port" Feb 23 14:33:50.294548 master-0 kubenswrapper[7728]: I0223 14:33:50.294124 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ff46cdb00d28519af7c0cdc9ea8d11" containerName="wait-for-host-port" Feb 23 14:33:50.294548 master-0 kubenswrapper[7728]: I0223 14:33:50.294262 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="56ff46cdb00d28519af7c0cdc9ea8d11" containerName="kube-scheduler-recovery-controller" Feb 23 14:33:50.294548 master-0 kubenswrapper[7728]: I0223 14:33:50.294295 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="a81da820-e31a-455d-b87f-95322ee57d3a" containerName="installer" Feb 23 14:33:50.294548 master-0 kubenswrapper[7728]: I0223 14:33:50.294308 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="56ff46cdb00d28519af7c0cdc9ea8d11" containerName="kube-scheduler-cert-syncer" Feb 23 14:33:50.294548 master-0 kubenswrapper[7728]: I0223 14:33:50.294320 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="56ff46cdb00d28519af7c0cdc9ea8d11" containerName="kube-scheduler" Feb 23 14:33:50.341633 master-0 kubenswrapper[7728]: I0223 14:33:50.341554 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d03a1e6620a92c780b0a91c72a55bc8b-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"d03a1e6620a92c780b0a91c72a55bc8b\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Feb 23 14:33:50.341841 master-0 kubenswrapper[7728]: I0223 14:33:50.341731 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d03a1e6620a92c780b0a91c72a55bc8b-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"d03a1e6620a92c780b0a91c72a55bc8b\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Feb 23 14:33:50.442334 master-0 kubenswrapper[7728]: I0223 14:33:50.442285 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d03a1e6620a92c780b0a91c72a55bc8b-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"d03a1e6620a92c780b0a91c72a55bc8b\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Feb 23 14:33:50.442430 master-0 kubenswrapper[7728]: I0223 14:33:50.442358 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d03a1e6620a92c780b0a91c72a55bc8b-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"d03a1e6620a92c780b0a91c72a55bc8b\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Feb 23 14:33:50.442430 master-0 kubenswrapper[7728]: I0223 14:33:50.442373 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d03a1e6620a92c780b0a91c72a55bc8b-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"d03a1e6620a92c780b0a91c72a55bc8b\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Feb 23 14:33:50.442566 master-0 kubenswrapper[7728]: I0223 14:33:50.442472 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d03a1e6620a92c780b0a91c72a55bc8b-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"d03a1e6620a92c780b0a91c72a55bc8b\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Feb 23 14:33:50.450889 master-0 kubenswrapper[7728]: I0223 14:33:50.450841 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-6569778c84-hsl6c_3488a7eb-5170-478c-9af7-490dbe0f514e/ingress-operator/4.log" Feb 23 14:33:50.451392 master-0 kubenswrapper[7728]: I0223 14:33:50.451352 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" event={"ID":"3488a7eb-5170-478c-9af7-490dbe0f514e","Type":"ContainerStarted","Data":"3f0687c145632743367603550edfb620cb9232947c49e722ee283fed3a1505a2"} Feb 23 14:33:50.453617 master-0 kubenswrapper[7728]: I0223 14:33:50.453572 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_56ff46cdb00d28519af7c0cdc9ea8d11/kube-scheduler-cert-syncer/0.log" Feb 23 14:33:50.454333 master-0 kubenswrapper[7728]: I0223 14:33:50.454281 7728 generic.go:334] "Generic (PLEG): container finished" podID="56ff46cdb00d28519af7c0cdc9ea8d11" containerID="1bc827a8854dfec15010881aca2028b9a63aeaba5a66ba610581f32b2d5f3a53" exitCode=0 Feb 23 14:33:50.454333 master-0 kubenswrapper[7728]: I0223 14:33:50.454325 7728 generic.go:334] "Generic (PLEG): container finished" podID="56ff46cdb00d28519af7c0cdc9ea8d11" containerID="a83359f382bbf7e84f344fddfee0b01fcac40fd46b179877c37b9571576884e8" exitCode=2 Feb 23 14:33:50.454472 master-0 kubenswrapper[7728]: I0223 14:33:50.454341 7728 generic.go:334] "Generic (PLEG): container finished" podID="56ff46cdb00d28519af7c0cdc9ea8d11" containerID="a607c62f2f6fcfd8c1e82eea4a4f2c6c0363686e9d645511b15629d774c518ef" exitCode=0 Feb 23 14:33:50.454472 master-0 kubenswrapper[7728]: I0223 14:33:50.454370 7728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edb6ec0bf3017407524bcef4889f83a1c2c54ff05b0e40918d7eda5368e06757" Feb 23 14:33:50.469755 master-0 kubenswrapper[7728]: I0223 14:33:50.469712 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_56ff46cdb00d28519af7c0cdc9ea8d11/kube-scheduler-cert-syncer/0.log" Feb 23 14:33:50.470518 master-0 kubenswrapper[7728]: I0223 14:33:50.470490 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Feb 23 14:33:50.515144 master-0 kubenswrapper[7728]: I0223 14:33:50.514871 7728 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="56ff46cdb00d28519af7c0cdc9ea8d11" podUID="d03a1e6620a92c780b0a91c72a55bc8b" Feb 23 14:33:50.644844 master-0 kubenswrapper[7728]: I0223 14:33:50.644203 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/56ff46cdb00d28519af7c0cdc9ea8d11-resource-dir\") pod \"56ff46cdb00d28519af7c0cdc9ea8d11\" (UID: \"56ff46cdb00d28519af7c0cdc9ea8d11\") " Feb 23 14:33:50.644844 master-0 kubenswrapper[7728]: I0223 14:33:50.644392 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/56ff46cdb00d28519af7c0cdc9ea8d11-cert-dir\") pod \"56ff46cdb00d28519af7c0cdc9ea8d11\" (UID: \"56ff46cdb00d28519af7c0cdc9ea8d11\") " Feb 23 14:33:50.644844 master-0 kubenswrapper[7728]: I0223 14:33:50.644757 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56ff46cdb00d28519af7c0cdc9ea8d11-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "56ff46cdb00d28519af7c0cdc9ea8d11" (UID: "56ff46cdb00d28519af7c0cdc9ea8d11"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:33:50.644844 master-0 kubenswrapper[7728]: I0223 14:33:50.644784 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56ff46cdb00d28519af7c0cdc9ea8d11-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "56ff46cdb00d28519af7c0cdc9ea8d11" (UID: "56ff46cdb00d28519af7c0cdc9ea8d11"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:33:50.746370 master-0 kubenswrapper[7728]: I0223 14:33:50.746220 7728 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/56ff46cdb00d28519af7c0cdc9ea8d11-resource-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:33:50.746370 master-0 kubenswrapper[7728]: I0223 14:33:50.746257 7728 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/56ff46cdb00d28519af7c0cdc9ea8d11-cert-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:33:50.908230 master-0 kubenswrapper[7728]: I0223 14:33:50.908111 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:50.908230 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:50.908230 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:50.908230 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:50.908230 master-0 kubenswrapper[7728]: I0223 14:33:50.908181 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:51.229892 master-0 kubenswrapper[7728]: I0223 14:33:51.229818 7728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56ff46cdb00d28519af7c0cdc9ea8d11" path="/var/lib/kubelet/pods/56ff46cdb00d28519af7c0cdc9ea8d11/volumes" Feb 23 14:33:51.465382 master-0 kubenswrapper[7728]: I0223 14:33:51.465295 7728 generic.go:334] "Generic (PLEG): container finished" podID="1225c7e0-f2d1-4b39-979c-c77191862c81" containerID="77b1792020215e1792b7b140e0ac936225d54418ad659b82a3189d8865905a56" exitCode=0 Feb 23 14:33:51.465673 master-0 kubenswrapper[7728]: I0223 14:33:51.465361 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"1225c7e0-f2d1-4b39-979c-c77191862c81","Type":"ContainerDied","Data":"77b1792020215e1792b7b140e0ac936225d54418ad659b82a3189d8865905a56"} Feb 23 14:33:51.465673 master-0 kubenswrapper[7728]: I0223 14:33:51.465424 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Feb 23 14:33:51.492837 master-0 kubenswrapper[7728]: I0223 14:33:51.492697 7728 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="56ff46cdb00d28519af7c0cdc9ea8d11" podUID="d03a1e6620a92c780b0a91c72a55bc8b" Feb 23 14:33:51.907197 master-0 kubenswrapper[7728]: I0223 14:33:51.907092 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:51.907197 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:51.907197 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:51.907197 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:51.907603 master-0 kubenswrapper[7728]: I0223 14:33:51.907571 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:52.806097 master-0 kubenswrapper[7728]: I0223 14:33:52.806052 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Feb 23 14:33:52.907787 master-0 kubenswrapper[7728]: I0223 14:33:52.907656 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:52.907787 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:52.907787 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:52.907787 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:52.908066 master-0 kubenswrapper[7728]: I0223 14:33:52.907814 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:52.978156 master-0 kubenswrapper[7728]: I0223 14:33:52.977777 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1225c7e0-f2d1-4b39-979c-c77191862c81-kube-api-access\") pod \"1225c7e0-f2d1-4b39-979c-c77191862c81\" (UID: \"1225c7e0-f2d1-4b39-979c-c77191862c81\") " Feb 23 14:33:52.978156 master-0 kubenswrapper[7728]: I0223 14:33:52.977917 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1225c7e0-f2d1-4b39-979c-c77191862c81-var-lock\") pod \"1225c7e0-f2d1-4b39-979c-c77191862c81\" (UID: \"1225c7e0-f2d1-4b39-979c-c77191862c81\") " Feb 23 14:33:52.978156 master-0 kubenswrapper[7728]: I0223 14:33:52.977944 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1225c7e0-f2d1-4b39-979c-c77191862c81-kubelet-dir\") pod \"1225c7e0-f2d1-4b39-979c-c77191862c81\" (UID: \"1225c7e0-f2d1-4b39-979c-c77191862c81\") " Feb 23 14:33:52.978156 master-0 kubenswrapper[7728]: I0223 14:33:52.978001 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1225c7e0-f2d1-4b39-979c-c77191862c81-var-lock" (OuterVolumeSpecName: "var-lock") pod "1225c7e0-f2d1-4b39-979c-c77191862c81" (UID: "1225c7e0-f2d1-4b39-979c-c77191862c81"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:33:52.978156 master-0 kubenswrapper[7728]: I0223 14:33:52.978096 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1225c7e0-f2d1-4b39-979c-c77191862c81-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1225c7e0-f2d1-4b39-979c-c77191862c81" (UID: "1225c7e0-f2d1-4b39-979c-c77191862c81"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:33:52.978612 master-0 kubenswrapper[7728]: I0223 14:33:52.978347 7728 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1225c7e0-f2d1-4b39-979c-c77191862c81-var-lock\") on node \"master-0\" DevicePath \"\"" Feb 23 14:33:52.978612 master-0 kubenswrapper[7728]: I0223 14:33:52.978362 7728 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1225c7e0-f2d1-4b39-979c-c77191862c81-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:33:52.980866 master-0 kubenswrapper[7728]: I0223 14:33:52.980807 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1225c7e0-f2d1-4b39-979c-c77191862c81-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1225c7e0-f2d1-4b39-979c-c77191862c81" (UID: "1225c7e0-f2d1-4b39-979c-c77191862c81"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:33:53.079500 master-0 kubenswrapper[7728]: I0223 14:33:53.079353 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1225c7e0-f2d1-4b39-979c-c77191862c81-kube-api-access\") on node \"master-0\" DevicePath \"\"" Feb 23 14:33:53.477996 master-0 kubenswrapper[7728]: I0223 14:33:53.477937 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"1225c7e0-f2d1-4b39-979c-c77191862c81","Type":"ContainerDied","Data":"c76f131e30758a750c900160974a6f4d36e53aa3d716b4777b6fcccdf538b7ff"} Feb 23 14:33:53.477996 master-0 kubenswrapper[7728]: I0223 14:33:53.477989 7728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c76f131e30758a750c900160974a6f4d36e53aa3d716b4777b6fcccdf538b7ff" Feb 23 14:33:53.478306 master-0 kubenswrapper[7728]: I0223 14:33:53.478028 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Feb 23 14:33:53.908051 master-0 kubenswrapper[7728]: I0223 14:33:53.907935 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:53.908051 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:53.908051 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:53.908051 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:53.908051 master-0 kubenswrapper[7728]: I0223 14:33:53.908016 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:54.906524 master-0 kubenswrapper[7728]: I0223 14:33:54.906438 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:54.906524 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:54.906524 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:54.906524 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:54.906524 master-0 kubenswrapper[7728]: I0223 14:33:54.906518 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:55.907345 master-0 kubenswrapper[7728]: I0223 14:33:55.907231 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:55.907345 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:55.907345 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:55.907345 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:55.907345 master-0 kubenswrapper[7728]: I0223 14:33:55.907338 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:56.907838 master-0 kubenswrapper[7728]: I0223 14:33:56.907765 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:56.907838 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:56.907838 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:56.907838 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:56.907838 master-0 kubenswrapper[7728]: I0223 14:33:56.907831 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:57.907314 master-0 kubenswrapper[7728]: I0223 14:33:57.907114 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:57.907314 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:57.907314 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:57.907314 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:57.907314 master-0 kubenswrapper[7728]: I0223 14:33:57.907196 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:58.908627 master-0 kubenswrapper[7728]: I0223 14:33:58.908555 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:58.908627 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:58.908627 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:58.908627 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:58.909297 master-0 kubenswrapper[7728]: I0223 14:33:58.908645 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:33:59.909275 master-0 kubenswrapper[7728]: I0223 14:33:59.909189 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:33:59.909275 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:33:59.909275 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:33:59.909275 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:33:59.910052 master-0 kubenswrapper[7728]: I0223 14:33:59.910024 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:34:00.414990 master-0 kubenswrapper[7728]: I0223 14:34:00.414871 7728 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Feb 23 14:34:00.415552 master-0 kubenswrapper[7728]: E0223 14:34:00.415022 7728 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/kube-controller-manager-pod.yaml\": /etc/kubernetes/manifests/kube-controller-manager-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Feb 23 14:34:00.415552 master-0 kubenswrapper[7728]: I0223 14:34:00.415270 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="79b63b0311108e042b4d16d40534ff93" containerName="kube-controller-manager" containerID="cri-o://cc06826ca4ea200fc8d9842f30146d34aae96828a181bf8a495c15fc9235ebdd" gracePeriod=30 Feb 23 14:34:00.415552 master-0 kubenswrapper[7728]: I0223 14:34:00.415419 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="79b63b0311108e042b4d16d40534ff93" containerName="cluster-policy-controller" containerID="cri-o://ebb1604449992626b0f5a092af2505cc2585db163d59c98aca43b3946d7b4f4c" gracePeriod=30 Feb 23 14:34:00.415552 master-0 kubenswrapper[7728]: I0223 14:34:00.415453 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="79b63b0311108e042b4d16d40534ff93" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://4727007b722af10f03ef1e413d9a5307206f80fd37d32a2fd4680a08e0c09abd" gracePeriod=30 Feb 23 14:34:00.415976 master-0 kubenswrapper[7728]: I0223 14:34:00.415419 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="79b63b0311108e042b4d16d40534ff93" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://3128474a58df77e7f674ea5fc2b490b80a290b5c331332dff119879f6a4cc014" gracePeriod=30 Feb 23 14:34:00.419300 master-0 kubenswrapper[7728]: I0223 14:34:00.419174 7728 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Feb 23 14:34:00.419938 master-0 kubenswrapper[7728]: E0223 14:34:00.419872 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79b63b0311108e042b4d16d40534ff93" containerName="kube-controller-manager" Feb 23 14:34:00.419938 master-0 kubenswrapper[7728]: I0223 14:34:00.419921 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="79b63b0311108e042b4d16d40534ff93" containerName="kube-controller-manager" Feb 23 14:34:00.420152 master-0 kubenswrapper[7728]: E0223 14:34:00.419951 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79b63b0311108e042b4d16d40534ff93" containerName="cluster-policy-controller" Feb 23 14:34:00.420152 master-0 kubenswrapper[7728]: I0223 14:34:00.419970 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="79b63b0311108e042b4d16d40534ff93" containerName="cluster-policy-controller" Feb 23 14:34:00.420152 master-0 kubenswrapper[7728]: E0223 14:34:00.419998 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1225c7e0-f2d1-4b39-979c-c77191862c81" containerName="installer" Feb 23 14:34:00.420152 master-0 kubenswrapper[7728]: I0223 14:34:00.420017 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="1225c7e0-f2d1-4b39-979c-c77191862c81" containerName="installer" Feb 23 14:34:00.420152 master-0 kubenswrapper[7728]: E0223 14:34:00.420093 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79b63b0311108e042b4d16d40534ff93" containerName="kube-controller-manager-recovery-controller" Feb 23 14:34:00.420152 master-0 kubenswrapper[7728]: I0223 14:34:00.420114 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="79b63b0311108e042b4d16d40534ff93" containerName="kube-controller-manager-recovery-controller" Feb 23 14:34:00.420152 master-0 kubenswrapper[7728]: E0223 14:34:00.420141 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79b63b0311108e042b4d16d40534ff93" containerName="kube-controller-manager-cert-syncer" Feb 23 14:34:00.420152 master-0 kubenswrapper[7728]: I0223 14:34:00.420159 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="79b63b0311108e042b4d16d40534ff93" containerName="kube-controller-manager-cert-syncer" Feb 23 14:34:00.420834 master-0 kubenswrapper[7728]: I0223 14:34:00.420420 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="79b63b0311108e042b4d16d40534ff93" containerName="kube-controller-manager" Feb 23 14:34:00.420834 master-0 kubenswrapper[7728]: I0223 14:34:00.420460 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="79b63b0311108e042b4d16d40534ff93" containerName="kube-controller-manager-recovery-controller" Feb 23 14:34:00.420834 master-0 kubenswrapper[7728]: I0223 14:34:00.420522 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="79b63b0311108e042b4d16d40534ff93" containerName="kube-controller-manager-cert-syncer" Feb 23 14:34:00.420834 master-0 kubenswrapper[7728]: I0223 14:34:00.420574 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="79b63b0311108e042b4d16d40534ff93" containerName="cluster-policy-controller" Feb 23 14:34:00.420834 master-0 kubenswrapper[7728]: I0223 14:34:00.420609 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="1225c7e0-f2d1-4b39-979c-c77191862c81" containerName="installer" Feb 23 14:34:00.585708 master-0 kubenswrapper[7728]: I0223 14:34:00.585649 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/181adc3f4810f127b44f3750f5d2460c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"181adc3f4810f127b44f3750f5d2460c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:34:00.585811 master-0 kubenswrapper[7728]: I0223 14:34:00.585717 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/181adc3f4810f127b44f3750f5d2460c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"181adc3f4810f127b44f3750f5d2460c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:34:00.687398 master-0 kubenswrapper[7728]: I0223 14:34:00.687301 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/181adc3f4810f127b44f3750f5d2460c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"181adc3f4810f127b44f3750f5d2460c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:34:00.687398 master-0 kubenswrapper[7728]: I0223 14:34:00.687383 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/181adc3f4810f127b44f3750f5d2460c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"181adc3f4810f127b44f3750f5d2460c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:34:00.687398 master-0 kubenswrapper[7728]: I0223 14:34:00.687405 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/181adc3f4810f127b44f3750f5d2460c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"181adc3f4810f127b44f3750f5d2460c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:34:00.687806 master-0 kubenswrapper[7728]: I0223 14:34:00.687570 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/181adc3f4810f127b44f3750f5d2460c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"181adc3f4810f127b44f3750f5d2460c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:34:00.908546 master-0 kubenswrapper[7728]: I0223 14:34:00.908453 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:34:00.908546 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:34:00.908546 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:34:00.908546 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:34:00.908872 master-0 kubenswrapper[7728]: I0223 14:34:00.908551 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:34:01.152779 master-0 kubenswrapper[7728]: I0223 14:34:01.152697 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_79b63b0311108e042b4d16d40534ff93/kube-controller-manager-cert-syncer/0.log" Feb 23 14:34:01.154008 master-0 kubenswrapper[7728]: I0223 14:34:01.153957 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:34:01.158508 master-0 kubenswrapper[7728]: I0223 14:34:01.158401 7728 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="79b63b0311108e042b4d16d40534ff93" podUID="181adc3f4810f127b44f3750f5d2460c" Feb 23 14:34:01.220226 master-0 kubenswrapper[7728]: I0223 14:34:01.220035 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Feb 23 14:34:01.247181 master-0 kubenswrapper[7728]: I0223 14:34:01.247115 7728 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="39b8d23d-1c12-432f-a397-3779f73f0670" Feb 23 14:34:01.247181 master-0 kubenswrapper[7728]: I0223 14:34:01.247157 7728 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="39b8d23d-1c12-432f-a397-3779f73f0670" Feb 23 14:34:01.295545 master-0 kubenswrapper[7728]: I0223 14:34:01.295303 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/79b63b0311108e042b4d16d40534ff93-resource-dir\") pod \"79b63b0311108e042b4d16d40534ff93\" (UID: \"79b63b0311108e042b4d16d40534ff93\") " Feb 23 14:34:01.295545 master-0 kubenswrapper[7728]: I0223 14:34:01.295343 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/79b63b0311108e042b4d16d40534ff93-cert-dir\") pod \"79b63b0311108e042b4d16d40534ff93\" (UID: \"79b63b0311108e042b4d16d40534ff93\") " Feb 23 14:34:01.295545 master-0 kubenswrapper[7728]: I0223 14:34:01.295434 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79b63b0311108e042b4d16d40534ff93-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "79b63b0311108e042b4d16d40534ff93" (UID: "79b63b0311108e042b4d16d40534ff93"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:34:01.295545 master-0 kubenswrapper[7728]: I0223 14:34:01.295526 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79b63b0311108e042b4d16d40534ff93-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "79b63b0311108e042b4d16d40534ff93" (UID: "79b63b0311108e042b4d16d40534ff93"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:34:01.298824 master-0 kubenswrapper[7728]: I0223 14:34:01.298740 7728 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/79b63b0311108e042b4d16d40534ff93-resource-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:34:01.298824 master-0 kubenswrapper[7728]: I0223 14:34:01.298808 7728 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/79b63b0311108e042b4d16d40534ff93-cert-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:34:01.305548 master-0 kubenswrapper[7728]: I0223 14:34:01.305457 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Feb 23 14:34:01.312213 master-0 kubenswrapper[7728]: I0223 14:34:01.311016 7728 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Feb 23 14:34:01.319967 master-0 kubenswrapper[7728]: I0223 14:34:01.319452 7728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Feb 23 14:34:01.459771 master-0 kubenswrapper[7728]: I0223 14:34:01.459663 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Feb 23 14:34:01.462605 master-0 kubenswrapper[7728]: I0223 14:34:01.462453 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Feb 23 14:34:01.493040 master-0 kubenswrapper[7728]: W0223 14:34:01.492955 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd03a1e6620a92c780b0a91c72a55bc8b.slice/crio-0b6059c95ec8023c3749bdce17c5d7c3d4cfc4af6e64639a41582908eb86d4e6 WatchSource:0}: Error finding container 0b6059c95ec8023c3749bdce17c5d7c3d4cfc4af6e64639a41582908eb86d4e6: Status 404 returned error can't find the container with id 0b6059c95ec8023c3749bdce17c5d7c3d4cfc4af6e64639a41582908eb86d4e6 Feb 23 14:34:01.539106 master-0 kubenswrapper[7728]: I0223 14:34:01.539051 7728 generic.go:334] "Generic (PLEG): container finished" podID="a52cbaf6-c1af-4c29-aef9-67523f5148c6" containerID="f2ac2c56c1e34a13c986eed31e989cbdb13313c0a91d44d3e68704b2399e5a39" exitCode=0 Feb 23 14:34:01.539326 master-0 kubenswrapper[7728]: I0223 14:34:01.539171 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"a52cbaf6-c1af-4c29-aef9-67523f5148c6","Type":"ContainerDied","Data":"f2ac2c56c1e34a13c986eed31e989cbdb13313c0a91d44d3e68704b2399e5a39"} Feb 23 14:34:01.541557 master-0 kubenswrapper[7728]: I0223 14:34:01.541452 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"d03a1e6620a92c780b0a91c72a55bc8b","Type":"ContainerStarted","Data":"0b6059c95ec8023c3749bdce17c5d7c3d4cfc4af6e64639a41582908eb86d4e6"} Feb 23 14:34:01.550804 master-0 kubenswrapper[7728]: I0223 14:34:01.548587 7728 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_79b63b0311108e042b4d16d40534ff93/kube-controller-manager-cert-syncer/0.log" Feb 23 14:34:01.550804 master-0 kubenswrapper[7728]: I0223 14:34:01.549804 7728 generic.go:334] "Generic (PLEG): container finished" podID="79b63b0311108e042b4d16d40534ff93" containerID="4727007b722af10f03ef1e413d9a5307206f80fd37d32a2fd4680a08e0c09abd" exitCode=0 Feb 23 14:34:01.550804 master-0 kubenswrapper[7728]: I0223 14:34:01.549824 7728 generic.go:334] "Generic (PLEG): container finished" podID="79b63b0311108e042b4d16d40534ff93" containerID="3128474a58df77e7f674ea5fc2b490b80a290b5c331332dff119879f6a4cc014" exitCode=2 Feb 23 14:34:01.550804 master-0 kubenswrapper[7728]: I0223 14:34:01.549830 7728 generic.go:334] "Generic (PLEG): container finished" podID="79b63b0311108e042b4d16d40534ff93" containerID="ebb1604449992626b0f5a092af2505cc2585db163d59c98aca43b3946d7b4f4c" exitCode=0 Feb 23 14:34:01.550804 master-0 kubenswrapper[7728]: I0223 14:34:01.549837 7728 generic.go:334] "Generic (PLEG): container finished" podID="79b63b0311108e042b4d16d40534ff93" containerID="cc06826ca4ea200fc8d9842f30146d34aae96828a181bf8a495c15fc9235ebdd" exitCode=0 Feb 23 14:34:01.550804 master-0 kubenswrapper[7728]: I0223 14:34:01.549879 7728 scope.go:117] "RemoveContainer" containerID="4727007b722af10f03ef1e413d9a5307206f80fd37d32a2fd4680a08e0c09abd" Feb 23 14:34:01.550804 master-0 kubenswrapper[7728]: I0223 14:34:01.549901 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:34:01.574853 master-0 kubenswrapper[7728]: I0223 14:34:01.572640 7728 scope.go:117] "RemoveContainer" containerID="3128474a58df77e7f674ea5fc2b490b80a290b5c331332dff119879f6a4cc014" Feb 23 14:34:01.589992 master-0 kubenswrapper[7728]: I0223 14:34:01.589949 7728 scope.go:117] "RemoveContainer" containerID="ebb1604449992626b0f5a092af2505cc2585db163d59c98aca43b3946d7b4f4c" Feb 23 14:34:01.615801 master-0 kubenswrapper[7728]: I0223 14:34:01.615762 7728 scope.go:117] "RemoveContainer" containerID="cc06826ca4ea200fc8d9842f30146d34aae96828a181bf8a495c15fc9235ebdd" Feb 23 14:34:01.632800 master-0 kubenswrapper[7728]: I0223 14:34:01.632744 7728 scope.go:117] "RemoveContainer" containerID="4727007b722af10f03ef1e413d9a5307206f80fd37d32a2fd4680a08e0c09abd" Feb 23 14:34:01.633314 master-0 kubenswrapper[7728]: E0223 14:34:01.633262 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4727007b722af10f03ef1e413d9a5307206f80fd37d32a2fd4680a08e0c09abd\": container with ID starting with 4727007b722af10f03ef1e413d9a5307206f80fd37d32a2fd4680a08e0c09abd not found: ID does not exist" containerID="4727007b722af10f03ef1e413d9a5307206f80fd37d32a2fd4680a08e0c09abd" Feb 23 14:34:01.633354 master-0 kubenswrapper[7728]: I0223 14:34:01.633326 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4727007b722af10f03ef1e413d9a5307206f80fd37d32a2fd4680a08e0c09abd"} err="failed to get container status \"4727007b722af10f03ef1e413d9a5307206f80fd37d32a2fd4680a08e0c09abd\": rpc error: code = NotFound desc = could not find container \"4727007b722af10f03ef1e413d9a5307206f80fd37d32a2fd4680a08e0c09abd\": container with ID starting with 4727007b722af10f03ef1e413d9a5307206f80fd37d32a2fd4680a08e0c09abd not found: ID does not exist" Feb 23 14:34:01.633392 master-0 kubenswrapper[7728]: I0223 14:34:01.633360 7728 scope.go:117] "RemoveContainer" containerID="3128474a58df77e7f674ea5fc2b490b80a290b5c331332dff119879f6a4cc014" Feb 23 14:34:01.633863 master-0 kubenswrapper[7728]: E0223 14:34:01.633802 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3128474a58df77e7f674ea5fc2b490b80a290b5c331332dff119879f6a4cc014\": container with ID starting with 3128474a58df77e7f674ea5fc2b490b80a290b5c331332dff119879f6a4cc014 not found: ID does not exist" containerID="3128474a58df77e7f674ea5fc2b490b80a290b5c331332dff119879f6a4cc014" Feb 23 14:34:01.633863 master-0 kubenswrapper[7728]: I0223 14:34:01.633849 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3128474a58df77e7f674ea5fc2b490b80a290b5c331332dff119879f6a4cc014"} err="failed to get container status \"3128474a58df77e7f674ea5fc2b490b80a290b5c331332dff119879f6a4cc014\": rpc error: code = NotFound desc = could not find container \"3128474a58df77e7f674ea5fc2b490b80a290b5c331332dff119879f6a4cc014\": container with ID starting with 3128474a58df77e7f674ea5fc2b490b80a290b5c331332dff119879f6a4cc014 not found: ID does not exist" Feb 23 14:34:01.633975 master-0 kubenswrapper[7728]: I0223 14:34:01.633876 7728 scope.go:117] "RemoveContainer" containerID="ebb1604449992626b0f5a092af2505cc2585db163d59c98aca43b3946d7b4f4c" Feb 23 14:34:01.634157 master-0 kubenswrapper[7728]: E0223 14:34:01.634122 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebb1604449992626b0f5a092af2505cc2585db163d59c98aca43b3946d7b4f4c\": container with ID starting with ebb1604449992626b0f5a092af2505cc2585db163d59c98aca43b3946d7b4f4c not found: ID does not exist" containerID="ebb1604449992626b0f5a092af2505cc2585db163d59c98aca43b3946d7b4f4c" Feb 23 14:34:01.634198 master-0 kubenswrapper[7728]: I0223 14:34:01.634151 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebb1604449992626b0f5a092af2505cc2585db163d59c98aca43b3946d7b4f4c"} err="failed to get container status \"ebb1604449992626b0f5a092af2505cc2585db163d59c98aca43b3946d7b4f4c\": rpc error: code = NotFound desc = could not find container \"ebb1604449992626b0f5a092af2505cc2585db163d59c98aca43b3946d7b4f4c\": container with ID starting with ebb1604449992626b0f5a092af2505cc2585db163d59c98aca43b3946d7b4f4c not found: ID does not exist" Feb 23 14:34:01.634198 master-0 kubenswrapper[7728]: I0223 14:34:01.634168 7728 scope.go:117] "RemoveContainer" containerID="cc06826ca4ea200fc8d9842f30146d34aae96828a181bf8a495c15fc9235ebdd" Feb 23 14:34:01.634505 master-0 kubenswrapper[7728]: E0223 14:34:01.634460 7728 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc06826ca4ea200fc8d9842f30146d34aae96828a181bf8a495c15fc9235ebdd\": container with ID starting with cc06826ca4ea200fc8d9842f30146d34aae96828a181bf8a495c15fc9235ebdd not found: ID does not exist" containerID="cc06826ca4ea200fc8d9842f30146d34aae96828a181bf8a495c15fc9235ebdd" Feb 23 14:34:01.634556 master-0 kubenswrapper[7728]: I0223 14:34:01.634498 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc06826ca4ea200fc8d9842f30146d34aae96828a181bf8a495c15fc9235ebdd"} err="failed to get container status \"cc06826ca4ea200fc8d9842f30146d34aae96828a181bf8a495c15fc9235ebdd\": rpc error: code = NotFound desc = could not find container \"cc06826ca4ea200fc8d9842f30146d34aae96828a181bf8a495c15fc9235ebdd\": container with ID starting with cc06826ca4ea200fc8d9842f30146d34aae96828a181bf8a495c15fc9235ebdd not found: ID does not exist" Feb 23 14:34:01.634556 master-0 kubenswrapper[7728]: I0223 14:34:01.634514 7728 scope.go:117] "RemoveContainer" containerID="4727007b722af10f03ef1e413d9a5307206f80fd37d32a2fd4680a08e0c09abd" Feb 23 14:34:01.634770 master-0 kubenswrapper[7728]: I0223 14:34:01.634739 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4727007b722af10f03ef1e413d9a5307206f80fd37d32a2fd4680a08e0c09abd"} err="failed to get container status \"4727007b722af10f03ef1e413d9a5307206f80fd37d32a2fd4680a08e0c09abd\": rpc error: code = NotFound desc = could not find container \"4727007b722af10f03ef1e413d9a5307206f80fd37d32a2fd4680a08e0c09abd\": container with ID starting with 4727007b722af10f03ef1e413d9a5307206f80fd37d32a2fd4680a08e0c09abd not found: ID does not exist" Feb 23 14:34:01.634770 master-0 kubenswrapper[7728]: I0223 14:34:01.634757 7728 scope.go:117] "RemoveContainer" containerID="3128474a58df77e7f674ea5fc2b490b80a290b5c331332dff119879f6a4cc014" Feb 23 14:34:01.635007 master-0 kubenswrapper[7728]: I0223 14:34:01.634980 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3128474a58df77e7f674ea5fc2b490b80a290b5c331332dff119879f6a4cc014"} err="failed to get container status \"3128474a58df77e7f674ea5fc2b490b80a290b5c331332dff119879f6a4cc014\": rpc error: code = NotFound desc = could not find container \"3128474a58df77e7f674ea5fc2b490b80a290b5c331332dff119879f6a4cc014\": container with ID starting with 3128474a58df77e7f674ea5fc2b490b80a290b5c331332dff119879f6a4cc014 not found: ID does not exist" Feb 23 14:34:01.635007 master-0 kubenswrapper[7728]: I0223 14:34:01.634999 7728 scope.go:117] "RemoveContainer" containerID="ebb1604449992626b0f5a092af2505cc2585db163d59c98aca43b3946d7b4f4c" Feb 23 14:34:01.635317 master-0 kubenswrapper[7728]: I0223 14:34:01.635274 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebb1604449992626b0f5a092af2505cc2585db163d59c98aca43b3946d7b4f4c"} err="failed to get container status \"ebb1604449992626b0f5a092af2505cc2585db163d59c98aca43b3946d7b4f4c\": rpc error: code = NotFound desc = could not find container \"ebb1604449992626b0f5a092af2505cc2585db163d59c98aca43b3946d7b4f4c\": container with ID starting with ebb1604449992626b0f5a092af2505cc2585db163d59c98aca43b3946d7b4f4c not found: ID does not exist" Feb 23 14:34:01.635317 master-0 kubenswrapper[7728]: I0223 14:34:01.635307 7728 scope.go:117] "RemoveContainer" containerID="cc06826ca4ea200fc8d9842f30146d34aae96828a181bf8a495c15fc9235ebdd" Feb 23 14:34:01.635586 master-0 kubenswrapper[7728]: I0223 14:34:01.635546 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc06826ca4ea200fc8d9842f30146d34aae96828a181bf8a495c15fc9235ebdd"} err="failed to get container status \"cc06826ca4ea200fc8d9842f30146d34aae96828a181bf8a495c15fc9235ebdd\": rpc error: code = NotFound desc = could not find container \"cc06826ca4ea200fc8d9842f30146d34aae96828a181bf8a495c15fc9235ebdd\": container with ID starting with cc06826ca4ea200fc8d9842f30146d34aae96828a181bf8a495c15fc9235ebdd not found: ID does not exist" Feb 23 14:34:01.635586 master-0 kubenswrapper[7728]: I0223 14:34:01.635580 7728 scope.go:117] "RemoveContainer" containerID="4727007b722af10f03ef1e413d9a5307206f80fd37d32a2fd4680a08e0c09abd" Feb 23 14:34:01.635883 master-0 kubenswrapper[7728]: I0223 14:34:01.635829 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4727007b722af10f03ef1e413d9a5307206f80fd37d32a2fd4680a08e0c09abd"} err="failed to get container status \"4727007b722af10f03ef1e413d9a5307206f80fd37d32a2fd4680a08e0c09abd\": rpc error: code = NotFound desc = could not find container \"4727007b722af10f03ef1e413d9a5307206f80fd37d32a2fd4680a08e0c09abd\": container with ID starting with 4727007b722af10f03ef1e413d9a5307206f80fd37d32a2fd4680a08e0c09abd not found: ID does not exist" Feb 23 14:34:01.635883 master-0 kubenswrapper[7728]: I0223 14:34:01.635876 7728 scope.go:117] "RemoveContainer" containerID="3128474a58df77e7f674ea5fc2b490b80a290b5c331332dff119879f6a4cc014" Feb 23 14:34:01.636147 master-0 kubenswrapper[7728]: I0223 14:34:01.636109 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3128474a58df77e7f674ea5fc2b490b80a290b5c331332dff119879f6a4cc014"} err="failed to get container status \"3128474a58df77e7f674ea5fc2b490b80a290b5c331332dff119879f6a4cc014\": rpc error: code = NotFound desc = could not find container \"3128474a58df77e7f674ea5fc2b490b80a290b5c331332dff119879f6a4cc014\": container with ID starting with 3128474a58df77e7f674ea5fc2b490b80a290b5c331332dff119879f6a4cc014 not found: ID does not exist" Feb 23 14:34:01.636147 master-0 kubenswrapper[7728]: I0223 14:34:01.636136 7728 scope.go:117] "RemoveContainer" containerID="ebb1604449992626b0f5a092af2505cc2585db163d59c98aca43b3946d7b4f4c" Feb 23 14:34:01.636529 master-0 kubenswrapper[7728]: I0223 14:34:01.636466 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebb1604449992626b0f5a092af2505cc2585db163d59c98aca43b3946d7b4f4c"} err="failed to get container status \"ebb1604449992626b0f5a092af2505cc2585db163d59c98aca43b3946d7b4f4c\": rpc error: code = NotFound desc = could not find container \"ebb1604449992626b0f5a092af2505cc2585db163d59c98aca43b3946d7b4f4c\": container with ID starting with ebb1604449992626b0f5a092af2505cc2585db163d59c98aca43b3946d7b4f4c not found: ID does not exist" Feb 23 14:34:01.636529 master-0 kubenswrapper[7728]: I0223 14:34:01.636520 7728 scope.go:117] "RemoveContainer" containerID="cc06826ca4ea200fc8d9842f30146d34aae96828a181bf8a495c15fc9235ebdd" Feb 23 14:34:01.636869 master-0 kubenswrapper[7728]: I0223 14:34:01.636832 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc06826ca4ea200fc8d9842f30146d34aae96828a181bf8a495c15fc9235ebdd"} err="failed to get container status \"cc06826ca4ea200fc8d9842f30146d34aae96828a181bf8a495c15fc9235ebdd\": rpc error: code = NotFound desc = could not find container \"cc06826ca4ea200fc8d9842f30146d34aae96828a181bf8a495c15fc9235ebdd\": container with ID starting with cc06826ca4ea200fc8d9842f30146d34aae96828a181bf8a495c15fc9235ebdd not found: ID does not exist" Feb 23 14:34:01.636869 master-0 kubenswrapper[7728]: I0223 14:34:01.636857 7728 scope.go:117] "RemoveContainer" containerID="4727007b722af10f03ef1e413d9a5307206f80fd37d32a2fd4680a08e0c09abd" Feb 23 14:34:01.637172 master-0 kubenswrapper[7728]: I0223 14:34:01.637135 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4727007b722af10f03ef1e413d9a5307206f80fd37d32a2fd4680a08e0c09abd"} err="failed to get container status \"4727007b722af10f03ef1e413d9a5307206f80fd37d32a2fd4680a08e0c09abd\": rpc error: code = NotFound desc = could not find container \"4727007b722af10f03ef1e413d9a5307206f80fd37d32a2fd4680a08e0c09abd\": container with ID starting with 4727007b722af10f03ef1e413d9a5307206f80fd37d32a2fd4680a08e0c09abd not found: ID does not exist" Feb 23 14:34:01.637172 master-0 kubenswrapper[7728]: I0223 14:34:01.637166 7728 scope.go:117] "RemoveContainer" containerID="3128474a58df77e7f674ea5fc2b490b80a290b5c331332dff119879f6a4cc014" Feb 23 14:34:01.637582 master-0 kubenswrapper[7728]: I0223 14:34:01.637547 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3128474a58df77e7f674ea5fc2b490b80a290b5c331332dff119879f6a4cc014"} err="failed to get container status \"3128474a58df77e7f674ea5fc2b490b80a290b5c331332dff119879f6a4cc014\": rpc error: code = NotFound desc = could not find container \"3128474a58df77e7f674ea5fc2b490b80a290b5c331332dff119879f6a4cc014\": container with ID starting with 3128474a58df77e7f674ea5fc2b490b80a290b5c331332dff119879f6a4cc014 not found: ID does not exist" Feb 23 14:34:01.637582 master-0 kubenswrapper[7728]: I0223 14:34:01.637574 7728 scope.go:117] "RemoveContainer" containerID="ebb1604449992626b0f5a092af2505cc2585db163d59c98aca43b3946d7b4f4c" Feb 23 14:34:01.637838 master-0 kubenswrapper[7728]: I0223 14:34:01.637801 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebb1604449992626b0f5a092af2505cc2585db163d59c98aca43b3946d7b4f4c"} err="failed to get container status \"ebb1604449992626b0f5a092af2505cc2585db163d59c98aca43b3946d7b4f4c\": rpc error: code = NotFound desc = could not find container \"ebb1604449992626b0f5a092af2505cc2585db163d59c98aca43b3946d7b4f4c\": container with ID starting with ebb1604449992626b0f5a092af2505cc2585db163d59c98aca43b3946d7b4f4c not found: ID does not exist" Feb 23 14:34:01.637838 master-0 kubenswrapper[7728]: I0223 14:34:01.637831 7728 scope.go:117] "RemoveContainer" containerID="cc06826ca4ea200fc8d9842f30146d34aae96828a181bf8a495c15fc9235ebdd" Feb 23 14:34:01.638243 master-0 kubenswrapper[7728]: I0223 14:34:01.638207 7728 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc06826ca4ea200fc8d9842f30146d34aae96828a181bf8a495c15fc9235ebdd"} err="failed to get container status \"cc06826ca4ea200fc8d9842f30146d34aae96828a181bf8a495c15fc9235ebdd\": rpc error: code = NotFound desc = could not find container \"cc06826ca4ea200fc8d9842f30146d34aae96828a181bf8a495c15fc9235ebdd\": container with ID starting with cc06826ca4ea200fc8d9842f30146d34aae96828a181bf8a495c15fc9235ebdd not found: ID does not exist" Feb 23 14:34:01.645872 master-0 kubenswrapper[7728]: I0223 14:34:01.645809 7728 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="79b63b0311108e042b4d16d40534ff93" podUID="181adc3f4810f127b44f3750f5d2460c" Feb 23 14:34:01.908188 master-0 kubenswrapper[7728]: I0223 14:34:01.908073 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:34:01.908188 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:34:01.908188 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:34:01.908188 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:34:01.908460 master-0 kubenswrapper[7728]: I0223 14:34:01.908217 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:34:02.559174 master-0 kubenswrapper[7728]: I0223 14:34:02.559105 7728 generic.go:334] "Generic (PLEG): container finished" podID="d03a1e6620a92c780b0a91c72a55bc8b" containerID="e3f7365c1acb54a72a772e182d383b3e70a626dbae5d085f9cd81b46982b0137" exitCode=0 Feb 23 14:34:02.559174 master-0 kubenswrapper[7728]: I0223 14:34:02.559181 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"d03a1e6620a92c780b0a91c72a55bc8b","Type":"ContainerDied","Data":"e3f7365c1acb54a72a772e182d383b3e70a626dbae5d085f9cd81b46982b0137"} Feb 23 14:34:02.883101 master-0 kubenswrapper[7728]: I0223 14:34:02.883056 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Feb 23 14:34:02.908310 master-0 kubenswrapper[7728]: I0223 14:34:02.907813 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:34:02.908310 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:34:02.908310 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:34:02.908310 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:34:02.908310 master-0 kubenswrapper[7728]: I0223 14:34:02.907901 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:34:02.923901 master-0 kubenswrapper[7728]: I0223 14:34:02.923808 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a52cbaf6-c1af-4c29-aef9-67523f5148c6-kube-api-access\") pod \"a52cbaf6-c1af-4c29-aef9-67523f5148c6\" (UID: \"a52cbaf6-c1af-4c29-aef9-67523f5148c6\") " Feb 23 14:34:02.926716 master-0 kubenswrapper[7728]: I0223 14:34:02.926640 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a52cbaf6-c1af-4c29-aef9-67523f5148c6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a52cbaf6-c1af-4c29-aef9-67523f5148c6" (UID: "a52cbaf6-c1af-4c29-aef9-67523f5148c6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:34:03.024948 master-0 kubenswrapper[7728]: I0223 14:34:03.024892 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a52cbaf6-c1af-4c29-aef9-67523f5148c6-kubelet-dir\") pod \"a52cbaf6-c1af-4c29-aef9-67523f5148c6\" (UID: \"a52cbaf6-c1af-4c29-aef9-67523f5148c6\") " Feb 23 14:34:03.025435 master-0 kubenswrapper[7728]: I0223 14:34:03.025404 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a52cbaf6-c1af-4c29-aef9-67523f5148c6-var-lock\") pod \"a52cbaf6-c1af-4c29-aef9-67523f5148c6\" (UID: \"a52cbaf6-c1af-4c29-aef9-67523f5148c6\") " Feb 23 14:34:03.026216 master-0 kubenswrapper[7728]: I0223 14:34:03.025105 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a52cbaf6-c1af-4c29-aef9-67523f5148c6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a52cbaf6-c1af-4c29-aef9-67523f5148c6" (UID: "a52cbaf6-c1af-4c29-aef9-67523f5148c6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:34:03.026216 master-0 kubenswrapper[7728]: I0223 14:34:03.025578 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a52cbaf6-c1af-4c29-aef9-67523f5148c6-var-lock" (OuterVolumeSpecName: "var-lock") pod "a52cbaf6-c1af-4c29-aef9-67523f5148c6" (UID: "a52cbaf6-c1af-4c29-aef9-67523f5148c6"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:34:03.026657 master-0 kubenswrapper[7728]: I0223 14:34:03.026616 7728 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a52cbaf6-c1af-4c29-aef9-67523f5148c6-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:34:03.026833 master-0 kubenswrapper[7728]: I0223 14:34:03.026806 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a52cbaf6-c1af-4c29-aef9-67523f5148c6-kube-api-access\") on node \"master-0\" DevicePath \"\"" Feb 23 14:34:03.026965 master-0 kubenswrapper[7728]: I0223 14:34:03.026945 7728 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a52cbaf6-c1af-4c29-aef9-67523f5148c6-var-lock\") on node \"master-0\" DevicePath \"\"" Feb 23 14:34:03.247827 master-0 kubenswrapper[7728]: I0223 14:34:03.247784 7728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79b63b0311108e042b4d16d40534ff93" path="/var/lib/kubelet/pods/79b63b0311108e042b4d16d40534ff93/volumes" Feb 23 14:34:03.573824 master-0 kubenswrapper[7728]: I0223 14:34:03.573779 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Feb 23 14:34:03.574545 master-0 kubenswrapper[7728]: I0223 14:34:03.573815 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"a52cbaf6-c1af-4c29-aef9-67523f5148c6","Type":"ContainerDied","Data":"a0069da11d319bc5848078fcc0287e9847a5a57b6af5121c1b96241adcdaa914"} Feb 23 14:34:03.574639 master-0 kubenswrapper[7728]: I0223 14:34:03.574559 7728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0069da11d319bc5848078fcc0287e9847a5a57b6af5121c1b96241adcdaa914" Feb 23 14:34:03.578014 master-0 kubenswrapper[7728]: I0223 14:34:03.577943 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"d03a1e6620a92c780b0a91c72a55bc8b","Type":"ContainerStarted","Data":"cecfab4dc3898d095fcb3a6b3e1cf511480563e469afa63033300c82e251f626"} Feb 23 14:34:03.578014 master-0 kubenswrapper[7728]: I0223 14:34:03.578016 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"d03a1e6620a92c780b0a91c72a55bc8b","Type":"ContainerStarted","Data":"25bfdd1b2d2ccd5f7ad2b82ecfb58c4fad723643d443beb5181c17d3e22ca1ac"} Feb 23 14:34:03.907614 master-0 kubenswrapper[7728]: I0223 14:34:03.907508 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:34:03.907614 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:34:03.907614 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:34:03.907614 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:34:03.908124 master-0 kubenswrapper[7728]: I0223 14:34:03.907620 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:34:04.586595 master-0 kubenswrapper[7728]: I0223 14:34:04.586515 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"d03a1e6620a92c780b0a91c72a55bc8b","Type":"ContainerStarted","Data":"a45da35c56f17703c91fdc8040f2ec004087a605a92000fedcffc1936ffcae7f"} Feb 23 14:34:04.587145 master-0 kubenswrapper[7728]: I0223 14:34:04.586739 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Feb 23 14:34:04.635364 master-0 kubenswrapper[7728]: I0223 14:34:04.635263 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podStartSLOduration=3.635242679 podStartE2EDuration="3.635242679s" podCreationTimestamp="2026-02-23 14:34:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:34:04.631852317 +0000 UTC m=+937.594513683" watchObservedRunningTime="2026-02-23 14:34:04.635242679 +0000 UTC m=+937.597903975" Feb 23 14:34:04.909228 master-0 kubenswrapper[7728]: I0223 14:34:04.909033 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:34:04.909228 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:34:04.909228 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:34:04.909228 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:34:04.909228 master-0 kubenswrapper[7728]: I0223 14:34:04.909135 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:34:05.907073 master-0 kubenswrapper[7728]: I0223 14:34:05.906957 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:34:05.907073 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:34:05.907073 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:34:05.907073 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:34:05.907073 master-0 kubenswrapper[7728]: I0223 14:34:05.907022 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:34:06.908119 master-0 kubenswrapper[7728]: I0223 14:34:06.908016 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:34:06.908119 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:34:06.908119 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:34:06.908119 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:34:06.908119 master-0 kubenswrapper[7728]: I0223 14:34:06.908113 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:34:07.907449 master-0 kubenswrapper[7728]: I0223 14:34:07.907375 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:34:07.907449 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:34:07.907449 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:34:07.907449 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:34:07.907778 master-0 kubenswrapper[7728]: I0223 14:34:07.907467 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:34:08.908471 master-0 kubenswrapper[7728]: I0223 14:34:08.908425 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:34:08.908471 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:34:08.908471 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:34:08.908471 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:34:08.909310 master-0 kubenswrapper[7728]: I0223 14:34:08.909077 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:34:09.907295 master-0 kubenswrapper[7728]: I0223 14:34:09.907239 7728 patch_prober.go:28] interesting pod/router-default-7b65dc9fcb-w68qb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 14:34:09.907295 master-0 kubenswrapper[7728]: [-]has-synced failed: reason withheld Feb 23 14:34:09.907295 master-0 kubenswrapper[7728]: [+]process-running ok Feb 23 14:34:09.907295 master-0 kubenswrapper[7728]: healthz check failed Feb 23 14:34:09.907694 master-0 kubenswrapper[7728]: I0223 14:34:09.907304 7728 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 14:34:09.907694 master-0 kubenswrapper[7728]: I0223 14:34:09.907364 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:34:09.907958 master-0 kubenswrapper[7728]: I0223 14:34:09.907922 7728 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"7461839a3a630e391eda2be4a947e3e187fea230edbbc3e8b3af02abc9e03e06"} pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" containerMessage="Container router failed startup probe, will be restarted" Feb 23 14:34:09.908012 master-0 kubenswrapper[7728]: I0223 14:34:09.907960 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" podUID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerName="router" containerID="cri-o://7461839a3a630e391eda2be4a947e3e187fea230edbbc3e8b3af02abc9e03e06" gracePeriod=3600 Feb 23 14:34:11.220583 master-0 kubenswrapper[7728]: I0223 14:34:11.220459 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:34:11.236760 master-0 kubenswrapper[7728]: I0223 14:34:11.236701 7728 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="15b2891f-e293-470b-9906-e7bee91cedb5" Feb 23 14:34:11.236760 master-0 kubenswrapper[7728]: I0223 14:34:11.236737 7728 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="15b2891f-e293-470b-9906-e7bee91cedb5" Feb 23 14:34:11.248723 master-0 kubenswrapper[7728]: I0223 14:34:11.248678 7728 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Feb 23 14:34:11.257839 master-0 kubenswrapper[7728]: I0223 14:34:11.257754 7728 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:34:11.270177 master-0 kubenswrapper[7728]: I0223 14:34:11.270125 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:34:11.270761 master-0 kubenswrapper[7728]: I0223 14:34:11.270734 7728 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Feb 23 14:34:11.275054 master-0 kubenswrapper[7728]: I0223 14:34:11.274935 7728 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Feb 23 14:34:11.636026 master-0 kubenswrapper[7728]: I0223 14:34:11.635962 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"181adc3f4810f127b44f3750f5d2460c","Type":"ContainerStarted","Data":"ea1409538bec46d9eceb195d8a31f70cddcab9c02d2f2d5acf77e88b46aed24f"} Feb 23 14:34:11.636026 master-0 kubenswrapper[7728]: I0223 14:34:11.636010 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"181adc3f4810f127b44f3750f5d2460c","Type":"ContainerStarted","Data":"d25cb0e087422893459bc0facfa4b23104f61b45c1b3b866e5b4e0e0ad019f99"} Feb 23 14:34:12.644830 master-0 kubenswrapper[7728]: I0223 14:34:12.644774 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"181adc3f4810f127b44f3750f5d2460c","Type":"ContainerStarted","Data":"b8d6f49109bc5e9937c7a4c297e2344d7130b49e86fe7057d8a2caa05af89ff5"} Feb 23 14:34:12.645387 master-0 kubenswrapper[7728]: I0223 14:34:12.644846 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"181adc3f4810f127b44f3750f5d2460c","Type":"ContainerStarted","Data":"0b38f3f0c36dbadf8be89c71d3b96febfcea8812afc5013e2c33ad058c7c6088"} Feb 23 14:34:12.645387 master-0 kubenswrapper[7728]: I0223 14:34:12.644863 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"181adc3f4810f127b44f3750f5d2460c","Type":"ContainerStarted","Data":"0f12986ca20c74365b105ffde80e7b4ab97ae2e79cf0faa03c36002407af9c04"} Feb 23 14:34:12.672533 master-0 kubenswrapper[7728]: I0223 14:34:12.669641 7728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=1.669624148 podStartE2EDuration="1.669624148s" podCreationTimestamp="2026-02-23 14:34:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:34:12.668271131 +0000 UTC m=+945.630932437" watchObservedRunningTime="2026-02-23 14:34:12.669624148 +0000 UTC m=+945.632285444" Feb 23 14:34:21.270418 master-0 kubenswrapper[7728]: I0223 14:34:21.270336 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:34:21.270418 master-0 kubenswrapper[7728]: I0223 14:34:21.270413 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:34:21.270418 master-0 kubenswrapper[7728]: I0223 14:34:21.270436 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:34:21.271590 master-0 kubenswrapper[7728]: I0223 14:34:21.270458 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:34:21.274407 master-0 kubenswrapper[7728]: I0223 14:34:21.274332 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:34:21.276145 master-0 kubenswrapper[7728]: I0223 14:34:21.276098 7728 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:34:21.708410 master-0 kubenswrapper[7728]: I0223 14:34:21.708336 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:34:21.712867 master-0 kubenswrapper[7728]: I0223 14:34:21.712809 7728 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:34:24.716622 master-0 kubenswrapper[7728]: I0223 14:34:24.716379 7728 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Feb 23 14:34:24.717271 master-0 kubenswrapper[7728]: I0223 14:34:24.716764 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="687e92a6cecf1e2beeef16a0b322ad08" containerName="kube-apiserver" containerID="cri-o://bdd3290dcf6f732f006b381bec2edfc3a7a58623787040a36811efd529225351" gracePeriod=15 Feb 23 14:34:24.717271 master-0 kubenswrapper[7728]: I0223 14:34:24.716965 7728 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="687e92a6cecf1e2beeef16a0b322ad08" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://e3d987d25306f70a7327b5bce6ea549b476972db2d3366cf37d35b30c1531578" gracePeriod=15 Feb 23 14:34:24.718251 master-0 kubenswrapper[7728]: I0223 14:34:24.718198 7728 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Feb 23 14:34:24.718580 master-0 kubenswrapper[7728]: E0223 14:34:24.718555 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="687e92a6cecf1e2beeef16a0b322ad08" containerName="kube-apiserver" Feb 23 14:34:24.718640 master-0 kubenswrapper[7728]: I0223 14:34:24.718580 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="687e92a6cecf1e2beeef16a0b322ad08" containerName="kube-apiserver" Feb 23 14:34:24.718640 master-0 kubenswrapper[7728]: E0223 14:34:24.718620 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="687e92a6cecf1e2beeef16a0b322ad08" containerName="setup" Feb 23 14:34:24.718640 master-0 kubenswrapper[7728]: I0223 14:34:24.718633 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="687e92a6cecf1e2beeef16a0b322ad08" containerName="setup" Feb 23 14:34:24.718753 master-0 kubenswrapper[7728]: E0223 14:34:24.718652 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="687e92a6cecf1e2beeef16a0b322ad08" containerName="kube-apiserver-insecure-readyz" Feb 23 14:34:24.718753 master-0 kubenswrapper[7728]: I0223 14:34:24.718661 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="687e92a6cecf1e2beeef16a0b322ad08" containerName="kube-apiserver-insecure-readyz" Feb 23 14:34:24.718753 master-0 kubenswrapper[7728]: E0223 14:34:24.718683 7728 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52cbaf6-c1af-4c29-aef9-67523f5148c6" containerName="installer" Feb 23 14:34:24.718753 master-0 kubenswrapper[7728]: I0223 14:34:24.718691 7728 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52cbaf6-c1af-4c29-aef9-67523f5148c6" containerName="installer" Feb 23 14:34:24.718937 master-0 kubenswrapper[7728]: I0223 14:34:24.718913 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="687e92a6cecf1e2beeef16a0b322ad08" containerName="kube-apiserver" Feb 23 14:34:24.718982 master-0 kubenswrapper[7728]: I0223 14:34:24.718958 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="687e92a6cecf1e2beeef16a0b322ad08" containerName="kube-apiserver-insecure-readyz" Feb 23 14:34:24.718982 master-0 kubenswrapper[7728]: I0223 14:34:24.718972 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="a52cbaf6-c1af-4c29-aef9-67523f5148c6" containerName="installer" Feb 23 14:34:24.719069 master-0 kubenswrapper[7728]: I0223 14:34:24.718987 7728 memory_manager.go:354] "RemoveStaleState removing state" podUID="687e92a6cecf1e2beeef16a0b322ad08" containerName="setup" Feb 23 14:34:24.721523 master-0 kubenswrapper[7728]: I0223 14:34:24.721215 7728 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Feb 23 14:34:24.721523 master-0 kubenswrapper[7728]: I0223 14:34:24.721412 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:34:24.722139 master-0 kubenswrapper[7728]: I0223 14:34:24.722045 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:24.762445 master-0 kubenswrapper[7728]: E0223 14:34:24.762367 7728 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:34:24.771343 master-0 kubenswrapper[7728]: E0223 14:34:24.771247 7728 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:24.830188 master-0 kubenswrapper[7728]: I0223 14:34:24.830078 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:24.830188 master-0 kubenswrapper[7728]: I0223 14:34:24.830152 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:24.830414 master-0 kubenswrapper[7728]: I0223 14:34:24.830363 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/959c75833224b4ba3fa488b77d8f5032-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"959c75833224b4ba3fa488b77d8f5032\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:34:24.830631 master-0 kubenswrapper[7728]: I0223 14:34:24.830462 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/959c75833224b4ba3fa488b77d8f5032-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"959c75833224b4ba3fa488b77d8f5032\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:34:24.830704 master-0 kubenswrapper[7728]: I0223 14:34:24.830679 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:24.830813 master-0 kubenswrapper[7728]: I0223 14:34:24.830779 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:24.830854 master-0 kubenswrapper[7728]: I0223 14:34:24.830831 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:24.830940 master-0 kubenswrapper[7728]: I0223 14:34:24.830910 7728 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/959c75833224b4ba3fa488b77d8f5032-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"959c75833224b4ba3fa488b77d8f5032\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:34:24.932918 master-0 kubenswrapper[7728]: I0223 14:34:24.932857 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:24.933057 master-0 kubenswrapper[7728]: I0223 14:34:24.933000 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:24.933186 master-0 kubenswrapper[7728]: I0223 14:34:24.933139 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:24.933264 master-0 kubenswrapper[7728]: I0223 14:34:24.933228 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:24.933316 master-0 kubenswrapper[7728]: I0223 14:34:24.933237 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:24.933316 master-0 kubenswrapper[7728]: I0223 14:34:24.933285 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:24.933389 master-0 kubenswrapper[7728]: I0223 14:34:24.933366 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/959c75833224b4ba3fa488b77d8f5032-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"959c75833224b4ba3fa488b77d8f5032\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:34:24.933584 master-0 kubenswrapper[7728]: I0223 14:34:24.933388 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:24.933584 master-0 kubenswrapper[7728]: I0223 14:34:24.933430 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:24.933584 master-0 kubenswrapper[7728]: I0223 14:34:24.933466 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/959c75833224b4ba3fa488b77d8f5032-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"959c75833224b4ba3fa488b77d8f5032\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:34:24.933584 master-0 kubenswrapper[7728]: I0223 14:34:24.933510 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:24.933584 master-0 kubenswrapper[7728]: I0223 14:34:24.933520 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:24.933584 master-0 kubenswrapper[7728]: I0223 14:34:24.933559 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/959c75833224b4ba3fa488b77d8f5032-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"959c75833224b4ba3fa488b77d8f5032\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:34:24.933775 master-0 kubenswrapper[7728]: I0223 14:34:24.933602 7728 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/959c75833224b4ba3fa488b77d8f5032-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"959c75833224b4ba3fa488b77d8f5032\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:34:24.933775 master-0 kubenswrapper[7728]: I0223 14:34:24.933668 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/959c75833224b4ba3fa488b77d8f5032-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"959c75833224b4ba3fa488b77d8f5032\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:34:24.933775 master-0 kubenswrapper[7728]: I0223 14:34:24.933703 7728 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/959c75833224b4ba3fa488b77d8f5032-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"959c75833224b4ba3fa488b77d8f5032\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:34:25.063694 master-0 kubenswrapper[7728]: I0223 14:34:25.063539 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:34:25.072078 master-0 kubenswrapper[7728]: I0223 14:34:25.072001 7728 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:25.103319 master-0 kubenswrapper[7728]: W0223 14:34:25.103236 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod959c75833224b4ba3fa488b77d8f5032.slice/crio-77590dc8fba389fa97fd0b176ea4707c8bfaef0fd399e113347fbdf7415d0d0f WatchSource:0}: Error finding container 77590dc8fba389fa97fd0b176ea4707c8bfaef0fd399e113347fbdf7415d0d0f: Status 404 returned error can't find the container with id 77590dc8fba389fa97fd0b176ea4707c8bfaef0fd399e113347fbdf7415d0d0f Feb 23 14:34:25.105723 master-0 kubenswrapper[7728]: W0223 14:34:25.105697 7728 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafeec80f2ec1ff5cb32c2367912befef.slice/crio-1318491cbf4a9852637e1a59a321f0824086291a3e80867692cb4a5b349fa4cf WatchSource:0}: Error finding container 1318491cbf4a9852637e1a59a321f0824086291a3e80867692cb4a5b349fa4cf: Status 404 returned error can't find the container with id 1318491cbf4a9852637e1a59a321f0824086291a3e80867692cb4a5b349fa4cf Feb 23 14:34:25.106571 master-0 kubenswrapper[7728]: E0223 14:34:25.106335 7728 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-master-0.1896e6cb7f4fb80a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-master-0,UID:959c75833224b4ba3fa488b77d8f5032,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8177c465e14c63854e5c0fa95ca0635cffc9b5dd3d077ecf971feedbc42b1274\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:34:25.105549322 +0000 UTC m=+958.068210618,LastTimestamp:2026-02-23 14:34:25.105549322 +0000 UTC m=+958.068210618,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:34:25.223726 master-0 kubenswrapper[7728]: E0223 14:34:25.223639 7728 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:34:25.224558 master-0 kubenswrapper[7728]: E0223 14:34:25.224506 7728 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:34:25.225589 master-0 kubenswrapper[7728]: E0223 14:34:25.225534 7728 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:34:25.228054 master-0 kubenswrapper[7728]: E0223 14:34:25.228009 7728 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:34:25.228502 master-0 kubenswrapper[7728]: E0223 14:34:25.228449 7728 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:34:25.228554 master-0 kubenswrapper[7728]: I0223 14:34:25.228501 7728 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 23 14:34:25.229270 master-0 kubenswrapper[7728]: E0223 14:34:25.229217 7728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Feb 23 14:34:25.430877 master-0 kubenswrapper[7728]: E0223 14:34:25.430791 7728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Feb 23 14:34:25.752217 master-0 kubenswrapper[7728]: I0223 14:34:25.752107 7728 generic.go:334] "Generic (PLEG): container finished" podID="687e92a6cecf1e2beeef16a0b322ad08" containerID="e3d987d25306f70a7327b5bce6ea549b476972db2d3366cf37d35b30c1531578" exitCode=0 Feb 23 14:34:25.754270 master-0 kubenswrapper[7728]: I0223 14:34:25.754185 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"afeec80f2ec1ff5cb32c2367912befef","Type":"ContainerStarted","Data":"d39b757db5c6ad372b3e6ed02073c93d1685170abb93eb92d7e3098cd31c4317"} Feb 23 14:34:25.754429 master-0 kubenswrapper[7728]: I0223 14:34:25.754271 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"afeec80f2ec1ff5cb32c2367912befef","Type":"ContainerStarted","Data":"1318491cbf4a9852637e1a59a321f0824086291a3e80867692cb4a5b349fa4cf"} Feb 23 14:34:25.755880 master-0 kubenswrapper[7728]: I0223 14:34:25.755825 7728 generic.go:334] "Generic (PLEG): container finished" podID="e1148263-7b15-4c12-a217-8b030ecd9348" containerID="909663cdb0c0ac8db46b5e0989f1e87cc68ef03f3124e36ed314cba8e6058032" exitCode=0 Feb 23 14:34:25.756001 master-0 kubenswrapper[7728]: I0223 14:34:25.755902 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"e1148263-7b15-4c12-a217-8b030ecd9348","Type":"ContainerDied","Data":"909663cdb0c0ac8db46b5e0989f1e87cc68ef03f3124e36ed314cba8e6058032"} Feb 23 14:34:25.756001 master-0 kubenswrapper[7728]: E0223 14:34:25.755961 7728 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:25.757520 master-0 kubenswrapper[7728]: I0223 14:34:25.757066 7728 status_manager.go:851] "Failed to get status for pod" podUID="e1148263-7b15-4c12-a217-8b030ecd9348" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:34:25.757668 master-0 kubenswrapper[7728]: I0223 14:34:25.757527 7728 generic.go:334] "Generic (PLEG): container finished" podID="959c75833224b4ba3fa488b77d8f5032" containerID="40cb1664e8a96775d97586c3b2bf51f0c43fd54057e211ddda21f17bebe65211" exitCode=0 Feb 23 14:34:25.757668 master-0 kubenswrapper[7728]: I0223 14:34:25.757575 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"959c75833224b4ba3fa488b77d8f5032","Type":"ContainerDied","Data":"40cb1664e8a96775d97586c3b2bf51f0c43fd54057e211ddda21f17bebe65211"} Feb 23 14:34:25.757668 master-0 kubenswrapper[7728]: I0223 14:34:25.757614 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"959c75833224b4ba3fa488b77d8f5032","Type":"ContainerStarted","Data":"77590dc8fba389fa97fd0b176ea4707c8bfaef0fd399e113347fbdf7415d0d0f"} Feb 23 14:34:25.759265 master-0 kubenswrapper[7728]: I0223 14:34:25.759178 7728 status_manager.go:851] "Failed to get status for pod" podUID="e1148263-7b15-4c12-a217-8b030ecd9348" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:34:25.759265 master-0 kubenswrapper[7728]: E0223 14:34:25.759191 7728 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:34:25.831954 master-0 kubenswrapper[7728]: E0223 14:34:25.831872 7728 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Feb 23 14:34:26.815867 master-0 kubenswrapper[7728]: I0223 14:34:26.815017 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"959c75833224b4ba3fa488b77d8f5032","Type":"ContainerStarted","Data":"03f063ba05fdab3010ecf16036a12816aeaffe3e50e5e9cbc85f6a31b61cfdf9"} Feb 23 14:34:26.815867 master-0 kubenswrapper[7728]: I0223 14:34:26.815075 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"959c75833224b4ba3fa488b77d8f5032","Type":"ContainerStarted","Data":"f48e8181c4b1545411cb74be92ee46d43b72112e508bbbde3d0b0625382cc193"} Feb 23 14:34:26.815867 master-0 kubenswrapper[7728]: I0223 14:34:26.815089 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"959c75833224b4ba3fa488b77d8f5032","Type":"ContainerStarted","Data":"db3bcfab91dd51437354641c4ae5c853de368a114ab29260a261ff0716a6e3aa"} Feb 23 14:34:26.815867 master-0 kubenswrapper[7728]: I0223 14:34:26.815101 7728 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"959c75833224b4ba3fa488b77d8f5032","Type":"ContainerStarted","Data":"8daf30ba9c8438cb4829a86d151f5a35be6ceb72bec19daa3149b2925e7076a2"} Feb 23 14:34:27.034635 master-0 kubenswrapper[7728]: I0223 14:34:27.034165 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Feb 23 14:34:27.067278 master-0 kubenswrapper[7728]: I0223 14:34:27.067213 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-audit-dir\") pod \"687e92a6cecf1e2beeef16a0b322ad08\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " Feb 23 14:34:27.067278 master-0 kubenswrapper[7728]: I0223 14:34:27.067271 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-secrets\") pod \"687e92a6cecf1e2beeef16a0b322ad08\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " Feb 23 14:34:27.067530 master-0 kubenswrapper[7728]: I0223 14:34:27.067328 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-config\") pod \"687e92a6cecf1e2beeef16a0b322ad08\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " Feb 23 14:34:27.067530 master-0 kubenswrapper[7728]: I0223 14:34:27.067423 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-ssl-certs-host\") pod \"687e92a6cecf1e2beeef16a0b322ad08\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " Feb 23 14:34:27.067530 master-0 kubenswrapper[7728]: I0223 14:34:27.067439 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-logs\") pod \"687e92a6cecf1e2beeef16a0b322ad08\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " Feb 23 14:34:27.067530 master-0 kubenswrapper[7728]: I0223 14:34:27.067452 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-etc-kubernetes-cloud\") pod \"687e92a6cecf1e2beeef16a0b322ad08\" (UID: \"687e92a6cecf1e2beeef16a0b322ad08\") " Feb 23 14:34:27.067799 master-0 kubenswrapper[7728]: I0223 14:34:27.067766 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "687e92a6cecf1e2beeef16a0b322ad08" (UID: "687e92a6cecf1e2beeef16a0b322ad08"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:34:27.067845 master-0 kubenswrapper[7728]: I0223 14:34:27.067801 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "687e92a6cecf1e2beeef16a0b322ad08" (UID: "687e92a6cecf1e2beeef16a0b322ad08"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:34:27.067845 master-0 kubenswrapper[7728]: I0223 14:34:27.067816 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-secrets" (OuterVolumeSpecName: "secrets") pod "687e92a6cecf1e2beeef16a0b322ad08" (UID: "687e92a6cecf1e2beeef16a0b322ad08"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:34:27.067845 master-0 kubenswrapper[7728]: I0223 14:34:27.067829 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-config" (OuterVolumeSpecName: "config") pod "687e92a6cecf1e2beeef16a0b322ad08" (UID: "687e92a6cecf1e2beeef16a0b322ad08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:34:27.067845 master-0 kubenswrapper[7728]: I0223 14:34:27.067841 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "687e92a6cecf1e2beeef16a0b322ad08" (UID: "687e92a6cecf1e2beeef16a0b322ad08"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:34:27.067990 master-0 kubenswrapper[7728]: I0223 14:34:27.067855 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-logs" (OuterVolumeSpecName: "logs") pod "687e92a6cecf1e2beeef16a0b322ad08" (UID: "687e92a6cecf1e2beeef16a0b322ad08"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:34:27.145075 master-0 kubenswrapper[7728]: I0223 14:34:27.145036 7728 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Feb 23 14:34:27.169412 master-0 kubenswrapper[7728]: I0223 14:34:27.169317 7728 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Feb 23 14:34:27.169412 master-0 kubenswrapper[7728]: I0223 14:34:27.169370 7728 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-logs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:34:27.169412 master-0 kubenswrapper[7728]: I0223 14:34:27.169384 7728 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Feb 23 14:34:27.169412 master-0 kubenswrapper[7728]: I0223 14:34:27.169400 7728 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-audit-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:34:27.169412 master-0 kubenswrapper[7728]: I0223 14:34:27.169412 7728 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-secrets\") on node \"master-0\" DevicePath \"\"" Feb 23 14:34:27.169412 master-0 kubenswrapper[7728]: I0223 14:34:27.169426 7728 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/687e92a6cecf1e2beeef16a0b322ad08-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:34:27.229114 master-0 kubenswrapper[7728]: I0223 14:34:27.229004 7728 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="687e92a6cecf1e2beeef16a0b322ad08" path="/var/lib/kubelet/pods/687e92a6cecf1e2beeef16a0b322ad08/volumes" Feb 23 14:34:27.229494 master-0 kubenswrapper[7728]: I0223 14:34:27.229460 7728 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Feb 23 14:34:27.270447 master-0 kubenswrapper[7728]: I0223 14:34:27.270372 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e1148263-7b15-4c12-a217-8b030ecd9348-var-lock\") pod \"e1148263-7b15-4c12-a217-8b030ecd9348\" (UID: \"e1148263-7b15-4c12-a217-8b030ecd9348\") " Feb 23 14:34:27.270705 master-0 kubenswrapper[7728]: I0223 14:34:27.270465 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1148263-7b15-4c12-a217-8b030ecd9348-var-lock" (OuterVolumeSpecName: "var-lock") pod "e1148263-7b15-4c12-a217-8b030ecd9348" (UID: "e1148263-7b15-4c12-a217-8b030ecd9348"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:34:27.270785 master-0 kubenswrapper[7728]: I0223 14:34:27.270750 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1148263-7b15-4c12-a217-8b030ecd9348-kubelet-dir\") pod \"e1148263-7b15-4c12-a217-8b030ecd9348\" (UID: \"e1148263-7b15-4c12-a217-8b030ecd9348\") " Feb 23 14:34:27.270929 master-0 kubenswrapper[7728]: I0223 14:34:27.270884 7728 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access\") pod \"e1148263-7b15-4c12-a217-8b030ecd9348\" (UID: \"e1148263-7b15-4c12-a217-8b030ecd9348\") " Feb 23 14:34:27.271202 master-0 kubenswrapper[7728]: I0223 14:34:27.271119 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1148263-7b15-4c12-a217-8b030ecd9348-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e1148263-7b15-4c12-a217-8b030ecd9348" (UID: "e1148263-7b15-4c12-a217-8b030ecd9348"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:34:27.271634 master-0 kubenswrapper[7728]: I0223 14:34:27.271602 7728 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e1148263-7b15-4c12-a217-8b030ecd9348-var-lock\") on node \"master-0\" DevicePath \"\"" Feb 23 14:34:27.271634 master-0 kubenswrapper[7728]: I0223 14:34:27.271630 7728 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1148263-7b15-4c12-a217-8b030ecd9348-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:34:27.277764 master-0 kubenswrapper[7728]: I0223 14:34:27.277676 7728 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e1148263-7b15-4c12-a217-8b030ecd9348" (UID: "e1148263-7b15-4c12-a217-8b030ecd9348"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:34:27.403502 master-0 kubenswrapper[7728]: I0223 14:34:27.396089 7728 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access\") on node \"master-0\" DevicePath \"\"" Feb 23 14:34:27.756225 master-0 kubenswrapper[7728]: I0223 14:34:27.756051 7728 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 23 14:34:27.756263 master-0 systemd[1]: Stopping Kubernetes Kubelet... Feb 23 14:34:27.780544 master-0 systemd[1]: kubelet.service: Deactivated successfully. Feb 23 14:34:27.780804 master-0 systemd[1]: Stopped Kubernetes Kubelet. Feb 23 14:34:27.781864 master-0 systemd[1]: kubelet.service: Consumed 2min 12.290s CPU time. Feb 23 14:34:27.800759 master-0 systemd[1]: Starting Kubernetes Kubelet... Feb 23 14:34:27.914607 master-0 kubenswrapper[28758]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 23 14:34:27.914607 master-0 kubenswrapper[28758]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 23 14:34:27.914607 master-0 kubenswrapper[28758]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 23 14:34:27.914607 master-0 kubenswrapper[28758]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 23 14:34:27.914607 master-0 kubenswrapper[28758]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 23 14:34:27.914607 master-0 kubenswrapper[28758]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 23 14:34:27.915343 master-0 kubenswrapper[28758]: I0223 14:34:27.914723 28758 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 23 14:34:27.918361 master-0 kubenswrapper[28758]: W0223 14:34:27.918323 28758 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 23 14:34:27.918361 master-0 kubenswrapper[28758]: W0223 14:34:27.918345 28758 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 23 14:34:27.918361 master-0 kubenswrapper[28758]: W0223 14:34:27.918350 28758 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 23 14:34:27.918361 master-0 kubenswrapper[28758]: W0223 14:34:27.918354 28758 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 23 14:34:27.918361 master-0 kubenswrapper[28758]: W0223 14:34:27.918358 28758 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 23 14:34:27.918361 master-0 kubenswrapper[28758]: W0223 14:34:27.918362 28758 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 23 14:34:27.918361 master-0 kubenswrapper[28758]: W0223 14:34:27.918366 28758 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 23 14:34:27.918361 master-0 kubenswrapper[28758]: W0223 14:34:27.918370 28758 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 23 14:34:27.918361 master-0 kubenswrapper[28758]: W0223 14:34:27.918374 28758 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 23 14:34:27.918763 master-0 kubenswrapper[28758]: W0223 14:34:27.918380 28758 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 23 14:34:27.918763 master-0 kubenswrapper[28758]: W0223 14:34:27.918386 28758 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 23 14:34:27.918763 master-0 kubenswrapper[28758]: W0223 14:34:27.918390 28758 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 23 14:34:27.918763 master-0 kubenswrapper[28758]: W0223 14:34:27.918394 28758 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 23 14:34:27.918763 master-0 kubenswrapper[28758]: W0223 14:34:27.918397 28758 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 23 14:34:27.918763 master-0 kubenswrapper[28758]: W0223 14:34:27.918401 28758 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 23 14:34:27.918763 master-0 kubenswrapper[28758]: W0223 14:34:27.918404 28758 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 23 14:34:27.918763 master-0 kubenswrapper[28758]: W0223 14:34:27.918408 28758 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 23 14:34:27.918763 master-0 kubenswrapper[28758]: W0223 14:34:27.918412 28758 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 23 14:34:27.918763 master-0 kubenswrapper[28758]: W0223 14:34:27.918416 28758 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 23 14:34:27.918763 master-0 kubenswrapper[28758]: W0223 14:34:27.918420 28758 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 23 14:34:27.918763 master-0 kubenswrapper[28758]: W0223 14:34:27.918423 28758 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 23 14:34:27.918763 master-0 kubenswrapper[28758]: W0223 14:34:27.918426 28758 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 23 14:34:27.918763 master-0 kubenswrapper[28758]: W0223 14:34:27.918430 28758 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 23 14:34:27.918763 master-0 kubenswrapper[28758]: W0223 14:34:27.918434 28758 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 23 14:34:27.918763 master-0 kubenswrapper[28758]: W0223 14:34:27.918437 28758 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 23 14:34:27.918763 master-0 kubenswrapper[28758]: W0223 14:34:27.918441 28758 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 23 14:34:27.918763 master-0 kubenswrapper[28758]: W0223 14:34:27.918444 28758 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 23 14:34:27.918763 master-0 kubenswrapper[28758]: W0223 14:34:27.918448 28758 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 23 14:34:27.919214 master-0 kubenswrapper[28758]: W0223 14:34:27.918452 28758 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 23 14:34:27.919214 master-0 kubenswrapper[28758]: W0223 14:34:27.918455 28758 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 23 14:34:27.919214 master-0 kubenswrapper[28758]: W0223 14:34:27.918458 28758 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 23 14:34:27.919214 master-0 kubenswrapper[28758]: W0223 14:34:27.918462 28758 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 23 14:34:27.919214 master-0 kubenswrapper[28758]: W0223 14:34:27.918466 28758 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 23 14:34:27.919214 master-0 kubenswrapper[28758]: W0223 14:34:27.918470 28758 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 23 14:34:27.919214 master-0 kubenswrapper[28758]: W0223 14:34:27.918490 28758 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 23 14:34:27.919214 master-0 kubenswrapper[28758]: W0223 14:34:27.918497 28758 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 23 14:34:27.919214 master-0 kubenswrapper[28758]: W0223 14:34:27.918501 28758 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 23 14:34:27.919214 master-0 kubenswrapper[28758]: W0223 14:34:27.918506 28758 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 23 14:34:27.919214 master-0 kubenswrapper[28758]: W0223 14:34:27.918510 28758 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 23 14:34:27.919214 master-0 kubenswrapper[28758]: W0223 14:34:27.918514 28758 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 23 14:34:27.919214 master-0 kubenswrapper[28758]: W0223 14:34:27.918517 28758 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 23 14:34:27.919214 master-0 kubenswrapper[28758]: W0223 14:34:27.918521 28758 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 23 14:34:27.919214 master-0 kubenswrapper[28758]: W0223 14:34:27.918526 28758 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 23 14:34:27.919214 master-0 kubenswrapper[28758]: W0223 14:34:27.918531 28758 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Feb 23 14:34:27.919214 master-0 kubenswrapper[28758]: W0223 14:34:27.918536 28758 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 23 14:34:27.919214 master-0 kubenswrapper[28758]: W0223 14:34:27.918540 28758 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 23 14:34:27.919214 master-0 kubenswrapper[28758]: W0223 14:34:27.918543 28758 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 23 14:34:27.919214 master-0 kubenswrapper[28758]: W0223 14:34:27.918547 28758 feature_gate.go:330] unrecognized feature gate: Example Feb 23 14:34:27.919771 master-0 kubenswrapper[28758]: W0223 14:34:27.918550 28758 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 23 14:34:27.919771 master-0 kubenswrapper[28758]: W0223 14:34:27.918554 28758 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 23 14:34:27.919771 master-0 kubenswrapper[28758]: W0223 14:34:27.918559 28758 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 23 14:34:27.919771 master-0 kubenswrapper[28758]: W0223 14:34:27.918563 28758 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 23 14:34:27.919771 master-0 kubenswrapper[28758]: W0223 14:34:27.918566 28758 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 23 14:34:27.919771 master-0 kubenswrapper[28758]: W0223 14:34:27.918570 28758 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 23 14:34:27.919771 master-0 kubenswrapper[28758]: W0223 14:34:27.918574 28758 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 23 14:34:27.919771 master-0 kubenswrapper[28758]: W0223 14:34:27.918577 28758 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 23 14:34:27.919771 master-0 kubenswrapper[28758]: W0223 14:34:27.918582 28758 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 23 14:34:27.919771 master-0 kubenswrapper[28758]: W0223 14:34:27.918587 28758 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 23 14:34:27.919771 master-0 kubenswrapper[28758]: W0223 14:34:27.918591 28758 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 23 14:34:27.919771 master-0 kubenswrapper[28758]: W0223 14:34:27.918595 28758 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 23 14:34:27.919771 master-0 kubenswrapper[28758]: W0223 14:34:27.918598 28758 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 23 14:34:27.919771 master-0 kubenswrapper[28758]: W0223 14:34:27.918603 28758 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 23 14:34:27.919771 master-0 kubenswrapper[28758]: W0223 14:34:27.918608 28758 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 23 14:34:27.919771 master-0 kubenswrapper[28758]: W0223 14:34:27.918612 28758 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 23 14:34:27.919771 master-0 kubenswrapper[28758]: W0223 14:34:27.918616 28758 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 23 14:34:27.919771 master-0 kubenswrapper[28758]: W0223 14:34:27.918619 28758 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 23 14:34:27.919771 master-0 kubenswrapper[28758]: W0223 14:34:27.918623 28758 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 23 14:34:27.920456 master-0 kubenswrapper[28758]: W0223 14:34:27.918628 28758 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 23 14:34:27.920456 master-0 kubenswrapper[28758]: W0223 14:34:27.918632 28758 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 23 14:34:27.920456 master-0 kubenswrapper[28758]: W0223 14:34:27.918636 28758 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 23 14:34:27.920456 master-0 kubenswrapper[28758]: W0223 14:34:27.918639 28758 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 23 14:34:27.920456 master-0 kubenswrapper[28758]: W0223 14:34:27.918643 28758 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 23 14:34:27.920456 master-0 kubenswrapper[28758]: I0223 14:34:27.918728 28758 flags.go:64] FLAG: --address="0.0.0.0" Feb 23 14:34:27.920456 master-0 kubenswrapper[28758]: I0223 14:34:27.918745 28758 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 23 14:34:27.920456 master-0 kubenswrapper[28758]: I0223 14:34:27.918753 28758 flags.go:64] FLAG: --anonymous-auth="true" Feb 23 14:34:27.920456 master-0 kubenswrapper[28758]: I0223 14:34:27.918758 28758 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 23 14:34:27.920456 master-0 kubenswrapper[28758]: I0223 14:34:27.918766 28758 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 23 14:34:27.920456 master-0 kubenswrapper[28758]: I0223 14:34:27.918771 28758 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 23 14:34:27.920456 master-0 kubenswrapper[28758]: I0223 14:34:27.918776 28758 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 23 14:34:27.920456 master-0 kubenswrapper[28758]: I0223 14:34:27.918782 28758 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 23 14:34:27.920456 master-0 kubenswrapper[28758]: I0223 14:34:27.918786 28758 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 23 14:34:27.920456 master-0 kubenswrapper[28758]: I0223 14:34:27.918791 28758 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 23 14:34:27.920456 master-0 kubenswrapper[28758]: I0223 14:34:27.918796 28758 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 23 14:34:27.920456 master-0 kubenswrapper[28758]: I0223 14:34:27.918801 28758 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 23 14:34:27.920456 master-0 kubenswrapper[28758]: I0223 14:34:27.918805 28758 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 23 14:34:27.920456 master-0 kubenswrapper[28758]: I0223 14:34:27.918809 28758 flags.go:64] FLAG: --cgroup-root="" Feb 23 14:34:27.920456 master-0 kubenswrapper[28758]: I0223 14:34:27.918813 28758 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 23 14:34:27.920456 master-0 kubenswrapper[28758]: I0223 14:34:27.918817 28758 flags.go:64] FLAG: --client-ca-file="" Feb 23 14:34:27.920456 master-0 kubenswrapper[28758]: I0223 14:34:27.918821 28758 flags.go:64] FLAG: --cloud-config="" Feb 23 14:34:27.920456 master-0 kubenswrapper[28758]: I0223 14:34:27.918825 28758 flags.go:64] FLAG: --cloud-provider="" Feb 23 14:34:27.921330 master-0 kubenswrapper[28758]: I0223 14:34:27.918829 28758 flags.go:64] FLAG: --cluster-dns="[]" Feb 23 14:34:27.921330 master-0 kubenswrapper[28758]: I0223 14:34:27.918835 28758 flags.go:64] FLAG: --cluster-domain="" Feb 23 14:34:27.921330 master-0 kubenswrapper[28758]: I0223 14:34:27.918839 28758 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 23 14:34:27.921330 master-0 kubenswrapper[28758]: I0223 14:34:27.918844 28758 flags.go:64] FLAG: --config-dir="" Feb 23 14:34:27.921330 master-0 kubenswrapper[28758]: I0223 14:34:27.918848 28758 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 23 14:34:27.921330 master-0 kubenswrapper[28758]: I0223 14:34:27.918852 28758 flags.go:64] FLAG: --container-log-max-files="5" Feb 23 14:34:27.921330 master-0 kubenswrapper[28758]: I0223 14:34:27.918857 28758 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 23 14:34:27.921330 master-0 kubenswrapper[28758]: I0223 14:34:27.918862 28758 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 23 14:34:27.921330 master-0 kubenswrapper[28758]: I0223 14:34:27.918866 28758 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 23 14:34:27.921330 master-0 kubenswrapper[28758]: I0223 14:34:27.918870 28758 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 23 14:34:27.921330 master-0 kubenswrapper[28758]: I0223 14:34:27.918874 28758 flags.go:64] FLAG: --contention-profiling="false" Feb 23 14:34:27.921330 master-0 kubenswrapper[28758]: I0223 14:34:27.918878 28758 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 23 14:34:27.921330 master-0 kubenswrapper[28758]: I0223 14:34:27.918882 28758 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 23 14:34:27.921330 master-0 kubenswrapper[28758]: I0223 14:34:27.918886 28758 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 23 14:34:27.921330 master-0 kubenswrapper[28758]: I0223 14:34:27.918890 28758 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 23 14:34:27.921330 master-0 kubenswrapper[28758]: I0223 14:34:27.918895 28758 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 23 14:34:27.921330 master-0 kubenswrapper[28758]: I0223 14:34:27.918899 28758 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 23 14:34:27.921330 master-0 kubenswrapper[28758]: I0223 14:34:27.918903 28758 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 23 14:34:27.921330 master-0 kubenswrapper[28758]: I0223 14:34:27.918907 28758 flags.go:64] FLAG: --enable-load-reader="false" Feb 23 14:34:27.921330 master-0 kubenswrapper[28758]: I0223 14:34:27.918911 28758 flags.go:64] FLAG: --enable-server="true" Feb 23 14:34:27.921330 master-0 kubenswrapper[28758]: I0223 14:34:27.918915 28758 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 23 14:34:27.921330 master-0 kubenswrapper[28758]: I0223 14:34:27.918921 28758 flags.go:64] FLAG: --event-burst="100" Feb 23 14:34:27.921330 master-0 kubenswrapper[28758]: I0223 14:34:27.918925 28758 flags.go:64] FLAG: --event-qps="50" Feb 23 14:34:27.921330 master-0 kubenswrapper[28758]: I0223 14:34:27.918929 28758 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 23 14:34:27.921330 master-0 kubenswrapper[28758]: I0223 14:34:27.918933 28758 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 23 14:34:27.921959 master-0 kubenswrapper[28758]: I0223 14:34:27.918937 28758 flags.go:64] FLAG: --eviction-hard="" Feb 23 14:34:27.921959 master-0 kubenswrapper[28758]: I0223 14:34:27.918942 28758 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 23 14:34:27.921959 master-0 kubenswrapper[28758]: I0223 14:34:27.918948 28758 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 23 14:34:27.921959 master-0 kubenswrapper[28758]: I0223 14:34:27.918952 28758 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 23 14:34:27.921959 master-0 kubenswrapper[28758]: I0223 14:34:27.918956 28758 flags.go:64] FLAG: --eviction-soft="" Feb 23 14:34:27.921959 master-0 kubenswrapper[28758]: I0223 14:34:27.918960 28758 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 23 14:34:27.921959 master-0 kubenswrapper[28758]: I0223 14:34:27.918964 28758 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 23 14:34:27.921959 master-0 kubenswrapper[28758]: I0223 14:34:27.918968 28758 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 23 14:34:27.921959 master-0 kubenswrapper[28758]: I0223 14:34:27.918976 28758 flags.go:64] FLAG: --experimental-mounter-path="" Feb 23 14:34:27.921959 master-0 kubenswrapper[28758]: I0223 14:34:27.918980 28758 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 23 14:34:27.921959 master-0 kubenswrapper[28758]: I0223 14:34:27.918985 28758 flags.go:64] FLAG: --fail-swap-on="true" Feb 23 14:34:27.921959 master-0 kubenswrapper[28758]: I0223 14:34:27.918989 28758 flags.go:64] FLAG: --feature-gates="" Feb 23 14:34:27.921959 master-0 kubenswrapper[28758]: I0223 14:34:27.918994 28758 flags.go:64] FLAG: --file-check-frequency="20s" Feb 23 14:34:27.921959 master-0 kubenswrapper[28758]: I0223 14:34:27.918998 28758 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 23 14:34:27.921959 master-0 kubenswrapper[28758]: I0223 14:34:27.919002 28758 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 23 14:34:27.921959 master-0 kubenswrapper[28758]: I0223 14:34:27.919006 28758 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 23 14:34:27.921959 master-0 kubenswrapper[28758]: I0223 14:34:27.919011 28758 flags.go:64] FLAG: --healthz-port="10248" Feb 23 14:34:27.921959 master-0 kubenswrapper[28758]: I0223 14:34:27.919015 28758 flags.go:64] FLAG: --help="false" Feb 23 14:34:27.921959 master-0 kubenswrapper[28758]: I0223 14:34:27.919019 28758 flags.go:64] FLAG: --hostname-override="" Feb 23 14:34:27.921959 master-0 kubenswrapper[28758]: I0223 14:34:27.919023 28758 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 23 14:34:27.921959 master-0 kubenswrapper[28758]: I0223 14:34:27.919027 28758 flags.go:64] FLAG: --http-check-frequency="20s" Feb 23 14:34:27.921959 master-0 kubenswrapper[28758]: I0223 14:34:27.919031 28758 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 23 14:34:27.921959 master-0 kubenswrapper[28758]: I0223 14:34:27.919035 28758 flags.go:64] FLAG: --image-credential-provider-config="" Feb 23 14:34:27.921959 master-0 kubenswrapper[28758]: I0223 14:34:27.919039 28758 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 23 14:34:27.921959 master-0 kubenswrapper[28758]: I0223 14:34:27.919043 28758 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 23 14:34:27.922703 master-0 kubenswrapper[28758]: I0223 14:34:27.919047 28758 flags.go:64] FLAG: --image-service-endpoint="" Feb 23 14:34:27.922703 master-0 kubenswrapper[28758]: I0223 14:34:27.919050 28758 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 23 14:34:27.922703 master-0 kubenswrapper[28758]: I0223 14:34:27.919054 28758 flags.go:64] FLAG: --kube-api-burst="100" Feb 23 14:34:27.922703 master-0 kubenswrapper[28758]: I0223 14:34:27.919058 28758 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 23 14:34:27.922703 master-0 kubenswrapper[28758]: I0223 14:34:27.919063 28758 flags.go:64] FLAG: --kube-api-qps="50" Feb 23 14:34:27.922703 master-0 kubenswrapper[28758]: I0223 14:34:27.919066 28758 flags.go:64] FLAG: --kube-reserved="" Feb 23 14:34:27.922703 master-0 kubenswrapper[28758]: I0223 14:34:27.919070 28758 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 23 14:34:27.922703 master-0 kubenswrapper[28758]: I0223 14:34:27.919074 28758 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 23 14:34:27.922703 master-0 kubenswrapper[28758]: I0223 14:34:27.919078 28758 flags.go:64] FLAG: --kubelet-cgroups="" Feb 23 14:34:27.922703 master-0 kubenswrapper[28758]: I0223 14:34:27.919082 28758 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 23 14:34:27.922703 master-0 kubenswrapper[28758]: I0223 14:34:27.919086 28758 flags.go:64] FLAG: --lock-file="" Feb 23 14:34:27.922703 master-0 kubenswrapper[28758]: I0223 14:34:27.919090 28758 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 23 14:34:27.922703 master-0 kubenswrapper[28758]: I0223 14:34:27.919094 28758 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 23 14:34:27.922703 master-0 kubenswrapper[28758]: I0223 14:34:27.919099 28758 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 23 14:34:27.922703 master-0 kubenswrapper[28758]: I0223 14:34:27.919105 28758 flags.go:64] FLAG: --log-json-split-stream="false" Feb 23 14:34:27.922703 master-0 kubenswrapper[28758]: I0223 14:34:27.919111 28758 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 23 14:34:27.922703 master-0 kubenswrapper[28758]: I0223 14:34:27.919115 28758 flags.go:64] FLAG: --log-text-split-stream="false" Feb 23 14:34:27.922703 master-0 kubenswrapper[28758]: I0223 14:34:27.919119 28758 flags.go:64] FLAG: --logging-format="text" Feb 23 14:34:27.922703 master-0 kubenswrapper[28758]: I0223 14:34:27.919123 28758 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 23 14:34:27.922703 master-0 kubenswrapper[28758]: I0223 14:34:27.919128 28758 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 23 14:34:27.922703 master-0 kubenswrapper[28758]: I0223 14:34:27.919132 28758 flags.go:64] FLAG: --manifest-url="" Feb 23 14:34:27.922703 master-0 kubenswrapper[28758]: I0223 14:34:27.919232 28758 flags.go:64] FLAG: --manifest-url-header="" Feb 23 14:34:27.922703 master-0 kubenswrapper[28758]: I0223 14:34:27.919242 28758 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 23 14:34:27.922703 master-0 kubenswrapper[28758]: I0223 14:34:27.919247 28758 flags.go:64] FLAG: --max-open-files="1000000" Feb 23 14:34:27.922703 master-0 kubenswrapper[28758]: I0223 14:34:27.919252 28758 flags.go:64] FLAG: --max-pods="110" Feb 23 14:34:27.923337 master-0 kubenswrapper[28758]: I0223 14:34:27.919256 28758 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 23 14:34:27.923337 master-0 kubenswrapper[28758]: I0223 14:34:27.919261 28758 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 23 14:34:27.923337 master-0 kubenswrapper[28758]: I0223 14:34:27.919265 28758 flags.go:64] FLAG: --memory-manager-policy="None" Feb 23 14:34:27.923337 master-0 kubenswrapper[28758]: I0223 14:34:27.919269 28758 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 23 14:34:27.923337 master-0 kubenswrapper[28758]: I0223 14:34:27.919273 28758 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 23 14:34:27.923337 master-0 kubenswrapper[28758]: I0223 14:34:27.919277 28758 flags.go:64] FLAG: --node-ip="192.168.32.10" Feb 23 14:34:27.923337 master-0 kubenswrapper[28758]: I0223 14:34:27.919281 28758 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 23 14:34:27.923337 master-0 kubenswrapper[28758]: I0223 14:34:27.919292 28758 flags.go:64] FLAG: --node-status-max-images="50" Feb 23 14:34:27.923337 master-0 kubenswrapper[28758]: I0223 14:34:27.919296 28758 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 23 14:34:27.923337 master-0 kubenswrapper[28758]: I0223 14:34:27.919302 28758 flags.go:64] FLAG: --oom-score-adj="-999" Feb 23 14:34:27.923337 master-0 kubenswrapper[28758]: I0223 14:34:27.919324 28758 flags.go:64] FLAG: --pod-cidr="" Feb 23 14:34:27.923337 master-0 kubenswrapper[28758]: I0223 14:34:27.919328 28758 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6d5001a555eb05eef7f23d64667303c2b4db8343ee900c265f7613c40c1db229" Feb 23 14:34:27.923337 master-0 kubenswrapper[28758]: I0223 14:34:27.919335 28758 flags.go:64] FLAG: --pod-manifest-path="" Feb 23 14:34:27.923337 master-0 kubenswrapper[28758]: I0223 14:34:27.919339 28758 flags.go:64] FLAG: --pod-max-pids="-1" Feb 23 14:34:27.923337 master-0 kubenswrapper[28758]: I0223 14:34:27.919343 28758 flags.go:64] FLAG: --pods-per-core="0" Feb 23 14:34:27.923337 master-0 kubenswrapper[28758]: I0223 14:34:27.919347 28758 flags.go:64] FLAG: --port="10250" Feb 23 14:34:27.923337 master-0 kubenswrapper[28758]: I0223 14:34:27.919351 28758 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 23 14:34:27.923337 master-0 kubenswrapper[28758]: I0223 14:34:27.919355 28758 flags.go:64] FLAG: --provider-id="" Feb 23 14:34:27.923337 master-0 kubenswrapper[28758]: I0223 14:34:27.919359 28758 flags.go:64] FLAG: --qos-reserved="" Feb 23 14:34:27.923337 master-0 kubenswrapper[28758]: I0223 14:34:27.919363 28758 flags.go:64] FLAG: --read-only-port="10255" Feb 23 14:34:27.923337 master-0 kubenswrapper[28758]: I0223 14:34:27.919367 28758 flags.go:64] FLAG: --register-node="true" Feb 23 14:34:27.923337 master-0 kubenswrapper[28758]: I0223 14:34:27.919371 28758 flags.go:64] FLAG: --register-schedulable="true" Feb 23 14:34:27.923337 master-0 kubenswrapper[28758]: I0223 14:34:27.919375 28758 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 23 14:34:27.924227 master-0 kubenswrapper[28758]: I0223 14:34:27.919385 28758 flags.go:64] FLAG: --registry-burst="10" Feb 23 14:34:27.924227 master-0 kubenswrapper[28758]: I0223 14:34:27.919390 28758 flags.go:64] FLAG: --registry-qps="5" Feb 23 14:34:27.924227 master-0 kubenswrapper[28758]: I0223 14:34:27.919395 28758 flags.go:64] FLAG: --reserved-cpus="" Feb 23 14:34:27.924227 master-0 kubenswrapper[28758]: I0223 14:34:27.919398 28758 flags.go:64] FLAG: --reserved-memory="" Feb 23 14:34:27.924227 master-0 kubenswrapper[28758]: I0223 14:34:27.919404 28758 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 23 14:34:27.924227 master-0 kubenswrapper[28758]: I0223 14:34:27.919408 28758 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 23 14:34:27.924227 master-0 kubenswrapper[28758]: I0223 14:34:27.919412 28758 flags.go:64] FLAG: --rotate-certificates="false" Feb 23 14:34:27.924227 master-0 kubenswrapper[28758]: I0223 14:34:27.919416 28758 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 23 14:34:27.924227 master-0 kubenswrapper[28758]: I0223 14:34:27.919420 28758 flags.go:64] FLAG: --runonce="false" Feb 23 14:34:27.924227 master-0 kubenswrapper[28758]: I0223 14:34:27.919424 28758 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 23 14:34:27.924227 master-0 kubenswrapper[28758]: I0223 14:34:27.919428 28758 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 23 14:34:27.924227 master-0 kubenswrapper[28758]: I0223 14:34:27.919433 28758 flags.go:64] FLAG: --seccomp-default="false" Feb 23 14:34:27.924227 master-0 kubenswrapper[28758]: I0223 14:34:27.919437 28758 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 23 14:34:27.924227 master-0 kubenswrapper[28758]: I0223 14:34:27.919441 28758 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 23 14:34:27.924227 master-0 kubenswrapper[28758]: I0223 14:34:27.919445 28758 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 23 14:34:27.924227 master-0 kubenswrapper[28758]: I0223 14:34:27.919450 28758 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 23 14:34:27.924227 master-0 kubenswrapper[28758]: I0223 14:34:27.919454 28758 flags.go:64] FLAG: --storage-driver-password="root" Feb 23 14:34:27.924227 master-0 kubenswrapper[28758]: I0223 14:34:27.919458 28758 flags.go:64] FLAG: --storage-driver-secure="false" Feb 23 14:34:27.924227 master-0 kubenswrapper[28758]: I0223 14:34:27.919462 28758 flags.go:64] FLAG: --storage-driver-table="stats" Feb 23 14:34:27.924227 master-0 kubenswrapper[28758]: I0223 14:34:27.919466 28758 flags.go:64] FLAG: --storage-driver-user="root" Feb 23 14:34:27.924227 master-0 kubenswrapper[28758]: I0223 14:34:27.919470 28758 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 23 14:34:27.924227 master-0 kubenswrapper[28758]: I0223 14:34:27.919488 28758 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 23 14:34:27.924227 master-0 kubenswrapper[28758]: I0223 14:34:27.919493 28758 flags.go:64] FLAG: --system-cgroups="" Feb 23 14:34:27.924227 master-0 kubenswrapper[28758]: I0223 14:34:27.919499 28758 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Feb 23 14:34:27.924227 master-0 kubenswrapper[28758]: I0223 14:34:27.919507 28758 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 23 14:34:27.925030 master-0 kubenswrapper[28758]: I0223 14:34:27.919513 28758 flags.go:64] FLAG: --tls-cert-file="" Feb 23 14:34:27.925030 master-0 kubenswrapper[28758]: I0223 14:34:27.919518 28758 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 23 14:34:27.925030 master-0 kubenswrapper[28758]: I0223 14:34:27.919525 28758 flags.go:64] FLAG: --tls-min-version="" Feb 23 14:34:27.925030 master-0 kubenswrapper[28758]: I0223 14:34:27.919530 28758 flags.go:64] FLAG: --tls-private-key-file="" Feb 23 14:34:27.925030 master-0 kubenswrapper[28758]: I0223 14:34:27.919534 28758 flags.go:64] FLAG: --topology-manager-policy="none" Feb 23 14:34:27.925030 master-0 kubenswrapper[28758]: I0223 14:34:27.919539 28758 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 23 14:34:27.925030 master-0 kubenswrapper[28758]: I0223 14:34:27.919544 28758 flags.go:64] FLAG: --topology-manager-scope="container" Feb 23 14:34:27.925030 master-0 kubenswrapper[28758]: I0223 14:34:27.919551 28758 flags.go:64] FLAG: --v="2" Feb 23 14:34:27.925030 master-0 kubenswrapper[28758]: I0223 14:34:27.919556 28758 flags.go:64] FLAG: --version="false" Feb 23 14:34:27.925030 master-0 kubenswrapper[28758]: I0223 14:34:27.919562 28758 flags.go:64] FLAG: --vmodule="" Feb 23 14:34:27.925030 master-0 kubenswrapper[28758]: I0223 14:34:27.919567 28758 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 23 14:34:27.925030 master-0 kubenswrapper[28758]: I0223 14:34:27.919571 28758 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 23 14:34:27.925030 master-0 kubenswrapper[28758]: W0223 14:34:27.919674 28758 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Feb 23 14:34:27.925030 master-0 kubenswrapper[28758]: W0223 14:34:27.919680 28758 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 23 14:34:27.925030 master-0 kubenswrapper[28758]: W0223 14:34:27.919684 28758 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 23 14:34:27.925030 master-0 kubenswrapper[28758]: W0223 14:34:27.919688 28758 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 23 14:34:27.925030 master-0 kubenswrapper[28758]: W0223 14:34:27.919692 28758 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 23 14:34:27.925030 master-0 kubenswrapper[28758]: W0223 14:34:27.919696 28758 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 23 14:34:27.925030 master-0 kubenswrapper[28758]: W0223 14:34:27.919700 28758 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 23 14:34:27.925030 master-0 kubenswrapper[28758]: W0223 14:34:27.919703 28758 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 23 14:34:27.925030 master-0 kubenswrapper[28758]: W0223 14:34:27.919707 28758 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 23 14:34:27.925030 master-0 kubenswrapper[28758]: W0223 14:34:27.919710 28758 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 23 14:34:27.925691 master-0 kubenswrapper[28758]: W0223 14:34:27.919714 28758 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 23 14:34:27.925691 master-0 kubenswrapper[28758]: W0223 14:34:27.919717 28758 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 23 14:34:27.925691 master-0 kubenswrapper[28758]: W0223 14:34:27.919721 28758 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 23 14:34:27.925691 master-0 kubenswrapper[28758]: W0223 14:34:27.919724 28758 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 23 14:34:27.925691 master-0 kubenswrapper[28758]: W0223 14:34:27.919729 28758 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 23 14:34:27.925691 master-0 kubenswrapper[28758]: W0223 14:34:27.919733 28758 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 23 14:34:27.925691 master-0 kubenswrapper[28758]: W0223 14:34:27.919737 28758 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 23 14:34:27.925691 master-0 kubenswrapper[28758]: W0223 14:34:27.919741 28758 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 23 14:34:27.925691 master-0 kubenswrapper[28758]: W0223 14:34:27.919745 28758 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 23 14:34:27.925691 master-0 kubenswrapper[28758]: W0223 14:34:27.919749 28758 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 23 14:34:27.925691 master-0 kubenswrapper[28758]: W0223 14:34:27.919776 28758 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 23 14:34:27.925691 master-0 kubenswrapper[28758]: W0223 14:34:27.919780 28758 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 23 14:34:27.925691 master-0 kubenswrapper[28758]: W0223 14:34:27.919784 28758 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 23 14:34:27.925691 master-0 kubenswrapper[28758]: W0223 14:34:27.919788 28758 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 23 14:34:27.925691 master-0 kubenswrapper[28758]: W0223 14:34:27.919792 28758 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 23 14:34:27.925691 master-0 kubenswrapper[28758]: W0223 14:34:27.919796 28758 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 23 14:34:27.925691 master-0 kubenswrapper[28758]: W0223 14:34:27.919800 28758 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 23 14:34:27.925691 master-0 kubenswrapper[28758]: W0223 14:34:27.919806 28758 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 23 14:34:27.925691 master-0 kubenswrapper[28758]: W0223 14:34:27.919812 28758 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 23 14:34:27.925691 master-0 kubenswrapper[28758]: W0223 14:34:27.919816 28758 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 23 14:34:27.926234 master-0 kubenswrapper[28758]: W0223 14:34:27.919819 28758 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 23 14:34:27.926234 master-0 kubenswrapper[28758]: W0223 14:34:27.919823 28758 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 23 14:34:27.926234 master-0 kubenswrapper[28758]: W0223 14:34:27.919827 28758 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 23 14:34:27.926234 master-0 kubenswrapper[28758]: W0223 14:34:27.919830 28758 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 23 14:34:27.926234 master-0 kubenswrapper[28758]: W0223 14:34:27.919834 28758 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 23 14:34:27.926234 master-0 kubenswrapper[28758]: W0223 14:34:27.919838 28758 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 23 14:34:27.926234 master-0 kubenswrapper[28758]: W0223 14:34:27.919842 28758 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 23 14:34:27.926234 master-0 kubenswrapper[28758]: W0223 14:34:27.919846 28758 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 23 14:34:27.926234 master-0 kubenswrapper[28758]: W0223 14:34:27.919850 28758 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 23 14:34:27.926234 master-0 kubenswrapper[28758]: W0223 14:34:27.919853 28758 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 23 14:34:27.926234 master-0 kubenswrapper[28758]: W0223 14:34:27.919857 28758 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 23 14:34:27.926234 master-0 kubenswrapper[28758]: W0223 14:34:27.919860 28758 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 23 14:34:27.926234 master-0 kubenswrapper[28758]: W0223 14:34:27.919863 28758 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 23 14:34:27.926234 master-0 kubenswrapper[28758]: W0223 14:34:27.919867 28758 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 23 14:34:27.926234 master-0 kubenswrapper[28758]: W0223 14:34:27.919871 28758 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 23 14:34:27.926234 master-0 kubenswrapper[28758]: W0223 14:34:27.919875 28758 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 23 14:34:27.926234 master-0 kubenswrapper[28758]: W0223 14:34:27.919879 28758 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 23 14:34:27.926234 master-0 kubenswrapper[28758]: W0223 14:34:27.919883 28758 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 23 14:34:27.926234 master-0 kubenswrapper[28758]: W0223 14:34:27.919888 28758 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 23 14:34:27.926824 master-0 kubenswrapper[28758]: W0223 14:34:27.919892 28758 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 23 14:34:27.926824 master-0 kubenswrapper[28758]: W0223 14:34:27.919895 28758 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 23 14:34:27.926824 master-0 kubenswrapper[28758]: W0223 14:34:27.919898 28758 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 23 14:34:27.926824 master-0 kubenswrapper[28758]: W0223 14:34:27.919902 28758 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 23 14:34:27.926824 master-0 kubenswrapper[28758]: W0223 14:34:27.919906 28758 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 23 14:34:27.926824 master-0 kubenswrapper[28758]: W0223 14:34:27.919909 28758 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 23 14:34:27.926824 master-0 kubenswrapper[28758]: W0223 14:34:27.919912 28758 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 23 14:34:27.926824 master-0 kubenswrapper[28758]: W0223 14:34:27.919916 28758 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 23 14:34:27.926824 master-0 kubenswrapper[28758]: W0223 14:34:27.919919 28758 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 23 14:34:27.926824 master-0 kubenswrapper[28758]: W0223 14:34:27.919923 28758 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 23 14:34:27.926824 master-0 kubenswrapper[28758]: W0223 14:34:27.919927 28758 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 23 14:34:27.926824 master-0 kubenswrapper[28758]: W0223 14:34:27.919931 28758 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 23 14:34:27.926824 master-0 kubenswrapper[28758]: W0223 14:34:27.919936 28758 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 23 14:34:27.926824 master-0 kubenswrapper[28758]: W0223 14:34:27.919940 28758 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 23 14:34:27.926824 master-0 kubenswrapper[28758]: W0223 14:34:27.919944 28758 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 23 14:34:27.926824 master-0 kubenswrapper[28758]: W0223 14:34:27.919947 28758 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 23 14:34:27.926824 master-0 kubenswrapper[28758]: W0223 14:34:27.919950 28758 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 23 14:34:27.926824 master-0 kubenswrapper[28758]: W0223 14:34:27.919954 28758 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 23 14:34:27.926824 master-0 kubenswrapper[28758]: W0223 14:34:27.919960 28758 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 23 14:34:27.926824 master-0 kubenswrapper[28758]: W0223 14:34:27.919965 28758 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 23 14:34:27.927566 master-0 kubenswrapper[28758]: W0223 14:34:27.919968 28758 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 23 14:34:27.927566 master-0 kubenswrapper[28758]: W0223 14:34:27.919973 28758 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 23 14:34:27.927566 master-0 kubenswrapper[28758]: W0223 14:34:27.919976 28758 feature_gate.go:330] unrecognized feature gate: Example Feb 23 14:34:27.927566 master-0 kubenswrapper[28758]: I0223 14:34:27.919990 28758 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 23 14:34:27.927566 master-0 kubenswrapper[28758]: I0223 14:34:27.924466 28758 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Feb 23 14:34:27.927566 master-0 kubenswrapper[28758]: I0223 14:34:27.924594 28758 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 23 14:34:27.927566 master-0 kubenswrapper[28758]: W0223 14:34:27.924660 28758 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 23 14:34:27.927566 master-0 kubenswrapper[28758]: W0223 14:34:27.924666 28758 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 23 14:34:27.927566 master-0 kubenswrapper[28758]: W0223 14:34:27.924671 28758 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 23 14:34:27.927566 master-0 kubenswrapper[28758]: W0223 14:34:27.924687 28758 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 23 14:34:27.927566 master-0 kubenswrapper[28758]: W0223 14:34:27.924691 28758 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 23 14:34:27.927566 master-0 kubenswrapper[28758]: W0223 14:34:27.924695 28758 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 23 14:34:27.927566 master-0 kubenswrapper[28758]: W0223 14:34:27.924699 28758 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 23 14:34:27.927566 master-0 kubenswrapper[28758]: W0223 14:34:27.924703 28758 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 23 14:34:27.927566 master-0 kubenswrapper[28758]: W0223 14:34:27.924707 28758 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 23 14:34:27.928090 master-0 kubenswrapper[28758]: W0223 14:34:27.924711 28758 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 23 14:34:27.928090 master-0 kubenswrapper[28758]: W0223 14:34:27.924714 28758 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 23 14:34:27.928090 master-0 kubenswrapper[28758]: W0223 14:34:27.924718 28758 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 23 14:34:27.928090 master-0 kubenswrapper[28758]: W0223 14:34:27.924721 28758 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 23 14:34:27.928090 master-0 kubenswrapper[28758]: W0223 14:34:27.924725 28758 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 23 14:34:27.928090 master-0 kubenswrapper[28758]: W0223 14:34:27.924729 28758 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 23 14:34:27.928090 master-0 kubenswrapper[28758]: W0223 14:34:27.924733 28758 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 23 14:34:27.928090 master-0 kubenswrapper[28758]: W0223 14:34:27.924738 28758 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 23 14:34:27.928090 master-0 kubenswrapper[28758]: W0223 14:34:27.924742 28758 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 23 14:34:27.928090 master-0 kubenswrapper[28758]: W0223 14:34:27.924746 28758 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 23 14:34:27.928090 master-0 kubenswrapper[28758]: W0223 14:34:27.924750 28758 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 23 14:34:27.928090 master-0 kubenswrapper[28758]: W0223 14:34:27.924754 28758 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 23 14:34:27.928090 master-0 kubenswrapper[28758]: W0223 14:34:27.924764 28758 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 23 14:34:27.928090 master-0 kubenswrapper[28758]: W0223 14:34:27.924767 28758 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 23 14:34:27.928090 master-0 kubenswrapper[28758]: W0223 14:34:27.924775 28758 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 23 14:34:27.928090 master-0 kubenswrapper[28758]: W0223 14:34:27.924780 28758 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 23 14:34:27.928090 master-0 kubenswrapper[28758]: W0223 14:34:27.924783 28758 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 23 14:34:27.928090 master-0 kubenswrapper[28758]: W0223 14:34:27.924788 28758 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 23 14:34:27.928090 master-0 kubenswrapper[28758]: W0223 14:34:27.924792 28758 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 23 14:34:27.928090 master-0 kubenswrapper[28758]: W0223 14:34:27.924796 28758 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 23 14:34:27.928597 master-0 kubenswrapper[28758]: W0223 14:34:27.924800 28758 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Feb 23 14:34:27.928597 master-0 kubenswrapper[28758]: W0223 14:34:27.924804 28758 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 23 14:34:27.928597 master-0 kubenswrapper[28758]: W0223 14:34:27.924808 28758 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 23 14:34:27.928597 master-0 kubenswrapper[28758]: W0223 14:34:27.924811 28758 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 23 14:34:27.928597 master-0 kubenswrapper[28758]: W0223 14:34:27.924815 28758 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 23 14:34:27.928597 master-0 kubenswrapper[28758]: W0223 14:34:27.924818 28758 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 23 14:34:27.928597 master-0 kubenswrapper[28758]: W0223 14:34:27.924822 28758 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 23 14:34:27.928597 master-0 kubenswrapper[28758]: W0223 14:34:27.924825 28758 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 23 14:34:27.928597 master-0 kubenswrapper[28758]: W0223 14:34:27.924829 28758 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 23 14:34:27.928597 master-0 kubenswrapper[28758]: W0223 14:34:27.924832 28758 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 23 14:34:27.928597 master-0 kubenswrapper[28758]: W0223 14:34:27.924836 28758 feature_gate.go:330] unrecognized feature gate: Example Feb 23 14:34:27.928597 master-0 kubenswrapper[28758]: W0223 14:34:27.924839 28758 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 23 14:34:27.928597 master-0 kubenswrapper[28758]: W0223 14:34:27.924843 28758 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 23 14:34:27.928597 master-0 kubenswrapper[28758]: W0223 14:34:27.924846 28758 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 23 14:34:27.928597 master-0 kubenswrapper[28758]: W0223 14:34:27.924850 28758 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 23 14:34:27.928597 master-0 kubenswrapper[28758]: W0223 14:34:27.924853 28758 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 23 14:34:27.928597 master-0 kubenswrapper[28758]: W0223 14:34:27.924858 28758 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 23 14:34:27.928597 master-0 kubenswrapper[28758]: W0223 14:34:27.924864 28758 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 23 14:34:27.928597 master-0 kubenswrapper[28758]: W0223 14:34:27.924868 28758 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 23 14:34:27.929168 master-0 kubenswrapper[28758]: W0223 14:34:27.924872 28758 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 23 14:34:27.929168 master-0 kubenswrapper[28758]: W0223 14:34:27.924876 28758 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 23 14:34:27.929168 master-0 kubenswrapper[28758]: W0223 14:34:27.924881 28758 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 23 14:34:27.929168 master-0 kubenswrapper[28758]: W0223 14:34:27.924885 28758 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 23 14:34:27.929168 master-0 kubenswrapper[28758]: W0223 14:34:27.924888 28758 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 23 14:34:27.929168 master-0 kubenswrapper[28758]: W0223 14:34:27.924892 28758 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 23 14:34:27.929168 master-0 kubenswrapper[28758]: W0223 14:34:27.924903 28758 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 23 14:34:27.929168 master-0 kubenswrapper[28758]: W0223 14:34:27.924907 28758 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 23 14:34:27.929168 master-0 kubenswrapper[28758]: W0223 14:34:27.924910 28758 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 23 14:34:27.929168 master-0 kubenswrapper[28758]: W0223 14:34:27.924914 28758 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 23 14:34:27.929168 master-0 kubenswrapper[28758]: W0223 14:34:27.924917 28758 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 23 14:34:27.929168 master-0 kubenswrapper[28758]: W0223 14:34:27.924921 28758 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 23 14:34:27.929168 master-0 kubenswrapper[28758]: W0223 14:34:27.924925 28758 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 23 14:34:27.929168 master-0 kubenswrapper[28758]: W0223 14:34:27.924930 28758 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 23 14:34:27.929168 master-0 kubenswrapper[28758]: W0223 14:34:27.924939 28758 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 23 14:34:27.929168 master-0 kubenswrapper[28758]: W0223 14:34:27.924947 28758 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 23 14:34:27.929168 master-0 kubenswrapper[28758]: W0223 14:34:27.924952 28758 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 23 14:34:27.929168 master-0 kubenswrapper[28758]: W0223 14:34:27.924957 28758 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 23 14:34:27.929168 master-0 kubenswrapper[28758]: W0223 14:34:27.924961 28758 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 23 14:34:27.929168 master-0 kubenswrapper[28758]: W0223 14:34:27.924967 28758 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 23 14:34:27.929661 master-0 kubenswrapper[28758]: W0223 14:34:27.924971 28758 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 23 14:34:27.929661 master-0 kubenswrapper[28758]: W0223 14:34:27.924976 28758 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 23 14:34:27.929661 master-0 kubenswrapper[28758]: W0223 14:34:27.924980 28758 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 23 14:34:27.929661 master-0 kubenswrapper[28758]: W0223 14:34:27.924984 28758 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 23 14:34:27.929661 master-0 kubenswrapper[28758]: I0223 14:34:27.924990 28758 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 23 14:34:27.929661 master-0 kubenswrapper[28758]: W0223 14:34:27.925124 28758 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 23 14:34:27.929661 master-0 kubenswrapper[28758]: W0223 14:34:27.925131 28758 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 23 14:34:27.929661 master-0 kubenswrapper[28758]: W0223 14:34:27.925135 28758 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 23 14:34:27.929661 master-0 kubenswrapper[28758]: W0223 14:34:27.925140 28758 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 23 14:34:27.929661 master-0 kubenswrapper[28758]: W0223 14:34:27.925145 28758 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 23 14:34:27.929661 master-0 kubenswrapper[28758]: W0223 14:34:27.925149 28758 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 23 14:34:27.929661 master-0 kubenswrapper[28758]: W0223 14:34:27.925153 28758 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 23 14:34:27.929661 master-0 kubenswrapper[28758]: W0223 14:34:27.925158 28758 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 23 14:34:27.929661 master-0 kubenswrapper[28758]: W0223 14:34:27.925166 28758 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Feb 23 14:34:27.929661 master-0 kubenswrapper[28758]: W0223 14:34:27.925173 28758 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 23 14:34:27.930025 master-0 kubenswrapper[28758]: W0223 14:34:27.925178 28758 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 23 14:34:27.930025 master-0 kubenswrapper[28758]: W0223 14:34:27.925182 28758 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 23 14:34:27.930025 master-0 kubenswrapper[28758]: W0223 14:34:27.925187 28758 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 23 14:34:27.930025 master-0 kubenswrapper[28758]: W0223 14:34:27.925192 28758 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 23 14:34:27.930025 master-0 kubenswrapper[28758]: W0223 14:34:27.925197 28758 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 23 14:34:27.930025 master-0 kubenswrapper[28758]: W0223 14:34:27.925201 28758 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 23 14:34:27.930025 master-0 kubenswrapper[28758]: W0223 14:34:27.925204 28758 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 23 14:34:27.930025 master-0 kubenswrapper[28758]: W0223 14:34:27.925208 28758 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 23 14:34:27.930025 master-0 kubenswrapper[28758]: W0223 14:34:27.925212 28758 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 23 14:34:27.930025 master-0 kubenswrapper[28758]: W0223 14:34:27.925216 28758 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 23 14:34:27.930025 master-0 kubenswrapper[28758]: W0223 14:34:27.925219 28758 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 23 14:34:27.930025 master-0 kubenswrapper[28758]: W0223 14:34:27.925224 28758 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 23 14:34:27.930025 master-0 kubenswrapper[28758]: W0223 14:34:27.925228 28758 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 23 14:34:27.930025 master-0 kubenswrapper[28758]: W0223 14:34:27.925233 28758 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 23 14:34:27.930025 master-0 kubenswrapper[28758]: W0223 14:34:27.925237 28758 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 23 14:34:27.930025 master-0 kubenswrapper[28758]: W0223 14:34:27.925242 28758 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 23 14:34:27.930025 master-0 kubenswrapper[28758]: W0223 14:34:27.925246 28758 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 23 14:34:27.930025 master-0 kubenswrapper[28758]: W0223 14:34:27.925253 28758 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 23 14:34:27.930025 master-0 kubenswrapper[28758]: W0223 14:34:27.925258 28758 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 23 14:34:27.930739 master-0 kubenswrapper[28758]: W0223 14:34:27.925264 28758 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 23 14:34:27.930739 master-0 kubenswrapper[28758]: W0223 14:34:27.925268 28758 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 23 14:34:27.930739 master-0 kubenswrapper[28758]: W0223 14:34:27.925273 28758 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 23 14:34:27.930739 master-0 kubenswrapper[28758]: W0223 14:34:27.925278 28758 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 23 14:34:27.930739 master-0 kubenswrapper[28758]: W0223 14:34:27.925283 28758 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 23 14:34:27.930739 master-0 kubenswrapper[28758]: W0223 14:34:27.925296 28758 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 23 14:34:27.930739 master-0 kubenswrapper[28758]: W0223 14:34:27.925301 28758 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 23 14:34:27.930739 master-0 kubenswrapper[28758]: W0223 14:34:27.925305 28758 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 23 14:34:27.930739 master-0 kubenswrapper[28758]: W0223 14:34:27.925310 28758 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 23 14:34:27.930739 master-0 kubenswrapper[28758]: W0223 14:34:27.925314 28758 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 23 14:34:27.930739 master-0 kubenswrapper[28758]: W0223 14:34:27.925318 28758 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 23 14:34:27.930739 master-0 kubenswrapper[28758]: W0223 14:34:27.925323 28758 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 23 14:34:27.930739 master-0 kubenswrapper[28758]: W0223 14:34:27.925327 28758 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 23 14:34:27.930739 master-0 kubenswrapper[28758]: W0223 14:34:27.925332 28758 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 23 14:34:27.930739 master-0 kubenswrapper[28758]: W0223 14:34:27.925337 28758 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 23 14:34:27.930739 master-0 kubenswrapper[28758]: W0223 14:34:27.925341 28758 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 23 14:34:27.930739 master-0 kubenswrapper[28758]: W0223 14:34:27.925346 28758 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 23 14:34:27.930739 master-0 kubenswrapper[28758]: W0223 14:34:27.925351 28758 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 23 14:34:27.930739 master-0 kubenswrapper[28758]: W0223 14:34:27.925355 28758 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 23 14:34:27.930739 master-0 kubenswrapper[28758]: W0223 14:34:27.925359 28758 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 23 14:34:27.931398 master-0 kubenswrapper[28758]: W0223 14:34:27.925364 28758 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 23 14:34:27.931398 master-0 kubenswrapper[28758]: W0223 14:34:27.925370 28758 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 23 14:34:27.931398 master-0 kubenswrapper[28758]: W0223 14:34:27.925377 28758 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 23 14:34:27.931398 master-0 kubenswrapper[28758]: W0223 14:34:27.925382 28758 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 23 14:34:27.931398 master-0 kubenswrapper[28758]: W0223 14:34:27.925386 28758 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 23 14:34:27.931398 master-0 kubenswrapper[28758]: W0223 14:34:27.925391 28758 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 23 14:34:27.931398 master-0 kubenswrapper[28758]: W0223 14:34:27.925395 28758 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 23 14:34:27.931398 master-0 kubenswrapper[28758]: W0223 14:34:27.925399 28758 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 23 14:34:27.931398 master-0 kubenswrapper[28758]: W0223 14:34:27.925402 28758 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 23 14:34:27.931398 master-0 kubenswrapper[28758]: W0223 14:34:27.925406 28758 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 23 14:34:27.931398 master-0 kubenswrapper[28758]: W0223 14:34:27.925411 28758 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 23 14:34:27.931398 master-0 kubenswrapper[28758]: W0223 14:34:27.925416 28758 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 23 14:34:27.931398 master-0 kubenswrapper[28758]: W0223 14:34:27.925422 28758 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 23 14:34:27.931398 master-0 kubenswrapper[28758]: W0223 14:34:27.925427 28758 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 23 14:34:27.931398 master-0 kubenswrapper[28758]: W0223 14:34:27.925431 28758 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 23 14:34:27.931398 master-0 kubenswrapper[28758]: W0223 14:34:27.925436 28758 feature_gate.go:330] unrecognized feature gate: Example Feb 23 14:34:27.931398 master-0 kubenswrapper[28758]: W0223 14:34:27.925441 28758 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 23 14:34:27.931398 master-0 kubenswrapper[28758]: W0223 14:34:27.925446 28758 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 23 14:34:27.931398 master-0 kubenswrapper[28758]: W0223 14:34:27.925450 28758 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 23 14:34:27.931398 master-0 kubenswrapper[28758]: W0223 14:34:27.925455 28758 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 23 14:34:27.931978 master-0 kubenswrapper[28758]: W0223 14:34:27.925459 28758 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 23 14:34:27.931978 master-0 kubenswrapper[28758]: W0223 14:34:27.925464 28758 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 23 14:34:27.931978 master-0 kubenswrapper[28758]: W0223 14:34:27.925469 28758 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 23 14:34:27.931978 master-0 kubenswrapper[28758]: I0223 14:34:27.925500 28758 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 23 14:34:27.931978 master-0 kubenswrapper[28758]: I0223 14:34:27.925668 28758 server.go:940] "Client rotation is on, will bootstrap in background" Feb 23 14:34:27.931978 master-0 kubenswrapper[28758]: I0223 14:34:27.927200 28758 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 23 14:34:27.931978 master-0 kubenswrapper[28758]: I0223 14:34:27.927305 28758 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 23 14:34:27.931978 master-0 kubenswrapper[28758]: I0223 14:34:27.927595 28758 server.go:997] "Starting client certificate rotation" Feb 23 14:34:27.931978 master-0 kubenswrapper[28758]: I0223 14:34:27.927606 28758 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 23 14:34:27.931978 master-0 kubenswrapper[28758]: I0223 14:34:27.927765 28758 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 14:08:38 +0000 UTC, rotation deadline is 2026-02-24 08:35:33.220192186 +0000 UTC Feb 23 14:34:27.931978 master-0 kubenswrapper[28758]: I0223 14:34:27.927849 28758 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 18h1m5.292345767s for next certificate rotation Feb 23 14:34:27.932318 master-0 kubenswrapper[28758]: I0223 14:34:27.928349 28758 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 23 14:34:27.932318 master-0 kubenswrapper[28758]: I0223 14:34:27.929687 28758 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 23 14:34:27.933186 master-0 kubenswrapper[28758]: I0223 14:34:27.932934 28758 log.go:25] "Validated CRI v1 runtime API" Feb 23 14:34:27.937227 master-0 kubenswrapper[28758]: I0223 14:34:27.937187 28758 log.go:25] "Validated CRI v1 image API" Feb 23 14:34:27.938615 master-0 kubenswrapper[28758]: I0223 14:34:27.938032 28758 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 23 14:34:27.970375 master-0 kubenswrapper[28758]: I0223 14:34:27.970298 28758 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4 be859281-f98a-48e6-a6b4-cc97afbc917c:/dev/vda3] Feb 23 14:34:27.972921 master-0 kubenswrapper[28758]: I0223 14:34:27.970365 28758 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0b6059c95ec8023c3749bdce17c5d7c3d4cfc4af6e64639a41582908eb86d4e6/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0b6059c95ec8023c3749bdce17c5d7c3d4cfc4af6e64639a41582908eb86d4e6/userdata/shm major:0 minor:41 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/121fb1d62a402b22b2ce0dcefcc58af76b44ad548fdacc6da5113c93b5d1d4e0/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/121fb1d62a402b22b2ce0dcefcc58af76b44ad548fdacc6da5113c93b5d1d4e0/userdata/shm major:0 minor:359 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1318491cbf4a9852637e1a59a321f0824086291a3e80867692cb4a5b349fa4cf/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1318491cbf4a9852637e1a59a321f0824086291a3e80867692cb4a5b349fa4cf/userdata/shm major:0 minor:93 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/14329fd568a14f04f43b97498ab954734f3a702059891d8fa97640d70060f640/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/14329fd568a14f04f43b97498ab954734f3a702059891d8fa97640d70060f640/userdata/shm major:0 minor:809 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/15c8631804d0c1c71f4b64c5ee4ec4990f2c2f6adda4a03d015df366f3b28fd1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/15c8631804d0c1c71f4b64c5ee4ec4990f2c2f6adda4a03d015df366f3b28fd1/userdata/shm major:0 minor:569 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/16c1ddc99b10767e840864cd1d61fae8aedde334dad20b9e34c987fd909a5a36/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/16c1ddc99b10767e840864cd1d61fae8aedde334dad20b9e34c987fd909a5a36/userdata/shm major:0 minor:757 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/18bf2f609f8efa099778779b29a09c8f72903a95132d89474102c7f4d79d3d39/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/18bf2f609f8efa099778779b29a09c8f72903a95132d89474102c7f4d79d3d39/userdata/shm major:0 minor:415 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/223e6055ac3ccbf4f5095a5d877f5de6b8592fbf41a24e3985f9a14f56619a70/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/223e6055ac3ccbf4f5095a5d877f5de6b8592fbf41a24e3985f9a14f56619a70/userdata/shm major:0 minor:876 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2421d3a73005cb81482c335c859228ad362145060855ec39106d04eb50279bdb/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2421d3a73005cb81482c335c859228ad362145060855ec39106d04eb50279bdb/userdata/shm major:0 minor:870 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/244c9349c0c82d28b67e2cfc680e10b4528e1ddb2f6ad558456c92eee9746fa9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/244c9349c0c82d28b67e2cfc680e10b4528e1ddb2f6ad558456c92eee9746fa9/userdata/shm major:0 minor:874 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2a52d8e1940b8a601f24fbfede361672aeb32a5195856b935f21043b52b85ae5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2a52d8e1940b8a601f24fbfede361672aeb32a5195856b935f21043b52b85ae5/userdata/shm major:0 minor:807 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/305f42f52b6ba5ef239c92f6ac8cee0e2721fbe74d1ef92a70428b3a6fabdd04/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/305f42f52b6ba5ef239c92f6ac8cee0e2721fbe74d1ef92a70428b3a6fabdd04/userdata/shm major:0 minor:549 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/38ec00e9dfbef6fee1166da5b097f1e6a12696d48f32303b497f0b2d760141c8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/38ec00e9dfbef6fee1166da5b097f1e6a12696d48f32303b497f0b2d760141c8/userdata/shm major:0 minor:1225 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/390cfd29e9f1dac7e5d17a7f7165d182236f5a201c52a8221fd54d5117d708f7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/390cfd29e9f1dac7e5d17a7f7165d182236f5a201c52a8221fd54d5117d708f7/userdata/shm major:0 minor:430 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3dbe1f3d3698f2e251e24d454f894aefdf798ceecbb606aa9dd5f9be4602195a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3dbe1f3d3698f2e251e24d454f894aefdf798ceecbb606aa9dd5f9be4602195a/userdata/shm major:0 minor:507 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3debae3963d7ed5cbda602b7ac89dc5c4d861ae9dad9b89fb0b3fcce27f1aad1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3debae3963d7ed5cbda602b7ac89dc5c4d861ae9dad9b89fb0b3fcce27f1aad1/userdata/shm major:0 minor:1131 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4257fe78462bb2b5b2d39786788f5521c3464b4b4bf8cf481be2dae32881a79a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4257fe78462bb2b5b2d39786788f5521c3464b4b4bf8cf481be2dae32881a79a/userdata/shm major:0 minor:384 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/475b3c76ad7ac657e1ef59565d052b44742cf128419941b4feb55cbb0d636474/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/475b3c76ad7ac657e1ef59565d052b44742cf128419941b4feb55cbb0d636474/userdata/shm major:0 minor:881 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4782d187d8efc0b4014aa50653963e17b661187c5f36601036516cb2857a5d98/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4782d187d8efc0b4014aa50653963e17b661187c5f36601036516cb2857a5d98/userdata/shm major:0 minor:295 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4ae19cabdf4e15b9983be578ad7a63be61278ebfe49db1eb9827bad0d8d1a242/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4ae19cabdf4e15b9983be578ad7a63be61278ebfe49db1eb9827bad0d8d1a242/userdata/shm major:0 minor:750 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4b7d2c8100142f929dc133ef3a280566ae721c684f05389d72d0b6d99271f228/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4b7d2c8100142f929dc133ef3a280566ae721c684f05389d72d0b6d99271f228/userdata/shm major:0 minor:275 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4c87997a2f68dd1175880f954711e05356d4ed55a7a6f3583b752f9d11da5e55/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4c87997a2f68dd1175880f954711e05356d4ed55a7a6f3583b752f9d11da5e55/userdata/shm major:0 minor:883 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4f93047dc0b7cb7f4f7c771225dde60d727738bda2832af456ff04f11ecb402a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4f93047dc0b7cb7f4f7c771225dde60d727738bda2832af456ff04f11ecb402a/userdata/shm major:0 minor:285 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5110b129f87dd0c4cfa0060a0c853f8887b553680a908511fa6dc6b38b84e26d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5110b129f87dd0c4cfa0060a0c853f8887b553680a908511fa6dc6b38b84e26d/userdata/shm major:0 minor:266 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/57996809f1e2dec5f618cc991b1ec9797922b627eb03d04dabd6bb6cb4205117/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/57996809f1e2dec5f618cc991b1ec9797922b627eb03d04dabd6bb6cb4205117/userdata/shm major:0 minor:50 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5b1e3102064a2333a33694b441523235b1896dd8c0ad7164b8c2f46c1cc4e9c2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5b1e3102064a2333a33694b441523235b1896dd8c0ad7164b8c2f46c1cc4e9c2/userdata/shm major:0 minor:833 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5c45c6ef7e4b05f37927f974cf6cb4b129b6dd3a04cd82a3e97ac1e29ceb5010/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5c45c6ef7e4b05f37927f974cf6cb4b129b6dd3a04cd82a3e97ac1e29ceb5010/userdata/shm major:0 minor:835 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/601970e99fee05d1ddde3baeb681b21e539729838cc176b833fde61b155a74a5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/601970e99fee05d1ddde3baeb681b21e539729838cc176b833fde61b155a74a5/userdata/shm major:0 minor:550 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6d0f92f8c3e4f5b782259d4379958d9e827827d44cda6b952bb56160c213bbf8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6d0f92f8c3e4f5b782259d4379958d9e827827d44cda6b952bb56160c213bbf8/userdata/shm major:0 minor:533 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6ffc0e356ee8d2e23632fe04da113c89cd2bff5243dd2b5c07a151a546ba49d8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6ffc0e356ee8d2e23632fe04da113c89cd2bff5243dd2b5c07a151a546ba49d8/userdata/shm major:0 minor:269 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/71856c04f28ed0a4e9a36c70729f2e0d164816c342db7fab0a6d5f76b0f61b6a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/71856c04f28ed0a4e9a36c70729f2e0d164816c342db7fab0a6d5f76b0f61b6a/userdata/shm major:0 minor:379 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7644d6b4dd6d2352356f500ef21c6602c372872ba1a236023043ba253ba34314/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7644d6b4dd6d2352356f500ef21c6602c372872ba1a236023043ba253ba34314/userdata/shm major:0 minor:326 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7745fe383c3438f3eb713290ae29bc45137b7df8f820bdc331981eebbfe561fe/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7745fe383c3438f3eb713290ae29bc45137b7df8f820bdc331981eebbfe561fe/userdata/shm major:0 minor:77 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/77590dc8fba389fa97fd0b176ea4707c8bfaef0fd399e113347fbdf7415d0d0f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/77590dc8fba389fa97fd0b176ea4707c8bfaef0fd399e113347fbdf7415d0d0f/userdata/shm major:0 minor:92 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7a94361576154416139d60324d6f01b1540eacf16a8dedc989cadf9cc6e41fca/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7a94361576154416139d60324d6f01b1540eacf16a8dedc989cadf9cc6e41fca/userdata/shm major:0 minor:282 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7c2bb3b30fb024eb641e1765113a1a36bccda4c627461f72aa312212e851ddb2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7c2bb3b30fb024eb641e1765113a1a36bccda4c627461f72aa312212e851ddb2/userdata/shm major:0 minor:839 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7d9debfc99355a24383e4ffd764682011042ebcd62151bc7e6d7e61d3c2be56f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7d9debfc99355a24383e4ffd764682011042ebcd62151bc7e6d7e61d3c2be56f/userdata/shm major:0 minor:124 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/82317c2ff886c91d683fc1357282b875ededf218f4ed66da91969784b2202cd9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/82317c2ff886c91d683fc1357282b875ededf218f4ed66da91969784b2202cd9/userdata/shm major:0 minor:1139 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/842c59a633c6726baab1699104248bceff992214333b768aa99b1550ee1de3d0/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/842c59a633c6726baab1699104248bceff992214333b768aa99b1550ee1de3d0/userdata/shm major:0 minor:878 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/867a47d4a06c655239027935fd0111c0fd83a1e1a4a4c825f97faccc95bc37fc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/867a47d4a06c655239027935fd0111c0fd83a1e1a4a4c825f97faccc95bc37fc/userdata/shm major:0 minor:548 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8a899f46b5ae367f29ecac877a3d8b6b2ea9e0cf04f3dc088df5a7ab7fffcc36/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8a899f46b5ae367f29ecac877a3d8b6b2ea9e0cf04f3dc088df5a7ab7fffcc36/userdata/shm major:0 minor:1328 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8bce00dde4bf57f38bea21a54eaeb5445e9a6797bd70cd70ab2f40465ffb6015/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8bce00dde4bf57f38bea21a54eaeb5445e9a6797bd70cd70ab2f40465ffb6015/userdata/shm major:0 minor:273 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9307bb7ee156f4761943094e1eb907a68e217cea3a83d35051b952c84a004e40/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9307bb7ee156f4761943094e1eb907a68e217cea3a83d35051b952c84a004e40/userdata/shm major:0 minor:488 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/94e94715e4a9a7ea0bdeab74580c1cabb71e05248b0269144d5616aa9022f9eb/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/94e94715e4a9a7ea0bdeab74580c1cabb71e05248b0269144d5616aa9022f9eb/userdata/shm major:0 minor:82 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/97f199fa26d5c3158a89f49cff2f70c3039a2be9fc4ad7fe0571f7a519be854c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/97f199fa26d5c3158a89f49cff2f70c3039a2be9fc4ad7fe0571f7a519be854c/userdata/shm major:0 minor:362 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/984304e1b4252b7619a58df9f7ce55ca2014852517f80186c3411dc4b687d274/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/984304e1b4252b7619a58df9f7ce55ca2014852517f80186c3411dc4b687d274/userdata/shm major:0 minor:144 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9c6c5f4b9ba45ac61b51f9857ceb74fc6b905bb2bdd1312940fdeb330ace9d7f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9c6c5f4b9ba45ac61b51f9857ceb74fc6b905bb2bdd1312940fdeb330ace9d7f/userdata/shm major:0 minor:261 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a5727697c2b4cf38a7045ae8edfe9cf2a413bc9b589c95f8630aa7de7ef3ba40/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a5727697c2b4cf38a7045ae8edfe9cf2a413bc9b589c95f8630aa7de7ef3ba40/userdata/shm major:0 minor:837 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a74df791e2285ece031ddb2cb6a548b32c5f641cf114501941f1933c7809fad4/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a74df791e2285ece031ddb2cb6a548b32c5f641cf114501941f1933c7809fad4/userdata/shm major:0 minor:595 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a763a9aa12dde6c52d5c6991687ebd101bd47550719a37c47c1a30d449928cff/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a763a9aa12dde6c52d5c6991687ebd101bd47550719a37c47c1a30d449928cff/userdata/shm major:0 minor:168 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a928f690a2e58a25ba69277c1852026731fa14cc1f9743eea2995395d98f0871/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a928f690a2e58a25ba69277c1852026731fa14cc1f9743eea2995395d98f0871/userdata/shm major:0 minor:417 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a97726e86565351c3e74221a112a0906c73bf937319f30cac1b3e4b4f38e404f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a97726e86565351c3e74221a112a0906c73bf937319f30cac1b3e4b4f38e404f/userdata/shm major:0 minor:264 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b0f1382249dc5f24b8f4811073e190383453a7404f8296e621cf4a7e56c21042/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b0f1382249dc5f24b8f4811073e190383453a7404f8296e621cf4a7e56c21042/userdata/shm major:0 minor:301 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b4132c8230caf30ada71198ad6ab1bfac93f4aab775d0d4c1263153a8363aaf9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b4132c8230caf30ada71198ad6ab1bfac93f4aab775d0d4c1263153a8363aaf9/userdata/shm major:0 minor:786 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b51483fb30eb125aa1ba7d4f431cb050c71a85528244347d0f1ad28b65c42bd5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b51483fb30eb125aa1ba7d4f431cb050c71a85528244347d0f1ad28b65c42bd5/userdata/shm major:0 minor:520 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b6510ef0b5fc51e782ccd9549d8a6adfc4072f0d6db015ee20beaf2f6eb3bcaa/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b6510ef0b5fc51e782ccd9549d8a6adfc4072f0d6db015ee20beaf2f6eb3bcaa/userdata/shm major:0 minor:1180 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b6a377cd77390650f0232d49deb3c30f98f3070caae925837c6c5aeeec6246e5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b6a377cd77390650f0232d49deb3c30f98f3070caae925837c6c5aeeec6246e5/userdata/shm major:0 minor:1137 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b960fc2af9e3400ec5c9c6469cfc9540631ffe8f2ef43e226085fb14e2ada0b8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b960fc2af9e3400ec5c9c6469cfc9540631ffe8f2ef43e226085fb14e2ada0b8/userdata/shm major:0 minor:1293 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b96554c0b26af60fc366d01cdb0653dfe860650b866d469b8eb85b8f7a39e783/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b96554c0b26af60fc366d01cdb0653dfe860650b866d469b8eb85b8f7a39e783/userdata/shm major:0 minor:766 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ba80f8cbf4454b204ee21a5520078d48d5261a99279a142cb4f152e1edc60436/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ba80f8cbf4454b204ee21a5520078d48d5261a99279a142cb4f152e1edc60436/userdata/shm major:0 minor:1281 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/bf4e70417a5730a71d5c5227bc5ca324709a18d10a43505d1415f3e27a32b0fc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/bf4e70417a5730a71d5c5227bc5ca324709a18d10a43505d1415f3e27a32b0fc/userdata/shm major:0 minor:279 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d25cb0e087422893459bc0facfa4b23104f61b45c1b3b866e5b4e0e0ad019f99/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d25cb0e087422893459bc0facfa4b23104f61b45c1b3b866e5b4e0e0ad019f99/userdata/shm major:0 minor:504 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d4eec9eade1a6fd9bfe0d642fe3ae425b01a962b7129ee11f0681674274aaff6/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d4eec9eade1a6fd9bfe0d642fe3ae425b01a962b7129ee11f0681674274aaff6/userdata/shm major:0 minor:546 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d9d7c0c01b5302e99d057b82e04bc20f6aaa2ecd35612cea6195a17dbb1d878e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d9d7c0c01b5302e99d057b82e04bc20f6aaa2ecd35612cea6195a17dbb1d878e/userdata/shm major:0 minor:797 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/dac97420bb9e3883db6238fff45b69e331260668444b09a65159744c334f79d2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/dac97420bb9e3883db6238fff45b69e331260668444b09a65159744c334f79d2/userdata/shm major:0 minor:759 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e209b32301611ace99d9d8f60b3c7574bcb7691d3f24d73da6cbdd55987d8c54/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e209b32301611ace99d9d8f60b3c7574bcb7691d3f24d73da6cbdd55987d8c54/userdata/shm major:0 minor:420 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e75c63f7eeabff918836bbbeb9c11ed440bed473107c3e0b28076ddfdf91aadb/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e75c63f7eeabff918836bbbeb9c11ed440bed473107c3e0b28076ddfdf91aadb/userdata/shm major:0 minor:1146 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e8da33da933e20232c5b6f5c3675ee250e9d8a32fcabaead70736dd1e091c691/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e8da33da933e20232c5b6f5c3675ee250e9d8a32fcabaead70736dd1e091c691/userdata/shm major:0 minor:1041 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/edfafba30f67b299b61cf6429b5bf8b47c050040f18802ede0a7f2834a957ae9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/edfafba30f67b299b61cf6429b5bf8b47c050040f18802ede0a7f2834a957ae9/userdata/shm major:0 minor:551 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f2696aa250be24ef04b3fabb47f7471ddc013ced2adb7ee07e74a3053e3dcc2e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f2696aa250be24ef04b3fabb47f7471ddc013ced2adb7ee07e74a3053e3dcc2e/userdata/shm major:0 minor:677 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f5da67148a68052c542cf85f2d066448d7345b96cbb5647569d62bb97b2af2b1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f5da67148a68052c542cf85f2d066448d7345b96cbb5647569d62bb97b2af2b1/userdata/shm major:0 minor:1213 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f850a3ee886935c4dd2d0266e97d2bc00c30e8e88c1475292224ac9d98f6501e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f850a3ee886935c4dd2d0266e97d2bc00c30e8e88c1475292224ac9d98f6501e/userdata/shm major:0 minor:145 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f9fabe4de8507d0278903a966443b61784dc222f54713517ea295798fc992f95/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f9fabe4de8507d0278903a966443b61784dc222f54713517ea295798fc992f95/userdata/shm major:0 minor:1211 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fb44bfa273a0390e40795165f46ee3660a2a5c93ba6fcc3ac327138fc4e69610/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fb44bfa273a0390e40795165f46ee3660a2a5c93ba6fcc3ac327138fc4e69610/userdata/shm major:0 minor:131 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fcec922662159dc1cf38c675599685e8c305a9fc3cb374ca7d731b92354b4d60/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fcec922662159dc1cf38c675599685e8c305a9fc3cb374ca7d731b92354b4d60/userdata/shm major:0 minor:299 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fe38a11f2899f2913cfd5201bad475af2fb0c867e6d00537cbb69269270c3e16/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fe38a11f2899f2913cfd5201bad475af2fb0c867e6d00537cbb69269270c3e16/userdata/shm major:0 minor:422 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fee0bde3d0eee2f0bc5f9cbe5f3f907b178692716f3f6aef77b4bea08c864506/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fee0bde3d0eee2f0bc5f9cbe5f3f907b178692716f3f6aef77b4bea08c864506/userdata/shm major:0 minor:418 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0315476e-7140-4777-8061-9cead4c92024/volumes/kubernetes.io~projected/kube-api-access-jtp7w:{mountpoint:/var/lib/kubelet/pods/0315476e-7140-4777-8061-9cead4c92024/volumes/kubernetes.io~projected/kube-api-access-jtp7w major:0 minor:864 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0315476e-7140-4777-8061-9cead4c92024/volumes/kubernetes.io~secret/apiservice-cert:{mountpoint:/var/lib/kubelet/pods/0315476e-7140-4777-8061-9cead4c92024/volumes/kubernetes.io~secret/apiservice-cert major:0 minor:862 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0315476e-7140-4777-8061-9cead4c92024/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/0315476e-7140-4777-8061-9cead4c92024/volumes/kubernetes.io~secret/webhook-cert major:0 minor:863 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/06bde94a-3126-4d0f-baba-49dc5fbec61b/volumes/kubernetes.io~projected/kube-api-access-2p6hn:{mountpoint:/var/lib/kubelet/pods/06bde94a-3126-4d0f-baba-49dc5fbec61b/volumes/kubernetes.io~projected/kube-api-access-2p6hn major:0 minor:1145 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/06bde94a-3126-4d0f-baba-49dc5fbec61b/volumes/kubernetes.io~secret/default-certificate:{mountpoint:/var/lib/kubelet/pods/06bde94a-3126-4d0f-baba-49dc5fbec61b/volumes/kubernetes.io~secret/default-certificate major:0 minor:1142 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/06bde94a-3126-4d0f-baba-49dc5fbec61b/volumes/kubernetes.io~secret/metrics-certs:{mountpoint:/var/lib/kubelet/pods/06bde94a-3126-4d0f-baba-49dc5fbec61b/volumes/kubernetes.io~secret/metrics-certs major:0 minor:1143 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/06bde94a-3126-4d0f-baba-49dc5fbec61b/volumes/kubernetes.io~secret/stats-auth:{mountpoint:/var/lib/kubelet/pods/06bde94a-3126-4d0f-baba-49dc5fbec61b/volumes/kubernetes.io~secret/stats-auth major:0 minor:1144 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/08c561b3-613b-425f-9de4-d5fc8762ea51/volumes/kubernetes.io~projected/kube-api-access-phmkf:{mountpoint:/var/lib/kubelet/pods/08c561b3-613b-425f-9de4-d5fc8762ea51/volumes/kubernetes.io~projected/kube-api-access-phmkf major:0 minor:277 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/09d80e28-0b64-4c5d-a9bc-99d843d40165/volumes/kubernetes.io~projected/kube-api-access-g9z2f:{mountpoint:/var/lib/kubelet/pods/09d80e28-0b64-4c5d-a9bc-99d843d40165/volumes/kubernetes.io~projected/kube-api-access-g9z2f major:0 minor:113 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0cebb80d-d898-44c8-82b3-1e18833cee3f/volumes/kubernetes.io~projected/kube-api-access-44599:{mountpoint:/var/lib/kubelet/pods/0cebb80d-d898-44c8-82b3-1e18833cee3f/volumes/kubernetes.io~projected/kube-api-access-44599 major:0 minor:829 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0cebb80d-d898-44c8-82b3-1e18833cee3f/volumes/kubernetes.io~secret/profile-collector-cert:{mountpoint:/var/lib/kubelet/pods/0cebb80d-d898-44c8-82b3-1e18833cee3f/volumes/kubernetes.io~secret/profile-collector-cert major:0 minor:821 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0cebb80d-d898-44c8-82b3-1e18833cee3f/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/0cebb80d-d898-44c8-82b3-1e18833cee3f/volumes/kubernetes.io~secret/srv-cert major:0 minor:818 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/12b256b7-a57b-4124-8452-25e74cfa7926/volumes/kubernetes.io~projected/kube-api-access-2rg9g:{mountpoint:/var/lib/kubelet/pods/12b256b7-a57b-4124-8452-25e74cfa7926/volumes/kubernetes.io~projected/kube-api-access-2rg9g major:0 minor:805 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/12b256b7-a57b-4124-8452-25e74cfa7926/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/12b256b7-a57b-4124-8452-25e74cfa7926/volumes/kubernetes.io~secret/cert major:0 minor:803 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/12b256b7-a57b-4124-8452-25e74cfa7926/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls:{mountpoint:/var/lib/kubelet/pods/12b256b7-a57b-4124-8452-25e74cfa7926/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls major:0 minor:494 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/15ad7f4e-44c6-4426-8b97-c47a47786544/volumes/kubernetes.io~projected/kube-api-access-94ddm:{mountpoint:/var/lib/kubelet/pods/15ad7f4e-44c6-4426-8b97-c47a47786544/volumes/kubernetes.io~projected/kube-api-access-94ddm major:0 minor:1114 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/15ad7f4e-44c6-4426-8b97-c47a47786544/volumes/kubernetes.io~secret/node-exporter-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/15ad7f4e-44c6-4426-8b97-c47a47786544/volumes/kubernetes.io~secret/node-exporter-kube-rbac-proxy-config major:0 minor:1113 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/15ad7f4e-44c6-4426-8b97-c47a47786544/volumes/kubernetes.io~secret/node-exporter-tls:{mountpoint:/var/lib/kubelet/pods/15ad7f4e-44c6-4426-8b97-c47a47786544/volumes/kubernetes.io~secret/node-exporter-tls major:0 minor:1106 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/172d47fd-e1a1-4d77-9e31-c4f22e824d5f/volumes/kubernetes.io~projected/kube-api-access-9x6q2:{mountpoint:/var/lib/kubelet/pods/172d47fd-e1a1-4d77-9e31-c4f22e824d5f/volumes/kubernetes.io~projected/kube-api-access-9x6q2 major:0 minor:466 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/172d47fd-e1a1-4d77-9e31-c4f22e824d5f/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls:{mountpoint:/var/lib/kubelet/pods/172d47fd-e1a1-4d77-9e31-c4f22e824d5f/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls major:0 minor:373 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/18da400b-2271-455d-be0d-0ed44c74f78d/volumes/kubernetes.io~projected/kube-api-access-2w5kr:{mountpoint:/var/lib/kubelet/pods/18da400b-2271-455d-be0d-0ed44c74f78d/volumes/kubernetes.io~projected/kube-api-access-2w5kr major:0 minor:1168 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/18da400b-2271-455d-be0d-0ed44c74f78d/volumes/kubernetes.io~secret/prometheus-operator-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/18da400b-2271-455d-be0d-0ed44c74f78d/volumes/kubernetes.io~secret/prometheus-operator-kube-rbac-proxy-config major:0 minor:1167 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/18da400b-2271-455d-be0d-0ed44c74f78d/volumes/kubernetes.io~secret/prometheus-operator-tls:{mountpoint:/var/lib/kubelet/pods/18da400b-2271-455d-be0d-0ed44c74f78d/volumes/kubernetes.io~secret/prometheus-operator-tls major:0 minor:1169 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1a283e3a-33ba-4ef7-87d3-55ed8c953fb4/volumes/kubernetes.io~projected/kube-api-access-rr7rw:{mountpoint:/var/lib/kubelet/pods/1a283e3a-33ba-4ef7-87d3-55ed8c953fb4/volumes/kubernetes.io~projected/kube-api-access-rr7rw major:0 minor:792 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1a283e3a-33ba-4ef7-87d3-55ed8c953fb4/volumes/kubernetes.io~secret/samples-operator-tls:{mountpoint:/var/lib/kubelet/pods/1a283e3a-33ba-4ef7-87d3-55ed8c953fb4/volumes/kubernetes.io~secret/samples-operator-tls major:0 minor:506 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1c60ff3f-2bb1-422e-be27-5eca96d85fd2/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/1c60ff3f-2bb1-422e-be27-5eca96d85fd2/volumes/kubernetes.io~projected/ca-certs major:0 minor:553 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1c60ff3f-2bb1-422e-be27-5eca96d85fd2/volumes/kubernetes.io~projected/kube-api-access-jlz28:{mountpoint:/var/lib/kubelet/pods/1c60ff3f-2bb1-422e-be27-5eca96d85fd2/volumes/kubernetes.io~projected/kube-api-access-jlz28 major:0 minor:556 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/24829faf-50e8-45bb-abb0-7cc5ccf81080/volumes/kubernetes.io~projected/kube-api-access-7hp42:{mountpoint:/var/lib/kubelet/pods/24829faf-50e8-45bb-abb0-7cc5ccf81080/volumes/kubernetes.io~projected/kube-api-access-7hp42 major:0 minor:272 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/24829faf-50e8-45bb-abb0-7cc5ccf81080/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/24829faf-50e8-45bb-abb0-7cc5ccf81080/volumes/kubernetes.io~secret/serving-cert major:0 minor:244 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/255b5a89-1b89-42dc-868a-32ce67975a54/volumes/kubernetes.io~projected/kube-api-access-5nn4m:{mountpoint:/var/lib/kubelet/pods/255b5a89-1b89-42dc-868a-32ce67975a54/volumes/kubernetes.io~projected/kube-api-access-5nn4m major:0 minor:828 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/255b5a89-1b89-42dc-868a-32ce67975a54/volumes/kubernetes.io~secret/profile-collector-cert:{mountpoint:/var/lib/kubelet/pods/255b5a89-1b89-42dc-868a-32ce67975a54/volumes/kubernetes.io~secret/profile-collector-cert major:0 minor:820 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/255b5a89-1b89-42dc-868a-32ce67975a54/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/255b5a89-1b89-42dc-868a-32ce67975a54/volumes/kubernetes.io~secret/srv-cert major:0 minor:819 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2e89a047-9ebc-459b-b7b3-e902c1fb0e17/volumes/kubernetes.io~projected/kube-api-access-bhsc6:{mountpoint:/var/lib/kubelet/pods/2e89a047-9ebc-459b-b7b3-e902c1fb0e17/volumes/kubernetes.io~projected/kube-api-access-bhsc6 major:0 minor:427 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2f876e5d-2e82-47d0-8a9c-adacf2bddf77/volumes/kubernetes.io~projected/kube-api-access-pwtcj:{mountpoint:/var/lib/kubelet/pods/2f876e5d-2e82-47d0-8a9c-adacf2bddf77/volumes/kubernetes.io~projected/kube-api-access-pwtcj major:0 minor:487 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3488a7eb-5170-478c-9af7-490dbe0f514e/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/3488a7eb-5170-478c-9af7-490dbe0f514e/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:254 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3488a7eb-5170-478c-9af7-490dbe0f514e/volumes/kubernetes.io~projected/kube-api-access-6qszm:{mountpoint:/var/lib/kubelet/pods/3488a7eb-5170-478c-9af7-490dbe0f514e/volumes/kubernetes.io~projected/kube-api-access-6qszm major:0 minor:258 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3488a7eb-5170-478c-9af7-490dbe0f514e/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/3488a7eb-5170-478c-9af7-490dbe0f514e/volumes/kubernetes.io~secret/metrics-tls major:0 minor:540 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3/volumes/kubernetes.io~projected/kube-api-access-qtbcj:{mountpoint:/var/lib/kubelet/pods/3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3/volumes/kubernetes.io~projected/kube-api-access-qtbcj major:0 minor:806 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3/volumes/kubernetes.io~secret/cert major:0 minor:804 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3f86e881-275c-4387-a23a-06c559c8f1e8/volumes/kubernetes.io~projected/kube-api-access-wp8kk:{mountpoint:/var/lib/kubelet/pods/3f86e881-275c-4387-a23a-06c559c8f1e8/volumes/kubernetes.io~projected/kube-api-access-wp8kk major:0 minor:861 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4373687a-61a0-434b-81f7-3fecaa1494ef/volumes/kubernetes.io~projected/kube-api-access-wv5nj:{mountpoint:/var/lib/kubelet/pods/4373687a-61a0-434b-81f7-3fecaa1494ef/volumes/kubernetes.io~projected/kube-api-access-wv5nj major:0 minor:749 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4373687a-61a0-434b-81f7-3fecaa1494ef/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls:{mountpoint:/var/lib/kubelet/pods/4373687a-61a0-434b-81f7-3fecaa1494ef/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls major:0 minor:748 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/482284fd-6911-4ba6-8d57-7966cc51117a/volumes/kubernetes.io~projected/kube-api-access-khfkr:{mountpoint:/var/lib/kubelet/pods/482284fd-6911-4ba6-8d57-7966cc51117a/volumes/kubernetes.io~projected/kube-api-access-khfkr major:0 minor:867 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/482284fd-6911-4ba6-8d57-7966cc51117a/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/482284fd-6911-4ba6-8d57-7966cc51117a/volumes/kubernetes.io~secret/serving-cert major:0 minor:866 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/483786a0-0a29-44bf-bbd0-2f37e045aa2c/volumes/kubernetes.io~projected/kube-api-access-88qnh:{mountpoint:/var/lib/kubelet/pods/483786a0-0a29-44bf-bbd0-2f37e045aa2c/volumes/kubernetes.io~projected/kube-api-access-88qnh major:0 minor:130 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/57b57915-64dd-42f5-b06f-bc4bcc06b667/volumes/kubernetes.io~projected/kube-api-access-qggzs:{mountpoint:/var/lib/kubelet/pods/57b57915-64dd-42f5-b06f-bc4bcc06b667/volumes/kubernetes.io~projected/kube-api-access-qggzs major:0 minor:259 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/57b57915-64dd-42f5-b06f-bc4bcc06b667/volumes/kubernetes.io~secret/apiservice-cert:{mountpoint:/var/lib/kubelet/pods/57b57915-64dd-42f5-b06f-bc4bcc06b667/volumes/kubernetes.io~secret/apiservice-cert major:0 minor:413 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/57b57915-64dd-42f5-b06f-bc4bcc06b667/volumes/kubernetes.io~secret/node-tuning-operator-tls:{mountpoint:/var/lib/kubelet/pods/57b57915-64dd-42f5-b06f-bc4bcc06b667/volumes/kubernetes.io~secret/node-tuning-operator-tls major:0 minor:414 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/585f74db-4593-426b-b0c7-ec8f64810549/volumes/kubernetes.io~projected/kube-api-access-q9tkx:{mountpoint:/var/lib/kubelet/pods/585f74db-4593-426b-b0c7-ec8f64810549/volumes/kubernetes.io~projected/kube-api-access-q9tkx major:0 minor:267 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/585f74db-4593-426b-b0c7-ec8f64810549/volumes/kubernetes.io~secret/marketplace-operator-metrics:{mountpoint:/var/lib/kubelet/pods/585f74db-4593-426b-b0c7-ec8f64810549/volumes/kubernetes.io~secret/marketplace-operator-metrics major:0 minor:539 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/588a804a-430a-47f4-aa97-c08e907239da/volumes/kubernetes.io~projected/kube-api-access-hzrqz:{mountpoint:/var/lib/kubelet/pods/588a804a-430a-47f4-aa97-c08e907239da/volumes/kubernetes.io~projected/kube-api-access-hzrqz major:0 minor:647 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/588a804a-430a-47f4-aa97-c08e907239da/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/588a804a-430a-47f4-aa97-c08e907239da/volumes/kubernetes.io~secret/encryption-config major:0 minor:645 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/588a804a-430a-47f4-aa97-c08e907239da/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/588a804a-430a-47f4-aa97-c08e907239da/volumes/kubernetes.io~secret/etcd-client major:0 minor:646 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/588a804a-430a-47f4-aa97-c08e907239da/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/588a804a-430a-47f4-aa97-c08e907239da/volumes/kubernetes.io~secret/serving-cert major:0 minor:565 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5b54fc16-d2f7-4b10-a611-5b411b389c5a/volumes/kubernetes.io~projected/kube-api-access-d5f8j:{mountpoint:/var/lib/kubelet/pods/5b54fc16-d2f7-4b10-a611-5b411b389c5a/volumes/kubernetes.io~projected/kube-api-access-d5f8j major:0 minor:253 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5b54fc16-d2f7-4b10-a611-5b411b389c5a/volumes/kubernetes.io~secret/package-server-manager-serving-cert:{mountpoint:/var/lib/kubelet/pods/5b54fc16-d2f7-4b10-a611-5b411b389c5a/volumes/kubernetes.io~secret/package-server-manager-serving-cert major:0 minor:538 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5cc28e06-3542-4a25-a8b1-5f5b4ee41114/volumes/kubernetes.io~projected/kube-api-access-phzkn:{mountpoint:/var/lib/kubelet/pods/5cc28e06-3542-4a25-a8b1-5f5b4ee41114/volumes/kubernetes.io~projected/kube-api-access-phzkn major:0 minor:1032 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5cc28e06-3542-4a25-a8b1-5f5b4ee41114/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/5cc28e06-3542-4a25-a8b1-5f5b4ee41114/volumes/kubernetes.io~secret/proxy-tls major:0 minor:1015 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/607c1101-3533-43e3-9eda-13cea2b9dbb6/volumes/kubernetes.io~projected/kube-api-access-v4sbp:{mountpoint:/var/lib/kubelet/pods/607c1101-3533-43e3-9eda-13cea2b9dbb6/volumes/kubernetes.io~projected/kube-api-access-v4sbp major:0 minor:284 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/607c1101-3533-43e3-9eda-13cea2b9dbb6/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/607c1101-3533-43e3-9eda-13cea2b9dbb6/volumes/kubernetes.io~secret/metrics-tls major:0 minor:412 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/646fece3-4a42-4e0c-bcc7-5f705f948d63/volumes/kubernetes.io~projected/kube-api-access-2jzsd:{mountpoint:/var/lib/kubelet/pods/646fece3-4a42-4e0c-bcc7-5f705f948d63/volumes/kubernetes.io~projected/kube-api-access-2jzsd major:0 minor:255 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/646fece3-4a42-4e0c-bcc7-5f705f948d63/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls:{mountpoint:/var/lib/kubelet/pods/646fece3-4a42-4e0c-bcc7-5f705f948d63/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls major:0 minor:541 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/66c72c71-f74a-43ab-bf0d-1f4c93623774/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/66c72c71-f74a-43ab-bf0d-1f4c93623774/volumes/kubernetes.io~projected/ca-certs major:0 minor:555 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/66c72c71-f74a-43ab-bf0d-1f4c93623774/volumes/kubernetes.io~projected/kube-api-access-xlqzc:{mountpoint:/var/lib/kubelet/pods/66c72c71-f74a-43ab-bf0d-1f4c93623774/volumes/kubernetes.io~projected/kube-api-access-xlqzc major:0 minor:557 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/66c72c71-f74a-43ab-bf0d-1f4c93623774/volumes/kubernetes.io~secret/catalogserver-certs:{mountpoint:/var/lib/kubelet/pods/66c72c71-f74a-43ab-bf0d-1f4c93623774/volumes/kubernetes.io~secret/catalogserver-certs major:0 minor:554 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/674041a2-e2b0-4286-88cc-f1b00571e3f3/volumes/kubernetes.io~projected/kube-api-access-brd4j:{mountpoint:/var/lib/kubelet/pods/674041a2-e2b0-4286-88cc-f1b00571e3f3/volumes/kubernetes.io~projected/kube-api-access-brd4j major:0 minor:73 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/674041a2-e2b0-4286-88cc-f1b00571e3f3/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/674041a2-e2b0-4286-88cc-f1b00571e3f3/volumes/kubernetes.io~secret/metrics-tls major:0 minor:43 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6a801da1-a7eb-4187-98b8-315076f55e19/volumes/kubernetes.io~projected/kube-api-access-pqkz4:{mountpoint:/var/lib/kubelet/pods/6a801da1-a7eb-4187-98b8-315076f55e19/volumes/kubernetes.io~projected/kube-api-access-pqkz4 major:0 minor:483 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6a801da1-a7eb-4187-98b8-315076f55e19/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/6a801da1-a7eb-4187-98b8-315076f55e19/volumes/kubernetes.io~secret/metrics-tls major:0 minor:532 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/709ac071-4392-4a3f-a3d1-4bc8ba2f6236/volumes/kubernetes.io~projected/kube-api-access-6qhr9:{mountpoint:/var/lib/kubelet/pods/709ac071-4392-4a3f-a3d1-4bc8ba2f6236/volumes/kubernetes.io~projected/kube-api-access-6qhr9 major:0 minor:383 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/709ac071-4392-4a3f-a3d1-4bc8ba2f6236/volumes/kubernetes.io~secret/signing-key:{mountpoint:/var/lib/kubelet/pods/709ac071-4392-4a3f-a3d1-4bc8ba2f6236/volumes/kubernetes.io~secret/signing-key major:0 minor:348 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/76c67569-3a72-4de9-87cd-432a4607b15b/volumes/kubernetes.io~projected/kube-api-access-2jlzj:{mountpoint:/var/lib/kubelet/pods/76c67569-3a72-4de9-87cd-432a4607b15b/volumes/kubernetes.io~projected/kube-api-access-2jlzj major:0 minor:872 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/76c67569-3a72-4de9-87cd-432a4607b15b/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/76c67569-3a72-4de9-87cd-432a4607b15b/volumes/kubernetes.io~secret/proxy-tls major:0 minor:871 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/85365dec-af50-406c-b258-890e4f454c4a/volumes/kubernetes.io~projected/kube-api-access-k5d9w:{mountpoint:/var/lib/kubelet/pods/85365dec-af50-406c-b258-890e4f454c4a/volumes/kubernetes.io~projected/kube-api-access-k5d9w major:0 minor:785 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/85365dec-af50-406c-b258-890e4f454c4a/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/85365dec-af50-406c-b258-890e4f454c4a/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert major:0 minor:781 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/865ceedb-b19a-4f2f-b295-311e1b7a645e/volumes/kubernetes.io~projected/kube-api-access-tr2p2:{mountpoint:/var/lib/kubelet/pods/865ceedb-b19a-4f2f-b295-311e1b7a645e/volumes/kubernetes.io~projected/kube-api-access-tr2p2 major:0 minor:239 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/865ceedb-b19a-4f2f-b295-311e1b7a645e/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/865ceedb-b19a-4f2f-b295-311e1b7a645e/volumes/kubernetes.io~secret/serving-cert major:0 minor:235 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/87f989cd-6c19-4a30-833a-10e98b7a0326/volumes/kubernetes.io~projected/kube-api-access-wpqzn:{mountpoint:/var/lib/kubelet/pods/87f989cd-6c19-4a30-833a-10e98b7a0326/volumes/kubernetes.io~projected/kube-api-access-wpqzn major:0 minor:1292 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/87f989cd-6c19-4a30-833a-10e98b7a0326/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/87f989cd-6c19-4a30-833a-10e98b7a0326/volumes/kubernetes.io~secret/cert major:0 minor:1287 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8ca3dee6-f651-4536-991c-303752c22f07/volumes/kubernetes.io~projected/kube-api-access-g999k:{mountpoint:/var/lib/kubelet/pods/8ca3dee6-f651-4536-991c-303752c22f07/volumes/kubernetes.io~projected/kube-api-access-g999k major:0 minor:368 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8dd5fa7c-0519-4170-89c6-b369e5fc1990/volumes/kubernetes.io~projected/kube-api-access-chs7z:{mountpoint:/var/lib/kubelet/pods/8dd5fa7c-0519-4170-89c6-b369e5fc1990/volumes/kubernetes.io~projected/kube-api-access-chs7z major:0 minor:1325 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8dd5fa7c-0519-4170-89c6-b369e5fc1990/volumes/kubernetes.io~secret/webhook-certs:{mountpoint:/var/lib/kubelet/pods/8dd5fa7c-0519-4170-89c6-b369e5fc1990/volumes/kubernetes.io~secret/webhook-certs major:0 minor:1320 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8de1f285-47ac-42aa-8026-8addce656362/volumes/kubernetes.io~projected/kube-api-access-x7jvd:{mountpoint:/var/lib/kubelet/pods/8de1f285-47ac-42aa-8026-8addce656362/volumes/kubernetes.io~projected/kube-api-access-x7jvd major:0 minor:260 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8de1f285-47ac-42aa-8026-8addce656362/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/8de1f285-47ac-42aa-8026-8addce656362/volumes/kubernetes.io~secret/etcd-client major:0 minor:243 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8de1f285-47ac-42aa-8026-8addce656362/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/8de1f285-47ac-42aa-8026-8addce656362/volumes/kubernetes.io~secret/serving-cert major:0 minor:247 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/92c63c95-e880-4f51-9858-7715343f7bd8/volumes/kubernetes.io~projected/kube-api-access-9tl7p:{mountpoint:/var/lib/kubelet/pods/92c63c95-e880-4f51-9858-7715343f7bd8/volumes/kubernetes.io~projected/kube-api-access-9tl7p major:0 minor:827 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/92c63c95-e880-4f51-9858-7715343f7bd8/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/92c63c95-e880-4f51-9858-7715343f7bd8/volumes/kubernetes.io~secret/serving-cert major:0 minor:822 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9416f5d0-32b4-4065-b678-26913af8b6dd/volumes/kubernetes.io~projected/kube-api-access-7hnfl:{mountpoint:/var/lib/kubelet/pods/9416f5d0-32b4-4065-b678-26913af8b6dd/volumes/kubernetes.io~projected/kube-api-access-7hnfl major:0 minor:1280 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9416f5d0-32b4-4065-b678-26913af8b6dd/volumes/kubernetes.io~secret/client-ca-bundle:{mountpoint:/var/lib/kubelet/pods/9416f5d0-32b4-4065-b678-26913af8b6dd/volumes/kubernetes.io~secret/client-ca-bundle major:0 minor:1273 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9416f5d0-32b4-4065-b678-26913af8b6dd/volumes/kubernetes.io~secret/secret-metrics-client-certs:{mountpoint:/var/lib/kubelet/pods/9416f5d0-32b4-4065-b678-26913af8b6dd/volumes/kubernetes.io~secret/secret-metrics-client-certs major:0 minor:1278 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9416f5d0-32b4-4065-b678-26913af8b6dd/volumes/kubernetes.io~secret/secret-metrics-server-tls:{mountpoint:/var/lib/kubelet/pods/9416f5d0-32b4-4065-b678-26913af8b6dd/volumes/kubernetes.io~secret/secret-metrics-server-tls major:0 minor:1279 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/959c2393-e914-4c10-a18f-b30fcf012d19/volumes/kubernetes.io~projected/kube-api-access-42sml:{mountpoint:/var/lib/kubelet/pods/959c2393-e914-4c10-a18f-b30fcf012d19/volumes/kubernetes.io~projected/kube-api-access-42sml major:0 minor:856 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/959c2393-e914-4c10-a18f-b30fcf012d19/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/959c2393-e914-4c10-a18f-b30fcf012d19/volumes/kubernetes.io~secret/serving-cert major:0 minor:734 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/961e4ecd-545b-4270-ae34-e733dec793b6/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/961e4ecd-545b-4270-ae34-e733dec793b6/volumes/kubernetes.io~projected/kub Feb 23 14:34:27.973365 master-0 kubenswrapper[28758]: e-api-access major:0 minor:263 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/961e4ecd-545b-4270-ae34-e733dec793b6/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/961e4ecd-545b-4270-ae34-e733dec793b6/volumes/kubernetes.io~secret/serving-cert major:0 minor:245 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9b558268-2262-4593-893e-408639a9987d/volumes/kubernetes.io~empty-dir/etc-tuned:{mountpoint:/var/lib/kubelet/pods/9b558268-2262-4593-893e-408639a9987d/volumes/kubernetes.io~empty-dir/etc-tuned major:0 minor:473 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9b558268-2262-4593-893e-408639a9987d/volumes/kubernetes.io~empty-dir/tmp:{mountpoint:/var/lib/kubelet/pods/9b558268-2262-4593-893e-408639a9987d/volumes/kubernetes.io~empty-dir/tmp major:0 minor:474 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9b558268-2262-4593-893e-408639a9987d/volumes/kubernetes.io~projected/kube-api-access-nmmn9:{mountpoint:/var/lib/kubelet/pods/9b558268-2262-4593-893e-408639a9987d/volumes/kubernetes.io~projected/kube-api-access-nmmn9 major:0 minor:478 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a4ae9292-71dc-4484-b277-43cb26c1e04d/volumes/kubernetes.io~projected/kube-api-access-8llc8:{mountpoint:/var/lib/kubelet/pods/a4ae9292-71dc-4484-b277-43cb26c1e04d/volumes/kubernetes.io~projected/kube-api-access-8llc8 major:0 minor:250 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ace75aae-6f4f-4299-90e2-d5292271b136/volumes/kubernetes.io~projected/kube-api-access-wzkcs:{mountpoint:/var/lib/kubelet/pods/ace75aae-6f4f-4299-90e2-d5292271b136/volumes/kubernetes.io~projected/kube-api-access-wzkcs major:0 minor:135 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ace75aae-6f4f-4299-90e2-d5292271b136/volumes/kubernetes.io~secret/metrics-certs:{mountpoint:/var/lib/kubelet/pods/ace75aae-6f4f-4299-90e2-d5292271b136/volumes/kubernetes.io~secret/metrics-certs major:0 minor:542 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ad0f0d72-0337-4347-bb50-e299a175f3ca/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/ad0f0d72-0337-4347-bb50-e299a175f3ca/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:505 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ad0f0d72-0337-4347-bb50-e299a175f3ca/volumes/kubernetes.io~projected/kube-api-access-knkx2:{mountpoint:/var/lib/kubelet/pods/ad0f0d72-0337-4347-bb50-e299a175f3ca/volumes/kubernetes.io~projected/kube-api-access-knkx2 major:0 minor:436 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ad0f0d72-0337-4347-bb50-e299a175f3ca/volumes/kubernetes.io~secret/image-registry-operator-tls:{mountpoint:/var/lib/kubelet/pods/ad0f0d72-0337-4347-bb50-e299a175f3ca/volumes/kubernetes.io~secret/image-registry-operator-tls major:0 minor:435 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/adbf8f71-f005-4e5b-9de1-e49559cf7386/volumes/kubernetes.io~projected/kube-api-access-5vxqg:{mountpoint:/var/lib/kubelet/pods/adbf8f71-f005-4e5b-9de1-e49559cf7386/volumes/kubernetes.io~projected/kube-api-access-5vxqg major:0 minor:769 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2/volumes/kubernetes.io~projected/kube-api-access-lzj2j:{mountpoint:/var/lib/kubelet/pods/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2/volumes/kubernetes.io~projected/kube-api-access-lzj2j major:0 minor:832 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2/volumes/kubernetes.io~secret/serving-cert major:0 minor:817 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/af950a67-1557-4352-8100-27281bb8ecbe/volumes/kubernetes.io~projected/kube-api-access-jxrvf:{mountpoint:/var/lib/kubelet/pods/af950a67-1557-4352-8100-27281bb8ecbe/volumes/kubernetes.io~projected/kube-api-access-jxrvf major:0 minor:755 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/af950a67-1557-4352-8100-27281bb8ecbe/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/af950a67-1557-4352-8100-27281bb8ecbe/volumes/kubernetes.io~secret/proxy-tls major:0 minor:754 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b090ed5a-984f-41dd-8cea-34a1ece1514f/volumes/kubernetes.io~projected/kube-api-access-fjs6f:{mountpoint:/var/lib/kubelet/pods/b090ed5a-984f-41dd-8cea-34a1ece1514f/volumes/kubernetes.io~projected/kube-api-access-fjs6f major:0 minor:141 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b090ed5a-984f-41dd-8cea-34a1ece1514f/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/b090ed5a-984f-41dd-8cea-34a1ece1514f/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:140 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b714a9df-026e-423d-a980-2569f0d92e47/volumes/kubernetes.io~projected/kube-api-access-lr868:{mountpoint:/var/lib/kubelet/pods/b714a9df-026e-423d-a980-2569f0d92e47/volumes/kubernetes.io~projected/kube-api-access-lr868 major:0 minor:256 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b714a9df-026e-423d-a980-2569f0d92e47/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/b714a9df-026e-423d-a980-2569f0d92e47/volumes/kubernetes.io~secret/serving-cert major:0 minor:246 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b9774f8c-0f29-46d8-be77-81bcf74d5994/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/b9774f8c-0f29-46d8-be77-81bcf74d5994/volumes/kubernetes.io~projected/kube-api-access major:0 minor:580 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b9774f8c-0f29-46d8-be77-81bcf74d5994/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/b9774f8c-0f29-46d8-be77-81bcf74d5994/volumes/kubernetes.io~secret/serving-cert major:0 minor:579 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b9cf1c39-24f0-420b-8020-089616d1cdf0/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/b9cf1c39-24f0-420b-8020-089616d1cdf0/volumes/kubernetes.io~projected/kube-api-access major:0 minor:271 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b9cf1c39-24f0-420b-8020-089616d1cdf0/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/b9cf1c39-24f0-420b-8020-089616d1cdf0/volumes/kubernetes.io~secret/serving-cert major:0 minor:248 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bbe678de-546d-49d0-8280-3f6d94fa5e4f/volumes/kubernetes.io~projected/kube-api-access-kp5kb:{mountpoint:/var/lib/kubelet/pods/bbe678de-546d-49d0-8280-3f6d94fa5e4f/volumes/kubernetes.io~projected/kube-api-access-kp5kb major:0 minor:167 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bbe678de-546d-49d0-8280-3f6d94fa5e4f/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/bbe678de-546d-49d0-8280-3f6d94fa5e4f/volumes/kubernetes.io~secret/webhook-cert major:0 minor:166 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c02c8912-46c9-4f86-ad28-9bfb2eca4e54/volumes/kubernetes.io~secret/tls-certificates:{mountpoint:/var/lib/kubelet/pods/c02c8912-46c9-4f86-ad28-9bfb2eca4e54/volumes/kubernetes.io~secret/tls-certificates major:0 minor:1136 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c0a39496-5e47-4415-b8bf-ed0634797ce1/volumes/kubernetes.io~projected/kube-api-access-9sflb:{mountpoint:/var/lib/kubelet/pods/c0a39496-5e47-4415-b8bf-ed0634797ce1/volumes/kubernetes.io~projected/kube-api-access-9sflb major:0 minor:1130 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c0a39496-5e47-4415-b8bf-ed0634797ce1/volumes/kubernetes.io~secret/certs:{mountpoint:/var/lib/kubelet/pods/c0a39496-5e47-4415-b8bf-ed0634797ce1/volumes/kubernetes.io~secret/certs major:0 minor:1129 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c0a39496-5e47-4415-b8bf-ed0634797ce1/volumes/kubernetes.io~secret/node-bootstrap-token:{mountpoint:/var/lib/kubelet/pods/c0a39496-5e47-4415-b8bf-ed0634797ce1/volumes/kubernetes.io~secret/node-bootstrap-token major:0 minor:1125 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c67a2ed2-f520-46fc-84d3-6816dc19f4e0/volumes/kubernetes.io~projected/kube-api-access-hj8ff:{mountpoint:/var/lib/kubelet/pods/c67a2ed2-f520-46fc-84d3-6816dc19f4e0/volumes/kubernetes.io~projected/kube-api-access-hj8ff major:0 minor:765 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c67a2ed2-f520-46fc-84d3-6816dc19f4e0/volumes/kubernetes.io~secret/machine-approver-tls:{mountpoint:/var/lib/kubelet/pods/c67a2ed2-f520-46fc-84d3-6816dc19f4e0/volumes/kubernetes.io~secret/machine-approver-tls major:0 minor:764 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c84f66f0-207e-436a-8f4e-d1971fa815eb/volumes/kubernetes.io~projected/kube-api-access-gzn8r:{mountpoint:/var/lib/kubelet/pods/c84f66f0-207e-436a-8f4e-d1971fa815eb/volumes/kubernetes.io~projected/kube-api-access-gzn8r major:0 minor:783 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cb6e88cd-98de-446a-92e8-f56a2f133703/volumes/kubernetes.io~projected/kube-api-access-chznd:{mountpoint:/var/lib/kubelet/pods/cb6e88cd-98de-446a-92e8-f56a2f133703/volumes/kubernetes.io~projected/kube-api-access-chznd major:0 minor:249 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cb6e88cd-98de-446a-92e8-f56a2f133703/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/cb6e88cd-98de-446a-92e8-f56a2f133703/volumes/kubernetes.io~secret/serving-cert major:0 minor:252 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ceba7b56-f910-473d-aed5-add94868fb31/volumes/kubernetes.io~projected/kube-api-access-72769:{mountpoint:/var/lib/kubelet/pods/ceba7b56-f910-473d-aed5-add94868fb31/volumes/kubernetes.io~projected/kube-api-access-72769 major:0 minor:684 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ceba7b56-f910-473d-aed5-add94868fb31/volumes/kubernetes.io~secret/machine-api-operator-tls:{mountpoint:/var/lib/kubelet/pods/ceba7b56-f910-473d-aed5-add94868fb31/volumes/kubernetes.io~secret/machine-api-operator-tls major:0 minor:711 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cf04aca0-8174-4134-835d-37adf6a3b5ca/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/cf04aca0-8174-4134-835d-37adf6a3b5ca/volumes/kubernetes.io~projected/kube-api-access major:0 minor:251 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cf04aca0-8174-4134-835d-37adf6a3b5ca/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/cf04aca0-8174-4134-835d-37adf6a3b5ca/volumes/kubernetes.io~secret/serving-cert major:0 minor:242 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061/volumes/kubernetes.io~projected/kube-api-access-vp6tj:{mountpoint:/var/lib/kubelet/pods/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061/volumes/kubernetes.io~projected/kube-api-access-vp6tj major:0 minor:257 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:240 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ded555da-db03-498e-81a9-ad166f29a2aa/volumes/kubernetes.io~projected/kube-api-access-x4lz2:{mountpoint:/var/lib/kubelet/pods/ded555da-db03-498e-81a9-ad166f29a2aa/volumes/kubernetes.io~projected/kube-api-access-x4lz2 major:0 minor:325 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e2d00ece-7586-4346-adbb-eaae1aeda69e/volumes/kubernetes.io~projected/kube-api-access-4nr85:{mountpoint:/var/lib/kubelet/pods/e2d00ece-7586-4346-adbb-eaae1aeda69e/volumes/kubernetes.io~projected/kube-api-access-4nr85 major:0 minor:278 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e2d00ece-7586-4346-adbb-eaae1aeda69e/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/e2d00ece-7586-4346-adbb-eaae1aeda69e/volumes/kubernetes.io~secret/serving-cert major:0 minor:241 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e5104cdd-85b8-49ba-95ca-3e9c8218a01e/volumes/kubernetes.io~projected/kube-api-access-8qfr7:{mountpoint:/var/lib/kubelet/pods/e5104cdd-85b8-49ba-95ca-3e9c8218a01e/volumes/kubernetes.io~projected/kube-api-access-8qfr7 major:0 minor:1141 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ea0b3538-9a7d-4995-b628-2d63f21d683c/volumes/kubernetes.io~projected/kube-api-access-2cd7w:{mountpoint:/var/lib/kubelet/pods/ea0b3538-9a7d-4995-b628-2d63f21d683c/volumes/kubernetes.io~projected/kube-api-access-2cd7w major:0 minor:552 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ea0b3538-9a7d-4995-b628-2d63f21d683c/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/ea0b3538-9a7d-4995-b628-2d63f21d683c/volumes/kubernetes.io~secret/encryption-config major:0 minor:459 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ea0b3538-9a7d-4995-b628-2d63f21d683c/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/ea0b3538-9a7d-4995-b628-2d63f21d683c/volumes/kubernetes.io~secret/etcd-client major:0 minor:463 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ea0b3538-9a7d-4995-b628-2d63f21d683c/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/ea0b3538-9a7d-4995-b628-2d63f21d683c/volumes/kubernetes.io~secret/serving-cert major:0 minor:461 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/efdde2df-cd07-4898-88f4-7ecde0e04d7a/volumes/kubernetes.io~projected/kube-api-access-tpc4t:{mountpoint:/var/lib/kubelet/pods/efdde2df-cd07-4898-88f4-7ecde0e04d7a/volumes/kubernetes.io~projected/kube-api-access-tpc4t major:0 minor:784 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c/volumes/kubernetes.io~projected/kube-api-access-dzcqx:{mountpoint:/var/lib/kubelet/pods/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c/volumes/kubernetes.io~projected/kube-api-access-dzcqx major:0 minor:1210 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c/volumes/kubernetes.io~secret/kube-state-metrics-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c/volumes/kubernetes.io~secret/kube-state-metrics-kube-rbac-proxy-config major:0 minor:1202 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c/volumes/kubernetes.io~secret/kube-state-metrics-tls:{mountpoint:/var/lib/kubelet/pods/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c/volumes/kubernetes.io~secret/kube-state-metrics-tls major:0 minor:1207 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f10f592e-5738-4879-b776-246b357d4621/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/f10f592e-5738-4879-b776-246b357d4621/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f10f592e-5738-4879-b776-246b357d4621/volumes/kubernetes.io~projected/kube-api-access-269v7:{mountpoint:/var/lib/kubelet/pods/f10f592e-5738-4879-b776-246b357d4621/volumes/kubernetes.io~projected/kube-api-access-269v7 major:0 minor:143 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f10f592e-5738-4879-b776-246b357d4621/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/f10f592e-5738-4879-b776-246b357d4621/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:142 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fae9a4cf-2acf-4728-9105-87e004052fe5/volumes/kubernetes.io~projected/kube-api-access-x8v9z:{mountpoint:/var/lib/kubelet/pods/fae9a4cf-2acf-4728-9105-87e004052fe5/volumes/kubernetes.io~projected/kube-api-access-x8v9z major:0 minor:1209 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fae9a4cf-2acf-4728-9105-87e004052fe5/volumes/kubernetes.io~secret/openshift-state-metrics-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/fae9a4cf-2acf-4728-9105-87e004052fe5/volumes/kubernetes.io~secret/openshift-state-metrics-kube-rbac-proxy-config major:0 minor:1115 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fae9a4cf-2acf-4728-9105-87e004052fe5/volumes/kubernetes.io~secret/openshift-state-metrics-tls:{mountpoint:/var/lib/kubelet/pods/fae9a4cf-2acf-4728-9105-87e004052fe5/volumes/kubernetes.io~secret/openshift-state-metrics-tls major:0 minor:1208 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fbb66172-1ea9-4683-b88f-227c4fd94924/volumes/kubernetes.io~projected/kube-api-access-kl87q:{mountpoint:/var/lib/kubelet/pods/fbb66172-1ea9-4683-b88f-227c4fd94924/volumes/kubernetes.io~projected/kube-api-access-kl87q major:0 minor:831 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fbb66172-1ea9-4683-b88f-227c4fd94924/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/fbb66172-1ea9-4683-b88f-227c4fd94924/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert major:0 minor:830 fsType:tmpfs blockSize:0} overlay_0-100:{mountpoint:/var/lib/containers/storage/overlay/d705403a0879c0975a3e66550b97a0f81bf93dd642a288baef16233e0be3576a/merged major:0 minor:100 fsType:overlay blockSize:0} overlay_0-1002:{mountpoint:/var/lib/containers/storage/overlay/4e7eb8b97758d81c46fffef581d00ac6d5230abd332762ba9d16a81b95bbc74e/merged major:0 minor:1002 fsType:overlay blockSize:0} overlay_0-1008:{mountpoint:/var/lib/containers/storage/overlay/011df9a046f5fa55513a5405e04e8827272a6b38810ddbe508f5d1cdf1eda729/merged major:0 minor:1008 fsType:overlay blockSize:0} overlay_0-1010:{mountpoint:/var/lib/containers/storage/overlay/424d4c54841343f1f083a81c150e0e2b1ef669f0f010a22c14b94db7a19fde6b/merged major:0 minor:1010 fsType:overlay blockSize:0} overlay_0-1013:{mountpoint:/var/lib/containers/storage/overlay/7e470648df9a3cdf5aa6b5054416bd54f0b2ffd52ba59cd652fafa32b28511b8/merged major:0 minor:1013 fsType:overlay blockSize:0} overlay_0-102:{mountpoint:/var/lib/containers/storage/overlay/3e23d738fd862891b3c5c6ee84f91a0ce1daa9cf5128e482a3f4b8954a4bf8d3/merged major:0 minor:102 fsType:overlay blockSize:0} overlay_0-104:{mountpoint:/var/lib/containers/storage/overlay/c978e043c130bcbff262e204597ff151da4195ebd3d14afaf9a80656bb0f6078/merged major:0 minor:104 fsType:overlay blockSize:0} overlay_0-1040:{mountpoint:/var/lib/containers/storage/overlay/0e6920c1ab61ed3270702c0e437b678f04cbd917d41e082786c8b5996bd63be7/merged major:0 minor:1040 fsType:overlay blockSize:0} overlay_0-1047:{mountpoint:/var/lib/containers/storage/overlay/4cbb836374f1448cac7f654ebaafdf683b70a89d5cecdae995902641c3225c52/merged major:0 minor:1047 fsType:overlay blockSize:0} overlay_0-1054:{mountpoint:/var/lib/containers/storage/overlay/13e5f0c5fb0ae5e267ff148ecf3fb4218d219ad160106a6eb07d024c8f890c9d/merged major:0 minor:1054 fsType:overlay blockSize:0} overlay_0-1062:{mountpoint:/var/lib/containers/storage/overlay/2b3bda6efad6fb455fe2e3d70affc3f63341758e71bb2c85e11c127bd28e0273/merged major:0 minor:1062 fsType:overlay blockSize:0} overlay_0-1067:{mountpoint:/var/lib/containers/storage/overlay/fdeadf8a7fdf394827c18d02c4a15df5e2297b0469d0cce364314c699887fef5/merged major:0 minor:1067 fsType:overlay blockSize:0} overlay_0-1076:{mountpoint:/var/lib/containers/storage/overlay/93b0c3d2bafe868c4dbcb68462393b1db406345ec4b1db15df211a665822cd3a/merged major:0 minor:1076 fsType:overlay blockSize:0} overlay_0-108:{mountpoint:/var/lib/containers/storage/overlay/6a9d7144afab304634f4421dbd76371440974bc716c0b26cd01f589c0034a15c/merged major:0 minor:108 fsType:overlay blockSize:0} overlay_0-1081:{mountpoint:/var/lib/containers/storage/overlay/ea36aa88f62449d097b59d377aaa64cdbb3a1280594200f807be1c4d8959cdb3/merged major:0 minor:1081 fsType:overlay blockSize:0} overlay_0-1086:{mountpoint:/var/lib/containers/storage/overlay/03deb91126ff9d977c145b65bb0cf772f65b90386bf7d58a27e0478e04be52e0/merged major:0 minor:1086 fsType:overlay blockSize:0} overlay_0-110:{mountpoint:/var/lib/containers/storage/overlay/2dbf656ed6802613dfd64bb4e38e0ae43560d7005c1a39964385eb309abd848f/merged major:0 minor:110 fsType:overlay blockSize:0} overlay_0-111:{mountpoint:/var/lib/containers/storage/overlay/aa8bfffbe1cc23950a48e86d7438fcb09c7e8b9c4f14a3ca45177c89e18dfa48/merged major:0 minor:111 fsType:overlay blockSize:0} overlay_0-1111:{mountpoint:/var/lib/containers/storage/overlay/2d140674cc9f3c5eec3afa1f06e943d1143d67a95e0e80cb6e2a8322d0d4e49f/merged major:0 minor:1111 fsType:overlay blockSize:0} overlay_0-1117:{mountpoint:/var/lib/containers/storage/overlay/861163a2140af6e5f2b1e0bcca9c24a750b4308b4c76bb74fbcd7249d2840d83/merged major:0 minor:1117 fsType:overlay blockSize:0} overlay_0-1119:{mountpoint:/var/lib/containers/storage/overlay/58477ef3fff770e2245d35d7ab5ca849bbbbaca32cde09341c12514cd7a420d7/merged major:0 minor:1119 fsType:overlay blockSize:0} overlay_0-1133:{mountpoint:/var/lib/containers/storage/overlay/c56f5027617868c4d358bfa63dc811b0d41d10d2292268dd275c91b19941aba2/merged major:0 minor:1133 fsType:overlay blockSize:0} overlay_0-1134:{mountpoint:/var/lib/containers/storage/overlay/3f2febec48f5db65eb187c4e3bcfd740964c00faff95b6b050abb5cbc06410a8/merged major:0 minor:1134 fsType:overlay blockSize:0} overlay_0-1148:{mountpoint:/var/lib/containers/storage/overlay/faf7fe8dfd02584dd1b66e80617bc62694f80f5946e0567a89c85dce7933dbec/merged major:0 minor:1148 fsType:overlay blockSize:0} overlay_0-1150:{mountpoint:/var/lib/containers/storage/overlay/76d9d8626ea36fc3d1cb703a456335c310de5ba52764a3c42c1c03524f415c67/merged major:0 minor:1150 fsType:overlay blockSize:0} overlay_0-1152:{mountpoint:/var/lib/containers/storage/overlay/972561cb82c9fa66fc4010f3a2af8575c00683badd70f1cff365a3fae05b4123/merged major:0 minor:1152 fsType:overlay blockSize:0} overlay_0-1154:{mountpoint:/var/lib/containers/storage/overlay/28d34ad6d36dbd4be72ac6e78253c6c3c64742c16f1801cae03314fad95b1fb5/merged major:0 minor:1154 fsType:overlay blockSize:0} overlay_0-1161:{mountpoint:/var/lib/containers/storage/overlay/cb1a306b87721b851a7047de330c89b71fbffac1fb7600bddb15694eb5660fda/merged major:0 minor:1161 fsType:overlay blockSize:0} overlay_0-1165:{mountpoint:/var/lib/containers/storage/overlay/e4174898016926df25cc5ae2b81f1297abf66516e09fb009ff4845f0e1799273/merged major:0 minor:1165 fsType:overlay blockSize:0} overlay_0-117:{mountpoint:/var/lib/containers/storage/overlay/fdcae2dea75a1ead7a2b9c07639112c5657b59e502ae0196029a1cfc53b59897/merged major:0 minor:117 fsType:overlay blockSize:0} overlay_0-118:{mountpoint:/var/lib/containers/storage/overlay/a1aa23976f0a98ed5e50531694375ca3b171e1566104ad25da2d5f443d50d8d4/merged major:0 minor:118 fsType:overlay blockSize:0} overlay_0-1182:{mountpoint:/var/lib/containers/storage/overlay/3beddb8cf7d70c8b52513b7dadd855e037b1039f8d4e155f75317a50db900a79/merged major:0 minor:1182 fsType:overlay blockSize:0} overlay_0-1184:{mountpoint:/var/lib/containers/storage/overlay/ffed2dc46023c08f5ef87adab929bfba46df86834c0a0c89eddb2c5e2f601542/merged major:0 minor:1184 fsType:overlay blockSize:0} overlay_0-1186:{mountpoint:/var/lib/containers/storage/overlay/90ca952e4d7c78d202db8b970e5badac7b1e51989640a9576cbf1d67378d95f0/merged major:0 minor:1186 fsType:overlay blockSize:0} overlay_0-1196:{mountpoint:/var/lib/containers/storage/overlay/d2915125ffc84f215f0f9a42db539be293c659fe96012ee137c0a164b6231186/merged major:0 minor:1196 fsType:overlay blockSize:0} overlay_0-1200:{mountpoint:/var/lib/containers/storage/overlay/8a4adecc9502699dd2d05f522480e62b80ec856149f594800a99b17759518bde/merged major:0 minor:1200 fsType:overlay blockSize:0} overlay_0-1216:{mountpoint:/var/lib/containers/storage/overlay/02beebb25dddad9e41ba17bbc0cd260aa48c2041dcdc04c0ab401fcc8f40e73e/merged major:0 minor:1216 fsType:overlay blockSize:0} overlay_0-1218:{mountpoint:/var/lib/containers/storage/overlay/cf60f67ddf515501f8dd37b3194e44bbc3ad12902add818426ba4d46c3c78e8b/merged major:0 minor:1218 fsType:overlay blockSize:0} overlay_0-1219:{mountpoint:/var/lib/containers/storage/overlay/ec207f6fd7adfb8d9c4ca043850e250abcf07242bb15a5a6e86e715757a2a866/merged major:0 minor:1219 fsType:overlay blockSize:0} overlay_0-1228:{mountpoint:/var/lib/containers/storage/overlay/9270c88a50053da8468181add4200c0e52f5bbd190ce6e2ad7952499d95408db/merged major:0 minor:1228 fsType:overlay blockSize:0} overlay_0-1230:{mountpoint:/var/lib/containers/storage/overlay/323cd23d868381a8830aec40f4ba4afcbce48984c78a1230c55dcba940ba0377/merged major:0 minor:1230 fsType:overlay blockSize:0} overlay_0-1235:{mountpoint:/var/lib/containers/storage/overlay/87840beb9879a2dec907b136be5558d001b2abd2e5d3d6ee7a01a4e641d486e6/merged major:0 minor:1235 fsType:overlay blockSize:0} overlay_0-1237:{mountpoint:/var/lib/containers/storage/overlay/135670920ab7013e73b7027c3183360a386a9c89815e51ccc6a4f1b038ca9953/merged major:0 minor:1237 fsType:overlay blockSize:0} overlay_0-1239:{mountpoint:/var/lib/containers/storage/overlay/1e3edf502fa6a2c7a3508001aa1d0969dc80653932c50f9f2172e231d2158d30/merged major:0 minor:1239 fsType:overlay blockSize:0} overlay_0-1241:{mountpoint:/var/lib/containers/storage/overlay/f301370adceba125a30c274e5280aa22145fba033897bb8dad7c1e15246a0230/merged major:0 minor:1241 fsType:overlay blockSize:0} overlay_0-1243:{mountpoint:/var/lib/containers/storage/overlay/910375cc727d8a38c000012c384c224c0bcba6f05b3cc2e13ef832cbfd50b6d7/merged major:0 minor:1243 fsType:overlay blockSize:0} overlay_0-1258:{mountpoint:/var/lib/containers/storage/overlay/9f69d692dd898a1d72d51e3c295ba920a5462eaf54345ddc817aa7ceccc4154c/merged major:0 minor:1258 fsType:overlay blockSize:0} overlay_0-126:{mountpoint:/var/lib/containers/storage/overlay/65ee14cad8f8e89d1b904e383e96e66f946404750a35c14c963851861e9757a0/merged major:0 minor:126 fsType:overlay blockSize:0} overlay_0-1267:{mountpoint:/var/lib/containers/storage/overlay/1bcc16e259a1ed6735c1424fc66ee2ccedc420bd261bc7805678f0912d9c8cab/merged major:0 minor:1267 fsType:overlay blockSize:0} overlay_0-1270:{mountpoint:/var/lib/containers/storage/overlay/b917fe44f7a8ca46079c6f0ff915d09a695fc04707af0ebc14f7128f24876c03/merged major:0 minor:1270 fsType:overlay blockSize:0} overlay_0-1271:{mountpoint:/var/lib/containers/storage/overlay/e804baadf7e16655ab12d5d8886ad07c2fe4c9806f69fe84c83b7327750c5b24/merged major:0 minor:1271 fsType:overlay blockSize:0} overlay_0-1275:{mountpoint:/var/lib/containers/storage/overlay/01e089736be052c59b9ae61fca9d1efc9f55075ba3eeb814342a6427d2fd2d40/merged major:0 minor:1275 fsType:overlay blockSize:0} overlay_0-1277:{mountpoint:/var/lib/containers/storage/overlay/72b8d7eeba79d2ba4969313fca8dea732630e7a78b99a09b0e514080df7f9ec9/merged major:0 minor:1277 fsType:overlay blockSize:0} overlay_0-128:{mountpoint:/var/lib/containers/storage/overlay/eb92f150005dab97dfb2d9434859cf5d7e5eee041a906fecce8c699227e08423/merged major:0 minor:128 fsType:overlay blockSize:0} overlay_0-1283:{mountpoint:/var/lib/containers/storage/overlay/cf0ca079fd1b4caf245cea5a7afd01e88a0ceef210531bbed8378671dc0b5408/merged major:0 minor:1283 fsType:overlay blockSize:0} overlay_0-1285:{mountpoint:/var/lib/containers/storage/overlay/e28d0c948278c4bb573cc6e584d4853f00a1559fa6b41a71a8c4d1ea45b66486/merged major:0 minor:1285 fsType:overlay blockSize:0} overlay_0-1296:{mountpoint:/var/lib/containers/storage/overlay/dd67c74b551e972bc905c910347b8ebb867e171079b6f9a7feabe7986544af0a/merged major:0 minor:1296 fsType:overlay blockSize:0} overlay_0-1298:{mountpoint:/var/lib/containers/storage/overlay/9f2b17a437c904466050efdea5a6719324a8d793769d2b59f22856ad9b5d95d1/merged major:0 minor:1298 fsType:overlay blockSize:0} overlay_0-1300:{mountpoint:/var/lib/containers/storage/overlay/b99662ffdcc1795bd3ed267dc601fac1c1ade7644d8645ff4d1b3272acfefaff/merged major:0 minor:1300 fsType:overlay blockSize:0} overlay_0-1304:{mountpoint:/var/lib/containers/storage/overlay/5c8e05e8442c399854010b2ff3731abb2860564f3a53be44de8581f0f7470a00/merged major:0 minor:1304 fsType:overlay blockSize:0} overlay_0-1317:{mountpoint:/var/lib/containers/storage/overlay/3af0d4748fb675421281e6f07ca6d0e7aa7bcbd694aeeb9f43916b67c47c5e57/merged major:0 minor:1317 fsType:overlay blockSize:0} overlay_0-1322:{mountpoint:/var/lib/containers/storage/overlay/0a40c271a3e241a4d7d29171b9a303c3bb4cb35761316d701125efed36629637/merged major:0 minor:1322 fsType:overlay blockSize:0} overlay_0-1326:{mountpoint:/var/lib/containers/storage/overlay/460fac3216e332f0f68fc8833dd2daec80484fd5b8a48c2dbb100c9ce3d71cdb/merged major:0 minor:1326 fsType:overlay blockSize:0} overlay_0-133:{mountpoint:/var/lib/containers/storage/overlay/f36e77d1a7f8bcfa0abd0da89b68bc1acea7f4ab44b36a59515228e130593ab3/merged major:0 minor:133 fsType:overlay blockSize:0} overlay_0-1330:{mountpoint:/var/lib/containers/storage/overlay/98c650cbc803d5fe147f01174cd6a179f750761400be9073315a016a10ba7011/merged major:0 minor:1330 fsType:overlay blockSize:0} overlay_0-1332:{mountpoint:/var/lib/containers/storage/overlay/42a30c67fbc94f4f7d24e55bdfe2cd09f3d7e5faf218544662a9a8ba601fd35b/merged major:0 minor:1332 fsType:overlay blockSize:0} overlay_0-1343:{mountpoint:/var/lib/containers/storage/overlay/77bceeeea9dcd21b200f5093cdf6b0dc50432f90200d495fe6c86dc258d448f6/merged major:0 minor:1343 fsType:overlay blockSize:0} overlay_0-1348:{mountpoint:/var/lib/containers/storage/overlay/c2684177b1c08a479300b13324cd9b0757f0195131e11770c7799656de01198e/merged major:0 minor:1348 fsType:overlay blockSize:0} overlay_0-136:{mountpoint:/var/lib/containers/storage/overlay/cab0cf6a3545e3e03202870da16a0f9b15304a7d67884a95e0603397075cf891/merged major:0 minor:136 fsType:overlay blockSize:0} overlay_0-138:{mountpoint:/var/lib/containers/storage/overlay/6f8d98512dccef2bf36b0e248f554e9ff2c819756dda85ebe869a85bb6873172/merged major:0 minor:138 fsType:overlay blockSize:0} overlay_0-148:{mountpoint:/var/lib/containers/storage/overlay/4dafe75899de7a156b4aa675f65ba30667db6c91576d7feb74f9a5c7dda5586a/merged major:0 minor:148 fsType:overlay blockSize:0} overlay_0-150:{mountpoint:/var/lib/containers/storage/overlay/d3cc6d80825cea1b3b543f8f76c93eaad81913823bc8b5613f1475c3af3b9369/merged major:0 minor:150 fsType:overlay blockSize:0} overlay_0-152:{mountpoint:/var/lib/containers/storage/overlay/41daa65fd8ca427bc31044355e67d5fbecef2836af01f5ca37cf0caf5ee25ec4/merged major:0 minor:152 fsType:overlay blockSize:0} overlay_0-154:{mountpoint:/var/lib/containers/storage/overlay/3333daf134cb3da0245069a02c7243182aedd8032f3702a16979a442df0d8ec1/merged major:0 minor:154 fsType:overlay blockSize:0} overlay_0-159:{mountpoint:/var/lib/containers/storage/overlay/4d25ac30bfc8b0dc4a9dee0884be64c59b3b4a06b12fc37141083d69359d3c63/merged major:0 minor:159 fsType:overlay blockSize:0} overlay_0-164:{mountpoint:/var/lib/containers/storage/overlay/416fdbd7f81cdc209c426295b456e9f3b325016c99af172a0cdb54ee1b720108/merged major:0 minor:164 fsType:overlay blockSize:0} overlay_0-170:{mountpoint:/var/lib/containers/storage/overlay/38f99c8ad9bd37b9b4302ffff5c5e979340a349394f8d63ad9affee8a695b213/merged major:0 minor:170 fsType:overlay blockSize:0} overlay_0-172:{mountpoint:/var/lib/containers/storage/overlay/6d458c784040933dc173e5962450a973b8341f582593afa9496558be7640fe0e/merged major:0 minor:172 fsType:overlay blockSize:0} overlay_0-174:{mountpoint:/var/lib/containers/storage/overlay/450c703987cbb7e8e6d3ebe4ebb96d3e61162f0469e9f739001f2a8881f943e7/merged major:0 minor:174 fsType:overlay blockSize:0} overlay_0-176:{mountpoint:/var/lib/containers/storage/overlay/c36ac0523fb77eb41a58606dc01d214baf8615e5bbeb8ff440c70ab25e948a9b/merged major:0 minor:176 fsType:overlay blockSize:0} overlay_0-178:{mountpoint:/var/lib/containers/storage/overlay/cee62bc9f0ca9757435066cbff749f999feef7c526f307853875114308836b79/merged major:0 minor:178 fsType:overlay blockSize:0} overlay_0-180:{mountpoint:/var/lib/containers/storage/overlay/f3a06b6bc5cc9309a5e816dc7e624e48b2fd95e158eaf6ec7a977ce6ed11e8f4/merged major:0 minor:180 fsType:overlay blockSize:0} overlay_0-182:{mountpoint:/var/lib/containers/storage/overlay/1e3ee3b4e61b65508bb9a9c3ce4e93682fca7288ae8c4afe90b5241b15eeb3ac/merged major:0 minor:182 fsType:overlay blockSize:0} overlay_0-190:{mountpoint:/var/lib/containers/storage/overlay/58f913d8907dd224ec3b885692f70c1043484d915de9010496ea8ceddd1aa889/merged major:0 minor:190 fsType:overlay blockSize:0} overlay_0-192:{mountpoint:/var/lib/containers/storage/overlay/51ddc9b6e59ecc9f855b0bb8fd2248ea06dad26175bb39cd321e6c96e8982f82/merged major:0 minor:192 fsType:overlay blockSize:0} overlay_0-193:{mountpoint:/var/lib/containers/storage/overlay/d497eaae2d03ece72bf60d41fc2f6fed71fb286996f169ec890b9be8f7a9b389/merged major:0 minor:193 fsType:overlay blockSize:0} overlay_0-194:{mountpoint:/var/lib/containers/storage/overlay/76cb5f972c693019121a16c61034de0e13b2eaaa2ea5c6d7cf4d72ebef5bcefe/merged major:0 minor:194 fsType:overlay blockSize:0} overlay_0-200:{mountpoint:/var/lib/containers/storage/overlay/da967a919abebb50750e8c0d85bef2750865d0726afd702b1d1a3f17dfe53bad/merged major:0 minor:200 fsType:overlay blockSize:0} overlay_0-205:{mountpoint:/var/lib/containers/storage/overlay/1a68b7abdc1772ea02040d6b0540fe9d6e306c2754d5742bf76423df09a9a7b8/merged major:0 minor:205 fsType:overlay blockSize:0} overlay_0-210:{mountpoint:/var/lib/containers/storage/overlay/245e9b1536351d0c345cb010b3854cf958019e458799ecda785c2e646b1533aa/merged major:0 minor:210 fsType:overlay blockSize:0} overlay_0-215:{mountpoint:/var/lib/containers/storage/overlay/c0e9ed247f3ae29cce83d4673289a9d6f58da74c2a0ac713b5699a6fc3d6f605/merged major:0 minor:215 fsType:overlay blockSize:0} overlay_0-220:{mountpoint:/var/lib/containers/storage/overlay/5fee5e65188a47c8fb1d51bad993bbec9a22552dcb8f6652e23cea5a06aef08c/merged major:0 minor:220 fsType:overlay blockSize:0} overlay_0-221:{mountpoint:/var/lib/containers/storage/overlay/a5c42f1605b03d1d312007f4db64dc34a7c6cd9b8b8616276f3c0b24e8b7e650/merged major:0 minor:221 fsType:overlay blockSize:0} overlay_0-230:{mountpoint:/var/lib/containers/storage/overlay/6938a7f15f3e6a2e952b5dd5e28c577ac4d16eadc2b89da7147e74db80f20045/merged major:0 minor:230 fsType:overlay blockSize:0} overlay_0-287:{mountpoint:/var/lib/containers/storage/overlay/fd45fad339d4e6dd1831b338f9a235c704498ed73e806313f3f04902ebd9428f/merged major:0 minor:287 fsType:overlay blockSize:0} overlay_0-289:{mountpoint:/var/lib/containers/storage/overlay/59e96ceb691c7ea249ed2a793caeb2025d0fe992eaa028ba9c99c2b376fd863d/merged major:0 minor:289 fsType:overlay blockSize:0} overlay_0-291:{mountpoint:/var/lib/containers/storage/overlay/de7124b09be7c8692a63b3d1a8ea9f9f83da6a0eee30081f57e20a98470ba8c8/merged major:0 minor:291 fsType:overlay blockSize:0} overlay_0-293:{mountpoint:/var/lib/containers/storage/overlay/947b9b54a7154bfe64998abbd37772745f37b596f118913ecd5b817e6399e08d/merged major:0 minor:293 fsType:overlay blockSize:0} overlay_0-297:{mountpoint:/var/lib/containers/storage/overlay/3120d70b85fb10291a840bc6a1b4850fcf98f369a88a30a9cf6401424a58a8e5/merged major:0 minor:297 fsType:overlay blockSize:0} overlay_0-303:{mountpoint:/var/lib/containers/storage/overlay/02e6bec4520bbf2009bcda686373feb51de4291850d5a3614e322a9b5c365b91/merged major:0 minor:303 fsType:overlay blockSize:0} overlay_0-305:{mountpoint:/var/lib/containers/storage/overlay/91baa4597c1a7f1c159d6bb3eb6edd6cb43dfe7701c56233800fe59c612ecca5/merged major:0 minor:305 fsType:overlay blockSize:0} overlay_0-307:{mountpoint:/var/lib/containers/storage/overlay/cb29400d07350b1ef1d59b54e1d2af266f1c0839280e91112b28e773056b97e7/merged major:0 minor:307 fsType:overlay blockSize:0} overlay_0-309:{mountpoint:/var/lib/containers/storage/overlay/12bbc7c46d95b08f346a3f5f71c8a0540090dc539a1b2c8b5350a14f78c46334/merged major:0 minor:309 fsType:overlay blockSize:0} overlay_0-311:{mountpoint:/var/lib/containers/storage/overlay/e0bb40a25462827249a9ec7a4d574e757daecd055c7ca8c0b02fabf14528f9ed/merged major:0 minor:311 fsType:overlay blockSize:0} overlay_0-313:{mountpoint:/var/lib/containers/storage/overlay/6329e16ae625fb7e5314e841d6aba1be9fe513780d9fdafa4adc9eee32be69ed/merged major:0 minor:313 fsType:overlay blockSize:0} overlay_0-315:{mountpoint:/var/lib/containers/storage/overlay/8aa793db136d69396e48e1178df52c10748763169e9d74c379a148935c773ce4/merged major:0 minor:315 fsType:overlay blockSize:0} overlay_0-317:{mountpoint:/var/lib/containers/storage/overlay/1009f928e04db67d5b9bce4f0015c9968bf4372829e27df4fb8f204db02cefbf/merged major:0 minor:317 fsType:overlay blockSize:0} overlay_0-319:{mountpoint:/var/lib/containers/storage/overlay/57c349b0dbe7e8c2c5decc44840351a66d8b39475a4814668d89f8d47a3934a2/merged major:0 minor:319 fsType:overlay blockSize:0} overlay_0-320:{mountpoint:/var/lib/containers/storage/overlay/ded62fd2fdfe2ffe0741ab88b9d87d5d5e5711202cc150055ac74c3e1e1f17e1/merged major:0 minor:320 fsType:overlay blockSize:0} overlay_0-322:{mountpoint:/var/lib/containers/storage/overlay/f0f9e9d940511f7798d273b0c34d3f316a0aa2a5cf0666d0dd149428bc66d337/merged major:0 minor:322 fsType:overlay blockSize:0} overlay_0-328:{mountpoint:/var/lib/containers/storage/overlay/bb589fc386294f727b23bb1d779a5465effc060b38c535a5dc692b6ed93a49df/merged major:0 minor:328 fsType:overlay blockSize:0} overlay_0-330:{mountpoint:/var/lib/containers/storage/overlay/d1aafc7a7f667de7dd2f90f0b07df72240c9d469e66ccac9fa3ed6051951c342/merged major:0 minor:330 fsType:overlay blockSize:0} overlay_0-341:{mountpoint:/var/lib/containers/storage/overlay/b7dd458b564fa0872ee59943c1e7a15f727a47356e558d3101c9f8dbfa52a9c6/merged major:0 minor:341 fsType:overlay blockSize:0} overlay_0-344:{mountpoint:/var/lib/containers/storage/overlay/802244a947bc190dbc1020e094821f1baa1a29dd3362bbae9d6c4a0ec30bfba8/merged major:0 minor:344 fsType:overlay blockSize:0} overlay_0-346:{mountpoint:/var/lib/containers/storage/overlay/087309d921d2518570bc8b69d43153d6ab5e14f88163e7e3d7f50e0841c53098/merged major:0 minor:346 fsType:overlay blockSize:0} overlay_0-350:{mountpoint:/var/lib/containers/storage/overlay/4fac155e82eabf0cf9ef2ebcae489c490f0b8f8197046a927e9e8b1e001d35d7/merged major:0 minor:350 fsType:overlay blockSize:0} overlay_0-360:{mountpoint:/var/lib/containers/storage/overlay/6455612d93f09fd74729d5c7107422c7131b308111a2109ba38b507d969d7b61/merged major:0 minor:360 fsType:overlay blockSize:0} overlay_0-361:{mountpoint:/var/lib/containers/storage/overlay/fb778582d3c4b287e855f753195144d25626018e5429bd4783c51096cdf4d2c5/merged major:0 minor:361 fsType:overlay blockSize:0} overlay_0-363:{mountpoint:/var/lib/containers/storage/overlay/03cb8b30ffe16fe2d7da77032dafc987e354da5b7f7bca5fd6fbc8705ba91a26/merged major:0 minor:363 fsType:overlay blockSize:0} overlay_0-370:{mountpoint:/var/lib/containers/storage/overlay/9335b790bf6f65b3bd509a1f56bf915aab8a29ce6c2f4869212a44704a9f4521/merged major:0 minor:370 fsType:overlay blockSize:0} overlay_0-371:{mountpoint:/var/lib/containers/storage/overlay/7699c6f096b12519f98539db59e65775133ee00f85db4519506e7a23aa7a5b87/merged major:0 minor:371 fsType:overlay blockSize:0} overlay_0-381:{mountpoint:/var/lib/containers/storage/overlay/3906c1284c2d30621f583fca4a243806e48a5a970070492aefc5471cf189af64/merged major:0 minor:381 fsType:overlay blockSize:0} overlay_0-386:{mountpoint:/var/lib/containers/storage/overlay/cbbe0e84cc13474ee00105897bcd216a42a1069e73defd961c430a6849c4c0c7/merged major:0 minor:386 fsType:overlay blockSize:0} overlay_0-388:{mountpoint:/var/lib/containers/storage/overlay/c727a87caa543d1ecdc5ecbba09b2d536e52826c8c812d7ceba40b5b8272964c/merged major:0 minor:388 fsType:overlay blockSize:0} overlay_0-394:{mountpoint:/var/lib/containers/storage/overlay/80be0fa502a658f88356a645294653c5ea25d8b5ea43023e68feab08f6fe5ce8/merged major:0 minor:394 fsType:overlay blockSize:0} overlay_0-396:{mountpoint:/var/lib/containers/storage/overlay/7fb4a9486acbf1e51488fc884496bea0fe140a168d30488c0da64b291c82b907/merged major:0 minor:396 fsType:overlay blockSize:0} overlay_0-398:{mountpoint:/var/lib/containers/storage/overlay/474f5eab0fbded11eb1d6fc21d715a7e6336a537130ef7dc7b2d377f52bd4fa1/merged major:0 minor:398 fsType:overlay blockSize:0} overlay_0-404:{mountpoint:/var/lib/containers/storage/overlay/378d56ecd26135a68210cbf73557c3e5151faac2ab1c7a310cd9a95f1b4faa3b/merged major:0 minor:404 fsType:overlay blockSize:0} overlay_0-406:{mountpoint:/var/lib/containers/storage/overlay/7ddf4a46ea819a3b07157328be9474b1a0753cec325b248869481d785e8c7e72/merged major:0 minor:406 fsType:overlay blockSize:0} overlay_0-409:{mountpoint:/var/lib/containers/storage/overlay/1d74cc20b9a1e76c77dfd3b2d997b3fc43696796cdf6258550b640e14256c43c/merged major:0 minor:409 fsType:overlay blockSize:0} overlay_0-423:{mountpoint:/var/lib/containers/storage/overlay/6216c21cc81258bde53b00b69fb04abd6ee86891754730e8312315cb1170126d/merged major:0 minor:423 fsType:overlay blockSize:0} overlay_0-425:{mountpoint:/var/lib/containers/storage/overlay/1ede37b5d00e325b67ec0945589de08ef85d094225ebda9907abd4d49410e3bc/merged major:0 minor:425 fsType:overlay blockSize:0} overlay_0-432:{mountpoint:/var/lib/containers/storage/overlay/4d8b51581ddbe66badfc88b19600a29bce549918757b85b611012201032f93b4/merged major:0 minor:432 fsType:overlay blockSize:0} overlay_0-433:{mountpoint:/var/lib/containers/storage/overlay/f7713a878f320e6c780a1616906da4342d026d0a25852f36616c9e5aa5e48577/merged major:0 minor:433 fsType:overlay blockSize:0} overlay_0-438:{mountpoint:/var/lib/containers/storage/overlay/d3f8be2e96ad9e34de77ab6ea2b3f3095495f41cac2c3199f5763ba258822e64/merged major:0 minor:438 fsType:overlay blockSize:0} overlay_0-44:{mountpoint:/var/lib/containers/storage/overlay/54bd9e3af303047c39ef3094d5b669609a7bb95a28e2a8bb2549ed5b2da259dc/merged major:0 minor:44 fsType:overlay blockSize:0} overlay_0-440:{mountpoint:/var/lib/containers/storage/overlay/4c0da8d5e0a5b46fba6d09a1ca7b03c9b4ffbfe6ee39e8253675b9eb2cdf1dc3/merged major:0 minor:440 fsType:overlay blockSize:0} overlay_0-442:{mountpoint:/var/lib/containers/storage/overlay/4e1164a4fde5210b28f8300d41111a2f034582f9cc7070d0f398e361349a2577/merged major:0 minor:442 fsType:overlay blockSize:0} overlay_0-448:{mountpoint:/var/lib/containers/storage/overlay/fa7d1d36b77f3a28bba6b2ee514a59bd4ff932cc44d5d37359070b5f3d60aff2/merged major:0 minor:448 fsType:overlay blockSize:0} overlay_0-450:{mountpoint:/var/lib/containers/storage/overlay/3f99ded4ed7a9102a0d0cfde0da714e5f6df33e2369f287464f7a6ab0a3bee4d/merged major:0 minor:450 fsType:overlay blockSize:0} overlay_0-452:{mountpoint:/var/lib/containers/storage/overlay/31bf1e23c33d7697789638c13b5ea1148fe99f092eb4fd9cd16d555b127a4115/merged major:0 minor:452 fsType:overlay blockSize:0} overlay_0-457:{mountpoint:/var/lib/containers/storage/overlay/5824937b60a466a06ff67b4d9e77ee0fb1e6c00584b0eb45316e9c9791725704/merged major:0 minor:457 fsType:overlay blockSize:0} overlay_0-46:{mountpoint:/var/lib/containers/storage/overlay/15f7f2fb148857bb79eb6121e016cf0821ed4b1d5e4e0f2b71ef8b66f516c4a1/merged major:0 minor:46 fsType:overlay blockSize:0} overlay_0-460:{mountpoint:/var/lib/containers/storage/overlay/b948f21c9048d32103a4f15f545eb39e0747c03f2ed976e075f30c51f1880454/merged major:0 minor:460 fsType:overlay blockSize:0} overlay_0-464:{mountpoint:/var/lib/containers/storage/overlay/8222ce0ca88ea29d65c12d498d93482203d9a411a277f047b3efd9724f16c671/merged major:0 minor:464 fsType:overlay blockSize:0} overlay_0-468:{mountpoint:/var/lib/containers/storage/overlay/6a16601de81746f66591e16f99dc9aae40f9250182b2e4c0d00761e84465bf87/merged major:0 minor:468 fsType:overlay blockSize:0} overlay_0-470:{mountpoint:/var/lib/containers/storage/overlay/e9c7f33b0d390fe8bc15c92c7c2770fe5ac6ff6c48d57828c743e09ff894d0eb/merged major:0 minor:470 fsType:overlay blockSize:0} overlay_0-479:{mountpoint:/var/lib/containers/storage/overlay/f03ec0d60841e89d47a8610b5e3dc50cec7e8554f7f8ff99d659f9b0b66bdb65/merged major:0 minor:479 fsType:overlay blockSize:0} overlay_0-48:{mountpoint:/var/lib/containers/storage/overlay/6be5fb791beed4456a249b88ee19caaa9cfc7ac88328db86c60cb4ee2ad1506c/merged major:0 minor:48 fsType:overlay blockSize:0} overlay_0-481:{mountpoint:/var/lib/containers/storage/overlay/18b362b1607756ba75a58e25b9b3c793db93e7d83539d2dd3eb6babb23c4c9e3/merged major:0 minor:481 fsType:overlay blockSize:0} overlay_0-490:{mountpoint:/var/lib/containers/storage/overlay/57a38e16e3a3374e4886a5d025d18e9711446c44dc0baa2be0c1af3a839fc0a9/merged major:0 minor:490 fsType:overlay blockSize:0} overlay_0-492:{mountpoint:/var/lib/containers/storage/overlay/b92e8aa5f7ebd9a0f4c91276c95c139944d4f2135fbc58292e9a02c8e8e671a8/merged major:0 minor:492 fsType:overlay blockSize:0} overlay_0-501:{mountpoint:/var/lib/containers/storage/overlay/76c17927e7b49999dcb0a72c340b3b582df347194d5794dd3c71215fded16463/merged major:0 minor:501 fsType:overlay blockSize:0} overlay_0-510:{mountpoint:/var/lib/containers/storage/overlay/892f6c99486ec1475e74bbf3fe7896250a4afc2e9d95070bdd00dddb0ef61df7/merged major:0 minor:510 fsType:overlay blockSize:0} overlay_0-512:{mountpoint:/var/lib/containers/storage/overlay/265be98822326ee4d8fc8161dcbcb7024a0ed254c20b9a180536b5cd2e4f6509/merged major:0 minor:512 fsType:overlay blockSize:0} overlay_0-514:{mountpoint:/var/lib/containers/storage/overlay/ed09d06719cd820a1173d033a46488a26846e7946cdcc7483d1bd29ed9378765/merged major:0 minor:514 fsType:overlay blockSize:0} overlay_0-516:{mountpoint:/var/lib/containers/storage/overlay/5a8eb19b04f43ac01e7c4dd882d8d6ab9b9b92f4e294e98a14b5a581f299bded/merged major:0 minor:516 fsType:overlay blockSize:0} overlay_0-518:{mountpoint:/var/lib/containers/storage/overlay/5af1b67002c10cef706a60c76d824dd4a9b2eb5a8e46dd1fb150033a203175a1/merged major:0 minor:518 fsType:overlay blockSize:0} overlay_0-52:{mountpoint:/var/lib/containers/storage/overlay/3fbe469d46126a5a22a28864d2d45c3e798b77aa9755af854ec32de7ba3755e4/merged major:0 minor:52 fsType:overlay blockSize:0} overlay_0-522:{mountpoint:/var/lib/containers/storage/overlay/1549e059365defa1c1fa3c452e7e63555e59d64d9f5a50ccf876ddd7219897e7/merged major:0 minor:522 fsType:overlay blockSize:0} overlay_0-524:{mountpoint:/var/lib/containers/storage/overlay/8c28ed49018be2c1a79316bf6de4e7d53bbf89dbe37b4efa1a061801e10ad275/merged major:0 minor:524 fsType:overlay blockSize:0} overlay_0-528:{mountpoint:/var/lib/containers/storage/overlay/850f6ead9aced0feaf572c94bbf808aeca8c469cc798fa499828aa6de1d7f5ce/merged major:0 minor:528 fsType:overlay blockSize:0} overlay_0-536:{mountpoint:/var/lib/containers/storage/overlay/c5477df9efd5d2938869bacac42210b935111289dad6e8a2de99b71937288568/merged major:0 minor:536 fsType:overlay blockSize:0} overlay_0-54:{mountpoint:/var/lib/containers/storage/overlay/0a32949807e5ef0fe4632018c7860ca2af7a9ff6efce35f8ae5e0da826f60eed/merged major:0 minor:54 fsType:overlay blockSize:0} overlay_0-543:{mountpoint:/var/lib/containers/storage/overlay/d0da20a4deb41d66781c75bf7fc37708bb9c9ce51c78f17b06ccba94e7e7b035/merged major:0 minor:543 fsType:overlay blockSize:0} overlay_0-547:{mountpoint:/var/lib/containers/storage/overlay/57235fc73cfdd2e040bc66724474721d36c4bd7f70b4e0047bb35657bc490654/merged major:0 minor:547 fsType:overlay blockSize:0} overlay_0-56:{mountpoint:/var/lib/containers/storage/overlay/9680acf1d2c820936e60d4b8d67f78ba386a9251916c6a899ebe5c0fe49c38be/merged major:0 minor:56 fsType:overlay blockSize:0} overlay_0-566:{mountpoint:/var/lib/containers/storage/overlay/8862013f83945eae4727be0176512064c98a5fbeae35cfe71a821da88035e0e0/merged major:0 minor:566 fsType:overlay blockSize:0} overlay_0-571:{mountpoint:/var/lib/containers/storage/overlay/410d3d78be0e1b32fbf76cdbde6df1ff7b8c210ab6e0ef9c70bbea95266839ad/merged major:0 minor:571 fsType:overlay blockSize:0} overlay_0-573:{mountpoint:/var/lib/containers/storage/overlay/2b62ed9254afbcdcb2097ea684031a950d70a43cd42c0f9d4de23b42e467f3d6/merged major:0 minor:573 fsType:overlay blockSize:0} overlay_0-575:{mountpoint:/var/lib/containers/storage/overlay/0026f98ef137a256f8ba53ad277b66eca62912f93fc4c7843e5263331b70856f/merged major:0 minor:575 fsType:overlay blockSize:0} overlay_0-577:{mountpoint:/var/lib/containers/storage/overlay/31c1f06a61e665fdb51626a99e3934252ced7b7fa1b6ba42fe975c9656a27501/merged major:0 minor:577 fsType:overlay blockSize:0} overlay_0-581:{mountpoint:/var/lib/containers/storage/overlay/75ea0792f29bef4dde58190c36de3fae95bd5c59fd04dc17ac6471d3336b3b0a/merged major:0 minor:581 fsType:overlay blockSize:0} overlay_0-583:{mountpoint:/var/lib/containers/storage/overlay/97e97ae160455e5026ea66be75c00aa0f1bc86268ce9e888a7eb248e18c659bf/merged major:0 minor:583 fsType:overlay blockSize:0} overlay_0-585:{mountpoint:/var/lib/containers/storage/overlay/9d1e163ea4976895c4b99d9181a5a7239f1ce1617edc010bc1d05ea268f5b37c/merged major:0 minor:585 fsType:overlay blockSize:0} overlay_0-587:{mountpoint:/var/lib/containers/storage/overlay/9985736aafc3a37443e12438af3664ff7b7d5904b957ac179ba205e041ebcae0/merged major:0 minor:587 fsType:overlay blockSize:0} overlay_0-589:{mountpoint:/var/lib/containers/storage/overlay/194b1a05f8cce3b507eb141f67e5c91efbdb728bc5cefdfb822ca281d7265436/merged major:0 minor:589 fsType:overlay blockSize:0} overlay_0-591:{mountpoint:/var/lib/containers/storage/overlay/5ef4a72ae79b7c632bbad7755ae9ec0be0425271263bf716c897bd9374c3552b/merged major:0 minor:591 fsType:overlay blockSize:0} overlay_0-593:{mountpoint:/var/lib/containers/storage/overlay/4a2a8e9fda9aff46d6968ac98773776c6f93158913f71e2985b633898bb14d5c/merged major:0 minor:593 fsType:overlay blockSize:0} overlay_0-597:{mountpoint:/var/lib/containers/storage/overlay/843214a9b3e96e639428dd3ad431e27ac32b75140407bddba4a912174770f680/merged major:0 minor:597 fsType:overlay blockSize:0} overlay_0-599:{mountpoint:/var/lib/containers/sto Feb 23 14:34:27.973726 master-0 kubenswrapper[28758]: rage/overlay/3608a72a2e5b36b5917d22beb951cb3e9703dad4498c21602aeec53fe6bb96a3/merged major:0 minor:599 fsType:overlay blockSize:0} overlay_0-601:{mountpoint:/var/lib/containers/storage/overlay/21b650839e5b9b3655a97ba14e5709da3c56b2a9d63dacdc1d96f650271eb351/merged major:0 minor:601 fsType:overlay blockSize:0} overlay_0-603:{mountpoint:/var/lib/containers/storage/overlay/3084de4a9e2aaa0fc6af36d9f82e8f262a2ea513838d3254e820e42f1405dc78/merged major:0 minor:603 fsType:overlay blockSize:0} overlay_0-605:{mountpoint:/var/lib/containers/storage/overlay/e40e35348a51b9169e6e0dd8fe6a0394ae2bc7bd2e00ce532b5172246ad9fc68/merged major:0 minor:605 fsType:overlay blockSize:0} overlay_0-611:{mountpoint:/var/lib/containers/storage/overlay/5fb946034f529bf2e92f2bf20c2f12ee2835f7b6d37a172923193c3ab19f8512/merged major:0 minor:611 fsType:overlay blockSize:0} overlay_0-613:{mountpoint:/var/lib/containers/storage/overlay/47c3a6a7e7bc39ae7ff99a8107e54f4dca58f11fbe8e7118c9d2f86c729b3070/merged major:0 minor:613 fsType:overlay blockSize:0} overlay_0-617:{mountpoint:/var/lib/containers/storage/overlay/56a8a24cda76914fac5220919013e2788f45a5bc8b0c5c91c8ed1162fa4bee6a/merged major:0 minor:617 fsType:overlay blockSize:0} overlay_0-62:{mountpoint:/var/lib/containers/storage/overlay/4c8ad3a8be25193d2ab53ea0def8c657bf8fd0e3774f7aa2875e80d5241b3549/merged major:0 minor:62 fsType:overlay blockSize:0} overlay_0-623:{mountpoint:/var/lib/containers/storage/overlay/974382f5af41acf02b8da604a8f87fc2e800ab96770fb2e4130eca8ec1f64cc1/merged major:0 minor:623 fsType:overlay blockSize:0} overlay_0-625:{mountpoint:/var/lib/containers/storage/overlay/db63facb7eb1029911860310301c436d12f88aba59590f10ad155f4081ac836f/merged major:0 minor:625 fsType:overlay blockSize:0} overlay_0-636:{mountpoint:/var/lib/containers/storage/overlay/89514abcfcf193b7dce6702a2d595bf30f5af8eea6a12b5c1a036ccdaf1ff47f/merged major:0 minor:636 fsType:overlay blockSize:0} overlay_0-637:{mountpoint:/var/lib/containers/storage/overlay/de9be1b39b2d34fcfb605074b6e49e3e869924222be9737fd749784361a26e9c/merged major:0 minor:637 fsType:overlay blockSize:0} overlay_0-639:{mountpoint:/var/lib/containers/storage/overlay/c766391ed2468f594620b4e2a5e55239d066f60413fa0ac9ea2d4284db3280db/merged major:0 minor:639 fsType:overlay blockSize:0} overlay_0-64:{mountpoint:/var/lib/containers/storage/overlay/55d6b5e139e5c0bb10cbf6b62e5cef3688159da34eda2c0bcc44e8fe38b1c109/merged major:0 minor:64 fsType:overlay blockSize:0} overlay_0-641:{mountpoint:/var/lib/containers/storage/overlay/0b6f5a6834e3111655913663fda054039b7a2421ad6a60f2a864ba7786c9973e/merged major:0 minor:641 fsType:overlay blockSize:0} overlay_0-644:{mountpoint:/var/lib/containers/storage/overlay/848246efa0e2918e905065132a58a182940718ded7dfcbc3d6e6001a82e0d56b/merged major:0 minor:644 fsType:overlay blockSize:0} overlay_0-648:{mountpoint:/var/lib/containers/storage/overlay/03716fb8fdae1b2b6a48fc8e0ae7c37c0f6b147625ccf9b7176824f101a30f37/merged major:0 minor:648 fsType:overlay blockSize:0} overlay_0-650:{mountpoint:/var/lib/containers/storage/overlay/a6d1afdc7f8e800033a9d6d46cf4a449be6ca083135deb52b2924aaee8bdfe3c/merged major:0 minor:650 fsType:overlay blockSize:0} overlay_0-656:{mountpoint:/var/lib/containers/storage/overlay/fc9aca074c30aaf7e8fea4014909cfb7fc1041f6a19179b3083fd63e90ce6d14/merged major:0 minor:656 fsType:overlay blockSize:0} overlay_0-66:{mountpoint:/var/lib/containers/storage/overlay/48641ff8ee973a851dcd35f2366ff86d2b998ad75ecd1b14219553af22820522/merged major:0 minor:66 fsType:overlay blockSize:0} overlay_0-67:{mountpoint:/var/lib/containers/storage/overlay/9e20b93187104def1303be203fb142bfbc9d1d6e1b90a5505c6f71feb15436f2/merged major:0 minor:67 fsType:overlay blockSize:0} overlay_0-670:{mountpoint:/var/lib/containers/storage/overlay/14c6dde179d3934d73e79f01a5db61f31b9026205d5059e7fe753fbd670887e5/merged major:0 minor:670 fsType:overlay blockSize:0} overlay_0-672:{mountpoint:/var/lib/containers/storage/overlay/b6070a8b46e81865554ba987dbb47680ad9f70a13cd380c4cf30c536db34c1d9/merged major:0 minor:672 fsType:overlay blockSize:0} overlay_0-679:{mountpoint:/var/lib/containers/storage/overlay/a04b3aa95ee63ca7bbcd304027ef0d8d1c1a898ad2ed7e99f582c7b1724606e9/merged major:0 minor:679 fsType:overlay blockSize:0} overlay_0-685:{mountpoint:/var/lib/containers/storage/overlay/0b7f2727f4ec7607cda929caa1bc28afe7d7804f13de64dd8b19830d09e72e08/merged major:0 minor:685 fsType:overlay blockSize:0} overlay_0-686:{mountpoint:/var/lib/containers/storage/overlay/fb66735d869f6f2aa55baf8bd82746a41a39d4ac4369dca69d0f92dbbc10fa03/merged major:0 minor:686 fsType:overlay blockSize:0} overlay_0-689:{mountpoint:/var/lib/containers/storage/overlay/070b96cc7461c6022ec193d0777597d1608daba4d9a517386bbe73e2d17b0210/merged major:0 minor:689 fsType:overlay blockSize:0} overlay_0-691:{mountpoint:/var/lib/containers/storage/overlay/3a2c6f1a2b14b49c41a265abc9096908ec6c56d0407b432a1bef05a10ff2248a/merged major:0 minor:691 fsType:overlay blockSize:0} overlay_0-693:{mountpoint:/var/lib/containers/storage/overlay/08518346822d66db4f96290266b58c7d59172227a0eae5a51ccf5628631f1cfd/merged major:0 minor:693 fsType:overlay blockSize:0} overlay_0-695:{mountpoint:/var/lib/containers/storage/overlay/32a255efe64dc7178e48956d528e6272660486ff95e2f693e11cdf63ef240e77/merged major:0 minor:695 fsType:overlay blockSize:0} overlay_0-701:{mountpoint:/var/lib/containers/storage/overlay/7ca718a19c6165af2c47363690073ad9e0425c285c8455dbf682a2f862281006/merged major:0 minor:701 fsType:overlay blockSize:0} overlay_0-713:{mountpoint:/var/lib/containers/storage/overlay/67617f2db2c4a2730da6c5f00d637abacde36681657103ada10fbf8d89046844/merged major:0 minor:713 fsType:overlay blockSize:0} overlay_0-714:{mountpoint:/var/lib/containers/storage/overlay/de9b3b6d77d7efb3a053f1a959bc12efd80b0794080b9c2a9d68bae49c00a6f4/merged major:0 minor:714 fsType:overlay blockSize:0} overlay_0-721:{mountpoint:/var/lib/containers/storage/overlay/5a99c2b75b2d31988295b6c1aaabd14526fae0134c6b4f7f7163e3d569532c9c/merged major:0 minor:721 fsType:overlay blockSize:0} overlay_0-723:{mountpoint:/var/lib/containers/storage/overlay/d2c18c1afa33db557238b4fa6c7381cf0a3a457b5f7eedc4b519f46be559c6ab/merged major:0 minor:723 fsType:overlay blockSize:0} overlay_0-726:{mountpoint:/var/lib/containers/storage/overlay/70cd02db30c8c974f67135b7d5df13d6b22bad6235b0591b9e1a3274f10e7add/merged major:0 minor:726 fsType:overlay blockSize:0} overlay_0-727:{mountpoint:/var/lib/containers/storage/overlay/095f130f8db2072e9c275179e0c4613554fb4ccc31e3e45eb1ef37f2a711a2c8/merged major:0 minor:727 fsType:overlay blockSize:0} overlay_0-729:{mountpoint:/var/lib/containers/storage/overlay/b0b522a7d3676906dfda7c892f71d5f56079f1af21a566fb6220047287f1ac4c/merged major:0 minor:729 fsType:overlay blockSize:0} overlay_0-741:{mountpoint:/var/lib/containers/storage/overlay/51cbb6f5b2b037caa0bb0234a1fdcda8bd70971691fe29d8cd71d0e4efb38533/merged major:0 minor:741 fsType:overlay blockSize:0} overlay_0-75:{mountpoint:/var/lib/containers/storage/overlay/1f8e4f130afc61504c2f205d854b9d33a63d998a0959dec023ea4ecef11dae00/merged major:0 minor:75 fsType:overlay blockSize:0} overlay_0-752:{mountpoint:/var/lib/containers/storage/overlay/e3d5b08fd8db85798929c37c7101898645d0961ffd48fee946bcfc7e893468b6/merged major:0 minor:752 fsType:overlay blockSize:0} overlay_0-756:{mountpoint:/var/lib/containers/storage/overlay/754b4605a35ff413b04fbb62934173140eefedcc1afa84a7a76d1a0a5165204b/merged major:0 minor:756 fsType:overlay blockSize:0} overlay_0-762:{mountpoint:/var/lib/containers/storage/overlay/f60721e93186eb2c06f543a2f33afb18d9df1a4cd48ac32d0a9f5945bec1e447/merged major:0 minor:762 fsType:overlay blockSize:0} overlay_0-772:{mountpoint:/var/lib/containers/storage/overlay/d5d55a63593cc664351469d31f857b6baba775782ed2cf74193135f548ea7096/merged major:0 minor:772 fsType:overlay blockSize:0} overlay_0-774:{mountpoint:/var/lib/containers/storage/overlay/8238b110d057a4e912a7449a20c7fd53c5dfaee35aa848d82539ad5274702ff1/merged major:0 minor:774 fsType:overlay blockSize:0} overlay_0-788:{mountpoint:/var/lib/containers/storage/overlay/580433a24ec878157b4cb7a8b28a3e3779305761478ec8dd096a8ff0bac924cb/merged major:0 minor:788 fsType:overlay blockSize:0} overlay_0-790:{mountpoint:/var/lib/containers/storage/overlay/b4159f5404848ef7aa3af6efd0ab67a341de9e98a58456a874509571fde21abe/merged major:0 minor:790 fsType:overlay blockSize:0} overlay_0-80:{mountpoint:/var/lib/containers/storage/overlay/82ab4451e35e9a68cf12b0b304e32274205edd9ccdfec8c0f6378e58164fbbe3/merged major:0 minor:80 fsType:overlay blockSize:0} overlay_0-811:{mountpoint:/var/lib/containers/storage/overlay/654b93aa9a5c4a7c4ff1043f071bf7b4770d23ca86a5fd20b6428299c8cadbe8/merged major:0 minor:811 fsType:overlay blockSize:0} overlay_0-813:{mountpoint:/var/lib/containers/storage/overlay/26ac1f926c7447f34f2f9a8ea23ede8045645eaa8a5aae459d960784f4a83aa2/merged major:0 minor:813 fsType:overlay blockSize:0} overlay_0-815:{mountpoint:/var/lib/containers/storage/overlay/ce08ea4768ff2da27b464b8f91f07f6eeeb735564851bac5e779ed670fd61827/merged major:0 minor:815 fsType:overlay blockSize:0} overlay_0-84:{mountpoint:/var/lib/containers/storage/overlay/6018dfd745a6a64eeaf2bd54bca433091a56336b62060ae0a342e938cfa988d3/merged major:0 minor:84 fsType:overlay blockSize:0} overlay_0-841:{mountpoint:/var/lib/containers/storage/overlay/0559d77ae6a6363ee9c215b054ebcdee6c987ac844be29eeda8707fab802c3ae/merged major:0 minor:841 fsType:overlay blockSize:0} overlay_0-843:{mountpoint:/var/lib/containers/storage/overlay/86cd3b6c4fe376e9d803f1b61977d64c485164f21ec919dc002961ffc5b4883a/merged major:0 minor:843 fsType:overlay blockSize:0} overlay_0-845:{mountpoint:/var/lib/containers/storage/overlay/3f7e6d9342e6ed7fff6f5c1e7193d7e24cd85c3db70c16645041d95fd153c63c/merged major:0 minor:845 fsType:overlay blockSize:0} overlay_0-848:{mountpoint:/var/lib/containers/storage/overlay/dd1c836f555b7dee7b1c436c803ea27e504b7054a44677383054574b3e496dd2/merged major:0 minor:848 fsType:overlay blockSize:0} overlay_0-854:{mountpoint:/var/lib/containers/storage/overlay/96d0eb381b1604fffabeb029fe66d9a10e6f813883a3decf724cf8652c896302/merged major:0 minor:854 fsType:overlay blockSize:0} overlay_0-865:{mountpoint:/var/lib/containers/storage/overlay/737a272ecfe8ecad9151361e8b91177779054259d5e80b065615652dfcd6e912/merged major:0 minor:865 fsType:overlay blockSize:0} overlay_0-868:{mountpoint:/var/lib/containers/storage/overlay/7a7e48bd3083cb42b10cd9bb239367c077df31d0abf05fe46079df2bae0fdbfa/merged major:0 minor:868 fsType:overlay blockSize:0} overlay_0-887:{mountpoint:/var/lib/containers/storage/overlay/79d79776fbb921cdc29146e7892344d3f2f7384cc12b2da4882417178b328dfc/merged major:0 minor:887 fsType:overlay blockSize:0} overlay_0-89:{mountpoint:/var/lib/containers/storage/overlay/3b23fba8ba846e9dbcb04f6975574166434f7ee49983c545e11e09dcbd679524/merged major:0 minor:89 fsType:overlay blockSize:0} overlay_0-892:{mountpoint:/var/lib/containers/storage/overlay/9bbe9e18f35911886122b1cc88cdfe30aee41614f68cc8dba78717ef780a2d06/merged major:0 minor:892 fsType:overlay blockSize:0} overlay_0-894:{mountpoint:/var/lib/containers/storage/overlay/ed8394f155b51320187c961b32da9fa7b71d76f668fac768e89c56e068d4f5e6/merged major:0 minor:894 fsType:overlay blockSize:0} overlay_0-896:{mountpoint:/var/lib/containers/storage/overlay/dc1aadac166ddc51a069a4538b7ebe44144fab847306df3e0e2149404f3a607f/merged major:0 minor:896 fsType:overlay blockSize:0} overlay_0-898:{mountpoint:/var/lib/containers/storage/overlay/156dc504f061ee311018f0250897868928695496ced2d429871c78c07ce5b95e/merged major:0 minor:898 fsType:overlay blockSize:0} overlay_0-900:{mountpoint:/var/lib/containers/storage/overlay/0cabf72328e31e5cf7f508ef1f58f4f89d1635657a02f8927804c8266e532d56/merged major:0 minor:900 fsType:overlay blockSize:0} overlay_0-901:{mountpoint:/var/lib/containers/storage/overlay/c54b49e03bc1f306e1cc753d295c89bb73b8c54707c17dcf21ee49a4ff0a6ea0/merged major:0 minor:901 fsType:overlay blockSize:0} overlay_0-905:{mountpoint:/var/lib/containers/storage/overlay/c882ae36036b131b724f40f1cb66373bd340b4bd8e96d13639a3dfb6b916fb65/merged major:0 minor:905 fsType:overlay blockSize:0} overlay_0-907:{mountpoint:/var/lib/containers/storage/overlay/a4c1252a744068e8afb9c1883f1b9c0c6e1805fc4bf4c195c949c4dce53070f8/merged major:0 minor:907 fsType:overlay blockSize:0} overlay_0-909:{mountpoint:/var/lib/containers/storage/overlay/91522f81f23705b6ea932820d6148448560ff9208e67b1b676069e9413ee3c00/merged major:0 minor:909 fsType:overlay blockSize:0} overlay_0-91:{mountpoint:/var/lib/containers/storage/overlay/849f7c25e91c53a4c14889eac4d21a24128a71a4b1e6e5db9f3f0e37b3fa8a42/merged major:0 minor:91 fsType:overlay blockSize:0} overlay_0-911:{mountpoint:/var/lib/containers/storage/overlay/2a9ba8cc8c39b80ca71d2b1d5b3f5e0dddf0fe75bd947d2c362add497524b4c6/merged major:0 minor:911 fsType:overlay blockSize:0} overlay_0-913:{mountpoint:/var/lib/containers/storage/overlay/8b3d1e09d74b9c086aec1b0dddce6a8272ff03f13174bbbea9592f54f26a1c47/merged major:0 minor:913 fsType:overlay blockSize:0} overlay_0-915:{mountpoint:/var/lib/containers/storage/overlay/d7dd24492e65d1274cc06f16c05ef54a45d20c8a9399a08448ad3f595eac413a/merged major:0 minor:915 fsType:overlay blockSize:0} overlay_0-917:{mountpoint:/var/lib/containers/storage/overlay/761bafc38843ca3ba16333ac42cdbf7f89ca1d26f01cda05f1c8881d3b49e327/merged major:0 minor:917 fsType:overlay blockSize:0} overlay_0-923:{mountpoint:/var/lib/containers/storage/overlay/de1b03f485889ef4eba34515a77d27ce141ee958bd4674a82dc102bf80f66275/merged major:0 minor:923 fsType:overlay blockSize:0} overlay_0-928:{mountpoint:/var/lib/containers/storage/overlay/675021e25873031590694a018397b432ab3940a7e9a2ee2fbf32512142a745ea/merged major:0 minor:928 fsType:overlay blockSize:0} overlay_0-929:{mountpoint:/var/lib/containers/storage/overlay/d5122e9fdb135ef29518dfb74c1c3f1f71a1fd158ab749e3d1a78cd5a0249573/merged major:0 minor:929 fsType:overlay blockSize:0} overlay_0-931:{mountpoint:/var/lib/containers/storage/overlay/5e75fb21f509af848992c394636e5680d0b3e0791321d9ada810ae3f1929e936/merged major:0 minor:931 fsType:overlay blockSize:0} overlay_0-938:{mountpoint:/var/lib/containers/storage/overlay/44171e04dd54f0f8ac72cc204aa6133d8691e52453a896060cd9c97d1d475e29/merged major:0 minor:938 fsType:overlay blockSize:0} overlay_0-939:{mountpoint:/var/lib/containers/storage/overlay/c30c714a81b5443a907fcb34041a7ffcc81cf3805a425acff24c41103056ba0c/merged major:0 minor:939 fsType:overlay blockSize:0} overlay_0-949:{mountpoint:/var/lib/containers/storage/overlay/e53aaf9c75b1d60ae954b55a99fc8d653c669ec9ef168ce0a3672546bdfff728/merged major:0 minor:949 fsType:overlay blockSize:0} overlay_0-95:{mountpoint:/var/lib/containers/storage/overlay/3e8871120c344d9cd2314e5836a41f513c7febb7077d61ac62a6e0f843902f60/merged major:0 minor:95 fsType:overlay blockSize:0} overlay_0-951:{mountpoint:/var/lib/containers/storage/overlay/d26cde5658779c76e8109f5df4ef86703311768dfed280e2177fac76bad162eb/merged major:0 minor:951 fsType:overlay blockSize:0} overlay_0-955:{mountpoint:/var/lib/containers/storage/overlay/ab48edd49665da78c9af9a645bc9f4ffb370618b0115ba43ab8d70e97b71ea1c/merged major:0 minor:955 fsType:overlay blockSize:0} overlay_0-958:{mountpoint:/var/lib/containers/storage/overlay/480b95fba5c665ce82cb19a47bf4aaf3e29600418676ea16ea864a03069581a1/merged major:0 minor:958 fsType:overlay blockSize:0} overlay_0-969:{mountpoint:/var/lib/containers/storage/overlay/afecaefb3f6cc442b5393ac1eef6f8d8ae71e979dfcbbf1da0fe9077e60311dc/merged major:0 minor:969 fsType:overlay blockSize:0} overlay_0-97:{mountpoint:/var/lib/containers/storage/overlay/6660605dc0b69066fd20fcbd97167e8fc6a24374cf1ec175e49d091d03b66316/merged major:0 minor:97 fsType:overlay blockSize:0} overlay_0-971:{mountpoint:/var/lib/containers/storage/overlay/ce868b7193a6c140b71c42fcb1242514acd441bb55e8d216dc07906234f518fc/merged major:0 minor:971 fsType:overlay blockSize:0} overlay_0-972:{mountpoint:/var/lib/containers/storage/overlay/23cc92083b7c2b2e5394d77de195d4a3fa555be58ef1cd7f119f263a719e68e1/merged major:0 minor:972 fsType:overlay blockSize:0} overlay_0-977:{mountpoint:/var/lib/containers/storage/overlay/c1c8ccd48904e51106550b9d084125adaf64ce14d36c466f88504979daf9b7f8/merged major:0 minor:977 fsType:overlay blockSize:0} overlay_0-979:{mountpoint:/var/lib/containers/storage/overlay/d3fda71e208e297f41b0e60b3e3673b7259d423fdb0101c06e4d8c6118b6e1c2/merged major:0 minor:979 fsType:overlay blockSize:0} overlay_0-980:{mountpoint:/var/lib/containers/storage/overlay/e6cc07c59f7a58d7d631dfe40211847db7a48752909bfe6c2c8f7aa0ab1ed2d0/merged major:0 minor:980 fsType:overlay blockSize:0} overlay_0-982:{mountpoint:/var/lib/containers/storage/overlay/b34de35020744efafaa931a10fe241d28ffe2bc16d29e5f68157892981453fd4/merged major:0 minor:982 fsType:overlay blockSize:0} overlay_0-990:{mountpoint:/var/lib/containers/storage/overlay/75d8bc45bdf0a05daeef27746820be6b586cc7b8f0dbac552c063c0499b494b9/merged major:0 minor:990 fsType:overlay blockSize:0} overlay_0-991:{mountpoint:/var/lib/containers/storage/overlay/c73becc3d725fc4fe25eddabc17645b0a0c3a794927442bd0a4b5cd7d4324fd9/merged major:0 minor:991 fsType:overlay blockSize:0}] Feb 23 14:34:28.032268 master-0 kubenswrapper[28758]: I0223 14:34:28.030669 28758 manager.go:217] Machine: {Timestamp:2026-02-23 14:34:28.027998077 +0000 UTC m=+0.154314029 CPUVendorID:AuthenticAMD NumCores:16 NumPhysicalCores:1 NumSockets:16 CpuFrequency:2800000 MemoryCapacity:50514153472 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:b2003aa684e6437e87dd9193d3a162ac SystemUUID:b2003aa6-84e6-437e-87dd-9193d3a162ac BootID:f84e7e92-cd63-4a9e-83cc-11dcb3ddc406 Filesystems:[{Device:/var/lib/kubelet/pods/e2d00ece-7586-4346-adbb-eaae1aeda69e/volumes/kubernetes.io~projected/kube-api-access-4nr85 DeviceMajor:0 DeviceMinor:278 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-939 DeviceMajor:0 DeviceMinor:939 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-150 DeviceMajor:0 DeviceMinor:150 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5110b129f87dd0c4cfa0060a0c853f8887b553680a908511fa6dc6b38b84e26d/userdata/shm DeviceMajor:0 DeviceMinor:266 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3dbe1f3d3698f2e251e24d454f894aefdf798ceecbb606aa9dd5f9be4602195a/userdata/shm DeviceMajor:0 DeviceMinor:507 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1111 DeviceMajor:0 DeviceMinor:1111 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-148 DeviceMajor:0 DeviceMinor:148 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-192 DeviceMajor:0 DeviceMinor:192 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-689 DeviceMajor:0 DeviceMinor:689 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/984304e1b4252b7619a58df9f7ce55ca2014852517f80186c3411dc4b687d274/userdata/shm DeviceMajor:0 DeviceMinor:144 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-685 DeviceMajor:0 DeviceMinor:685 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/18bf2f609f8efa099778779b29a09c8f72903a95132d89474102c7f4d79d3d39/userdata/shm DeviceMajor:0 DeviceMinor:415 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/ea0b3538-9a7d-4995-b628-2d63f21d683c/volumes/kubernetes.io~projected/kube-api-access-2cd7w DeviceMajor:0 DeviceMinor:552 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e209b32301611ace99d9d8f60b3c7574bcb7691d3f24d73da6cbdd55987d8c54/userdata/shm DeviceMajor:0 DeviceMinor:420 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-170 DeviceMajor:0 DeviceMinor:170 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-601 DeviceMajor:0 DeviceMinor:601 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-815 DeviceMajor:0 DeviceMinor:815 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-581 DeviceMajor:0 DeviceMinor:581 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/15ad7f4e-44c6-4426-8b97-c47a47786544/volumes/kubernetes.io~secret/node-exporter-tls DeviceMajor:0 DeviceMinor:1106 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/edfafba30f67b299b61cf6429b5bf8b47c050040f18802ede0a7f2834a957ae9/userdata/shm DeviceMajor:0 DeviceMinor:551 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/dac97420bb9e3883db6238fff45b69e331260668444b09a65159744c334f79d2/userdata/shm DeviceMajor:0 DeviceMinor:759 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-958 DeviceMajor:0 DeviceMinor:958 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8dd5fa7c-0519-4170-89c6-b369e5fc1990/volumes/kubernetes.io~projected/kube-api-access-chs7z DeviceMajor:0 DeviceMinor:1325 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1148 DeviceMajor:0 DeviceMinor:1148 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-176 DeviceMajor:0 DeviceMinor:176 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-727 DeviceMajor:0 DeviceMinor:727 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3/volumes/kubernetes.io~projected/kube-api-access-qtbcj DeviceMajor:0 DeviceMinor:806 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b51483fb30eb125aa1ba7d4f431cb050c71a85528244347d0f1ad28b65c42bd5/userdata/shm DeviceMajor:0 DeviceMinor:520 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/a4ae9292-71dc-4484-b277-43cb26c1e04d/volumes/kubernetes.io~projected/kube-api-access-8llc8 DeviceMajor:0 DeviceMinor:250 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-490 DeviceMajor:0 DeviceMinor:490 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-512 DeviceMajor:0 DeviceMinor:512 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/38ec00e9dfbef6fee1166da5b097f1e6a12696d48f32303b497f0b2d760141c8/userdata/shm DeviceMajor:0 DeviceMinor:1225 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-319 DeviceMajor:0 DeviceMinor:319 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9307bb7ee156f4761943094e1eb907a68e217cea3a83d35051b952c84a004e40/userdata/shm DeviceMajor:0 DeviceMinor:488 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-350 DeviceMajor:0 DeviceMinor:350 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/87f989cd-6c19-4a30-833a-10e98b7a0326/volumes/kubernetes.io~projected/kube-api-access-wpqzn DeviceMajor:0 DeviceMinor:1292 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/f10f592e-5738-4879-b776-246b357d4621/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:10102833152 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/92c63c95-e880-4f51-9858-7715343f7bd8/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:822 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-460 DeviceMajor:0 DeviceMinor:460 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/15ad7f4e-44c6-4426-8b97-c47a47786544/volumes/kubernetes.io~secret/node-exporter-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1113 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-440 DeviceMajor:0 DeviceMinor:440 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-543 DeviceMajor:0 DeviceMinor:543 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1a283e3a-33ba-4ef7-87d3-55ed8c953fb4/volumes/kubernetes.io~projected/kube-api-access-rr7rw DeviceMajor:0 DeviceMinor:792 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1296 DeviceMajor:0 DeviceMinor:1296 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-790 DeviceMajor:0 DeviceMinor:790 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-686 DeviceMajor:0 DeviceMinor:686 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1165 DeviceMajor:0 DeviceMinor:1165 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-293 DeviceMajor:0 DeviceMinor:293 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/121fb1d62a402b22b2ce0dcefcc58af76b44ad548fdacc6da5113c93b5d1d4e0/userdata/shm DeviceMajor:0 DeviceMinor:359 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1298 DeviceMajor:0 DeviceMinor:1298 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:25257078784 Type:vfs Inodes:1048576 HasInodes:true} {Device:overlay_0-182 DeviceMajor:0 DeviceMinor:182 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4f93047dc0b7cb7f4f7c771225dde60d727738bda2832af456ff04f11ecb402a/userdata/shm DeviceMajor:0 DeviceMinor:285 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/18da400b-2271-455d-be0d-0ed44c74f78d/volumes/kubernetes.io~secret/prometheus-operator-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1167 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/9b558268-2262-4593-893e-408639a9987d/volumes/kubernetes.io~empty-dir/etc-tuned DeviceMajor:0 DeviceMinor:473 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-346 DeviceMajor:0 DeviceMinor:346 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-907 DeviceMajor:0 DeviceMinor:907 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/674041a2-e2b0-4286-88cc-f1b00571e3f3/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:43 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f2696aa250be24ef04b3fabb47f7471ddc013ced2adb7ee07e74a3053e3dcc2e/userdata/shm DeviceMajor:0 DeviceMinor:677 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-637 DeviceMajor:0 DeviceMinor:637 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c67a2ed2-f520-46fc-84d3-6816dc19f4e0/volumes/kubernetes.io~projected/kube-api-access-hj8ff DeviceMajor:0 DeviceMinor:765 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-611 DeviceMajor:0 DeviceMinor:611 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1258 DeviceMajor:0 DeviceMinor:1258 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b960fc2af9e3400ec5c9c6469cfc9540631ffe8f2ef43e226085fb14e2ada0b8/userdata/shm DeviceMajor:0 DeviceMinor:1293 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-388 DeviceMajor:0 DeviceMinor:388 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-433 DeviceMajor:0 DeviceMinor:433 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-854 DeviceMajor:0 DeviceMinor:854 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-152 DeviceMajor:0 DeviceMinor:152 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-679 DeviceMajor:0 DeviceMinor:679 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/fae9a4cf-2acf-4728-9105-87e004052fe5/volumes/kubernetes.io~secret/openshift-state-metrics-tls DeviceMajor:0 DeviceMinor:1208 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9c6c5f4b9ba45ac61b51f9857ceb74fc6b905bb2bdd1312940fdeb330ace9d7f/userdata/shm DeviceMajor:0 DeviceMinor:261 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-291 DeviceMajor:0 DeviceMinor:291 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-432 DeviceMajor:0 DeviceMinor:432 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-597 DeviceMajor:0 DeviceMinor:597 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-972 DeviceMajor:0 DeviceMinor:972 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-178 DeviceMajor:0 DeviceMinor:178 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-220 DeviceMajor:0 DeviceMinor:220 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-516 DeviceMajor:0 DeviceMinor:516 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/57b57915-64dd-42f5-b06f-bc4bcc06b667/volumes/kubernetes.io~secret/node-tuning-operator-tls DeviceMajor:0 DeviceMinor:414 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1008 DeviceMajor:0 DeviceMinor:1008 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3f86e881-275c-4387-a23a-06c559c8f1e8/volumes/kubernetes.io~projected/kube-api-access-wp8kk DeviceMajor:0 DeviceMinor:861 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-979 DeviceMajor:0 DeviceMinor:979 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1277 DeviceMajor:0 DeviceMinor:1277 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/57b57915-64dd-42f5-b06f-bc4bcc06b667/volumes/kubernetes.io~projected/kube-api-access-qggzs DeviceMajor:0 DeviceMinor:259 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/601970e99fee05d1ddde3baeb681b21e539729838cc176b833fde61b155a74a5/userdata/shm DeviceMajor:0 DeviceMinor:550 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-911 DeviceMajor:0 DeviceMinor:911 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-593 DeviceMajor:0 DeviceMinor:593 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-66 DeviceMajor:0 DeviceMinor:66 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1062 DeviceMajor:0 DeviceMinor:1062 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5cc28e06-3542-4a25-a8b1-5f5b4ee41114/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:1015 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/8dd5fa7c-0519-4170-89c6-b369e5fc1990/volumes/kubernetes.io~secret/webhook-certs DeviceMajor:0 DeviceMinor:1320 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:25257074688 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/674041a2-e2b0-4286-88cc-f1b00571e3f3/volumes/kubernetes.io~projected/kube-api-access-brd4j DeviceMajor:0 DeviceMinor:73 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/865ceedb-b19a-4f2f-b295-311e1b7a645e/volumes/kubernetes.io~projected/kube-api-access-tr2p2 DeviceMajor:0 DeviceMinor:239 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/c0a39496-5e47-4415-b8bf-ed0634797ce1/volumes/kubernetes.io~projected/kube-api-access-9sflb DeviceMajor:0 DeviceMinor:1130 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1219 DeviceMajor:0 DeviceMinor:1219 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1237 DeviceMajor:0 DeviceMinor:1237 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/cb6e88cd-98de-446a-92e8-f56a2f133703/volumes/kubernetes.io~projected/kube-api-access-chznd DeviceMajor:0 DeviceMinor:249 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/255b5a89-1b89-42dc-868a-32ce67975a54/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:819 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/223e6055ac3ccbf4f5095a5d877f5de6b8592fbf41a24e3985f9a14f56619a70/userdata/shm DeviceMajor:0 DeviceMinor:876 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-56 DeviceMajor:0 DeviceMinor:56 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/bbe678de-546d-49d0-8280-3f6d94fa5e4f/volumes/kubernetes.io~projected/kube-api-access-kp5kb DeviceMajor:0 DeviceMinor:167 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-287 DeviceMajor:0 DeviceMinor:287 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-762 DeviceMajor:0 DeviceMinor:762 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-701 DeviceMajor:0 DeviceMinor:701 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ad0f0d72-0337-4347-bb50-e299a175f3ca/volumes/kubernetes.io~projected/kube-api-access-knkx2 DeviceMajor:0 DeviceMinor:436 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/0cebb80d-d898-44c8-82b3-1e18833cee3f/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:818 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-297 DeviceMajor:0 DeviceMinor:297 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/709ac071-4392-4a3f-a3d1-4bc8ba2f6236/volumes/kubernetes.io~secret/signing-key DeviceMajor:0 DeviceMinor:348 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6d0f92f8c3e4f5b782259d4379958d9e827827d44cda6b952bb56160c213bbf8/userdata/shm DeviceMajor:0 DeviceMinor:533 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/06bde94a-3126-4d0f-baba-49dc5fbec61b/volumes/kubernetes.io~secret/stats-auth DeviceMajor:0 DeviceMinor:1144 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/3488a7eb-5170-478c-9af7-490dbe0f514e/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:254 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/bf4e70417a5730a71d5c5227bc5ca324709a18d10a43505d1415f3e27a32b0fc/userdata/shm DeviceMajor:0 DeviceMinor:279 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-913 DeviceMajor:0 DeviceMinor:913 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fb44bfa273a0390e40795165f46ee3660a2a5c93ba6fcc3ac327138fc4e69610/userdata/shm DeviceMajor:0 DeviceMinor:131 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:817 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/97f199fa26d5c3158a89f49cff2f70c3039a2be9fc4ad7fe0571f7a519be854c/userdata/shm DeviceMajor:0 DeviceMinor:362 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/9b558268-2262-4593-893e-408639a9987d/volumes/kubernetes.io~empty-dir/tmp DeviceMajor:0 DeviceMinor:474 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/af950a67-1557-4352-8100-27281bb8ecbe/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:754 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-54 DeviceMajor:0 DeviceMinor:54 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/865ceedb-b19a-4f2f-b295-311e1b7a645e/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:235 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b0f1382249dc5f24b8f4811073e190383453a7404f8296e621cf4a7e56c21042/userdata/shm DeviceMajor:0 DeviceMinor:301 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-117 DeviceMajor:0 DeviceMinor:117 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-448 DeviceMajor:0 DeviceMinor:448 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1235 DeviceMajor:0 DeviceMinor:1235 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1239 DeviceMajor:0 DeviceMinor:1239 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-154 DeviceMajor:0 DeviceMinor:154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3488a7eb-5170-478c-9af7-490dbe0f514e/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:540 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-522 DeviceMajor:0 DeviceMinor:522 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-394 DeviceMajor:0 DeviceMinor:394 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-470 DeviceMajor:0 DeviceMinor:470 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7745fe383c3438f3eb713290ae29bc45137b7df8f820bdc331981eebbfe561fe/userdata/shm DeviceMajor:0 DeviceMinor:77 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-180 DeviceMajor:0 DeviceMinor:180 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5b54fc16-d2f7-4b10-a611-5b411b389c5a/volumes/kubernetes.io~projected/kube-api-access-d5f8j DeviceMajor:0 DeviceMinor:253 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/ad0f0d72-0337-4347-bb50-e299a175f3ca/volumes/kubernetes.io~secret/image-registry-operator-tls DeviceMajor:0 DeviceMinor:435 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/0315476e-7140-4777-8061-9cead4c92024/volumes/kubernetes.io~secret/apiservice-cert DeviceMajor:0 DeviceMinor:862 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-95 DeviceMajor:0 DeviceMinor:95 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/244c9349c0c82d28b67e2cfc680e10b4528e1ddb2f6ad558456c92eee9746fa9/userdata/shm DeviceMajor:0 DeviceMinor:874 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4257fe78462bb2b5b2d39786788f5521c3464b4b4bf8cf481be2dae32881a79a/userdata/shm DeviceMajor:0 DeviceMinor:384 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/57b57915-64dd-42f5-b06f-bc4bcc06b667/volumes/kubernetes.io~secret/apiservice-cert DeviceMajor:0 DeviceMinor:413 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/588a804a-430a-47f4-aa97-c08e907239da/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:646 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-617 DeviceMajor:0 DeviceMinor:617 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-865 DeviceMajor:0 DeviceMinor:865 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-136 DeviceMajor:0 DeviceMinor:136 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-381 DeviceMajor:0 DeviceMinor:381 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-788 DeviceMajor:0 DeviceMinor:788 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-980 DeviceMajor:0 DeviceMinor:980 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/842c59a633c6726baab1699104248bceff992214333b768aa99b1550ee1de3d0/userdata/shm DeviceMajor:0 DeviceMinor:878 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-729 DeviceMajor:0 DeviceMinor:729 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-714 DeviceMajor:0 DeviceMinor:714 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/9416f5d0-32b4-4065-b678-26913af8b6dd/volumes/kubernetes.io~secret/client-ca-bundle DeviceMajor:0 DeviceMinor:1273 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-80 DeviceMajor:0 DeviceMinor:80 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/646fece3-4a42-4e0c-bcc7-5f705f948d63/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls DeviceMajor:0 DeviceMinor:541 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1117 DeviceMajor:0 DeviceMinor:1117 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/4373687a-61a0-434b-81f7-3fecaa1494ef/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls DeviceMajor:0 DeviceMinor:748 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/12b256b7-a57b-4124-8452-25e74cfa7926/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls DeviceMajor:0 DeviceMinor:494 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-772 DeviceMajor:0 DeviceMinor:772 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1216 DeviceMajor:0 DeviceMinor:1216 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-200 DeviceMajor:0 DeviceMinor:200 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/585f74db-4593-426b-b0c7-ec8f64810549/volumes/kubernetes.io~projected/kube-api-access-q9tkx DeviceMajor:0 DeviceMinor:267 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-571 DeviceMajor:0 DeviceMinor:571 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ace75aae-6f4f-4299-90e2-d5292271b136/volumes/kubernetes.io~secret/metrics-certs DeviceMajor:0 DeviceMinor:542 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2a52d8e1940b8a601f24fbfede361672aeb32a5195856b935f21043b52b85ae5/userdata/shm DeviceMajor:0 DeviceMinor:807 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-75 DeviceMajor:0 DeviceMinor:75 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/09d80e28-0b64-4c5d-a9bc-99d843d40165/volumes/kubernetes.io~projected/kube-api-access-g9z2f DeviceMajor:0 DeviceMinor:113 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/ace75aae-6f4f-4299-90e2-d5292271b136/volumes/kubernetes.io~projected/kube-api-access-wzkcs DeviceMajor:0 DeviceMinor:135 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-425 DeviceMajor:0 DeviceMinor:425 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-813 DeviceMajor:0 DeviceMinor:813 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e5104cdd-85b8-49ba-95ca-3e9c8218a01e/volumes/kubernetes.io~projected/kube-api-access-8qfr7 DeviceMajor:0 DeviceMinor:1141 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061/volumes/kubernetes.io~projected/kube-api-access-vp6tj DeviceMajor:0 DeviceMinor:257 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/08c561b3-613b-425f-9de4-d5fc8762ea51/volumes/kubernetes.io~projected/kube-api-access-phmkf DeviceMajor:0 DeviceMinor:277 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/85365dec-af50-406c-b258-890e4f454c4a/volumes/kubernetes.io~projected/kube-api-access-k5d9w DeviceMajor:0 DeviceMinor:785 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-977 DeviceMajor:0 DeviceMinor:977 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1119 DeviceMajor:0 DeviceMinor:1119 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/06bde94a-3126-4d0f-baba-49dc5fbec61b/volumes/kubernetes.io~secret/default-certificate DeviceMajor:0 DeviceMinor:1142 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1086 DeviceMajor:0 DeviceMinor:1086 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1218 DeviceMajor:0 DeviceMinor:1218 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a97726e86565351c3e74221a112a0906c73bf937319f30cac1b3e4b4f38e404f/userdata/shm DeviceMajor:0 DeviceMinor:264 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/172d47fd-e1a1-4d77-9e31-c4f22e824d5f/volumes/kubernetes.io~projected/kube-api-access-9x6q2 DeviceMajor:0 DeviceMinor:466 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-46 DeviceMajor:0 DeviceMinor:46 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-330 DeviceMajor:0 DeviceMinor:330 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ad0f0d72-0337-4347-bb50-e299a175f3ca/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:505 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-931 DeviceMajor:0 DeviceMinor:931 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-695 DeviceMajor:0 DeviceMinor:695 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4782d187d8efc0b4014aa50653963e17b661187c5f36601036516cb2857a5d98/userdata/shm DeviceMajor:0 DeviceMinor:295 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/607c1101-3533-43e3-9eda-13cea2b9dbb6/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:412 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-479 DeviceMajor:0 DeviceMinor:479 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-172 DeviceMajor:0 DeviceMinor:172 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-64 DeviceMajor:0 DeviceMinor:64 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-48 DeviceMajor:0 DeviceMinor:48 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5b54fc16-d2f7-4b10-a611-5b411b389c5a/volumes/kubernetes.io~secret/package-server-manager-serving-cert DeviceMajor:0 DeviceMinor:538 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-583 DeviceMajor:0 DeviceMinor:583 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-843 DeviceMajor:0 DeviceMinor:843 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/94e94715e4a9a7ea0bdeab74580c1cabb71e05248b0269144d5616aa9022f9eb/userdata/shm DeviceMajor:0 DeviceMinor:82 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c/volumes/kubernetes.io~projected/kube-api-access-dzcqx DeviceMajor:0 DeviceMinor:1210 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/9416f5d0-32b4-4065-b678-26913af8b6dd/volumes/kubernetes.io~secret/secret-metrics-server-tls DeviceMajor:0 DeviceMinor:1279 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a928f690a2e58a25ba69277c1852026731fa14cc1f9743eea2995395d98f0871/userdata/shm DeviceMajor:0 DeviceMinor:417 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fee0bde3d0eee2f0bc5f9cbe5f3f907b178692716f3f6aef77b4bea08c864506/userdata/shm DeviceMajor:0 DeviceMinor:418 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/fbb66172-1ea9-4683-b88f-227c4fd94924/volumes/kubernetes.io~projected/kube-api-access-kl87q DeviceMajor:0 DeviceMinor:831 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7d9debfc99355a24383e4ffd764682011042ebcd62151bc7e6d7e61d3c2be56f/userdata/shm DeviceMajor:0 DeviceMinor:124 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4ae19cabdf4e15b9983be578ad7a63be61278ebfe49db1eb9827bad0d8d1a242/userdata/shm DeviceMajor:0 DeviceMinor:750 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-464 DeviceMajor:0 DeviceMinor:464 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-577 DeviceMajor:0 DeviceMinor:577 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1300 DeviceMajor:0 DeviceMinor:1300 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-205 DeviceMajor:0 DeviceMinor:205 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4b7d2c8100142f929dc133ef3a280566ae721c684f05389d72d0b6d99271f228/userdata/shm DeviceMajor:0 DeviceMinor:275 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-848 DeviceMajor:0 DeviceMinor:848 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/fbb66172-1ea9-4683-b88f-227c4fd94924/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert DeviceMajor:0 DeviceMinor:830 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1067 DeviceMajor:0 DeviceMinor:1067 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-386 DeviceMajor:0 DeviceMinor:386 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/390cfd29e9f1dac7e5d17a7f7165d182236f5a201c52a8221fd54d5117d708f7/userdata/shm DeviceMajor:0 DeviceMinor:430 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/66c72c71-f74a-43ab-bf0d-1f4c93623774/volumes/kubernetes.io~secret/catalogserver-certs DeviceMajor:0 DeviceMinor:554 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-599 DeviceMajor:0 DeviceMinor:599 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7c2bb3b30fb024eb641e1765113a1a36bccda4c627461f72aa312212e851ddb2/userdata/shm DeviceMajor:0 DeviceMinor:839 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-726 DeviceMajor:0 DeviceMinor:726 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1275 DeviceMajor:0 DeviceMinor:1275 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8de1f285-47ac-42aa-8026-8addce656362/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:243 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-305 DeviceMajor:0 DeviceMinor:305 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a74df791e2285ece031ddb2cb6a548b32c5f641cf114501941f1933c7809fad4/userdata/shm DeviceMajor:0 DeviceMinor:595 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/14329fd568a14f04f43b97498ab954734f3a702059891d8fa97640d70060f640/userdata/shm DeviceMajor:0 DeviceMinor:809 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5c45c6ef7e4b05f37927f974cf6cb4b129b6dd3a04cd82a3e97ac1e29ceb5010/userdata/shm DeviceMajor:0 DeviceMinor:835 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-845 DeviceMajor:0 DeviceMinor:845 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-371 DeviceMajor:0 DeviceMinor:371 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1318491cbf4a9852637e1a59a321f0824086291a3e80867692cb4a5b349fa4cf/userdata/shm DeviceMajor:0 DeviceMinor:93 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:10102833152 Type:vfs Inodes:819200 HasInodes:true} {Device:overlay_0-210 DeviceMajor:0 DeviceMinor:210 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-363 DeviceMajor:0 DeviceMinor:363 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-636 DeviceMajor:0 DeviceMinor:636 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-44 DeviceMajor:0 DeviceMinor:44 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-138 DeviceMajor:0 DeviceMinor:138 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8de1f285-47ac-42aa-8026-8addce656362/volumes/kubernetes.io~projected/kube-api-access-x7jvd DeviceMajor:0 DeviceMinor:260 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1040 DeviceMajor:0 DeviceMinor:1040 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-536 DeviceMajor:0 DeviceMinor:536 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/959c2393-e914-4c10-a18f-b30fcf012d19/volumes/kubernetes.io~projected/kube-api-access-42sml DeviceMajor:0 DeviceMinor:856 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-938 DeviceMajor:0 DeviceMinor:938 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f10f592e-5738-4879-b776-246b357d4621/volumes/kubernetes.io~projected/kube-api-access-269v7 DeviceMajor:0 DeviceMinor:143 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/ded555da-db03-498e-81a9-ad166f29a2aa/volumes/kubernetes.io~projected/kube-api-access-x4lz2 DeviceMajor:0 DeviceMinor:325 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1270 DeviceMajor:0 DeviceMinor:1270 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-955 DeviceMajor:0 DeviceMinor:955 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b6510ef0b5fc51e782ccd9549d8a6adfc4072f0d6db015ee20beaf2f6eb3bcaa/userdata/shm DeviceMajor:0 DeviceMinor:1180 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/77590dc8fba389fa97fd0b176ea4707c8bfaef0fd399e113347fbdf7415d0d0f/userdata/shm DeviceMajor:0 DeviceMinor:92 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-723 DeviceMajor:0 DeviceMinor:723 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-128 DeviceMajor:0 DeviceMinor:128 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-656 DeviceMajor:0 DeviceMinor:656 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/709ac071-4392-4a3f-a3d1-4bc8ba2f6236/volumes/kubernetes.io~projected/kube-api-access-6qhr9 DeviceMajor:0 DeviceMinor:383 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/2f876e5d-2e82-47d0-8a9c-adacf2bddf77/volumes/kubernetes.io~projected/kube-api-access-pwtcj DeviceMajor:0 DeviceMinor:487 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d9d7c0c01b5302e99d057b82e04bc20f6aaa2ecd35612cea6195a17dbb1d878e/userdata/shm DeviceMajor:0 DeviceMinor:797 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-990 DeviceMajor:0 DeviceMinor:990 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-613 DeviceMajor:0 DeviceMinor:613 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/efdde2df-cd07-4898-88f4-7ecde0e04d7a/volumes/kubernetes.io~projected/kube-api-access-tpc4t DeviceMajor:0 DeviceMinor:784 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-923 DeviceMajor:0 DeviceMinor:923 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b090ed5a-984f-41dd-8cea-34a1ece1514f/volumes/kubernetes.io~projected/kube-api-access-fjs6f DeviceMajor:0 DeviceMinor:141 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/b9774f8c-0f29-46d8-be77-81bcf74d5994/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:579 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/482284fd-6911-4ba6-8d57-7966cc51117a/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:866 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-84 DeviceMajor:0 DeviceMinor:84 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e75c63f7eeabff918836bbbeb9c11ed440bed473107c3e0b28076ddfdf91aadb/userdata/shm DeviceMajor:0 DeviceMinor:1146 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b4132c8230caf30ada71198ad6ab1bfac93f4aab775d0d4c1263153a8363aaf9/userdata/shm DeviceMajor:0 DeviceMinor:786 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-841 DeviceMajor:0 DeviceMinor:841 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-887 DeviceMajor:0 DeviceMinor:887 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1186 DeviceMajor:0 DeviceMinor:1186 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/959c2393-e914-4c10-a18f-b30fcf012d19/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:734 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-193 DeviceMajor:0 DeviceMinor:193 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c02c8912-46c9-4f86-ad28-9bfb2eca4e54/volumes/kubernetes.io~secret/tls-certificates DeviceMajor:0 DeviceMinor:1136 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b6a377cd77390650f0232d49deb3c30f98f3070caae925837c6c5aeeec6246e5/userdata/shm DeviceMajor:0 DeviceMinor:1137 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1285 DeviceMajor:0 DeviceMinor:1285 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6a801da1-a7eb-4187-98b8-315076f55e19/volumes/kubernetes.io~projected/kube-api-access-pqkz4 DeviceMajor:0 DeviceMinor:483 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/0315476e-7140-4777-8061-9cead4c92024/volumes/kubernetes.io~projected/kube-api-access-jtp7w DeviceMajor:0 DeviceMinor:864 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-929 DeviceMajor:0 DeviceMinor:929 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-221 DeviceMajor:0 DeviceMinor:221 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-328 DeviceMajor:0 DeviceMinor:328 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-398 DeviceMajor:0 DeviceMinor:398 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-639 DeviceMajor:0 DeviceMinor:639 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/172d47fd-e1a1-4d77-9e31-c4f22e824d5f/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls DeviceMajor:0 DeviceMinor:373 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-547 DeviceMajor:0 DeviceMinor:547 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6a801da1-a7eb-4187-98b8-315076f55e19/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:532 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/06bde94a-3126-4d0f-baba-49dc5fbec61b/volumes/kubernetes.io~secret/metrics-certs DeviceMajor:0 DeviceMinor:1143 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1152 DeviceMajor:0 DeviceMinor:1152 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-605 DeviceMajor:0 DeviceMinor:605 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1047 DeviceMajor:0 DeviceMinor:1047 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1184 DeviceMajor:0 DeviceMinor:1184 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-928 DeviceMajor:0 DeviceMinor:928 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-589 DeviceMajor:0 DeviceMinor:589 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-971 DeviceMajor:0 DeviceMinor:971 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-360 DeviceMajor:0 DeviceMinor:360 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1133 DeviceMajor:0 DeviceMinor:1133 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1241 DeviceMajor:0 DeviceMinor:1241 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/bbe678de-546d-49d0-8280-3f6d94fa5e4f/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:166 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-230 DeviceMajor:0 DeviceMinor:230 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fcec922662159dc1cf38c675599685e8c305a9fc3cb374ca7d731b92354b4d60/userdata/shm DeviceMajor:0 DeviceMinor:299 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/87f989cd-6c19-4a30-833a-10e98b7a0326/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:1287 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-510 DeviceMajor:0 DeviceMinor:510 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1230 DeviceMajor:0 DeviceMinor:1230 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-741 DeviceMajor:0 DeviceMinor:741 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-315 DeviceMajor:0 DeviceMinor:315 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-894 DeviceMajor:0 DeviceMinor:894 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-501 DeviceMajor:0 DeviceMinor:501 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-691 DeviceMajor:0 DeviceMinor:691 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/255b5a89-1b89-42dc-868a-32ce67975a54/volumes/kubernetes.io~projected/kube-api-access-5nn4m DeviceMajor:0 DeviceMinor:828 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-91 DeviceMajor:0 DeviceMinor:91 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1076 DeviceMajor:0 DeviceMinor:1076 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1243 DeviceMajor:0 DeviceMinor:1243 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1332 DeviceMajor:0 DeviceMinor:1332 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-404 DeviceMajor:0 DeviceMinor:404 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1322 DeviceMajor:0 DeviceMinor:1322 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/24829faf-50e8-45bb-abb0-7cc5ccf81080/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:244 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/ea0b3538-9a7d-4995-b628-2d63f21d683c/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:459 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/66c72c71-f74a-43ab-bf0d-1f4c93623774/volumes/kubernetes.io~projected/kube-api-access-xlqzc DeviceMajor:0 DeviceMinor:557 Capacity:49335554048 Type:vfs Inodes:616 Feb 23 14:34:28.032744 master-0 kubenswrapper[28758]: 6278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ba80f8cbf4454b204ee21a5520078d48d5261a99279a142cb4f152e1edc60436/userdata/shm DeviceMajor:0 DeviceMinor:1281 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:804 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/06bde94a-3126-4d0f-baba-49dc5fbec61b/volumes/kubernetes.io~projected/kube-api-access-2p6hn DeviceMajor:0 DeviceMinor:1145 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/9416f5d0-32b4-4065-b678-26913af8b6dd/volumes/kubernetes.io~secret/secret-metrics-client-certs DeviceMajor:0 DeviceMinor:1278 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-915 DeviceMajor:0 DeviceMinor:915 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1348 DeviceMajor:0 DeviceMinor:1348 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-468 DeviceMajor:0 DeviceMinor:468 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-406 DeviceMajor:0 DeviceMinor:406 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/12b256b7-a57b-4124-8452-25e74cfa7926/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:803 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/0cebb80d-d898-44c8-82b3-1e18833cee3f/volumes/kubernetes.io~projected/kube-api-access-44599 DeviceMajor:0 DeviceMinor:829 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/ceba7b56-f910-473d-aed5-add94868fb31/volumes/kubernetes.io~secret/machine-api-operator-tls DeviceMajor:0 DeviceMinor:711 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/588a804a-430a-47f4-aa97-c08e907239da/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:645 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/588a804a-430a-47f4-aa97-c08e907239da/volumes/kubernetes.io~projected/kube-api-access-hzrqz DeviceMajor:0 DeviceMinor:647 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-457 DeviceMajor:0 DeviceMinor:457 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-901 DeviceMajor:0 DeviceMinor:901 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-104 DeviceMajor:0 DeviceMinor:104 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8de1f285-47ac-42aa-8026-8addce656362/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:247 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-317 DeviceMajor:0 DeviceMinor:317 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-587 DeviceMajor:0 DeviceMinor:587 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8a899f46b5ae367f29ecac877a3d8b6b2ea9e0cf04f3dc088df5a7ab7fffcc36/userdata/shm DeviceMajor:0 DeviceMinor:1328 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-164 DeviceMajor:0 DeviceMinor:164 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/af950a67-1557-4352-8100-27281bb8ecbe/volumes/kubernetes.io~projected/kube-api-access-jxrvf DeviceMajor:0 DeviceMinor:755 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/76c67569-3a72-4de9-87cd-432a4607b15b/volumes/kubernetes.io~projected/kube-api-access-2jlzj DeviceMajor:0 DeviceMinor:872 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3debae3963d7ed5cbda602b7ac89dc5c4d861ae9dad9b89fb0b3fcce27f1aad1/userdata/shm DeviceMajor:0 DeviceMinor:1131 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/fae9a4cf-2acf-4728-9105-87e004052fe5/volumes/kubernetes.io~projected/kube-api-access-x8v9z DeviceMajor:0 DeviceMinor:1209 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/607c1101-3533-43e3-9eda-13cea2b9dbb6/volumes/kubernetes.io~projected/kube-api-access-v4sbp DeviceMajor:0 DeviceMinor:284 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-868 DeviceMajor:0 DeviceMinor:868 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-898 DeviceMajor:0 DeviceMinor:898 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b090ed5a-984f-41dd-8cea-34a1ece1514f/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:140 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-67 DeviceMajor:0 DeviceMinor:67 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1182 DeviceMajor:0 DeviceMinor:1182 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-110 DeviceMajor:0 DeviceMinor:110 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-575 DeviceMajor:0 DeviceMinor:575 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-774 DeviceMajor:0 DeviceMinor:774 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-481 DeviceMajor:0 DeviceMinor:481 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b9774f8c-0f29-46d8-be77-81bcf74d5994/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:580 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-344 DeviceMajor:0 DeviceMinor:344 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/15c8631804d0c1c71f4b64c5ee4ec4990f2c2f6adda4a03d015df366f3b28fd1/userdata/shm DeviceMajor:0 DeviceMinor:569 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-322 DeviceMajor:0 DeviceMinor:322 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c/volumes/kubernetes.io~secret/kube-state-metrics-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1202 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f850a3ee886935c4dd2d0266e97d2bc00c30e8e88c1475292224ac9d98f6501e/userdata/shm DeviceMajor:0 DeviceMinor:145 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-174 DeviceMajor:0 DeviceMinor:174 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-215 DeviceMajor:0 DeviceMinor:215 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-672 DeviceMajor:0 DeviceMinor:672 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/cb6e88cd-98de-446a-92e8-f56a2f133703/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:252 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/92c63c95-e880-4f51-9858-7715343f7bd8/volumes/kubernetes.io~projected/kube-api-access-9tl7p DeviceMajor:0 DeviceMinor:827 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e8da33da933e20232c5b6f5c3675ee250e9d8a32fcabaead70736dd1e091c691/userdata/shm DeviceMajor:0 DeviceMinor:1041 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-528 DeviceMajor:0 DeviceMinor:528 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1228 DeviceMajor:0 DeviceMinor:1228 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1267 DeviceMajor:0 DeviceMinor:1267 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-752 DeviceMajor:0 DeviceMinor:752 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1013 DeviceMajor:0 DeviceMinor:1013 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f9fabe4de8507d0278903a966443b61784dc222f54713517ea295798fc992f95/userdata/shm DeviceMajor:0 DeviceMinor:1211 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-62 DeviceMajor:0 DeviceMinor:62 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a763a9aa12dde6c52d5c6991687ebd101bd47550719a37c47c1a30d449928cff/userdata/shm DeviceMajor:0 DeviceMinor:168 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-641 DeviceMajor:0 DeviceMinor:641 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1002 DeviceMajor:0 DeviceMinor:1002 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/867a47d4a06c655239027935fd0111c0fd83a1e1a4a4c825f97faccc95bc37fc/userdata/shm DeviceMajor:0 DeviceMinor:548 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-811 DeviceMajor:0 DeviceMinor:811 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0315476e-7140-4777-8061-9cead4c92024/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:863 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-713 DeviceMajor:0 DeviceMinor:713 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-159 DeviceMajor:0 DeviceMinor:159 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/24829faf-50e8-45bb-abb0-7cc5ccf81080/volumes/kubernetes.io~projected/kube-api-access-7hp42 DeviceMajor:0 DeviceMinor:272 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-492 DeviceMajor:0 DeviceMinor:492 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b9cf1c39-24f0-420b-8020-089616d1cdf0/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:271 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/12b256b7-a57b-4124-8452-25e74cfa7926/volumes/kubernetes.io~projected/kube-api-access-2rg9g DeviceMajor:0 DeviceMinor:805 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1150 DeviceMajor:0 DeviceMinor:1150 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-100 DeviceMajor:0 DeviceMinor:100 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-982 DeviceMajor:0 DeviceMinor:982 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c67a2ed2-f520-46fc-84d3-6816dc19f4e0/volumes/kubernetes.io~secret/machine-approver-tls DeviceMajor:0 DeviceMinor:764 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/fae9a4cf-2acf-4728-9105-87e004052fe5/volumes/kubernetes.io~secret/openshift-state-metrics-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1115 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-909 DeviceMajor:0 DeviceMinor:909 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-111 DeviceMajor:0 DeviceMinor:111 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f10f592e-5738-4879-b776-246b357d4621/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:142 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/cf04aca0-8174-4134-835d-37adf6a3b5ca/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:242 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-518 DeviceMajor:0 DeviceMinor:518 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-438 DeviceMajor:0 DeviceMinor:438 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-514 DeviceMajor:0 DeviceMinor:514 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-969 DeviceMajor:0 DeviceMinor:969 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/18da400b-2271-455d-be0d-0ed44c74f78d/volumes/kubernetes.io~projected/kube-api-access-2w5kr DeviceMajor:0 DeviceMinor:1168 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/483786a0-0a29-44bf-bbd0-2f37e045aa2c/volumes/kubernetes.io~projected/kube-api-access-88qnh DeviceMajor:0 DeviceMinor:130 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-311 DeviceMajor:0 DeviceMinor:311 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/585f74db-4593-426b-b0c7-ec8f64810549/volumes/kubernetes.io~secret/marketplace-operator-metrics DeviceMajor:0 DeviceMinor:539 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/76c67569-3a72-4de9-87cd-432a4607b15b/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:871 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/cf04aca0-8174-4134-835d-37adf6a3b5ca/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:251 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/b714a9df-026e-423d-a980-2569f0d92e47/volumes/kubernetes.io~projected/kube-api-access-lr868 DeviceMajor:0 DeviceMinor:256 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/85365dec-af50-406c-b258-890e4f454c4a/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert DeviceMajor:0 DeviceMinor:781 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/adbf8f71-f005-4e5b-9de1-e49559cf7386/volumes/kubernetes.io~projected/kube-api-access-5vxqg DeviceMajor:0 DeviceMinor:769 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1330 DeviceMajor:0 DeviceMinor:1330 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3488a7eb-5170-478c-9af7-490dbe0f514e/volumes/kubernetes.io~projected/kube-api-access-6qszm DeviceMajor:0 DeviceMinor:258 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/961e4ecd-545b-4270-ae34-e733dec793b6/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:263 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-623 DeviceMajor:0 DeviceMinor:623 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/16c1ddc99b10767e840864cd1d61fae8aedde334dad20b9e34c987fd909a5a36/userdata/shm DeviceMajor:0 DeviceMinor:757 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/482284fd-6911-4ba6-8d57-7966cc51117a/volumes/kubernetes.io~projected/kube-api-access-khfkr DeviceMajor:0 DeviceMinor:867 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-896 DeviceMajor:0 DeviceMinor:896 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1010 DeviceMajor:0 DeviceMinor:1010 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1054 DeviceMajor:0 DeviceMinor:1054 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-423 DeviceMajor:0 DeviceMinor:423 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-524 DeviceMajor:0 DeviceMinor:524 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/588a804a-430a-47f4-aa97-c08e907239da/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:565 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-892 DeviceMajor:0 DeviceMinor:892 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b714a9df-026e-423d-a980-2569f0d92e47/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:246 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d4eec9eade1a6fd9bfe0d642fe3ae425b01a962b7129ee11f0681674274aaff6/userdata/shm DeviceMajor:0 DeviceMinor:546 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/ceba7b56-f910-473d-aed5-add94868fb31/volumes/kubernetes.io~projected/kube-api-access-72769 DeviceMajor:0 DeviceMinor:684 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-409 DeviceMajor:0 DeviceMinor:409 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-949 DeviceMajor:0 DeviceMinor:949 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1161 DeviceMajor:0 DeviceMinor:1161 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-126 DeviceMajor:0 DeviceMinor:126 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-650 DeviceMajor:0 DeviceMinor:650 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0cebb80d-d898-44c8-82b3-1e18833cee3f/volumes/kubernetes.io~secret/profile-collector-cert DeviceMajor:0 DeviceMinor:821 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-905 DeviceMajor:0 DeviceMinor:905 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/646fece3-4a42-4e0c-bcc7-5f705f948d63/volumes/kubernetes.io~projected/kube-api-access-2jzsd DeviceMajor:0 DeviceMinor:255 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-442 DeviceMajor:0 DeviceMinor:442 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1c60ff3f-2bb1-422e-be27-5eca96d85fd2/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:553 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-102 DeviceMajor:0 DeviceMinor:102 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-644 DeviceMajor:0 DeviceMinor:644 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-341 DeviceMajor:0 DeviceMinor:341 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:240 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/ea0b3538-9a7d-4995-b628-2d63f21d683c/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:463 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a5727697c2b4cf38a7045ae8edfe9cf2a413bc9b589c95f8630aa7de7ef3ba40/userdata/shm DeviceMajor:0 DeviceMinor:837 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-309 DeviceMajor:0 DeviceMinor:309 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/18da400b-2271-455d-be0d-0ed44c74f78d/volumes/kubernetes.io~secret/prometheus-operator-tls DeviceMajor:0 DeviceMinor:1169 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/66c72c71-f74a-43ab-bf0d-1f4c93623774/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:555 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/9b558268-2262-4593-893e-408639a9987d/volumes/kubernetes.io~projected/kube-api-access-nmmn9 DeviceMajor:0 DeviceMinor:478 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-625 DeviceMajor:0 DeviceMinor:625 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1304 DeviceMajor:0 DeviceMinor:1304 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1134 DeviceMajor:0 DeviceMinor:1134 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c/volumes/kubernetes.io~secret/kube-state-metrics-tls DeviceMajor:0 DeviceMinor:1207 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1317 DeviceMajor:0 DeviceMinor:1317 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-313 DeviceMajor:0 DeviceMinor:313 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8ca3dee6-f651-4536-991c-303752c22f07/volumes/kubernetes.io~projected/kube-api-access-g999k DeviceMajor:0 DeviceMinor:368 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-951 DeviceMajor:0 DeviceMinor:951 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b9cf1c39-24f0-420b-8020-089616d1cdf0/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:248 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/255b5a89-1b89-42dc-868a-32ce67975a54/volumes/kubernetes.io~secret/profile-collector-cert DeviceMajor:0 DeviceMinor:820 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b96554c0b26af60fc366d01cdb0653dfe860650b866d469b8eb85b8f7a39e783/userdata/shm DeviceMajor:0 DeviceMinor:766 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4c87997a2f68dd1175880f954711e05356d4ed55a7a6f3583b752f9d11da5e55/userdata/shm DeviceMajor:0 DeviceMinor:883 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1154 DeviceMajor:0 DeviceMinor:1154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c0a39496-5e47-4415-b8bf-ed0634797ce1/volumes/kubernetes.io~secret/node-bootstrap-token DeviceMajor:0 DeviceMinor:1125 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1200 DeviceMajor:0 DeviceMinor:1200 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-118 DeviceMajor:0 DeviceMinor:118 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1c60ff3f-2bb1-422e-be27-5eca96d85fd2/volumes/kubernetes.io~projected/kube-api-access-jlz28 DeviceMajor:0 DeviceMinor:556 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-573 DeviceMajor:0 DeviceMinor:573 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7644d6b4dd6d2352356f500ef21c6602c372872ba1a236023043ba253ba34314/userdata/shm DeviceMajor:0 DeviceMinor:326 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/9416f5d0-32b4-4065-b678-26913af8b6dd/volumes/kubernetes.io~projected/kube-api-access-7hnfl DeviceMajor:0 DeviceMinor:1280 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1283 DeviceMajor:0 DeviceMinor:1283 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d25cb0e087422893459bc0facfa4b23104f61b45c1b3b866e5b4e0e0ad019f99/userdata/shm DeviceMajor:0 DeviceMinor:504 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-303 DeviceMajor:0 DeviceMinor:303 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-585 DeviceMajor:0 DeviceMinor:585 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2421d3a73005cb81482c335c859228ad362145060855ec39106d04eb50279bdb/userdata/shm DeviceMajor:0 DeviceMinor:870 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/4373687a-61a0-434b-81f7-3fecaa1494ef/volumes/kubernetes.io~projected/kube-api-access-wv5nj DeviceMajor:0 DeviceMinor:749 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2/volumes/kubernetes.io~projected/kube-api-access-lzj2j DeviceMajor:0 DeviceMinor:832 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-370 DeviceMajor:0 DeviceMinor:370 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1081 DeviceMajor:0 DeviceMinor:1081 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-194 DeviceMajor:0 DeviceMinor:194 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/15ad7f4e-44c6-4426-8b97-c47a47786544/volumes/kubernetes.io~projected/kube-api-access-94ddm DeviceMajor:0 DeviceMinor:1114 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/57996809f1e2dec5f618cc991b1ec9797922b627eb03d04dabd6bb6cb4205117/userdata/shm DeviceMajor:0 DeviceMinor:50 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-52 DeviceMajor:0 DeviceMinor:52 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/961e4ecd-545b-4270-ae34-e733dec793b6/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:245 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-721 DeviceMajor:0 DeviceMinor:721 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1196 DeviceMajor:0 DeviceMinor:1196 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1343 DeviceMajor:0 DeviceMinor:1343 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6ffc0e356ee8d2e23632fe04da113c89cd2bff5243dd2b5c07a151a546ba49d8/userdata/shm DeviceMajor:0 DeviceMinor:269 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-289 DeviceMajor:0 DeviceMinor:289 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5b1e3102064a2333a33694b441523235b1896dd8c0ad7164b8c2f46c1cc4e9c2/userdata/shm DeviceMajor:0 DeviceMinor:833 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fe38a11f2899f2913cfd5201bad475af2fb0c867e6d00537cbb69269270c3e16/userdata/shm DeviceMajor:0 DeviceMinor:422 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-591 DeviceMajor:0 DeviceMinor:591 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/475b3c76ad7ac657e1ef59565d052b44742cf128419941b4feb55cbb0d636474/userdata/shm DeviceMajor:0 DeviceMinor:881 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-89 DeviceMajor:0 DeviceMinor:89 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5cc28e06-3542-4a25-a8b1-5f5b4ee41114/volumes/kubernetes.io~projected/kube-api-access-phzkn DeviceMajor:0 DeviceMinor:1032 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-108 DeviceMajor:0 DeviceMinor:108 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8bce00dde4bf57f38bea21a54eaeb5445e9a6797bd70cd70ab2f40465ffb6015/userdata/shm DeviceMajor:0 DeviceMinor:273 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/2e89a047-9ebc-459b-b7b3-e902c1fb0e17/volumes/kubernetes.io~projected/kube-api-access-bhsc6 DeviceMajor:0 DeviceMinor:427 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-603 DeviceMajor:0 DeviceMinor:603 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-190 DeviceMajor:0 DeviceMinor:190 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-917 DeviceMajor:0 DeviceMinor:917 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1271 DeviceMajor:0 DeviceMinor:1271 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-396 DeviceMajor:0 DeviceMinor:396 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-450 DeviceMajor:0 DeviceMinor:450 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/305f42f52b6ba5ef239c92f6ac8cee0e2721fbe74d1ef92a70428b3a6fabdd04/userdata/shm DeviceMajor:0 DeviceMinor:549 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-452 DeviceMajor:0 DeviceMinor:452 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1a283e3a-33ba-4ef7-87d3-55ed8c953fb4/volumes/kubernetes.io~secret/samples-operator-tls DeviceMajor:0 DeviceMinor:506 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0b6059c95ec8023c3749bdce17c5d7c3d4cfc4af6e64639a41582908eb86d4e6/userdata/shm DeviceMajor:0 DeviceMinor:41 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-320 DeviceMajor:0 DeviceMinor:320 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-361 DeviceMajor:0 DeviceMinor:361 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c84f66f0-207e-436a-8f4e-d1971fa815eb/volumes/kubernetes.io~projected/kube-api-access-gzn8r DeviceMajor:0 DeviceMinor:783 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:overlay_0-693 DeviceMajor:0 DeviceMinor:693 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-900 DeviceMajor:0 DeviceMinor:900 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f5da67148a68052c542cf85f2d066448d7345b96cbb5647569d62bb97b2af2b1/userdata/shm DeviceMajor:0 DeviceMinor:1213 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-133 DeviceMajor:0 DeviceMinor:133 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7a94361576154416139d60324d6f01b1540eacf16a8dedc989cadf9cc6e41fca/userdata/shm DeviceMajor:0 DeviceMinor:282 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-97 DeviceMajor:0 DeviceMinor:97 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-566 DeviceMajor:0 DeviceMinor:566 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-670 DeviceMajor:0 DeviceMinor:670 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ea0b3538-9a7d-4995-b628-2d63f21d683c/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:461 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-648 DeviceMajor:0 DeviceMinor:648 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-756 DeviceMajor:0 DeviceMinor:756 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-991 DeviceMajor:0 DeviceMinor:991 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-307 DeviceMajor:0 DeviceMinor:307 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/82317c2ff886c91d683fc1357282b875ededf218f4ed66da91969784b2202cd9/userdata/shm DeviceMajor:0 DeviceMinor:1139 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/c0a39496-5e47-4415-b8bf-ed0634797ce1/volumes/kubernetes.io~secret/certs DeviceMajor:0 DeviceMinor:1129 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1326 DeviceMajor:0 DeviceMinor:1326 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e2d00ece-7586-4346-adbb-eaae1aeda69e/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:241 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/71856c04f28ed0a4e9a36c70729f2e0d164816c342db7fab0a6d5f76b0f61b6a/userdata/shm DeviceMajor:0 DeviceMinor:379 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:121fb1d62a402b2 MacAddress:da:ce:43:8d:52:4b Speed:10000 Mtu:8900} {Name:14329fd568a14f0 MacAddress:de:8c:52:57:f2:a6 Speed:10000 Mtu:8900} {Name:15c8631804d0c1c MacAddress:e6:f0:79:91:5b:8a Speed:10000 Mtu:8900} {Name:16c1ddc99b10767 MacAddress:8a:04:ce:d8:b7:3e Speed:10000 Mtu:8900} {Name:18bf2f609f8efa0 MacAddress:aa:25:b2:f1:c2:b5 Speed:10000 Mtu:8900} {Name:223e6055ac3ccbf MacAddress:06:06:8e:58:27:3b Speed:10000 Mtu:8900} {Name:2421d3a73005cb8 MacAddress:ce:7c:97:39:c9:d8 Speed:10000 Mtu:8900} {Name:244c9349c0c82d2 MacAddress:52:98:dd:5c:03:c6 Speed:10000 Mtu:8900} {Name:2a52d8e1940b8a6 MacAddress:06:89:26:24:48:83 Speed:10000 Mtu:8900} {Name:305f42f52b6ba5e MacAddress:36:cb:8d:02:08:06 Speed:10000 Mtu:8900} {Name:390cfd29e9f1dac MacAddress:62:d7:c3:e9:d7:f4 Speed:10000 Mtu:8900} {Name:3dbe1f3d3698f2e MacAddress:9a:f1:86:f9:fa:ea Speed:10000 Mtu:8900} {Name:4257fe78462bb2b MacAddress:36:c7:40:a7:fd:2c Speed:10000 Mtu:8900} {Name:4782d187d8efc0b MacAddress:ce:ae:c3:b0:18:fb Speed:10000 Mtu:8900} {Name:4ae19cabdf4e15b MacAddress:52:7f:4b:37:20:c7 Speed:10000 Mtu:8900} {Name:4b7d2c8100142f9 MacAddress:2e:c1:0b:ed:3e:17 Speed:10000 Mtu:8900} {Name:4c87997a2f68dd1 MacAddress:56:4f:05:1e:c6:45 Speed:10000 Mtu:8900} {Name:5110b129f87dd0c MacAddress:b2:76:37:37:d2:74 Speed:10000 Mtu:8900} {Name:5b1e3102064a233 MacAddress:fe:43:a3:ad:f8:0f Speed:10000 Mtu:8900} {Name:5c45c6ef7e4b05f MacAddress:52:d0:af:7c:bf:d5 Speed:10000 Mtu:8900} {Name:601970e99fee05d MacAddress:82:86:d6:fa:c5:ec Speed:10000 Mtu:8900} {Name:6d0f92f8c3e4f5b MacAddress:8e:55:2d:21:f0:7c Speed:10000 Mtu:8900} {Name:6ffc0e356ee8d2e MacAddress:de:fe:8f:d5:58:a5 Speed:10000 Mtu:8900} {Name:71856c04f28ed0a MacAddress:92:53:2b:21:3f:04 Speed:10000 Mtu:8900} {Name:7644d6b4dd6d235 MacAddress:6e:5c:3d:93:b6:5e Speed:10000 Mtu:8900} {Name:7a9436157615441 MacAddress:6a:e1:08:2c:ce:37 Speed:10000 Mtu:8900} {Name:7c2bb3b30fb024e MacAddress:e2:ad:1b:66:0d:d9 Speed:10000 Mtu:8900} {Name:82317c2ff886c91 MacAddress:96:1d:a9:20:ee:3c Speed:10000 Mtu:8900} {Name:842c59a633c6726 MacAddress:be:3f:47:31:00:e6 Speed:10000 Mtu:8900} {Name:867a47d4a06c655 MacAddress:b2:84:b4:ac:44:61 Speed:10000 Mtu:8900} {Name:8a899f46b5ae367 MacAddress:36:6f:d9:30:b4:67 Speed:10000 Mtu:8900} {Name:8bce00dde4bf57f MacAddress:8a:75:a3:7a:ce:97 Speed:10000 Mtu:8900} {Name:97f199fa26d5c31 MacAddress:9e:e5:e9:ad:d1:77 Speed:10000 Mtu:8900} {Name:9c6c5f4b9ba45ac MacAddress:96:a4:fc:13:60:eb Speed:10000 Mtu:8900} {Name:a5727697c2b4cf3 MacAddress:aa:cd:6e:5a:3f:50 Speed:10000 Mtu:8900} {Name:a928f690a2e58a2 MacAddress:12:cc:0c:d7:d7:e5 Speed:10000 Mtu:8900} {Name:a97726e86565351 MacAddress:0e:63:8f:1a:46:e4 Speed:10000 Mtu:8900} {Name:b0f1382249dc5f2 MacAddress:2e:93:ad:44:de:01 Speed:10000 Mtu:8900} {Name:b4132c8230caf30 MacAddress:b2:72:ed:7d:57:07 Speed:10000 Mtu:8900} {Name:b6510ef0b5fc51e MacAddress:7e:94:53:5f:3b:ea Speed:10000 Mtu:8900} {Name:b6a377cd7739065 MacAddress:6a:59:40:15:99:a1 Speed:10000 Mtu:8900} {Name:b960fc2af9e3400 MacAddress:72:26:2c:e9:be:e2 Speed:10000 Mtu:8900} {Name:ba80f8cbf4454b2 MacAddress:c6:ce:96:fb:95:12 Speed:10000 Mtu:8900} {Name:bf4e70417a5730a MacAddress:8a:e7:fe:22:f6:77 Speed:10000 Mtu:8900} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:br-int MacAddress:92:5a:b3:60:03:d3 Speed:0 Mtu:8900} {Name:d4eec9eade1a6fd MacAddress:f2:87:80:82:37:24 Speed:10000 Mtu:8900} {Name:d9d7c0c01b5302e MacAddress:9a:30:ea:6e:52:c4 Speed:10000 Mtu:8900} {Name:dac97420bb9e388 MacAddress:66:08:4b:04:33:32 Speed:10000 Mtu:8900} {Name:e209b32301611ac MacAddress:ae:2a:4d:5a:a7:56 Speed:10000 Mtu:8900} {Name:e8da33da933e202 MacAddress:42:d9:22:08:86:1c Speed:10000 Mtu:8900} {Name:edfafba30f67b29 MacAddress:9e:64:cf:95:a4:ec Speed:10000 Mtu:8900} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:88:49:0a Speed:-1 Mtu:9000} {Name:eth2 MacAddress:fa:16:3e:4d:95:0b Speed:-1 Mtu:9000} {Name:f2696aa250be24e MacAddress:ba:e2:97:78:ec:8f Speed:10000 Mtu:8900} {Name:f5da67148a68052 MacAddress:2e:af:c8:50:c5:d9 Speed:10000 Mtu:8900} {Name:f9fabe4de8507d0 MacAddress:5a:d1:0e:b2:a0:a0 Speed:10000 Mtu:8900} {Name:fcec922662159dc MacAddress:96:31:3b:0f:d4:f3 Speed:10000 Mtu:8900} {Name:fe38a11f2899f29 MacAddress:c2:8b:b9:61:c6:12 Speed:10000 Mtu:8900} {Name:fee0bde3d0eee2f MacAddress:02:77:d9:74:c9:0b Speed:10000 Mtu:8900} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:8900} {Name:ovs-system MacAddress:4a:ca:1a:d6:6e:89 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:50514153472 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[12] Caches:[{Id:12 Size:32768 Type:Data Level:1} {Id:12 Size:32768 Type:Instruction Level:1} {Id:12 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:12 Size:16777216 Type:Unified Level:3}] SocketID:12 BookID: DrawerID:} {Id:0 Threads:[13] Caches:[{Id:13 Size:32768 Type:Data Level:1} {Id:13 Size:32768 Type:Instruction Level:1} {Id:13 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:13 Size:16777216 Type:Unified Level:3}] SocketID:13 BookID: DrawerID:} {Id:0 Threads:[14] Caches:[{Id:14 Size:32768 Type:Data Level:1} {Id:14 Size:32768 Type:Instruction Level:1} {Id:14 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:14 Size:16777216 Type:Unified Level:3}] SocketID:14 BookID: DrawerID:} {Id:0 Threads:[15] Caches:[{Id:15 Size:32768 Type:Data Level:1} {Id:15 Size:32768 Type:Instruction Level:1} {Id:15 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:15 Size:16777216 Type:Unified Level:3}] SocketID:15 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 23 14:34:28.032744 master-0 kubenswrapper[28758]: I0223 14:34:28.032199 28758 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 23 14:34:28.032744 master-0 kubenswrapper[28758]: I0223 14:34:28.032344 28758 manager.go:233] Version: {KernelVersion:5.14.0-427.109.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202602022246-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 23 14:34:28.032744 master-0 kubenswrapper[28758]: I0223 14:34:28.032706 28758 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 23 14:34:28.033565 master-0 kubenswrapper[28758]: I0223 14:34:28.032909 28758 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 23 14:34:28.033565 master-0 kubenswrapper[28758]: I0223 14:34:28.032936 28758 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 23 14:34:28.033565 master-0 kubenswrapper[28758]: I0223 14:34:28.033134 28758 topology_manager.go:138] "Creating topology manager with none policy" Feb 23 14:34:28.033565 master-0 kubenswrapper[28758]: I0223 14:34:28.033143 28758 container_manager_linux.go:303] "Creating device plugin manager" Feb 23 14:34:28.033565 master-0 kubenswrapper[28758]: I0223 14:34:28.033151 28758 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 23 14:34:28.033565 master-0 kubenswrapper[28758]: I0223 14:34:28.033173 28758 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 23 14:34:28.033565 master-0 kubenswrapper[28758]: I0223 14:34:28.033217 28758 state_mem.go:36] "Initialized new in-memory state store" Feb 23 14:34:28.033565 master-0 kubenswrapper[28758]: I0223 14:34:28.033309 28758 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 23 14:34:28.033565 master-0 kubenswrapper[28758]: I0223 14:34:28.033370 28758 kubelet.go:418] "Attempting to sync node with API server" Feb 23 14:34:28.033565 master-0 kubenswrapper[28758]: I0223 14:34:28.033382 28758 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 23 14:34:28.033565 master-0 kubenswrapper[28758]: I0223 14:34:28.033396 28758 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 23 14:34:28.033565 master-0 kubenswrapper[28758]: I0223 14:34:28.033407 28758 kubelet.go:324] "Adding apiserver pod source" Feb 23 14:34:28.033565 master-0 kubenswrapper[28758]: I0223 14:34:28.033417 28758 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 23 14:34:28.034503 master-0 kubenswrapper[28758]: I0223 14:34:28.034453 28758 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-6.rhaos4.18.git7ed6156.el9" apiVersion="v1" Feb 23 14:34:28.034608 master-0 kubenswrapper[28758]: I0223 14:34:28.034584 28758 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 23 14:34:28.034926 master-0 kubenswrapper[28758]: I0223 14:34:28.034901 28758 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 23 14:34:28.035041 master-0 kubenswrapper[28758]: I0223 14:34:28.035014 28758 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 23 14:34:28.035041 master-0 kubenswrapper[28758]: I0223 14:34:28.035039 28758 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 23 14:34:28.035107 master-0 kubenswrapper[28758]: I0223 14:34:28.035047 28758 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 23 14:34:28.035107 master-0 kubenswrapper[28758]: I0223 14:34:28.035055 28758 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 23 14:34:28.035107 master-0 kubenswrapper[28758]: I0223 14:34:28.035062 28758 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 23 14:34:28.035107 master-0 kubenswrapper[28758]: I0223 14:34:28.035068 28758 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 23 14:34:28.035107 master-0 kubenswrapper[28758]: I0223 14:34:28.035075 28758 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 23 14:34:28.035107 master-0 kubenswrapper[28758]: I0223 14:34:28.035081 28758 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 23 14:34:28.035107 master-0 kubenswrapper[28758]: I0223 14:34:28.035088 28758 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 23 14:34:28.035107 master-0 kubenswrapper[28758]: I0223 14:34:28.035094 28758 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 23 14:34:28.035107 master-0 kubenswrapper[28758]: I0223 14:34:28.035103 28758 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 23 14:34:28.035107 master-0 kubenswrapper[28758]: I0223 14:34:28.035114 28758 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 23 14:34:28.035358 master-0 kubenswrapper[28758]: I0223 14:34:28.035135 28758 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 23 14:34:28.035500 master-0 kubenswrapper[28758]: I0223 14:34:28.035466 28758 server.go:1280] "Started kubelet" Feb 23 14:34:28.036056 master-0 kubenswrapper[28758]: I0223 14:34:28.036015 28758 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 23 14:34:28.036251 master-0 systemd[1]: Started Kubernetes Kubelet. Feb 23 14:34:28.036392 master-0 kubenswrapper[28758]: I0223 14:34:28.036283 28758 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 23 14:34:28.036842 master-0 kubenswrapper[28758]: I0223 14:34:28.036808 28758 server_v1.go:47] "podresources" method="list" useActivePods=true Feb 23 14:34:28.037493 master-0 kubenswrapper[28758]: I0223 14:34:28.037441 28758 server.go:449] "Adding debug handlers to kubelet server" Feb 23 14:34:28.040355 master-0 kubenswrapper[28758]: I0223 14:34:28.039799 28758 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 23 14:34:28.052353 master-0 kubenswrapper[28758]: E0223 14:34:28.052285 28758 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Feb 23 14:34:28.061864 master-0 kubenswrapper[28758]: I0223 14:34:28.061782 28758 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 23 14:34:28.062051 master-0 kubenswrapper[28758]: I0223 14:34:28.061923 28758 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 23 14:34:28.062051 master-0 kubenswrapper[28758]: I0223 14:34:28.061932 28758 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 14:08:38 +0000 UTC, rotation deadline is 2026-02-24 08:44:00.554551575 +0000 UTC Feb 23 14:34:28.062051 master-0 kubenswrapper[28758]: I0223 14:34:28.062002 28758 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 18h9m32.492559602s for next certificate rotation Feb 23 14:34:28.062152 master-0 kubenswrapper[28758]: I0223 14:34:28.062103 28758 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 23 14:34:28.062152 master-0 kubenswrapper[28758]: I0223 14:34:28.062136 28758 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 23 14:34:28.062209 master-0 kubenswrapper[28758]: E0223 14:34:28.062156 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:28.062238 master-0 kubenswrapper[28758]: I0223 14:34:28.062211 28758 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Feb 23 14:34:28.062962 master-0 kubenswrapper[28758]: I0223 14:34:28.062923 28758 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 23 14:34:28.062962 master-0 kubenswrapper[28758]: I0223 14:34:28.062951 28758 factory.go:55] Registering systemd factory Feb 23 14:34:28.062962 master-0 kubenswrapper[28758]: I0223 14:34:28.062963 28758 factory.go:221] Registration of the systemd container factory successfully Feb 23 14:34:28.065297 master-0 kubenswrapper[28758]: I0223 14:34:28.064636 28758 factory.go:153] Registering CRI-O factory Feb 23 14:34:28.065297 master-0 kubenswrapper[28758]: I0223 14:34:28.064664 28758 factory.go:221] Registration of the crio container factory successfully Feb 23 14:34:28.065297 master-0 kubenswrapper[28758]: I0223 14:34:28.064697 28758 factory.go:103] Registering Raw factory Feb 23 14:34:28.065297 master-0 kubenswrapper[28758]: I0223 14:34:28.064712 28758 manager.go:1196] Started watching for new ooms in manager Feb 23 14:34:28.065297 master-0 kubenswrapper[28758]: I0223 14:34:28.065178 28758 manager.go:319] Starting recovery of all containers Feb 23 14:34:28.075869 master-0 kubenswrapper[28758]: I0223 14:34:28.075757 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="959c2393-e914-4c10-a18f-b30fcf012d19" volumeName="kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-client-ca" seLinuxMountContext="" Feb 23 14:34:28.075869 master-0 kubenswrapper[28758]: I0223 14:34:28.075863 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0cebb80d-d898-44c8-82b3-1e18833cee3f" volumeName="kubernetes.io/projected/0cebb80d-d898-44c8-82b3-1e18833cee3f-kube-api-access-44599" seLinuxMountContext="" Feb 23 14:34:28.076082 master-0 kubenswrapper[28758]: I0223 14:34:28.075878 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="12b256b7-a57b-4124-8452-25e74cfa7926" volumeName="kubernetes.io/configmap/12b256b7-a57b-4124-8452-25e74cfa7926-config" seLinuxMountContext="" Feb 23 14:34:28.076082 master-0 kubenswrapper[28758]: I0223 14:34:28.075891 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="76c67569-3a72-4de9-87cd-432a4607b15b" volumeName="kubernetes.io/configmap/76c67569-3a72-4de9-87cd-432a4607b15b-mcd-auth-proxy-config" seLinuxMountContext="" Feb 23 14:34:28.076082 master-0 kubenswrapper[28758]: I0223 14:34:28.075902 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8de1f285-47ac-42aa-8026-8addce656362" volumeName="kubernetes.io/configmap/8de1f285-47ac-42aa-8026-8addce656362-config" seLinuxMountContext="" Feb 23 14:34:28.076082 master-0 kubenswrapper[28758]: I0223 14:34:28.075913 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c0a39496-5e47-4415-b8bf-ed0634797ce1" volumeName="kubernetes.io/secret/c0a39496-5e47-4415-b8bf-ed0634797ce1-node-bootstrap-token" seLinuxMountContext="" Feb 23 14:34:28.076082 master-0 kubenswrapper[28758]: I0223 14:34:28.075926 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="08c561b3-613b-425f-9de4-d5fc8762ea51" volumeName="kubernetes.io/configmap/08c561b3-613b-425f-9de4-d5fc8762ea51-iptables-alerter-script" seLinuxMountContext="" Feb 23 14:34:28.076082 master-0 kubenswrapper[28758]: I0223 14:34:28.075938 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="18da400b-2271-455d-be0d-0ed44c74f78d" volumeName="kubernetes.io/secret/18da400b-2271-455d-be0d-0ed44c74f78d-prometheus-operator-tls" seLinuxMountContext="" Feb 23 14:34:28.076082 master-0 kubenswrapper[28758]: I0223 14:34:28.075954 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1c60ff3f-2bb1-422e-be27-5eca96d85fd2" volumeName="kubernetes.io/projected/1c60ff3f-2bb1-422e-be27-5eca96d85fd2-kube-api-access-jlz28" seLinuxMountContext="" Feb 23 14:34:28.076082 master-0 kubenswrapper[28758]: I0223 14:34:28.075968 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="85365dec-af50-406c-b258-890e4f454c4a" volumeName="kubernetes.io/secret/85365dec-af50-406c-b258-890e4f454c4a-cloud-credential-operator-serving-cert" seLinuxMountContext="" Feb 23 14:34:28.076082 master-0 kubenswrapper[28758]: I0223 14:34:28.075978 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cf04aca0-8174-4134-835d-37adf6a3b5ca" volumeName="kubernetes.io/secret/cf04aca0-8174-4134-835d-37adf6a3b5ca-serving-cert" seLinuxMountContext="" Feb 23 14:34:28.076082 master-0 kubenswrapper[28758]: I0223 14:34:28.075992 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f10f592e-5738-4879-b776-246b357d4621" volumeName="kubernetes.io/secret/f10f592e-5738-4879-b776-246b357d4621-ovn-node-metrics-cert" seLinuxMountContext="" Feb 23 14:34:28.076082 master-0 kubenswrapper[28758]: I0223 14:34:28.076004 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09d80e28-0b64-4c5d-a9bc-99d843d40165" volumeName="kubernetes.io/configmap/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-daemon-config" seLinuxMountContext="" Feb 23 14:34:28.076082 master-0 kubenswrapper[28758]: I0223 14:34:28.076020 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8de1f285-47ac-42aa-8026-8addce656362" volumeName="kubernetes.io/configmap/8de1f285-47ac-42aa-8026-8addce656362-etcd-service-ca" seLinuxMountContext="" Feb 23 14:34:28.076082 master-0 kubenswrapper[28758]: I0223 14:34:28.076031 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9b558268-2262-4593-893e-408639a9987d" volumeName="kubernetes.io/projected/9b558268-2262-4593-893e-408639a9987d-kube-api-access-nmmn9" seLinuxMountContext="" Feb 23 14:34:28.076082 master-0 kubenswrapper[28758]: I0223 14:34:28.076048 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b9cf1c39-24f0-420b-8020-089616d1cdf0" volumeName="kubernetes.io/secret/b9cf1c39-24f0-420b-8020-089616d1cdf0-serving-cert" seLinuxMountContext="" Feb 23 14:34:28.076082 master-0 kubenswrapper[28758]: I0223 14:34:28.076061 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c" volumeName="kubernetes.io/secret/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-state-metrics-tls" seLinuxMountContext="" Feb 23 14:34:28.076082 master-0 kubenswrapper[28758]: I0223 14:34:28.076072 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="08c561b3-613b-425f-9de4-d5fc8762ea51" volumeName="kubernetes.io/projected/08c561b3-613b-425f-9de4-d5fc8762ea51-kube-api-access-phmkf" seLinuxMountContext="" Feb 23 14:34:28.076082 master-0 kubenswrapper[28758]: I0223 14:34:28.076084 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1a283e3a-33ba-4ef7-87d3-55ed8c953fb4" volumeName="kubernetes.io/secret/1a283e3a-33ba-4ef7-87d3-55ed8c953fb4-samples-operator-tls" seLinuxMountContext="" Feb 23 14:34:28.076082 master-0 kubenswrapper[28758]: I0223 14:34:28.076096 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2e89a047-9ebc-459b-b7b3-e902c1fb0e17" volumeName="kubernetes.io/projected/2e89a047-9ebc-459b-b7b3-e902c1fb0e17-kube-api-access-bhsc6" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076107 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b54fc16-d2f7-4b10-a611-5b411b389c5a" volumeName="kubernetes.io/projected/5b54fc16-d2f7-4b10-a611-5b411b389c5a-kube-api-access-d5f8j" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076120 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="865ceedb-b19a-4f2f-b295-311e1b7a645e" volumeName="kubernetes.io/secret/865ceedb-b19a-4f2f-b295-311e1b7a645e-serving-cert" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076133 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="959c2393-e914-4c10-a18f-b30fcf012d19" volumeName="kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-config" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076145 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b714a9df-026e-423d-a980-2569f0d92e47" volumeName="kubernetes.io/projected/b714a9df-026e-423d-a980-2569f0d92e47-kube-api-access-lr868" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076157 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c84f66f0-207e-436a-8f4e-d1971fa815eb" volumeName="kubernetes.io/projected/c84f66f0-207e-436a-8f4e-d1971fa815eb-kube-api-access-gzn8r" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076198 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fbb66172-1ea9-4683-b88f-227c4fd94924" volumeName="kubernetes.io/secret/fbb66172-1ea9-4683-b88f-227c4fd94924-cluster-storage-operator-serving-cert" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076213 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5cc28e06-3542-4a25-a8b1-5f5b4ee41114" volumeName="kubernetes.io/secret/5cc28e06-3542-4a25-a8b1-5f5b4ee41114-proxy-tls" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076227 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="66c72c71-f74a-43ab-bf0d-1f4c93623774" volumeName="kubernetes.io/projected/66c72c71-f74a-43ab-bf0d-1f4c93623774-ca-certs" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076241 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="959c2393-e914-4c10-a18f-b30fcf012d19" volumeName="kubernetes.io/projected/959c2393-e914-4c10-a18f-b30fcf012d19-kube-api-access-42sml" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076272 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdde2df-cd07-4898-88f4-7ecde0e04d7a" volumeName="kubernetes.io/empty-dir/efdde2df-cd07-4898-88f4-7ecde0e04d7a-catalog-content" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076284 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="18da400b-2271-455d-be0d-0ed44c74f78d" volumeName="kubernetes.io/configmap/18da400b-2271-455d-be0d-0ed44c74f78d-metrics-client-ca" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076296 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="66c72c71-f74a-43ab-bf0d-1f4c93623774" volumeName="kubernetes.io/secret/66c72c71-f74a-43ab-bf0d-1f4c93623774-catalogserver-certs" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076307 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87f989cd-6c19-4a30-833a-10e98b7a0326" volumeName="kubernetes.io/projected/87f989cd-6c19-4a30-833a-10e98b7a0326-kube-api-access-wpqzn" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076318 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b090ed5a-984f-41dd-8cea-34a1ece1514f" volumeName="kubernetes.io/secret/b090ed5a-984f-41dd-8cea-34a1ece1514f-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076332 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a801da1-a7eb-4187-98b8-315076f55e19" volumeName="kubernetes.io/secret/6a801da1-a7eb-4187-98b8-315076f55e19-metrics-tls" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076343 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ad0f0d72-0337-4347-bb50-e299a175f3ca" volumeName="kubernetes.io/projected/ad0f0d72-0337-4347-bb50-e299a175f3ca-bound-sa-token" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076355 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ea0b3538-9a7d-4995-b628-2d63f21d683c" volumeName="kubernetes.io/secret/ea0b3538-9a7d-4995-b628-2d63f21d683c-encryption-config" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076367 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0315476e-7140-4777-8061-9cead4c92024" volumeName="kubernetes.io/empty-dir/0315476e-7140-4777-8061-9cead4c92024-tmpfs" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076380 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="646fece3-4a42-4e0c-bcc7-5f705f948d63" volumeName="kubernetes.io/projected/646fece3-4a42-4e0c-bcc7-5f705f948d63-kube-api-access-2jzsd" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076392 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e2d00ece-7586-4346-adbb-eaae1aeda69e" volumeName="kubernetes.io/configmap/e2d00ece-7586-4346-adbb-eaae1aeda69e-trusted-ca-bundle" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076404 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f10f592e-5738-4879-b776-246b357d4621" volumeName="kubernetes.io/configmap/f10f592e-5738-4879-b776-246b357d4621-ovnkube-script-lib" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076415 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="961e4ecd-545b-4270-ae34-e733dec793b6" volumeName="kubernetes.io/projected/961e4ecd-545b-4270-ae34-e733dec793b6-kube-api-access" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076654 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af950a67-1557-4352-8100-27281bb8ecbe" volumeName="kubernetes.io/configmap/af950a67-1557-4352-8100-27281bb8ecbe-images" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076675 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="adbf8f71-f005-4e5b-9de1-e49559cf7386" volumeName="kubernetes.io/empty-dir/adbf8f71-f005-4e5b-9de1-e49559cf7386-catalog-content" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076687 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b9774f8c-0f29-46d8-be77-81bcf74d5994" volumeName="kubernetes.io/secret/b9774f8c-0f29-46d8-be77-81bcf74d5994-serving-cert" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076701 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cf04aca0-8174-4134-835d-37adf6a3b5ca" volumeName="kubernetes.io/projected/cf04aca0-8174-4134-835d-37adf6a3b5ca-kube-api-access" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076714 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f10f592e-5738-4879-b776-246b357d4621" volumeName="kubernetes.io/configmap/f10f592e-5738-4879-b776-246b357d4621-ovnkube-config" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076727 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="06bde94a-3126-4d0f-baba-49dc5fbec61b" volumeName="kubernetes.io/secret/06bde94a-3126-4d0f-baba-49dc5fbec61b-metrics-certs" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076740 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="172d47fd-e1a1-4d77-9e31-c4f22e824d5f" volumeName="kubernetes.io/configmap/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-auth-proxy-config" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076752 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ae4baa4e-4ef4-433d-aa36-149e92fa6ee2" volumeName="kubernetes.io/configmap/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-trusted-ca-bundle" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076765 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c" volumeName="kubernetes.io/configmap/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-metrics-client-ca" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076776 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0315476e-7140-4777-8061-9cead4c92024" volumeName="kubernetes.io/projected/0315476e-7140-4777-8061-9cead4c92024-kube-api-access-jtp7w" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076793 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="646fece3-4a42-4e0c-bcc7-5f705f948d63" volumeName="kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076807 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c67a2ed2-f520-46fc-84d3-6816dc19f4e0" volumeName="kubernetes.io/secret/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-machine-approver-tls" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076820 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ded555da-db03-498e-81a9-ad166f29a2aa" volumeName="kubernetes.io/projected/ded555da-db03-498e-81a9-ad166f29a2aa-kube-api-access-x4lz2" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076833 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="06bde94a-3126-4d0f-baba-49dc5fbec61b" volumeName="kubernetes.io/secret/06bde94a-3126-4d0f-baba-49dc5fbec61b-stats-auth" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076848 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="709ac071-4392-4a3f-a3d1-4bc8ba2f6236" volumeName="kubernetes.io/secret/709ac071-4392-4a3f-a3d1-4bc8ba2f6236-signing-key" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076860 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e2d00ece-7586-4346-adbb-eaae1aeda69e" volumeName="kubernetes.io/secret/e2d00ece-7586-4346-adbb-eaae1aeda69e-serving-cert" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076875 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ea0b3538-9a7d-4995-b628-2d63f21d683c" volumeName="kubernetes.io/configmap/ea0b3538-9a7d-4995-b628-2d63f21d683c-etcd-serving-ca" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076886 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="959c2393-e914-4c10-a18f-b30fcf012d19" volumeName="kubernetes.io/secret/959c2393-e914-4c10-a18f-b30fcf012d19-serving-cert" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076899 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="06bde94a-3126-4d0f-baba-49dc5fbec61b" volumeName="kubernetes.io/projected/06bde94a-3126-4d0f-baba-49dc5fbec61b-kube-api-access-2p6hn" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076910 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0cebb80d-d898-44c8-82b3-1e18833cee3f" volumeName="kubernetes.io/secret/0cebb80d-d898-44c8-82b3-1e18833cee3f-profile-collector-cert" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076923 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4373687a-61a0-434b-81f7-3fecaa1494ef" volumeName="kubernetes.io/projected/4373687a-61a0-434b-81f7-3fecaa1494ef-kube-api-access-wv5nj" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076939 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="588a804a-430a-47f4-aa97-c08e907239da" volumeName="kubernetes.io/configmap/588a804a-430a-47f4-aa97-c08e907239da-audit" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076977 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8de1f285-47ac-42aa-8026-8addce656362" volumeName="kubernetes.io/projected/8de1f285-47ac-42aa-8026-8addce656362-kube-api-access-x7jvd" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.076990 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b9774f8c-0f29-46d8-be77-81bcf74d5994" volumeName="kubernetes.io/configmap/b9774f8c-0f29-46d8-be77-81bcf74d5994-service-ca" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.077001 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="15ad7f4e-44c6-4426-8b97-c47a47786544" volumeName="kubernetes.io/secret/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-kube-rbac-proxy-config" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.077013 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9b558268-2262-4593-893e-408639a9987d" volumeName="kubernetes.io/empty-dir/9b558268-2262-4593-893e-408639a9987d-etc-tuned" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.077025 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c84f66f0-207e-436a-8f4e-d1971fa815eb" volumeName="kubernetes.io/empty-dir/c84f66f0-207e-436a-8f4e-d1971fa815eb-utilities" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.077036 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ceba7b56-f910-473d-aed5-add94868fb31" volumeName="kubernetes.io/secret/ceba7b56-f910-473d-aed5-add94868fb31-machine-api-operator-tls" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.077049 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c" volumeName="kubernetes.io/secret/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-state-metrics-kube-rbac-proxy-config" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.077080 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1c60ff3f-2bb1-422e-be27-5eca96d85fd2" volumeName="kubernetes.io/projected/1c60ff3f-2bb1-422e-be27-5eca96d85fd2-ca-certs" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.077095 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2f876e5d-2e82-47d0-8a9c-adacf2bddf77" volumeName="kubernetes.io/projected/2f876e5d-2e82-47d0-8a9c-adacf2bddf77-kube-api-access-pwtcj" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.077108 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3" volumeName="kubernetes.io/configmap/3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3-auth-proxy-config" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.077121 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ae4baa4e-4ef4-433d-aa36-149e92fa6ee2" volumeName="kubernetes.io/configmap/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-service-ca-bundle" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.077133 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e2d00ece-7586-4346-adbb-eaae1aeda69e" volumeName="kubernetes.io/configmap/e2d00ece-7586-4346-adbb-eaae1aeda69e-config" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.077144 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fae9a4cf-2acf-4728-9105-87e004052fe5" volumeName="kubernetes.io/secret/fae9a4cf-2acf-4728-9105-87e004052fe5-openshift-state-metrics-tls" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.077155 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="585f74db-4593-426b-b0c7-ec8f64810549" volumeName="kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.077183 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5cc28e06-3542-4a25-a8b1-5f5b4ee41114" volumeName="kubernetes.io/projected/5cc28e06-3542-4a25-a8b1-5f5b4ee41114-kube-api-access-phzkn" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.077196 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="865ceedb-b19a-4f2f-b295-311e1b7a645e" volumeName="kubernetes.io/projected/865ceedb-b19a-4f2f-b295-311e1b7a645e-kube-api-access-tr2p2" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.077209 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8dd5fa7c-0519-4170-89c6-b369e5fc1990" volumeName="kubernetes.io/secret/8dd5fa7c-0519-4170-89c6-b369e5fc1990-webhook-certs" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.077221 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="12b256b7-a57b-4124-8452-25e74cfa7926" volumeName="kubernetes.io/configmap/12b256b7-a57b-4124-8452-25e74cfa7926-images" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.077235 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3488a7eb-5170-478c-9af7-490dbe0f514e" volumeName="kubernetes.io/configmap/3488a7eb-5170-478c-9af7-490dbe0f514e-trusted-ca" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.077247 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="482284fd-6911-4ba6-8d57-7966cc51117a" volumeName="kubernetes.io/projected/482284fd-6911-4ba6-8d57-7966cc51117a-kube-api-access-khfkr" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.077261 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bbe678de-546d-49d0-8280-3f6d94fa5e4f" volumeName="kubernetes.io/configmap/bbe678de-546d-49d0-8280-3f6d94fa5e4f-env-overrides" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.077276 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ea0b3538-9a7d-4995-b628-2d63f21d683c" volumeName="kubernetes.io/configmap/ea0b3538-9a7d-4995-b628-2d63f21d683c-audit-policies" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.077290 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="06bde94a-3126-4d0f-baba-49dc5fbec61b" volumeName="kubernetes.io/configmap/06bde94a-3126-4d0f-baba-49dc5fbec61b-service-ca-bundle" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.077302 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a801da1-a7eb-4187-98b8-315076f55e19" volumeName="kubernetes.io/configmap/6a801da1-a7eb-4187-98b8-315076f55e19-config-volume" seLinuxMountContext="" Feb 23 14:34:28.077253 master-0 kubenswrapper[28758]: I0223 14:34:28.077317 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="961e4ecd-545b-4270-ae34-e733dec793b6" volumeName="kubernetes.io/secret/961e4ecd-545b-4270-ae34-e733dec793b6-serving-cert" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077331 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b090ed5a-984f-41dd-8cea-34a1ece1514f" volumeName="kubernetes.io/projected/b090ed5a-984f-41dd-8cea-34a1ece1514f-kube-api-access-fjs6f" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077352 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="482284fd-6911-4ba6-8d57-7966cc51117a" volumeName="kubernetes.io/configmap/482284fd-6911-4ba6-8d57-7966cc51117a-client-ca" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077365 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="607c1101-3533-43e3-9eda-13cea2b9dbb6" volumeName="kubernetes.io/projected/607c1101-3533-43e3-9eda-13cea2b9dbb6-kube-api-access-v4sbp" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077377 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="66c72c71-f74a-43ab-bf0d-1f4c93623774" volumeName="kubernetes.io/projected/66c72c71-f74a-43ab-bf0d-1f4c93623774-kube-api-access-xlqzc" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077390 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b9774f8c-0f29-46d8-be77-81bcf74d5994" volumeName="kubernetes.io/projected/b9774f8c-0f29-46d8-be77-81bcf74d5994-kube-api-access" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077403 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="674041a2-e2b0-4286-88cc-f1b00571e3f3" volumeName="kubernetes.io/secret/674041a2-e2b0-4286-88cc-f1b00571e3f3-metrics-tls" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077415 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bbe678de-546d-49d0-8280-3f6d94fa5e4f" volumeName="kubernetes.io/secret/bbe678de-546d-49d0-8280-3f6d94fa5e4f-webhook-cert" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077429 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="15ad7f4e-44c6-4426-8b97-c47a47786544" volumeName="kubernetes.io/empty-dir/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-textfile" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077442 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="172d47fd-e1a1-4d77-9e31-c4f22e824d5f" volumeName="kubernetes.io/projected/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-kube-api-access-9x6q2" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077456 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9416f5d0-32b4-4065-b678-26913af8b6dd" volumeName="kubernetes.io/configmap/9416f5d0-32b4-4065-b678-26913af8b6dd-configmap-kubelet-serving-ca-bundle" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077469 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a4ae9292-71dc-4484-b277-43cb26c1e04d" volumeName="kubernetes.io/projected/a4ae9292-71dc-4484-b277-43cb26c1e04d-kube-api-access-8llc8" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077508 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="adbf8f71-f005-4e5b-9de1-e49559cf7386" volumeName="kubernetes.io/projected/adbf8f71-f005-4e5b-9de1-e49559cf7386-kube-api-access-5vxqg" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077521 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1c60ff3f-2bb1-422e-be27-5eca96d85fd2" volumeName="kubernetes.io/empty-dir/1c60ff3f-2bb1-422e-be27-5eca96d85fd2-cache" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077533 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="482284fd-6911-4ba6-8d57-7966cc51117a" volumeName="kubernetes.io/secret/482284fd-6911-4ba6-8d57-7966cc51117a-serving-cert" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077545 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="709ac071-4392-4a3f-a3d1-4bc8ba2f6236" volumeName="kubernetes.io/configmap/709ac071-4392-4a3f-a3d1-4bc8ba2f6236-signing-cabundle" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077565 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87f989cd-6c19-4a30-833a-10e98b7a0326" volumeName="kubernetes.io/secret/87f989cd-6c19-4a30-833a-10e98b7a0326-cert" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077579 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ea0b3538-9a7d-4995-b628-2d63f21d683c" volumeName="kubernetes.io/secret/ea0b3538-9a7d-4995-b628-2d63f21d683c-etcd-client" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077593 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f86e881-275c-4387-a23a-06c559c8f1e8" volumeName="kubernetes.io/projected/3f86e881-275c-4387-a23a-06c559c8f1e8-kube-api-access-wp8kk" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077606 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="483786a0-0a29-44bf-bbd0-2f37e045aa2c" volumeName="kubernetes.io/projected/483786a0-0a29-44bf-bbd0-2f37e045aa2c-kube-api-access-88qnh" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077620 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b090ed5a-984f-41dd-8cea-34a1ece1514f" volumeName="kubernetes.io/configmap/b090ed5a-984f-41dd-8cea-34a1ece1514f-ovnkube-config" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077632 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b714a9df-026e-423d-a980-2569f0d92e47" volumeName="kubernetes.io/configmap/b714a9df-026e-423d-a980-2569f0d92e47-config" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077645 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cb6e88cd-98de-446a-92e8-f56a2f133703" volumeName="kubernetes.io/secret/cb6e88cd-98de-446a-92e8-f56a2f133703-serving-cert" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077658 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="18da400b-2271-455d-be0d-0ed44c74f78d" volumeName="kubernetes.io/secret/18da400b-2271-455d-be0d-0ed44c74f78d-prometheus-operator-kube-rbac-proxy-config" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077673 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a801da1-a7eb-4187-98b8-315076f55e19" volumeName="kubernetes.io/projected/6a801da1-a7eb-4187-98b8-315076f55e19-kube-api-access-pqkz4" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077686 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="709ac071-4392-4a3f-a3d1-4bc8ba2f6236" volumeName="kubernetes.io/projected/709ac071-4392-4a3f-a3d1-4bc8ba2f6236-kube-api-access-6qhr9" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077698 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9416f5d0-32b4-4065-b678-26913af8b6dd" volumeName="kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-client-ca-bundle" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077710 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="961e4ecd-545b-4270-ae34-e733dec793b6" volumeName="kubernetes.io/configmap/961e4ecd-545b-4270-ae34-e733dec793b6-config" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077721 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bbe678de-546d-49d0-8280-3f6d94fa5e4f" volumeName="kubernetes.io/projected/bbe678de-546d-49d0-8280-3f6d94fa5e4f-kube-api-access-kp5kb" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077734 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f10f592e-5738-4879-b776-246b357d4621" volumeName="kubernetes.io/projected/f10f592e-5738-4879-b776-246b357d4621-kube-api-access-269v7" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077746 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="12b256b7-a57b-4124-8452-25e74cfa7926" volumeName="kubernetes.io/projected/12b256b7-a57b-4124-8452-25e74cfa7926-kube-api-access-2rg9g" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077758 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3" volumeName="kubernetes.io/secret/3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3-cert" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077770 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8de1f285-47ac-42aa-8026-8addce656362" volumeName="kubernetes.io/secret/8de1f285-47ac-42aa-8026-8addce656362-serving-cert" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077781 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="92c63c95-e880-4f51-9858-7715343f7bd8" volumeName="kubernetes.io/projected/92c63c95-e880-4f51-9858-7715343f7bd8-kube-api-access-9tl7p" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077794 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09d80e28-0b64-4c5d-a9bc-99d843d40165" volumeName="kubernetes.io/projected/09d80e28-0b64-4c5d-a9bc-99d843d40165-kube-api-access-g9z2f" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077805 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="92c63c95-e880-4f51-9858-7715343f7bd8" volumeName="kubernetes.io/empty-dir/92c63c95-e880-4f51-9858-7715343f7bd8-available-featuregates" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077818 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ceba7b56-f910-473d-aed5-add94868fb31" volumeName="kubernetes.io/configmap/ceba7b56-f910-473d-aed5-add94868fb31-config" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077829 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1a283e3a-33ba-4ef7-87d3-55ed8c953fb4" volumeName="kubernetes.io/projected/1a283e3a-33ba-4ef7-87d3-55ed8c953fb4-kube-api-access-rr7rw" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077842 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="588a804a-430a-47f4-aa97-c08e907239da" volumeName="kubernetes.io/projected/588a804a-430a-47f4-aa97-c08e907239da-kube-api-access-hzrqz" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077853 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cf04aca0-8174-4134-835d-37adf6a3b5ca" volumeName="kubernetes.io/configmap/cf04aca0-8174-4134-835d-37adf6a3b5ca-config" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077864 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="76c67569-3a72-4de9-87cd-432a4607b15b" volumeName="kubernetes.io/projected/76c67569-3a72-4de9-87cd-432a4607b15b-kube-api-access-2jlzj" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077875 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8ca3dee6-f651-4536-991c-303752c22f07" volumeName="kubernetes.io/projected/8ca3dee6-f651-4536-991c-303752c22f07-kube-api-access-g999k" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077886 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="959c2393-e914-4c10-a18f-b30fcf012d19" volumeName="kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-proxy-ca-bundles" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077896 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af950a67-1557-4352-8100-27281bb8ecbe" volumeName="kubernetes.io/secret/af950a67-1557-4352-8100-27281bb8ecbe-proxy-tls" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077907 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fbb66172-1ea9-4683-b88f-227c4fd94924" volumeName="kubernetes.io/projected/fbb66172-1ea9-4683-b88f-227c4fd94924-kube-api-access-kl87q" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077918 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="588a804a-430a-47f4-aa97-c08e907239da" volumeName="kubernetes.io/configmap/588a804a-430a-47f4-aa97-c08e907239da-config" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077929 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ace75aae-6f4f-4299-90e2-d5292271b136" volumeName="kubernetes.io/projected/ace75aae-6f4f-4299-90e2-d5292271b136-kube-api-access-wzkcs" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077940 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="adbf8f71-f005-4e5b-9de1-e49559cf7386" volumeName="kubernetes.io/empty-dir/adbf8f71-f005-4e5b-9de1-e49559cf7386-utilities" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077952 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ea0b3538-9a7d-4995-b628-2d63f21d683c" volumeName="kubernetes.io/configmap/ea0b3538-9a7d-4995-b628-2d63f21d683c-trusted-ca-bundle" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077964 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0315476e-7140-4777-8061-9cead4c92024" volumeName="kubernetes.io/secret/0315476e-7140-4777-8061-9cead4c92024-apiservice-cert" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077976 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="255b5a89-1b89-42dc-868a-32ce67975a54" volumeName="kubernetes.io/projected/255b5a89-1b89-42dc-868a-32ce67975a54-kube-api-access-5nn4m" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.077988 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bbe678de-546d-49d0-8280-3f6d94fa5e4f" volumeName="kubernetes.io/configmap/bbe678de-546d-49d0-8280-3f6d94fa5e4f-ovnkube-identity-cm" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078001 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c" volumeName="kubernetes.io/empty-dir/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-volume-directive-shadow" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078012 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0cebb80d-d898-44c8-82b3-1e18833cee3f" volumeName="kubernetes.io/secret/0cebb80d-d898-44c8-82b3-1e18833cee3f-srv-cert" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078025 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="15ad7f4e-44c6-4426-8b97-c47a47786544" volumeName="kubernetes.io/projected/15ad7f4e-44c6-4426-8b97-c47a47786544-kube-api-access-94ddm" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078037 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57b57915-64dd-42f5-b06f-bc4bcc06b667" volumeName="kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078048 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9b558268-2262-4593-893e-408639a9987d" volumeName="kubernetes.io/empty-dir/9b558268-2262-4593-893e-408639a9987d-tmp" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078094 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="24829faf-50e8-45bb-abb0-7cc5ccf81080" volumeName="kubernetes.io/secret/24829faf-50e8-45bb-abb0-7cc5ccf81080-serving-cert" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078136 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b9cf1c39-24f0-420b-8020-089616d1cdf0" volumeName="kubernetes.io/configmap/b9cf1c39-24f0-420b-8020-089616d1cdf0-config" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078150 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c67a2ed2-f520-46fc-84d3-6816dc19f4e0" volumeName="kubernetes.io/projected/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-kube-api-access-hj8ff" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078163 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e2d00ece-7586-4346-adbb-eaae1aeda69e" volumeName="kubernetes.io/configmap/e2d00ece-7586-4346-adbb-eaae1aeda69e-service-ca-bundle" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078177 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c0a39496-5e47-4415-b8bf-ed0634797ce1" volumeName="kubernetes.io/secret/c0a39496-5e47-4415-b8bf-ed0634797ce1-certs" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078189 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c67a2ed2-f520-46fc-84d3-6816dc19f4e0" volumeName="kubernetes.io/configmap/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-config" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078200 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57b57915-64dd-42f5-b06f-bc4bcc06b667" volumeName="kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078236 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5cc28e06-3542-4a25-a8b1-5f5b4ee41114" volumeName="kubernetes.io/configmap/5cc28e06-3542-4a25-a8b1-5f5b4ee41114-mcc-auth-proxy-config" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078249 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="66c72c71-f74a-43ab-bf0d-1f4c93623774" volumeName="kubernetes.io/empty-dir/66c72c71-f74a-43ab-bf0d-1f4c93623774-cache" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078261 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8dd5fa7c-0519-4170-89c6-b369e5fc1990" volumeName="kubernetes.io/projected/8dd5fa7c-0519-4170-89c6-b369e5fc1990-kube-api-access-chs7z" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078275 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ea0b3538-9a7d-4995-b628-2d63f21d683c" volumeName="kubernetes.io/projected/ea0b3538-9a7d-4995-b628-2d63f21d683c-kube-api-access-2cd7w" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078289 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="92c63c95-e880-4f51-9858-7715343f7bd8" volumeName="kubernetes.io/secret/92c63c95-e880-4f51-9858-7715343f7bd8-serving-cert" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078302 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af950a67-1557-4352-8100-27281bb8ecbe" volumeName="kubernetes.io/projected/af950a67-1557-4352-8100-27281bb8ecbe-kube-api-access-jxrvf" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078315 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b9cf1c39-24f0-420b-8020-089616d1cdf0" volumeName="kubernetes.io/projected/b9cf1c39-24f0-420b-8020-089616d1cdf0-kube-api-access" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078326 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e5104cdd-85b8-49ba-95ca-3e9c8218a01e" volumeName="kubernetes.io/projected/e5104cdd-85b8-49ba-95ca-3e9c8218a01e-kube-api-access-8qfr7" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078338 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="588a804a-430a-47f4-aa97-c08e907239da" volumeName="kubernetes.io/configmap/588a804a-430a-47f4-aa97-c08e907239da-image-import-ca" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078350 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="588a804a-430a-47f4-aa97-c08e907239da" volumeName="kubernetes.io/configmap/588a804a-430a-47f4-aa97-c08e907239da-etcd-serving-ca" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078362 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8de1f285-47ac-42aa-8026-8addce656362" volumeName="kubernetes.io/secret/8de1f285-47ac-42aa-8026-8addce656362-etcd-client" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078374 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d2aa0d48-7c8e-4ddb-84a3-b3c34414c061" volumeName="kubernetes.io/empty-dir/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061-operand-assets" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078387 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="172d47fd-e1a1-4d77-9e31-c4f22e824d5f" volumeName="kubernetes.io/configmap/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-images" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078398 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="18da400b-2271-455d-be0d-0ed44c74f78d" volumeName="kubernetes.io/projected/18da400b-2271-455d-be0d-0ed44c74f78d-kube-api-access-2w5kr" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078409 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4373687a-61a0-434b-81f7-3fecaa1494ef" volumeName="kubernetes.io/secret/4373687a-61a0-434b-81f7-3fecaa1494ef-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078438 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="607c1101-3533-43e3-9eda-13cea2b9dbb6" volumeName="kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078451 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdde2df-cd07-4898-88f4-7ecde0e04d7a" volumeName="kubernetes.io/empty-dir/efdde2df-cd07-4898-88f4-7ecde0e04d7a-utilities" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078463 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="12b256b7-a57b-4124-8452-25e74cfa7926" volumeName="kubernetes.io/secret/12b256b7-a57b-4124-8452-25e74cfa7926-cluster-baremetal-operator-tls" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078475 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="585f74db-4593-426b-b0c7-ec8f64810549" volumeName="kubernetes.io/configmap/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-trusted-ca" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078503 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cb6e88cd-98de-446a-92e8-f56a2f133703" volumeName="kubernetes.io/projected/cb6e88cd-98de-446a-92e8-f56a2f133703-kube-api-access-chznd" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078514 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ceba7b56-f910-473d-aed5-add94868fb31" volumeName="kubernetes.io/configmap/ceba7b56-f910-473d-aed5-add94868fb31-images" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078527 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3488a7eb-5170-478c-9af7-490dbe0f514e" volumeName="kubernetes.io/projected/3488a7eb-5170-478c-9af7-490dbe0f514e-bound-sa-token" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078538 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57b57915-64dd-42f5-b06f-bc4bcc06b667" volumeName="kubernetes.io/projected/57b57915-64dd-42f5-b06f-bc4bcc06b667-kube-api-access-qggzs" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078549 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="588a804a-430a-47f4-aa97-c08e907239da" volumeName="kubernetes.io/secret/588a804a-430a-47f4-aa97-c08e907239da-serving-cert" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078560 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cb6e88cd-98de-446a-92e8-f56a2f133703" volumeName="kubernetes.io/configmap/cb6e88cd-98de-446a-92e8-f56a2f133703-config" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078571 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ad0f0d72-0337-4347-bb50-e299a175f3ca" volumeName="kubernetes.io/projected/ad0f0d72-0337-4347-bb50-e299a175f3ca-kube-api-access-knkx2" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078583 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0315476e-7140-4777-8061-9cead4c92024" volumeName="kubernetes.io/secret/0315476e-7140-4777-8061-9cead4c92024-webhook-cert" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078594 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f86e881-275c-4387-a23a-06c559c8f1e8" volumeName="kubernetes.io/empty-dir/3f86e881-275c-4387-a23a-06c559c8f1e8-catalog-content" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078607 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9416f5d0-32b4-4065-b678-26913af8b6dd" volumeName="kubernetes.io/empty-dir/9416f5d0-32b4-4065-b678-26913af8b6dd-audit-log" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078619 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ad0f0d72-0337-4347-bb50-e299a175f3ca" volumeName="kubernetes.io/configmap/ad0f0d72-0337-4347-bb50-e299a175f3ca-trusted-ca" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078633 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="588a804a-430a-47f4-aa97-c08e907239da" volumeName="kubernetes.io/secret/588a804a-430a-47f4-aa97-c08e907239da-encryption-config" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078648 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ceba7b56-f910-473d-aed5-add94868fb31" volumeName="kubernetes.io/projected/ceba7b56-f910-473d-aed5-add94868fb31-kube-api-access-72769" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078661 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f10f592e-5738-4879-b776-246b357d4621" volumeName="kubernetes.io/configmap/f10f592e-5738-4879-b776-246b357d4621-env-overrides" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078672 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e2d00ece-7586-4346-adbb-eaae1aeda69e" volumeName="kubernetes.io/projected/e2d00ece-7586-4346-adbb-eaae1aeda69e-kube-api-access-4nr85" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078684 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="255b5a89-1b89-42dc-868a-32ce67975a54" volumeName="kubernetes.io/secret/255b5a89-1b89-42dc-868a-32ce67975a54-profile-collector-cert" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078696 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3488a7eb-5170-478c-9af7-490dbe0f514e" volumeName="kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078707 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c02c8912-46c9-4f86-ad28-9bfb2eca4e54" volumeName="kubernetes.io/secret/c02c8912-46c9-4f86-ad28-9bfb2eca4e54-tls-certificates" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078719 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d2aa0d48-7c8e-4ddb-84a3-b3c34414c061" volumeName="kubernetes.io/secret/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061-cluster-olm-operator-serving-cert" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078731 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdde2df-cd07-4898-88f4-7ecde0e04d7a" volumeName="kubernetes.io/projected/efdde2df-cd07-4898-88f4-7ecde0e04d7a-kube-api-access-tpc4t" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078743 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c" volumeName="kubernetes.io/projected/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-api-access-dzcqx" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078754 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="06bde94a-3126-4d0f-baba-49dc5fbec61b" volumeName="kubernetes.io/secret/06bde94a-3126-4d0f-baba-49dc5fbec61b-default-certificate" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078767 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3" volumeName="kubernetes.io/projected/3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3-kube-api-access-qtbcj" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078795 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9416f5d0-32b4-4065-b678-26913af8b6dd" volumeName="kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-secret-metrics-client-certs" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078807 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c84f66f0-207e-436a-8f4e-d1971fa815eb" volumeName="kubernetes.io/empty-dir/c84f66f0-207e-436a-8f4e-d1971fa815eb-catalog-content" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078820 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="483786a0-0a29-44bf-bbd0-2f37e045aa2c" volumeName="kubernetes.io/configmap/483786a0-0a29-44bf-bbd0-2f37e045aa2c-whereabouts-configmap" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078832 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="76c67569-3a72-4de9-87cd-432a4607b15b" volumeName="kubernetes.io/secret/76c67569-3a72-4de9-87cd-432a4607b15b-proxy-tls" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078844 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b714a9df-026e-423d-a980-2569f0d92e47" volumeName="kubernetes.io/secret/b714a9df-026e-423d-a980-2569f0d92e47-serving-cert" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078856 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="15ad7f4e-44c6-4426-8b97-c47a47786544" volumeName="kubernetes.io/configmap/15ad7f4e-44c6-4426-8b97-c47a47786544-metrics-client-ca" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078870 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="483786a0-0a29-44bf-bbd0-2f37e045aa2c" volumeName="kubernetes.io/configmap/483786a0-0a29-44bf-bbd0-2f37e045aa2c-cni-sysctl-allowlist" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078884 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="588a804a-430a-47f4-aa97-c08e907239da" volumeName="kubernetes.io/secret/588a804a-430a-47f4-aa97-c08e907239da-etcd-client" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078896 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ae4baa4e-4ef4-433d-aa36-149e92fa6ee2" volumeName="kubernetes.io/projected/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-kube-api-access-lzj2j" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078907 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="12b256b7-a57b-4124-8452-25e74cfa7926" volumeName="kubernetes.io/secret/12b256b7-a57b-4124-8452-25e74cfa7926-cert" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078919 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57b57915-64dd-42f5-b06f-bc4bcc06b667" volumeName="kubernetes.io/configmap/57b57915-64dd-42f5-b06f-bc4bcc06b667-trusted-ca" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078932 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="646fece3-4a42-4e0c-bcc7-5f705f948d63" volumeName="kubernetes.io/configmap/646fece3-4a42-4e0c-bcc7-5f705f948d63-telemetry-config" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078944 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ace75aae-6f4f-4299-90e2-d5292271b136" volumeName="kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078955 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b090ed5a-984f-41dd-8cea-34a1ece1514f" volumeName="kubernetes.io/configmap/b090ed5a-984f-41dd-8cea-34a1ece1514f-env-overrides" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078972 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fae9a4cf-2acf-4728-9105-87e004052fe5" volumeName="kubernetes.io/secret/fae9a4cf-2acf-4728-9105-87e004052fe5-openshift-state-metrics-kube-rbac-proxy-config" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078984 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="585f74db-4593-426b-b0c7-ec8f64810549" volumeName="kubernetes.io/projected/585f74db-4593-426b-b0c7-ec8f64810549-kube-api-access-q9tkx" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.078995 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9416f5d0-32b4-4065-b678-26913af8b6dd" volumeName="kubernetes.io/configmap/9416f5d0-32b4-4065-b678-26913af8b6dd-metrics-server-audit-profiles" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.079007 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="482284fd-6911-4ba6-8d57-7966cc51117a" volumeName="kubernetes.io/configmap/482284fd-6911-4ba6-8d57-7966cc51117a-config" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.079019 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ae4baa4e-4ef4-433d-aa36-149e92fa6ee2" volumeName="kubernetes.io/secret/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-serving-cert" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.079031 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d2aa0d48-7c8e-4ddb-84a3-b3c34414c061" volumeName="kubernetes.io/projected/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061-kube-api-access-vp6tj" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.079043 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fae9a4cf-2acf-4728-9105-87e004052fe5" volumeName="kubernetes.io/configmap/fae9a4cf-2acf-4728-9105-87e004052fe5-metrics-client-ca" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.079058 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b54fc16-d2f7-4b10-a611-5b411b389c5a" volumeName="kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.079071 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="588a804a-430a-47f4-aa97-c08e907239da" volumeName="kubernetes.io/configmap/588a804a-430a-47f4-aa97-c08e907239da-trusted-ca-bundle" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.079083 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9416f5d0-32b4-4065-b678-26913af8b6dd" volumeName="kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-secret-metrics-server-tls" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.079096 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c" volumeName="kubernetes.io/configmap/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-state-metrics-custom-resource-state-configmap" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.079109 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09d80e28-0b64-4c5d-a9bc-99d843d40165" volumeName="kubernetes.io/configmap/09d80e28-0b64-4c5d-a9bc-99d843d40165-cni-binary-copy" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.079120 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="24829faf-50e8-45bb-abb0-7cc5ccf81080" volumeName="kubernetes.io/projected/24829faf-50e8-45bb-abb0-7cc5ccf81080-kube-api-access-7hp42" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.079132 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9416f5d0-32b4-4065-b678-26913af8b6dd" volumeName="kubernetes.io/projected/9416f5d0-32b4-4065-b678-26913af8b6dd-kube-api-access-7hnfl" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.079146 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c67a2ed2-f520-46fc-84d3-6816dc19f4e0" volumeName="kubernetes.io/configmap/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-auth-proxy-config" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.079158 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="483786a0-0a29-44bf-bbd0-2f37e045aa2c" volumeName="kubernetes.io/configmap/483786a0-0a29-44bf-bbd0-2f37e045aa2c-cni-binary-copy" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.079172 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fae9a4cf-2acf-4728-9105-87e004052fe5" volumeName="kubernetes.io/projected/fae9a4cf-2acf-4728-9105-87e004052fe5-kube-api-access-x8v9z" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.079185 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="15ad7f4e-44c6-4426-8b97-c47a47786544" volumeName="kubernetes.io/secret/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-tls" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.079199 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="24829faf-50e8-45bb-abb0-7cc5ccf81080" volumeName="kubernetes.io/configmap/24829faf-50e8-45bb-abb0-7cc5ccf81080-config" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.079214 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="865ceedb-b19a-4f2f-b295-311e1b7a645e" volumeName="kubernetes.io/configmap/865ceedb-b19a-4f2f-b295-311e1b7a645e-config" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.079227 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ea0b3538-9a7d-4995-b628-2d63f21d683c" volumeName="kubernetes.io/secret/ea0b3538-9a7d-4995-b628-2d63f21d683c-serving-cert" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.079281 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ae4baa4e-4ef4-433d-aa36-149e92fa6ee2" volumeName="kubernetes.io/empty-dir/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-snapshots" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.079295 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c0a39496-5e47-4415-b8bf-ed0634797ce1" volumeName="kubernetes.io/projected/c0a39496-5e47-4415-b8bf-ed0634797ce1-kube-api-access-9sflb" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.079307 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="172d47fd-e1a1-4d77-9e31-c4f22e824d5f" volumeName="kubernetes.io/secret/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-cloud-controller-manager-operator-tls" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.079320 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="255b5a89-1b89-42dc-868a-32ce67975a54" volumeName="kubernetes.io/secret/255b5a89-1b89-42dc-868a-32ce67975a54-srv-cert" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.079333 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="85365dec-af50-406c-b258-890e4f454c4a" volumeName="kubernetes.io/projected/85365dec-af50-406c-b258-890e4f454c4a-kube-api-access-k5d9w" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.079344 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8de1f285-47ac-42aa-8026-8addce656362" volumeName="kubernetes.io/configmap/8de1f285-47ac-42aa-8026-8addce656362-etcd-ca" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.079355 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ad0f0d72-0337-4347-bb50-e299a175f3ca" volumeName="kubernetes.io/secret/ad0f0d72-0337-4347-bb50-e299a175f3ca-image-registry-operator-tls" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.079367 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3488a7eb-5170-478c-9af7-490dbe0f514e" volumeName="kubernetes.io/projected/3488a7eb-5170-478c-9af7-490dbe0f514e-kube-api-access-6qszm" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.079379 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f86e881-275c-4387-a23a-06c559c8f1e8" volumeName="kubernetes.io/empty-dir/3f86e881-275c-4387-a23a-06c559c8f1e8-utilities" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.079391 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="674041a2-e2b0-4286-88cc-f1b00571e3f3" volumeName="kubernetes.io/projected/674041a2-e2b0-4286-88cc-f1b00571e3f3-kube-api-access-brd4j" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.079403 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="85365dec-af50-406c-b258-890e4f454c4a" volumeName="kubernetes.io/configmap/85365dec-af50-406c-b258-890e4f454c4a-cco-trusted-ca" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.079414 28758 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af950a67-1557-4352-8100-27281bb8ecbe" volumeName="kubernetes.io/configmap/af950a67-1557-4352-8100-27281bb8ecbe-auth-proxy-config" seLinuxMountContext="" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.079427 28758 reconstruct.go:97] "Volume reconstruction finished" Feb 23 14:34:28.079773 master-0 kubenswrapper[28758]: I0223 14:34:28.079437 28758 reconciler.go:26] "Reconciler: start to sync state" Feb 23 14:34:28.087608 master-0 kubenswrapper[28758]: I0223 14:34:28.084119 28758 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 23 14:34:28.087608 master-0 kubenswrapper[28758]: I0223 14:34:28.086335 28758 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 23 14:34:28.087608 master-0 kubenswrapper[28758]: I0223 14:34:28.086418 28758 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 23 14:34:28.087608 master-0 kubenswrapper[28758]: I0223 14:34:28.086439 28758 kubelet.go:2335] "Starting kubelet main sync loop" Feb 23 14:34:28.087608 master-0 kubenswrapper[28758]: E0223 14:34:28.086830 28758 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 23 14:34:28.099550 master-0 kubenswrapper[28758]: I0223 14:34:28.098979 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-bcf775fc9-z5t5b_57b57915-64dd-42f5-b06f-bc4bcc06b667/cluster-node-tuning-operator/0.log" Feb 23 14:34:28.099550 master-0 kubenswrapper[28758]: I0223 14:34:28.099041 28758 generic.go:334] "Generic (PLEG): container finished" podID="57b57915-64dd-42f5-b06f-bc4bcc06b667" containerID="2f96ee533f5d52939bd2d7faf41993b118d9a6bfbb0b89e7580d1b1a849ba083" exitCode=1 Feb 23 14:34:28.102005 master-0 kubenswrapper[28758]: I0223 14:34:28.101965 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-545bf96f4d-fpwtm_8de1f285-47ac-42aa-8026-8addce656362/etcd-operator/2.log" Feb 23 14:34:28.102084 master-0 kubenswrapper[28758]: I0223 14:34:28.102014 28758 generic.go:334] "Generic (PLEG): container finished" podID="8de1f285-47ac-42aa-8026-8addce656362" containerID="a3d7f9dd773bb2be7eef32103651b05954025b8d3ad91ea82c3e56fc88bd34fd" exitCode=255 Feb 23 14:34:28.104116 master-0 kubenswrapper[28758]: I0223 14:34:28.104082 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-5bd7768f54-bgg88_d2aa0d48-7c8e-4ddb-84a3-b3c34414c061/cluster-olm-operator/1.log" Feb 23 14:34:28.104924 master-0 kubenswrapper[28758]: I0223 14:34:28.104881 28758 generic.go:334] "Generic (PLEG): container finished" podID="d2aa0d48-7c8e-4ddb-84a3-b3c34414c061" containerID="525b335554d223a0f792c02a10050ad9f40b958440d7f69f8c4c394f4e398780" exitCode=255 Feb 23 14:34:28.104924 master-0 kubenswrapper[28758]: I0223 14:34:28.104916 28758 generic.go:334] "Generic (PLEG): container finished" podID="d2aa0d48-7c8e-4ddb-84a3-b3c34414c061" containerID="4bdbe696b77666c832d686aa40ee248bba34d80f9ddd9b86b73fd8952b7b6113" exitCode=0 Feb 23 14:34:28.104924 master-0 kubenswrapper[28758]: I0223 14:34:28.104924 28758 generic.go:334] "Generic (PLEG): container finished" podID="d2aa0d48-7c8e-4ddb-84a3-b3c34414c061" containerID="881e3b61730f49e9657641d193738c054ca1938ca39d0f830ceee7b02b6b1f78" exitCode=0 Feb 23 14:34:28.106737 master-0 kubenswrapper[28758]: I0223 14:34:28.106699 28758 generic.go:334] "Generic (PLEG): container finished" podID="c84f66f0-207e-436a-8f4e-d1971fa815eb" containerID="ee32dcd7aedbc475bca90e7ca9047218b836991a39e06b3f0a1f7b5cd0f0132b" exitCode=0 Feb 23 14:34:28.106737 master-0 kubenswrapper[28758]: I0223 14:34:28.106724 28758 generic.go:334] "Generic (PLEG): container finished" podID="c84f66f0-207e-436a-8f4e-d1971fa815eb" containerID="d64c027236a9d4db40738067d4f95aca5d20f4b4daf356084c952897b507ab24" exitCode=0 Feb 23 14:34:28.135911 master-0 kubenswrapper[28758]: I0223 14:34:28.135770 28758 generic.go:334] "Generic (PLEG): container finished" podID="15245f43-22db-42eb-ab0b-702240986437" containerID="d8fda6fec7eadedba1d4400e4d7e27798506234350c769e6451d1eaf5b0ede8d" exitCode=0 Feb 23 14:34:28.139425 master-0 kubenswrapper[28758]: I0223 14:34:28.139389 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_56ff46cdb00d28519af7c0cdc9ea8d11/kube-scheduler-cert-syncer/0.log" Feb 23 14:34:28.140023 master-0 kubenswrapper[28758]: I0223 14:34:28.139958 28758 generic.go:334] "Generic (PLEG): container finished" podID="56ff46cdb00d28519af7c0cdc9ea8d11" containerID="1bc827a8854dfec15010881aca2028b9a63aeaba5a66ba610581f32b2d5f3a53" exitCode=0 Feb 23 14:34:28.140023 master-0 kubenswrapper[28758]: I0223 14:34:28.140014 28758 generic.go:334] "Generic (PLEG): container finished" podID="56ff46cdb00d28519af7c0cdc9ea8d11" containerID="a83359f382bbf7e84f344fddfee0b01fcac40fd46b179877c37b9571576884e8" exitCode=2 Feb 23 14:34:28.140146 master-0 kubenswrapper[28758]: I0223 14:34:28.140025 28758 generic.go:334] "Generic (PLEG): container finished" podID="56ff46cdb00d28519af7c0cdc9ea8d11" containerID="a607c62f2f6fcfd8c1e82eea4a4f2c6c0363686e9d645511b15629d774c518ef" exitCode=0 Feb 23 14:34:28.140146 master-0 kubenswrapper[28758]: I0223 14:34:28.140034 28758 generic.go:334] "Generic (PLEG): container finished" podID="56ff46cdb00d28519af7c0cdc9ea8d11" containerID="7a566b5e0634944e8d5422e837762f49e59d372d80be5def3116f1b2efb53f3a" exitCode=0 Feb 23 14:34:28.147727 master-0 kubenswrapper[28758]: I0223 14:34:28.147651 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-6569778c84-hsl6c_3488a7eb-5170-478c-9af7-490dbe0f514e/ingress-operator/4.log" Feb 23 14:34:28.148050 master-0 kubenswrapper[28758]: I0223 14:34:28.148015 28758 generic.go:334] "Generic (PLEG): container finished" podID="3488a7eb-5170-478c-9af7-490dbe0f514e" containerID="b59ddaa1f996d8d231b18a402187cbb1ee1446439ec71026f52221d4aaab529f" exitCode=1 Feb 23 14:34:28.150990 master-0 kubenswrapper[28758]: I0223 14:34:28.150956 28758 generic.go:334] "Generic (PLEG): container finished" podID="5f67ab24-82bc-4e71-b974-e25b819986c8" containerID="c172e0b4868c308f20f7ae8b13ba955f59eebc66ffba5fd517b3648866cbe26f" exitCode=0 Feb 23 14:34:28.157983 master-0 kubenswrapper[28758]: I0223 14:34:28.157925 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-d6bb9bb76-4frj6_12b256b7-a57b-4124-8452-25e74cfa7926/cluster-baremetal-operator/2.log" Feb 23 14:34:28.159011 master-0 kubenswrapper[28758]: I0223 14:34:28.158398 28758 generic.go:334] "Generic (PLEG): container finished" podID="12b256b7-a57b-4124-8452-25e74cfa7926" containerID="578a9e2a674702d2219386592f2e2254d406630d2cc2c55e8edf24f8f9368991" exitCode=1 Feb 23 14:34:28.162654 master-0 kubenswrapper[28758]: E0223 14:34:28.162604 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:28.166306 master-0 kubenswrapper[28758]: I0223 14:34:28.166171 28758 generic.go:334] "Generic (PLEG): container finished" podID="687e92a6cecf1e2beeef16a0b322ad08" containerID="e3d987d25306f70a7327b5bce6ea549b476972db2d3366cf37d35b30c1531578" exitCode=0 Feb 23 14:34:28.166306 master-0 kubenswrapper[28758]: I0223 14:34:28.166206 28758 generic.go:334] "Generic (PLEG): container finished" podID="687e92a6cecf1e2beeef16a0b322ad08" containerID="bdd3290dcf6f732f006b381bec2edfc3a7a58623787040a36811efd529225351" exitCode=0 Feb 23 14:34:28.166306 master-0 kubenswrapper[28758]: I0223 14:34:28.166216 28758 generic.go:334] "Generic (PLEG): container finished" podID="687e92a6cecf1e2beeef16a0b322ad08" containerID="40ca3552a0c110bf631be979ddbff1eb4abba63ee7c1c34c419314566066d566" exitCode=0 Feb 23 14:34:28.168662 master-0 kubenswrapper[28758]: I0223 14:34:28.168628 28758 generic.go:334] "Generic (PLEG): container finished" podID="709ac071-4392-4a3f-a3d1-4bc8ba2f6236" containerID="c28d30a2b760e3ebbe98681a086eea9adf4942f9ca5f692597b7830f1309f2a8" exitCode=0 Feb 23 14:34:28.171588 master-0 kubenswrapper[28758]: I0223 14:34:28.171460 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-fc889cfd5-tw2r9_865ceedb-b19a-4f2f-b295-311e1b7a645e/kube-storage-version-migrator-operator/2.log" Feb 23 14:34:28.171588 master-0 kubenswrapper[28758]: I0223 14:34:28.171565 28758 generic.go:334] "Generic (PLEG): container finished" podID="865ceedb-b19a-4f2f-b295-311e1b7a645e" containerID="deffb87f96ddeeef2ceba573c92018620cd6c1adba32e1a82ff2a0041c126856" exitCode=255 Feb 23 14:34:28.176174 master-0 kubenswrapper[28758]: I0223 14:34:28.176129 28758 generic.go:334] "Generic (PLEG): container finished" podID="f10f592e-5738-4879-b776-246b357d4621" containerID="d631da69f8bc3fb53c35b8ef8cedda80eee352d8a4bf7c9c1590bb5315fa046f" exitCode=0 Feb 23 14:34:28.182449 master-0 kubenswrapper[28758]: I0223 14:34:28.182397 28758 generic.go:334] "Generic (PLEG): container finished" podID="959c75833224b4ba3fa488b77d8f5032" containerID="40cb1664e8a96775d97586c3b2bf51f0c43fd54057e211ddda21f17bebe65211" exitCode=0 Feb 23 14:34:28.186931 master-0 kubenswrapper[28758]: E0223 14:34:28.186895 28758 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Feb 23 14:34:28.190917 master-0 kubenswrapper[28758]: I0223 14:34:28.190874 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-686847ff5f-9q266_4373687a-61a0-434b-81f7-3fecaa1494ef/control-plane-machine-set-operator/0.log" Feb 23 14:34:28.191000 master-0 kubenswrapper[28758]: I0223 14:34:28.190936 28758 generic.go:334] "Generic (PLEG): container finished" podID="4373687a-61a0-434b-81f7-3fecaa1494ef" containerID="9b45bf126e1d92621372b72946a5700b9c49834f8698b4a6266b185922dfcbee" exitCode=1 Feb 23 14:34:28.195336 master-0 kubenswrapper[28758]: I0223 14:34:28.195072 28758 generic.go:334] "Generic (PLEG): container finished" podID="85365dec-af50-406c-b258-890e4f454c4a" containerID="348647c8be47f1f0398a726d98ab4e65fbf23ef3ceae1691e078bd87dddb99c7" exitCode=0 Feb 23 14:34:28.199861 master-0 kubenswrapper[28758]: I0223 14:34:28.199829 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-77cd4d9559-qvq8x_b9cf1c39-24f0-420b-8020-089616d1cdf0/kube-scheduler-operator-container/2.log" Feb 23 14:34:28.199958 master-0 kubenswrapper[28758]: I0223 14:34:28.199875 28758 generic.go:334] "Generic (PLEG): container finished" podID="b9cf1c39-24f0-420b-8020-089616d1cdf0" containerID="cb07ee7a08ec58d0214f496b0ca32c3611b77165c521b9fecab35b067ef91753" exitCode=255 Feb 23 14:34:28.205826 master-0 kubenswrapper[28758]: I0223 14:34:28.205790 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-8586dccc9b-tvnmq_24829faf-50e8-45bb-abb0-7cc5ccf81080/openshift-apiserver-operator/2.log" Feb 23 14:34:28.206053 master-0 kubenswrapper[28758]: I0223 14:34:28.206021 28758 generic.go:334] "Generic (PLEG): container finished" podID="24829faf-50e8-45bb-abb0-7cc5ccf81080" containerID="50817d53493752eda9d4463a0b5a65e93107befbd5b1e52f265dd7c7f17a73bc" exitCode=255 Feb 23 14:34:28.211619 master-0 kubenswrapper[28758]: I0223 14:34:28.211573 28758 generic.go:334] "Generic (PLEG): container finished" podID="e1148263-7b15-4c12-a217-8b030ecd9348" containerID="909663cdb0c0ac8db46b5e0989f1e87cc68ef03f3124e36ed314cba8e6058032" exitCode=0 Feb 23 14:34:28.214258 master-0 kubenswrapper[28758]: I0223 14:34:28.214026 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-c48c8bf7c-vtnsw_b714a9df-026e-423d-a980-2569f0d92e47/service-ca-operator/2.log" Feb 23 14:34:28.214258 master-0 kubenswrapper[28758]: I0223 14:34:28.214057 28758 generic.go:334] "Generic (PLEG): container finished" podID="b714a9df-026e-423d-a980-2569f0d92e47" containerID="22cda996f9dec95459a017791c6284a80f33c42296156317930bcb92d3fc7877" exitCode=255 Feb 23 14:34:28.217171 master-0 kubenswrapper[28758]: I0223 14:34:28.217130 28758 generic.go:334] "Generic (PLEG): container finished" podID="b9774f8c-0f29-46d8-be77-81bcf74d5994" containerID="94bfdbcfdcf4914977da334b3fd2fe80966ec6c36be33d3628e4eada6361765f" exitCode=0 Feb 23 14:34:28.221887 master-0 kubenswrapper[28758]: I0223 14:34:28.221587 28758 generic.go:334] "Generic (PLEG): container finished" podID="18a83278819db2092fa26d8274eb3f00" containerID="4850e29e1670d0434d8ca87c5950a0424937b61be4c5fb2ae511df8fe764c7a2" exitCode=0 Feb 23 14:34:28.221887 master-0 kubenswrapper[28758]: I0223 14:34:28.221621 28758 generic.go:334] "Generic (PLEG): container finished" podID="18a83278819db2092fa26d8274eb3f00" containerID="626890ddbc06982ad60de27c4c4ad3f994d6a386f27886fbc0cdba298ce4fc87" exitCode=0 Feb 23 14:34:28.221887 master-0 kubenswrapper[28758]: I0223 14:34:28.221631 28758 generic.go:334] "Generic (PLEG): container finished" podID="18a83278819db2092fa26d8274eb3f00" containerID="e1b9898b8e99e752199648be3eeb21746009166b99c8416be13f36fdd12cbcdd" exitCode=0 Feb 23 14:34:28.226151 master-0 kubenswrapper[28758]: I0223 14:34:28.226088 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-86b8dc6d6-2kvfp_3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3/cluster-autoscaler-operator/0.log" Feb 23 14:34:28.226639 master-0 kubenswrapper[28758]: I0223 14:34:28.226594 28758 generic.go:334] "Generic (PLEG): container finished" podID="3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3" containerID="08d3df84ad8de18eec9e6a636baf4cb95ff798ffedb9a2d917a6b77d6c934fb7" exitCode=255 Feb 23 14:34:28.228927 master-0 kubenswrapper[28758]: I0223 14:34:28.228885 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_c997c8e9d3be51d454d8e61e376bef08/kube-rbac-proxy-crio/2.log" Feb 23 14:34:28.229296 master-0 kubenswrapper[28758]: I0223 14:34:28.229264 28758 generic.go:334] "Generic (PLEG): container finished" podID="c997c8e9d3be51d454d8e61e376bef08" containerID="6ac0b900bdb2d552799e0b2929f88eaa7518eb0c998cb215c17a947032781e19" exitCode=1 Feb 23 14:34:28.229296 master-0 kubenswrapper[28758]: I0223 14:34:28.229288 28758 generic.go:334] "Generic (PLEG): container finished" podID="c997c8e9d3be51d454d8e61e376bef08" containerID="02553ea2f34fd5b9d9104437dd7120800883c473073a6a74895604093906e009" exitCode=0 Feb 23 14:34:28.235983 master-0 kubenswrapper[28758]: I0223 14:34:28.235917 28758 generic.go:334] "Generic (PLEG): container finished" podID="76c67569-3a72-4de9-87cd-432a4607b15b" containerID="0ccc3d4cff85a134107f96cc12ad89d4b417a48e075d82e5410bc67ca88a884e" exitCode=0 Feb 23 14:34:28.239294 master-0 kubenswrapper[28758]: I0223 14:34:28.239244 28758 generic.go:334] "Generic (PLEG): container finished" podID="b090ed5a-984f-41dd-8cea-34a1ece1514f" containerID="be11245e52df36836387b793176a5296c3112993cdce052d05331b901d833321" exitCode=0 Feb 23 14:34:28.241758 master-0 kubenswrapper[28758]: I0223 14:34:28.241719 28758 generic.go:334] "Generic (PLEG): container finished" podID="1225c7e0-f2d1-4b39-979c-c77191862c81" containerID="77b1792020215e1792b7b140e0ac936225d54418ad659b82a3189d8865905a56" exitCode=0 Feb 23 14:34:28.253802 master-0 kubenswrapper[28758]: I0223 14:34:28.253738 28758 generic.go:334] "Generic (PLEG): container finished" podID="ad0f0d72-0337-4347-bb50-e299a175f3ca" containerID="c3f209a9ce16ae00e125bd88a555117337a8948041a4b5c781124f66c958f969" exitCode=0 Feb 23 14:34:28.257324 master-0 kubenswrapper[28758]: I0223 14:34:28.257169 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-84b8d9d697-2hr5s_66c72c71-f74a-43ab-bf0d-1f4c93623774/manager/0.log" Feb 23 14:34:28.257600 master-0 kubenswrapper[28758]: I0223 14:34:28.257559 28758 generic.go:334] "Generic (PLEG): container finished" podID="66c72c71-f74a-43ab-bf0d-1f4c93623774" containerID="e192093c7698f9c13f14fd55a50b3b960cd4142b3b8cb914299c2709465ffc51" exitCode=1 Feb 23 14:34:28.260150 master-0 kubenswrapper[28758]: I0223 14:34:28.259994 28758 generic.go:334] "Generic (PLEG): container finished" podID="94e30288-572c-4c6f-a063-a30243db8fd8" containerID="610ae43e00e8e1b0ff3dba88a6993fdf43f969aae5bdeeca94356519cf7c2602" exitCode=1 Feb 23 14:34:28.263144 master-0 kubenswrapper[28758]: E0223 14:34:28.262864 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:28.263404 master-0 kubenswrapper[28758]: I0223 14:34:28.263367 28758 generic.go:334] "Generic (PLEG): container finished" podID="15ad7f4e-44c6-4426-8b97-c47a47786544" containerID="1c473cd845ba12d4bf1927e76251ff9dc47cd40e997137152d0746ccd7834430" exitCode=0 Feb 23 14:34:28.269361 master-0 kubenswrapper[28758]: I0223 14:34:28.269312 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-584cc7bcb5-67ds6_cb6e88cd-98de-446a-92e8-f56a2f133703/openshift-controller-manager-operator/1.log" Feb 23 14:34:28.269528 master-0 kubenswrapper[28758]: I0223 14:34:28.269376 28758 generic.go:334] "Generic (PLEG): container finished" podID="cb6e88cd-98de-446a-92e8-f56a2f133703" containerID="e062ef1f26297d24d2516be8292a3297ef7a87cfa574a75bfb2f2e2e904d65e1" exitCode=255 Feb 23 14:34:28.273199 master-0 kubenswrapper[28758]: I0223 14:34:28.273020 28758 generic.go:334] "Generic (PLEG): container finished" podID="3f86e881-275c-4387-a23a-06c559c8f1e8" containerID="dd1e3de2ec0845831f5cf402eb5eb3565db815b7cdf1b7b1df427c27e6e8d027" exitCode=0 Feb 23 14:34:28.273199 master-0 kubenswrapper[28758]: I0223 14:34:28.273046 28758 generic.go:334] "Generic (PLEG): container finished" podID="3f86e881-275c-4387-a23a-06c559c8f1e8" containerID="d34072e7de379cff9844dcf3892b8004b156d1d5b0fd3f937aa6aac0ab1f96bb" exitCode=0 Feb 23 14:34:28.275069 master-0 kubenswrapper[28758]: I0223 14:34:28.275040 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-9cc7d7bb-6zmk9_1c60ff3f-2bb1-422e-be27-5eca96d85fd2/manager/0.log" Feb 23 14:34:28.275345 master-0 kubenswrapper[28758]: I0223 14:34:28.275075 28758 generic.go:334] "Generic (PLEG): container finished" podID="1c60ff3f-2bb1-422e-be27-5eca96d85fd2" containerID="0bda8d15a11221e7b98f49af56e0807945868c4a5e5d028da4a5c53d7f410c01" exitCode=1 Feb 23 14:34:28.278064 master-0 kubenswrapper[28758]: I0223 14:34:28.277952 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_493a9ed3-6d64-489a-a68c-235b69a58782/installer/0.log" Feb 23 14:34:28.278064 master-0 kubenswrapper[28758]: I0223 14:34:28.277982 28758 generic.go:334] "Generic (PLEG): container finished" podID="493a9ed3-6d64-489a-a68c-235b69a58782" containerID="1df16973da8e7c98a51b37b7335c255585ebd5dc4bbbed0d842fe3c32df42186" exitCode=1 Feb 23 14:34:28.282718 master-0 kubenswrapper[28758]: I0223 14:34:28.282646 28758 generic.go:334] "Generic (PLEG): container finished" podID="0514f486-2562-473d-8b01-b69441b82367" containerID="ac1b1e24015c720352cbb49d46332282e9687278977a0db4df21fe4d03fe58bd" exitCode=0 Feb 23 14:34:28.286961 master-0 kubenswrapper[28758]: I0223 14:34:28.286882 28758 generic.go:334] "Generic (PLEG): container finished" podID="fbb66172-1ea9-4683-b88f-227c4fd94924" containerID="4bd96aadee1934ae65fac50d897e75007505a399d7d143ad871ced8edd81b895" exitCode=0 Feb 23 14:34:28.292755 master-0 kubenswrapper[28758]: I0223 14:34:28.292717 28758 generic.go:334] "Generic (PLEG): container finished" podID="af950a67-1557-4352-8100-27281bb8ecbe" containerID="3d6da8a2ab007c14781f7a758e38f4dc17838974a913b095ba4be079439082e2" exitCode=0 Feb 23 14:34:28.294003 master-0 kubenswrapper[28758]: I0223 14:34:28.293971 28758 generic.go:334] "Generic (PLEG): container finished" podID="78f5dea4-ed09-44a1-8eb1-d1fc497cc173" containerID="27840ca7db3cacb7b24041918e945eaa29f553e36d936e622a640f67b21753c5" exitCode=0 Feb 23 14:34:28.296014 master-0 kubenswrapper[28758]: I0223 14:34:28.295931 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-7bcfbc574b-zdntd_cf04aca0-8174-4134-835d-37adf6a3b5ca/kube-controller-manager-operator/3.log" Feb 23 14:34:28.296014 master-0 kubenswrapper[28758]: I0223 14:34:28.295964 28758 generic.go:334] "Generic (PLEG): container finished" podID="cf04aca0-8174-4134-835d-37adf6a3b5ca" containerID="7c094f15ea265ac3d44bbebfb78fef4402e37dfe5737cb2bab354a08b8292a17" exitCode=255 Feb 23 14:34:28.297383 master-0 kubenswrapper[28758]: I0223 14:34:28.297341 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_0fdb9885-7479-43b5-8613-b2857a798ade/installer/0.log" Feb 23 14:34:28.297383 master-0 kubenswrapper[28758]: I0223 14:34:28.297370 28758 generic.go:334] "Generic (PLEG): container finished" podID="0fdb9885-7479-43b5-8613-b2857a798ade" containerID="0a7994e86e7ddf474fa9a6e9d028e17c8d71e5299119418e1b05d25a7b604984" exitCode=1 Feb 23 14:34:28.299240 master-0 kubenswrapper[28758]: I0223 14:34:28.299021 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-8bb99f4f-msq8f_482284fd-6911-4ba6-8d57-7966cc51117a/route-controller-manager/1.log" Feb 23 14:34:28.299240 master-0 kubenswrapper[28758]: I0223 14:34:28.299045 28758 generic.go:334] "Generic (PLEG): container finished" podID="482284fd-6911-4ba6-8d57-7966cc51117a" containerID="cc5b0e807a282b75c570fbfb71a174caf59e3ff1678808f33d1b9369bbe859b7" exitCode=255 Feb 23 14:34:28.300863 master-0 kubenswrapper[28758]: I0223 14:34:28.300833 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-5c75f78c8b-cj2l7_5b54fc16-d2f7-4b10-a611-5b411b389c5a/package-server-manager/0.log" Feb 23 14:34:28.301234 master-0 kubenswrapper[28758]: I0223 14:34:28.301083 28758 generic.go:334] "Generic (PLEG): container finished" podID="5b54fc16-d2f7-4b10-a611-5b411b389c5a" containerID="3e43920c8c9e66c01584e52a234477388c129ea94fe151ecc6c23098a8981522" exitCode=1 Feb 23 14:34:28.302844 master-0 kubenswrapper[28758]: I0223 14:34:28.302812 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-5bd7c86784-mlbx2_e2d00ece-7586-4346-adbb-eaae1aeda69e/authentication-operator/2.log" Feb 23 14:34:28.302950 master-0 kubenswrapper[28758]: I0223 14:34:28.302871 28758 generic.go:334] "Generic (PLEG): container finished" podID="e2d00ece-7586-4346-adbb-eaae1aeda69e" containerID="0d84cce0e88dcc70d83b8dd67a4e91c62d1f30fed4495c32ca427288ab62004f" exitCode=255 Feb 23 14:34:28.304217 master-0 kubenswrapper[28758]: I0223 14:34:28.304183 28758 generic.go:334] "Generic (PLEG): container finished" podID="a52cbaf6-c1af-4c29-aef9-67523f5148c6" containerID="f2ac2c56c1e34a13c986eed31e989cbdb13313c0a91d44d3e68704b2399e5a39" exitCode=0 Feb 23 14:34:28.305937 master-0 kubenswrapper[28758]: I0223 14:34:28.305883 28758 generic.go:334] "Generic (PLEG): container finished" podID="959c2393-e914-4c10-a18f-b30fcf012d19" containerID="943dceb3c19889e0c21143fb06ce16ff62e733710dc9afea16ddd3ae92da4904" exitCode=0 Feb 23 14:34:28.307640 master-0 kubenswrapper[28758]: I0223 14:34:28.307621 28758 generic.go:334] "Generic (PLEG): container finished" podID="adbf8f71-f005-4e5b-9de1-e49559cf7386" containerID="82543a2650cb25bd6bfd3b4eeba404e288e77827c47ab01f21f6a40862867df7" exitCode=0 Feb 23 14:34:28.307640 master-0 kubenswrapper[28758]: I0223 14:34:28.307637 28758 generic.go:334] "Generic (PLEG): container finished" podID="adbf8f71-f005-4e5b-9de1-e49559cf7386" containerID="ce4e5fa17dbd27ef1bc8352b48b01fe69433b598b77046f093daa7a26c560341" exitCode=0 Feb 23 14:34:28.313962 master-0 kubenswrapper[28758]: I0223 14:34:28.313928 28758 generic.go:334] "Generic (PLEG): container finished" podID="483786a0-0a29-44bf-bbd0-2f37e045aa2c" containerID="6c7ee6bebf88d829805371dc4fd4b58845a3f175897eb6486d1688a8a41b95ec" exitCode=0 Feb 23 14:34:28.313962 master-0 kubenswrapper[28758]: I0223 14:34:28.313945 28758 generic.go:334] "Generic (PLEG): container finished" podID="483786a0-0a29-44bf-bbd0-2f37e045aa2c" containerID="a6fb92a24f40b4f0a4db9442684eefd34b35d2511917f6d03fe2ac8345b66ead" exitCode=0 Feb 23 14:34:28.313962 master-0 kubenswrapper[28758]: I0223 14:34:28.313954 28758 generic.go:334] "Generic (PLEG): container finished" podID="483786a0-0a29-44bf-bbd0-2f37e045aa2c" containerID="e5ef5b210d67b35d196c3c58900eaedb9852f06e215b468c9e1c1dc53fce376f" exitCode=0 Feb 23 14:34:28.313962 master-0 kubenswrapper[28758]: I0223 14:34:28.313962 28758 generic.go:334] "Generic (PLEG): container finished" podID="483786a0-0a29-44bf-bbd0-2f37e045aa2c" containerID="29b61cbeccf4eaed8df82b56cbe6a444cd43fd7fd1043bff465ae48185e7e6a0" exitCode=0 Feb 23 14:34:28.314101 master-0 kubenswrapper[28758]: I0223 14:34:28.313969 28758 generic.go:334] "Generic (PLEG): container finished" podID="483786a0-0a29-44bf-bbd0-2f37e045aa2c" containerID="5fd96309ade76aec20ed37e459e178ae08d952af2aa513f3703806ca12a7c927" exitCode=0 Feb 23 14:34:28.314101 master-0 kubenswrapper[28758]: I0223 14:34:28.313977 28758 generic.go:334] "Generic (PLEG): container finished" podID="483786a0-0a29-44bf-bbd0-2f37e045aa2c" containerID="7e56c504fefbada4ed2745ea4973c98d064a08b56a86637d3809d7946280cc20" exitCode=0 Feb 23 14:34:28.315334 master-0 kubenswrapper[28758]: I0223 14:34:28.315303 28758 generic.go:334] "Generic (PLEG): container finished" podID="a4ae9292-71dc-4484-b277-43cb26c1e04d" containerID="fafa7b0f21c17417165ff9592e80bbb6992685b66472f608cb30827b7d663491" exitCode=0 Feb 23 14:34:28.316740 master-0 kubenswrapper[28758]: I0223 14:34:28.316705 28758 generic.go:334] "Generic (PLEG): container finished" podID="585f74db-4593-426b-b0c7-ec8f64810549" containerID="3d191963e287b24eb8e359eae476b7710f1b01ed3998cce17300434d7f6e8d0b" exitCode=0 Feb 23 14:34:28.320832 master-0 kubenswrapper[28758]: I0223 14:34:28.320804 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5d87bf58c-nq2tz_961e4ecd-545b-4270-ae34-e733dec793b6/kube-apiserver-operator/2.log" Feb 23 14:34:28.320832 master-0 kubenswrapper[28758]: I0223 14:34:28.320831 28758 generic.go:334] "Generic (PLEG): container finished" podID="961e4ecd-545b-4270-ae34-e733dec793b6" containerID="41bf9ac4f6ba09181a226cfe2ad608e31e59bbb137b1b1ead593f9c6c980fde1" exitCode=255 Feb 23 14:34:28.322925 master-0 kubenswrapper[28758]: I0223 14:34:28.322759 28758 generic.go:334] "Generic (PLEG): container finished" podID="588a804a-430a-47f4-aa97-c08e907239da" containerID="dd9d0218254ede188f4f7d6704abe8b5fb6e6589605eb6657b67d6dc33a0eb39" exitCode=0 Feb 23 14:34:28.324580 master-0 kubenswrapper[28758]: I0223 14:34:28.324559 28758 generic.go:334] "Generic (PLEG): container finished" podID="ea0b3538-9a7d-4995-b628-2d63f21d683c" containerID="ef55d8167c92b21a23135f3a8ced87d51d79df376d7aca850c7cba442f901e30" exitCode=0 Feb 23 14:34:28.325980 master-0 kubenswrapper[28758]: I0223 14:34:28.325948 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6847bb4785-5fw2x_2e89a047-9ebc-459b-b7b3-e902c1fb0e17/snapshot-controller/3.log" Feb 23 14:34:28.325980 master-0 kubenswrapper[28758]: I0223 14:34:28.325971 28758 generic.go:334] "Generic (PLEG): container finished" podID="2e89a047-9ebc-459b-b7b3-e902c1fb0e17" containerID="3d02c5174ccc3722ad642137b2ae38a4ad6beee863578d93948d8f75b3ffc635" exitCode=1 Feb 23 14:34:28.336154 master-0 kubenswrapper[28758]: I0223 14:34:28.327278 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_29f7b30e-bf6a-4e54-b009-1b0fcd830035/installer/0.log" Feb 23 14:34:28.336154 master-0 kubenswrapper[28758]: I0223 14:34:28.327311 28758 generic.go:334] "Generic (PLEG): container finished" podID="29f7b30e-bf6a-4e54-b009-1b0fcd830035" containerID="325ae25a8338b6a2543759476e50b822896d1071332fcb78a23d45a461fab54f" exitCode=1 Feb 23 14:34:28.363064 master-0 kubenswrapper[28758]: E0223 14:34:28.362979 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:28.390789 master-0 kubenswrapper[28758]: E0223 14:34:28.390755 28758 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Feb 23 14:34:28.420701 master-0 kubenswrapper[28758]: I0223 14:34:28.418967 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-6f47d587d6-55qjr_92c63c95-e880-4f51-9858-7715343f7bd8/openshift-config-operator/4.log" Feb 23 14:34:28.424502 master-0 kubenswrapper[28758]: I0223 14:34:28.422632 28758 generic.go:334] "Generic (PLEG): container finished" podID="92c63c95-e880-4f51-9858-7715343f7bd8" containerID="5ace2c4cd314c92825fd50854b7a53f375dd7e9cb995361c6b2c717e5d66eb1b" exitCode=255 Feb 23 14:34:28.424502 master-0 kubenswrapper[28758]: I0223 14:34:28.422655 28758 generic.go:334] "Generic (PLEG): container finished" podID="92c63c95-e880-4f51-9858-7715343f7bd8" containerID="4c56fad74102d69cc7f4ad84a92d8641223fd6345d161d89086b8aeb7a8a3450" exitCode=0 Feb 23 14:34:28.451838 master-0 kubenswrapper[28758]: I0223 14:34:28.451788 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7d7db75979-x4qnw_674041a2-e2b0-4286-88cc-f1b00571e3f3/network-operator/2.log" Feb 23 14:34:28.452031 master-0 kubenswrapper[28758]: I0223 14:34:28.451841 28758 generic.go:334] "Generic (PLEG): container finished" podID="674041a2-e2b0-4286-88cc-f1b00571e3f3" containerID="4059934c66f6a9887a7e6b1218e04bcfb0fcfe5376abb8c188a9213f581fe6f3" exitCode=255 Feb 23 14:34:28.454126 master-0 kubenswrapper[28758]: I0223 14:34:28.454096 28758 generic.go:334] "Generic (PLEG): container finished" podID="efdde2df-cd07-4898-88f4-7ecde0e04d7a" containerID="6fcd03423b6cd43db7e56c4b05dae69802a9ed15c0ea5b94c83b762001bf26a8" exitCode=0 Feb 23 14:34:28.454126 master-0 kubenswrapper[28758]: I0223 14:34:28.454113 28758 generic.go:334] "Generic (PLEG): container finished" podID="efdde2df-cd07-4898-88f4-7ecde0e04d7a" containerID="9f45d1abf22f4312045c038d142f9bba7b80278a0653e6693862acdb73f898f7" exitCode=0 Feb 23 14:34:28.455724 master-0 kubenswrapper[28758]: I0223 14:34:28.455694 28758 generic.go:334] "Generic (PLEG): container finished" podID="25b855e3-80dc-4ee5-80ab-c4742578a92f" containerID="9e428f62a82052df41d6797fdf53021748cfc643092e404d44bde9e1092162d6" exitCode=0 Feb 23 14:34:28.457944 master-0 kubenswrapper[28758]: I0223 14:34:28.457915 28758 generic.go:334] "Generic (PLEG): container finished" podID="d03a1e6620a92c780b0a91c72a55bc8b" containerID="e3f7365c1acb54a72a772e182d383b3e70a626dbae5d085f9cd81b46982b0137" exitCode=0 Feb 23 14:34:28.460443 master-0 kubenswrapper[28758]: I0223 14:34:28.460406 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-td489_bbe678de-546d-49d0-8280-3f6d94fa5e4f/approver/0.log" Feb 23 14:34:28.461047 master-0 kubenswrapper[28758]: I0223 14:34:28.460996 28758 generic.go:334] "Generic (PLEG): container finished" podID="bbe678de-546d-49d0-8280-3f6d94fa5e4f" containerID="86a800fe59aed9a0c248de7a352a6c1ffaea2cbdde27bb246147baa866e1c79a" exitCode=1 Feb 23 14:34:28.464384 master-0 kubenswrapper[28758]: E0223 14:34:28.464328 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:28.565085 master-0 kubenswrapper[28758]: E0223 14:34:28.564923 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:28.665435 master-0 kubenswrapper[28758]: E0223 14:34:28.665359 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:28.765899 master-0 kubenswrapper[28758]: E0223 14:34:28.765832 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:28.791066 master-0 kubenswrapper[28758]: E0223 14:34:28.790984 28758 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Feb 23 14:34:28.866634 master-0 kubenswrapper[28758]: E0223 14:34:28.866490 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:28.967183 master-0 kubenswrapper[28758]: E0223 14:34:28.967123 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:29.068013 master-0 kubenswrapper[28758]: E0223 14:34:29.067916 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:29.169021 master-0 kubenswrapper[28758]: E0223 14:34:29.168801 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:29.269609 master-0 kubenswrapper[28758]: E0223 14:34:29.269528 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:29.370200 master-0 kubenswrapper[28758]: E0223 14:34:29.370142 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:29.470596 master-0 kubenswrapper[28758]: E0223 14:34:29.470529 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:29.571518 master-0 kubenswrapper[28758]: E0223 14:34:29.571432 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:29.591815 master-0 kubenswrapper[28758]: E0223 14:34:29.591709 28758 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Feb 23 14:34:29.672358 master-0 kubenswrapper[28758]: E0223 14:34:29.672277 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:29.774995 master-0 kubenswrapper[28758]: E0223 14:34:29.774370 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:29.875141 master-0 kubenswrapper[28758]: E0223 14:34:29.875063 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:29.975655 master-0 kubenswrapper[28758]: E0223 14:34:29.975252 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:30.077663 master-0 kubenswrapper[28758]: E0223 14:34:30.077547 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:30.177775 master-0 kubenswrapper[28758]: E0223 14:34:30.177714 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:30.278204 master-0 kubenswrapper[28758]: E0223 14:34:30.278153 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:30.381849 master-0 kubenswrapper[28758]: E0223 14:34:30.381551 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:30.482491 master-0 kubenswrapper[28758]: E0223 14:34:30.482426 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:30.585553 master-0 kubenswrapper[28758]: E0223 14:34:30.583146 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:30.683572 master-0 kubenswrapper[28758]: E0223 14:34:30.683520 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:30.784004 master-0 kubenswrapper[28758]: E0223 14:34:30.783949 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:30.884389 master-0 kubenswrapper[28758]: E0223 14:34:30.884331 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:30.985172 master-0 kubenswrapper[28758]: E0223 14:34:30.985068 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:31.085774 master-0 kubenswrapper[28758]: E0223 14:34:31.085704 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:31.185879 master-0 kubenswrapper[28758]: E0223 14:34:31.185818 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:31.192069 master-0 kubenswrapper[28758]: E0223 14:34:31.192010 28758 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Feb 23 14:34:31.286501 master-0 kubenswrapper[28758]: E0223 14:34:31.286382 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:31.386884 master-0 kubenswrapper[28758]: E0223 14:34:31.386837 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:31.487640 master-0 kubenswrapper[28758]: E0223 14:34:31.487554 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:31.588401 master-0 kubenswrapper[28758]: E0223 14:34:31.588269 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:31.688675 master-0 kubenswrapper[28758]: E0223 14:34:31.688608 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:31.789065 master-0 kubenswrapper[28758]: E0223 14:34:31.789002 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:31.889650 master-0 kubenswrapper[28758]: E0223 14:34:31.889528 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:31.990335 master-0 kubenswrapper[28758]: E0223 14:34:31.990272 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:32.091417 master-0 kubenswrapper[28758]: E0223 14:34:32.091356 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:32.191923 master-0 kubenswrapper[28758]: E0223 14:34:32.191869 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:32.292712 master-0 kubenswrapper[28758]: E0223 14:34:32.292628 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:32.393141 master-0 kubenswrapper[28758]: E0223 14:34:32.393081 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:32.493789 master-0 kubenswrapper[28758]: E0223 14:34:32.493634 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:32.594215 master-0 kubenswrapper[28758]: E0223 14:34:32.594137 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:32.694794 master-0 kubenswrapper[28758]: E0223 14:34:32.694731 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:32.795228 master-0 kubenswrapper[28758]: E0223 14:34:32.795171 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:32.895634 master-0 kubenswrapper[28758]: E0223 14:34:32.895573 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:32.996834 master-0 kubenswrapper[28758]: E0223 14:34:32.996760 28758 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Feb 23 14:34:33.056498 master-0 kubenswrapper[28758]: I0223 14:34:33.056298 28758 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 23 14:34:33.059387 master-0 kubenswrapper[28758]: I0223 14:34:33.059063 28758 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 23 14:34:33.066327 master-0 kubenswrapper[28758]: I0223 14:34:33.066281 28758 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 23 14:34:33.088925 master-0 kubenswrapper[28758]: I0223 14:34:33.088872 28758 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 23 14:34:33.093878 master-0 kubenswrapper[28758]: I0223 14:34:33.093840 28758 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 23 14:34:33.500867 master-0 kubenswrapper[28758]: I0223 14:34:33.500811 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_959c75833224b4ba3fa488b77d8f5032/kube-apiserver-check-endpoints/0.log" Feb 23 14:34:33.503068 master-0 kubenswrapper[28758]: I0223 14:34:33.502256 28758 generic.go:334] "Generic (PLEG): container finished" podID="959c75833224b4ba3fa488b77d8f5032" containerID="7b1c7cc7b2e3d4c1fdbe5b4592355d6abc03a37f10ae5ab746402745b7ae1aa2" exitCode=255 Feb 23 14:34:34.036616 master-0 kubenswrapper[28758]: I0223 14:34:34.036557 28758 apiserver.go:52] "Watching apiserver" Feb 23 14:34:34.067814 master-0 kubenswrapper[28758]: I0223 14:34:34.067749 28758 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 23 14:34:34.392629 master-0 kubenswrapper[28758]: E0223 14:34:34.392469 28758 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Feb 23 14:34:39.393692 master-0 kubenswrapper[28758]: E0223 14:34:39.393617 28758 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Feb 23 14:34:44.394842 master-0 kubenswrapper[28758]: E0223 14:34:44.394775 28758 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Feb 23 14:34:49.395803 master-0 kubenswrapper[28758]: E0223 14:34:49.395718 28758 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Feb 23 14:34:54.396499 master-0 kubenswrapper[28758]: E0223 14:34:54.396423 28758 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Feb 23 14:34:56.044947 master-0 kubenswrapper[28758]: E0223 14:34:56.044891 28758 summary_sys_containers.go:89] "Failed to get system container stats" err="failed to get cgroup stats for \"/system.slice\": failed to get container info for \"/system.slice\": unknown container \"/system.slice\"" containerName="/system.slice" Feb 23 14:34:56.044947 master-0 kubenswrapper[28758]: E0223 14:34:56.044912 28758 summary_sys_containers.go:89] "Failed to get system container stats" err="failed to get cgroup stats for \"/kubepods.slice\": failed to get container info for \"/kubepods.slice\": unknown container \"/kubepods.slice\"" containerName="/kubepods.slice" Feb 23 14:34:56.046559 master-0 kubenswrapper[28758]: E0223 14:34:56.046360 28758 summary_sys_containers.go:89] "Failed to get system container stats" err="failed to get cgroup stats for \"/system.slice\": failed to get container info for \"/system.slice\": unknown container \"/system.slice\"" containerName="/system.slice" Feb 23 14:34:56.047440 master-0 kubenswrapper[28758]: E0223 14:34:56.047330 28758 summary_sys_containers.go:89] "Failed to get system container stats" err="failed to get cgroup stats for \"/kubepods.slice\": failed to get container info for \"/kubepods.slice\": unknown container \"/kubepods.slice\"" containerName="/kubepods.slice" Feb 23 14:34:56.290721 master-0 kubenswrapper[28758]: I0223 14:34:56.290579 28758 manager.go:324] Recovery completed Feb 23 14:34:56.382737 master-0 kubenswrapper[28758]: I0223 14:34:56.382675 28758 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 23 14:34:56.382737 master-0 kubenswrapper[28758]: I0223 14:34:56.382708 28758 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 23 14:34:56.382737 master-0 kubenswrapper[28758]: I0223 14:34:56.382745 28758 state_mem.go:36] "Initialized new in-memory state store" Feb 23 14:34:56.383028 master-0 kubenswrapper[28758]: I0223 14:34:56.383001 28758 state_mem.go:88] "Updated default CPUSet" cpuSet="" Feb 23 14:34:56.383064 master-0 kubenswrapper[28758]: I0223 14:34:56.383021 28758 state_mem.go:96] "Updated CPUSet assignments" assignments={} Feb 23 14:34:56.383064 master-0 kubenswrapper[28758]: I0223 14:34:56.383052 28758 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Feb 23 14:34:56.383064 master-0 kubenswrapper[28758]: I0223 14:34:56.383059 28758 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Feb 23 14:34:56.383064 master-0 kubenswrapper[28758]: I0223 14:34:56.383065 28758 policy_none.go:49] "None policy: Start" Feb 23 14:34:56.386366 master-0 kubenswrapper[28758]: I0223 14:34:56.386323 28758 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 23 14:34:56.386436 master-0 kubenswrapper[28758]: I0223 14:34:56.386383 28758 state_mem.go:35] "Initializing new in-memory state store" Feb 23 14:34:56.386725 master-0 kubenswrapper[28758]: I0223 14:34:56.386699 28758 state_mem.go:75] "Updated machine memory state" Feb 23 14:34:56.386725 master-0 kubenswrapper[28758]: I0223 14:34:56.386718 28758 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Feb 23 14:34:56.400883 master-0 kubenswrapper[28758]: I0223 14:34:56.400835 28758 manager.go:334] "Starting Device Plugin manager" Feb 23 14:34:56.401076 master-0 kubenswrapper[28758]: I0223 14:34:56.400901 28758 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 23 14:34:56.401076 master-0 kubenswrapper[28758]: I0223 14:34:56.400916 28758 server.go:79] "Starting device plugin registration server" Feb 23 14:34:56.401336 master-0 kubenswrapper[28758]: I0223 14:34:56.401306 28758 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 23 14:34:56.401402 master-0 kubenswrapper[28758]: I0223 14:34:56.401323 28758 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 23 14:34:56.401674 master-0 kubenswrapper[28758]: I0223 14:34:56.401642 28758 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 23 14:34:56.401753 master-0 kubenswrapper[28758]: I0223 14:34:56.401742 28758 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 23 14:34:56.401753 master-0 kubenswrapper[28758]: I0223 14:34:56.401751 28758 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 23 14:34:56.501761 master-0 kubenswrapper[28758]: I0223 14:34:56.501716 28758 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 14:34:56.504445 master-0 kubenswrapper[28758]: I0223 14:34:56.504416 28758 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Feb 23 14:34:56.504657 master-0 kubenswrapper[28758]: I0223 14:34:56.504642 28758 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Feb 23 14:34:56.504756 master-0 kubenswrapper[28758]: I0223 14:34:56.504743 28758 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Feb 23 14:34:56.504937 master-0 kubenswrapper[28758]: I0223 14:34:56.504921 28758 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Feb 23 14:34:56.516343 master-0 kubenswrapper[28758]: I0223 14:34:56.516277 28758 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Feb 23 14:34:56.516565 master-0 kubenswrapper[28758]: I0223 14:34:56.516396 28758 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Feb 23 14:34:56.643798 master-0 kubenswrapper[28758]: I0223 14:34:56.643682 28758 generic.go:334] "Generic (PLEG): container finished" podID="06bde94a-3126-4d0f-baba-49dc5fbec61b" containerID="7461839a3a630e391eda2be4a947e3e187fea230edbbc3e8b3af02abc9e03e06" exitCode=0 Feb 23 14:34:59.397419 master-0 kubenswrapper[28758]: I0223 14:34:59.397288 28758 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0","openshift-kube-controller-manager/kube-controller-manager-master-0","openshift-kube-scheduler/openshift-kube-scheduler-master-0","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0","openshift-kube-apiserver/kube-apiserver-master-0"] Feb 23 14:34:59.398106 master-0 kubenswrapper[28758]: I0223 14:34:59.398003 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x","openshift-kube-scheduler/installer-5-master-0","openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr","openshift-network-diagnostics/network-check-target-x9gxm","openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f","openshift-cloud-credential-operator/cloud-credential-operator-6968c58f46-p7jh7","openshift-cluster-node-tuning-operator/tuned-wsx6c","openshift-dns/node-resolver-7b6jk","openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7","openshift-kube-storage-version-migrator/migrator-5c85bff57-vk2x8","openshift-monitoring/node-exporter-ckhv6","openshift-kube-scheduler/installer-4-retry-1-master-0","openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j","openshift-multus/multus-vdzqk","openshift-multus/network-metrics-daemon-9dnsv","openshift-apiserver/apiserver-666b887977-f7h55","openshift-cluster-samples-operator/cluster-samples-operator-65c5c48b9b-ps6x5","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd","openshift-machine-config-operator/machine-config-controller-54cb48566c-g4r57","openshift-network-node-identity/network-node-identity-td489","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp","openshift-cluster-storage-operator/csi-snapshot-controller-6847bb4785-5fw2x","openshift-cluster-version/cluster-version-operator-57476485-m58rm","openshift-ingress-operator/ingress-operator-6569778c84-hsl6c","openshift-ingress/router-default-7b65dc9fcb-w68qb","openshift-kube-controller-manager/kube-controller-manager-master-0","openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp","openshift-monitoring/prometheus-operator-admission-webhook-75d56db95f-rg8tp","openshift-etcd/installer-1-master-0","openshift-marketplace/marketplace-operator-6f5488b997-7b5sp","openshift-operator-lifecycle-manager/catalog-operator-596f79dd6f-mhzxn","openshift-ovn-kubernetes/ovnkube-node-ftngv","openshift-service-ca/service-ca-576b4d78bd-lq6ct","openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4","openshift-operator-lifecycle-manager/olm-operator-5499d7f7bb-t45zz","openshift-cluster-storage-operator/csi-snapshot-controller-operator-6fb4df594f-hkcgz","openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6","openshift-machine-config-operator/machine-config-daemon-fhcgg","openshift-machine-config-operator/machine-config-server-qwsmk","openshift-multus/multus-admission-controller-5f54bf67d4-2p4jz","openshift-network-operator/iptables-alerter-t5h8h","openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9","openshift-marketplace/redhat-operators-tl6dk","openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7","openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw","openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb","openshift-controller-manager/controller-manager-55d786cb4c-cqkbt","openshift-kube-controller-manager/installer-3-master-0","openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88","openshift-dns-operator/dns-operator-8c7d49845-5rk2g","openshift-insights/insights-operator-59b498fcfb-rz897","openshift-kube-controller-manager/installer-2-retry-1-master-0","openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6","openshift-etcd/etcd-master-0","openshift-kube-scheduler/installer-4-master-0","openshift-kube-scheduler/openshift-kube-scheduler-master-0","openshift-monitoring/kube-state-metrics-59584d565f-pdl4r","openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v","openshift-dns/dns-default-86l7f","openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9","openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh","openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq","openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj","openshift-kube-controller-manager/installer-2-master-0","openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl","openshift-marketplace/certified-operators-cdrlk","openshift-multus/multus-additional-cni-plugins-jdsv6","assisted-installer/assisted-installer-controller-r6z45","openshift-cluster-storage-operator/cluster-storage-operator-f94476f49-s6c8v","openshift-machine-api/control-plane-machine-set-operator-686847ff5f-9q266","openshift-monitoring/prometheus-operator-754bc4d665-nl92v","openshift-ingress-canary/ingress-canary-nwdpd","openshift-kube-apiserver/installer-4-master-0","openshift-monitoring/metrics-server-f55d8f669-b2gf9","openshift-network-diagnostics/network-check-source-58fb6744f5-848dv","openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr","openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm","openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz","openshift-kube-apiserver/installer-1-master-0","openshift-marketplace/community-operators-fjpvt","openshift-marketplace/redhat-marketplace-pfb9h","openshift-network-operator/network-operator-7d7db75979-x4qnw","openshift-operator-lifecycle-manager/collect-profiles-29530950-wnkgm"] Feb 23 14:34:59.398294 master-0 kubenswrapper[28758]: I0223 14:34:59.398231 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-r6z45" Feb 23 14:34:59.411528 master-0 kubenswrapper[28758]: I0223 14:34:59.402757 28758 kubelet.go:2566] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="7957d771-baea-4150-9b67-556f890a05d5" Feb 23 14:34:59.411528 master-0 kubenswrapper[28758]: I0223 14:34:59.407864 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 23 14:34:59.411528 master-0 kubenswrapper[28758]: I0223 14:34:59.408135 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 23 14:34:59.411528 master-0 kubenswrapper[28758]: I0223 14:34:59.408332 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Feb 23 14:34:59.411528 master-0 kubenswrapper[28758]: I0223 14:34:59.408563 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Feb 23 14:34:59.411528 master-0 kubenswrapper[28758]: I0223 14:34:59.408859 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Feb 23 14:34:59.423401 master-0 kubenswrapper[28758]: I0223 14:34:59.422127 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 23 14:34:59.423401 master-0 kubenswrapper[28758]: I0223 14:34:59.422355 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 23 14:34:59.423401 master-0 kubenswrapper[28758]: I0223 14:34:59.422811 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 23 14:34:59.423401 master-0 kubenswrapper[28758]: I0223 14:34:59.423146 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 23 14:34:59.423401 master-0 kubenswrapper[28758]: I0223 14:34:59.423201 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 23 14:34:59.423401 master-0 kubenswrapper[28758]: I0223 14:34:59.423274 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Feb 23 14:34:59.423401 master-0 kubenswrapper[28758]: I0223 14:34:59.423297 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Feb 23 14:34:59.426318 master-0 kubenswrapper[28758]: I0223 14:34:59.425346 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Feb 23 14:34:59.426318 master-0 kubenswrapper[28758]: I0223 14:34:59.425385 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 23 14:34:59.426318 master-0 kubenswrapper[28758]: I0223 14:34:59.425464 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 23 14:34:59.426318 master-0 kubenswrapper[28758]: I0223 14:34:59.425543 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 23 14:34:59.426318 master-0 kubenswrapper[28758]: I0223 14:34:59.425657 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 23 14:34:59.426318 master-0 kubenswrapper[28758]: I0223 14:34:59.425383 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 23 14:34:59.426318 master-0 kubenswrapper[28758]: I0223 14:34:59.425793 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 23 14:34:59.426318 master-0 kubenswrapper[28758]: I0223 14:34:59.426014 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Feb 23 14:34:59.426318 master-0 kubenswrapper[28758]: I0223 14:34:59.426026 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 23 14:34:59.426318 master-0 kubenswrapper[28758]: I0223 14:34:59.426140 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 23 14:34:59.426710 master-0 kubenswrapper[28758]: I0223 14:34:59.426310 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 23 14:34:59.426710 master-0 kubenswrapper[28758]: I0223 14:34:59.426473 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Feb 23 14:34:59.427428 master-0 kubenswrapper[28758]: I0223 14:34:59.426794 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 23 14:34:59.427428 master-0 kubenswrapper[28758]: I0223 14:34:59.426882 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 23 14:34:59.427428 master-0 kubenswrapper[28758]: I0223 14:34:59.426991 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 23 14:34:59.427428 master-0 kubenswrapper[28758]: I0223 14:34:59.427149 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 23 14:34:59.427428 master-0 kubenswrapper[28758]: I0223 14:34:59.427246 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 23 14:34:59.427654 master-0 kubenswrapper[28758]: I0223 14:34:59.427494 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 23 14:34:59.427822 master-0 kubenswrapper[28758]: I0223 14:34:59.427778 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 23 14:34:59.427882 master-0 kubenswrapper[28758]: I0223 14:34:59.427845 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 23 14:34:59.428244 master-0 kubenswrapper[28758]: I0223 14:34:59.428015 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Feb 23 14:34:59.428244 master-0 kubenswrapper[28758]: I0223 14:34:59.428055 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 23 14:34:59.428244 master-0 kubenswrapper[28758]: I0223 14:34:59.428133 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 23 14:34:59.428494 master-0 kubenswrapper[28758]: I0223 14:34:59.428456 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 23 14:34:59.428718 master-0 kubenswrapper[28758]: I0223 14:34:59.428704 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 23 14:34:59.428824 master-0 kubenswrapper[28758]: I0223 14:34:59.428454 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 23 14:34:59.428963 master-0 kubenswrapper[28758]: I0223 14:34:59.428930 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 23 14:34:59.429006 master-0 kubenswrapper[28758]: I0223 14:34:59.428982 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 23 14:34:59.429166 master-0 kubenswrapper[28758]: I0223 14:34:59.429119 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 23 14:34:59.431233 master-0 kubenswrapper[28758]: I0223 14:34:59.431170 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 23 14:34:59.431388 master-0 kubenswrapper[28758]: I0223 14:34:59.431288 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 23 14:34:59.431388 master-0 kubenswrapper[28758]: I0223 14:34:59.431310 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 23 14:34:59.431388 master-0 kubenswrapper[28758]: I0223 14:34:59.431346 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Feb 23 14:34:59.431638 master-0 kubenswrapper[28758]: I0223 14:34:59.431546 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 23 14:34:59.431638 master-0 kubenswrapper[28758]: I0223 14:34:59.431555 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 23 14:34:59.431638 master-0 kubenswrapper[28758]: I0223 14:34:59.431615 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 23 14:34:59.431638 master-0 kubenswrapper[28758]: I0223 14:34:59.431630 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Feb 23 14:34:59.432004 master-0 kubenswrapper[28758]: I0223 14:34:59.431550 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Feb 23 14:34:59.432004 master-0 kubenswrapper[28758]: I0223 14:34:59.431743 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 23 14:34:59.432004 master-0 kubenswrapper[28758]: I0223 14:34:59.431884 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 23 14:34:59.432004 master-0 kubenswrapper[28758]: I0223 14:34:59.431952 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 23 14:34:59.432004 master-0 kubenswrapper[28758]: I0223 14:34:59.431991 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 23 14:34:59.432336 master-0 kubenswrapper[28758]: I0223 14:34:59.432118 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 23 14:34:59.432336 master-0 kubenswrapper[28758]: I0223 14:34:59.432172 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 23 14:34:59.432336 master-0 kubenswrapper[28758]: I0223 14:34:59.431892 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 23 14:34:59.432336 master-0 kubenswrapper[28758]: I0223 14:34:59.432285 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 23 14:34:59.432336 master-0 kubenswrapper[28758]: I0223 14:34:59.432335 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 23 14:34:59.432879 master-0 kubenswrapper[28758]: I0223 14:34:59.432470 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 23 14:34:59.434022 master-0 kubenswrapper[28758]: I0223 14:34:59.432555 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 23 14:34:59.434022 master-0 kubenswrapper[28758]: I0223 14:34:59.432642 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 23 14:34:59.434022 master-0 kubenswrapper[28758]: I0223 14:34:59.432687 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 23 14:34:59.434022 master-0 kubenswrapper[28758]: I0223 14:34:59.432744 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 23 14:34:59.434022 master-0 kubenswrapper[28758]: I0223 14:34:59.432839 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 23 14:34:59.434022 master-0 kubenswrapper[28758]: I0223 14:34:59.433882 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Feb 23 14:34:59.434245 master-0 kubenswrapper[28758]: I0223 14:34:59.434060 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Feb 23 14:34:59.434771 master-0 kubenswrapper[28758]: I0223 14:34:59.432863 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Feb 23 14:34:59.434771 master-0 kubenswrapper[28758]: I0223 14:34:59.434427 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 23 14:34:59.434771 master-0 kubenswrapper[28758]: I0223 14:34:59.432933 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 23 14:34:59.434771 master-0 kubenswrapper[28758]: I0223 14:34:59.432946 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 23 14:34:59.434771 master-0 kubenswrapper[28758]: I0223 14:34:59.434585 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 23 14:34:59.434771 master-0 kubenswrapper[28758]: I0223 14:34:59.434639 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 23 14:34:59.434771 master-0 kubenswrapper[28758]: I0223 14:34:59.433010 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 23 14:34:59.434771 master-0 kubenswrapper[28758]: I0223 14:34:59.433444 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 23 14:34:59.438025 master-0 kubenswrapper[28758]: I0223 14:34:59.437973 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 23 14:34:59.438856 master-0 kubenswrapper[28758]: I0223 14:34:59.438198 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Feb 23 14:34:59.438856 master-0 kubenswrapper[28758]: I0223 14:34:59.438221 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 23 14:34:59.438856 master-0 kubenswrapper[28758]: I0223 14:34:59.438740 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 23 14:34:59.438856 master-0 kubenswrapper[28758]: I0223 14:34:59.438749 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 23 14:34:59.439669 master-0 kubenswrapper[28758]: I0223 14:34:59.439061 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 23 14:34:59.439669 master-0 kubenswrapper[28758]: I0223 14:34:59.439183 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 23 14:34:59.439669 master-0 kubenswrapper[28758]: I0223 14:34:59.439213 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 23 14:34:59.439669 master-0 kubenswrapper[28758]: I0223 14:34:59.439340 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Feb 23 14:34:59.439669 master-0 kubenswrapper[28758]: I0223 14:34:59.439361 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 23 14:34:59.439669 master-0 kubenswrapper[28758]: I0223 14:34:59.439399 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 23 14:34:59.440582 master-0 kubenswrapper[28758]: I0223 14:34:59.439413 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 23 14:34:59.440582 master-0 kubenswrapper[28758]: E0223 14:34:59.440004 28758 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"openshift-kube-scheduler-master-0\" already exists" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Feb 23 14:34:59.440582 master-0 kubenswrapper[28758]: I0223 14:34:59.439492 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 23 14:34:59.440736 master-0 kubenswrapper[28758]: E0223 14:34:59.440116 28758 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-master-0\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:34:59.440821 master-0 kubenswrapper[28758]: E0223 14:34:59.440096 28758 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Feb 23 14:34:59.440821 master-0 kubenswrapper[28758]: I0223 14:34:59.439564 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 23 14:34:59.440921 master-0 kubenswrapper[28758]: I0223 14:34:59.439636 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Feb 23 14:34:59.440921 master-0 kubenswrapper[28758]: I0223 14:34:59.440158 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 23 14:34:59.441003 master-0 kubenswrapper[28758]: I0223 14:34:59.440329 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 23 14:34:59.441003 master-0 kubenswrapper[28758]: I0223 14:34:59.440632 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 23 14:34:59.442655 master-0 kubenswrapper[28758]: I0223 14:34:59.442383 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 23 14:34:59.444815 master-0 kubenswrapper[28758]: I0223 14:34:59.444761 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530950-wnkgm" Feb 23 14:34:59.447838 master-0 kubenswrapper[28758]: I0223 14:34:59.447805 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.450018 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.450104 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.450209 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" event={"ID":"57b57915-64dd-42f5-b06f-bc4bcc06b667","Type":"ContainerStarted","Data":"18bf2f609f8efa099778779b29a09c8f72903a95132d89474102c7f4d79d3d39"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.450342 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" event={"ID":"8de1f285-47ac-42aa-8026-8addce656362","Type":"ContainerStarted","Data":"7241c912fdeaeebbd740c87e12cc1e6c5e87f04c4117739e58873e1ff2b89ecc"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.450361 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" event={"ID":"8de1f285-47ac-42aa-8026-8addce656362","Type":"ContainerDied","Data":"a3d7f9dd773bb2be7eef32103651b05954025b8d3ad91ea82c3e56fc88bd34fd"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.450374 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" event={"ID":"8de1f285-47ac-42aa-8026-8addce656362","Type":"ContainerStarted","Data":"bf4e70417a5730a71d5c5227bc5ca324709a18d10a43505d1415f3e27a32b0fc"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.450395 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" event={"ID":"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061","Type":"ContainerStarted","Data":"1e39418594b660aa7d2fd220cbe33ba64607470ce93c748d1b847909be0ae5de"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.450437 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" event={"ID":"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061","Type":"ContainerDied","Data":"525b335554d223a0f792c02a10050ad9f40b958440d7f69f8c4c394f4e398780"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.450470 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" event={"ID":"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061","Type":"ContainerDied","Data":"4bdbe696b77666c832d686aa40ee248bba34d80f9ddd9b86b73fd8952b7b6113"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.450536 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" event={"ID":"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061","Type":"ContainerDied","Data":"881e3b61730f49e9657641d193738c054ca1938ca39d0f830ceee7b02b6b1f78"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.450554 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" event={"ID":"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061","Type":"ContainerStarted","Data":"4b7d2c8100142f929dc133ef3a280566ae721c684f05389d72d0b6d99271f228"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.450590 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tl6dk" event={"ID":"c84f66f0-207e-436a-8f4e-d1971fa815eb","Type":"ContainerStarted","Data":"7684169f87c9b5161cf129865dea04bd28679344f72d4a78a47e2ebf7f12ba2b"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.450609 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tl6dk" event={"ID":"c84f66f0-207e-436a-8f4e-d1971fa815eb","Type":"ContainerDied","Data":"ee32dcd7aedbc475bca90e7ca9047218b836991a39e06b3f0a1f7b5cd0f0132b"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.450623 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tl6dk" event={"ID":"c84f66f0-207e-436a-8f4e-d1971fa815eb","Type":"ContainerDied","Data":"d64c027236a9d4db40738067d4f95aca5d20f4b4daf356084c952897b507ab24"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.450636 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tl6dk" event={"ID":"c84f66f0-207e-436a-8f4e-d1971fa815eb","Type":"ContainerStarted","Data":"223e6055ac3ccbf4f5095a5d877f5de6b8592fbf41a24e3985f9a14f56619a70"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.450671 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"afeec80f2ec1ff5cb32c2367912befef","Type":"ContainerStarted","Data":"d39b757db5c6ad372b3e6ed02073c93d1685170abb93eb92d7e3098cd31c4317"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.450688 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"afeec80f2ec1ff5cb32c2367912befef","Type":"ContainerStarted","Data":"1318491cbf4a9852637e1a59a321f0824086291a3e80867692cb4a5b349fa4cf"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.450703 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" event={"ID":"15245f43-22db-42eb-ab0b-702240986437","Type":"ContainerDied","Data":"d8fda6fec7eadedba1d4400e4d7e27798506234350c769e6451d1eaf5b0ede8d"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.450719 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" event={"ID":"15245f43-22db-42eb-ab0b-702240986437","Type":"ContainerDied","Data":"926e5a9c249a2c0844ff9ae9838c305e194c581c93350a2a61909bb76a1d1f42"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.450756 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="926e5a9c249a2c0844ff9ae9838c305e194c581c93350a2a61909bb76a1d1f42" Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.450782 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edb6ec0bf3017407524bcef4889f83a1c2c54ff05b0e40918d7eda5368e06757" Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.450792 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" event={"ID":"3488a7eb-5170-478c-9af7-490dbe0f514e","Type":"ContainerStarted","Data":"3f0687c145632743367603550edfb620cb9232947c49e722ee283fed3a1505a2"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.450806 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" event={"ID":"3488a7eb-5170-478c-9af7-490dbe0f514e","Type":"ContainerDied","Data":"b59ddaa1f996d8d231b18a402187cbb1ee1446439ec71026f52221d4aaab529f"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.450848 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" event={"ID":"3488a7eb-5170-478c-9af7-490dbe0f514e","Type":"ContainerStarted","Data":"67af818672ec3e17ceefc5db7bd1c60de0a5faf480f82c76fab1be5ba6eb05bb"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.450862 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" event={"ID":"3488a7eb-5170-478c-9af7-490dbe0f514e","Type":"ContainerStarted","Data":"867a47d4a06c655239027935fd0111c0fd83a1e1a4a4c825f97faccc95bc37fc"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.450873 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"5f67ab24-82bc-4e71-b974-e25b819986c8","Type":"ContainerDied","Data":"c172e0b4868c308f20f7ae8b13ba955f59eebc66ffba5fd517b3648866cbe26f"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.450922 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"5f67ab24-82bc-4e71-b974-e25b819986c8","Type":"ContainerDied","Data":"6dfbc560ad1e1a5e7a72cca845fe403480de4a21ae08827666c9a7f55f0e049e"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.450942 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dfbc560ad1e1a5e7a72cca845fe403480de4a21ae08827666c9a7f55f0e049e" Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.450953 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-54cb48566c-g4r57" event={"ID":"5cc28e06-3542-4a25-a8b1-5f5b4ee41114","Type":"ContainerStarted","Data":"f6ede5e39b7b33efaf1190fcbfc978e3b3ab212001c387b4a6ad421efbb824cf"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451026 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-54cb48566c-g4r57" event={"ID":"5cc28e06-3542-4a25-a8b1-5f5b4ee41114","Type":"ContainerStarted","Data":"dab7706dbc7b87f5dfe86583c49c1b5888ad124b3a580895f9a21bab5c02cb05"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451047 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-54cb48566c-g4r57" event={"ID":"5cc28e06-3542-4a25-a8b1-5f5b4ee41114","Type":"ContainerStarted","Data":"e8da33da933e20232c5b6f5c3675ee250e9d8a32fcabaead70736dd1e091c691"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451059 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" event={"ID":"12b256b7-a57b-4124-8452-25e74cfa7926","Type":"ContainerStarted","Data":"123c9da1a73f2000f089739afcd687dfaec03bdf3090d2bb31eea1f983917dfc"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451114 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" event={"ID":"12b256b7-a57b-4124-8452-25e74cfa7926","Type":"ContainerDied","Data":"578a9e2a674702d2219386592f2e2254d406630d2cc2c55e8edf24f8f9368991"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451134 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" event={"ID":"12b256b7-a57b-4124-8452-25e74cfa7926","Type":"ContainerStarted","Data":"ea6ff3745fdddde6e725677d9a1d30748c6e897ee50a3c7ab0203d9af3e9590f"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451155 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" event={"ID":"12b256b7-a57b-4124-8452-25e74cfa7926","Type":"ContainerStarted","Data":"14329fd568a14f04f43b97498ab954734f3a702059891d8fa97640d70060f640"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451202 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-5c85bff57-vk2x8" event={"ID":"8ca3dee6-f651-4536-991c-303752c22f07","Type":"ContainerStarted","Data":"0c89c563f0edf976572aa08fd148607bd8a41b8057fc467b03154cfc52d5048c"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451220 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-5c85bff57-vk2x8" event={"ID":"8ca3dee6-f651-4536-991c-303752c22f07","Type":"ContainerStarted","Data":"656ff9c0d892162aa14c1f4924bdaef93d7972bd31e96f6230a2df8c99d0a8a8"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451233 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-5c85bff57-vk2x8" event={"ID":"8ca3dee6-f651-4536-991c-303752c22f07","Type":"ContainerStarted","Data":"71856c04f28ed0a4e9a36c70729f2e0d164816c342db7fab0a6d5f76b0f61b6a"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451286 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0829d0cc308970ce0a149e45fa21b4352374f015f4b31f7eb48a14c16cbea5b2" Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451300 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-576b4d78bd-lq6ct" event={"ID":"709ac071-4392-4a3f-a3d1-4bc8ba2f6236","Type":"ContainerStarted","Data":"856c1fc3c90a7b6e24237be79fd4d77aac6d053d0596ff94d90d617594f6c02a"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451315 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-576b4d78bd-lq6ct" event={"ID":"709ac071-4392-4a3f-a3d1-4bc8ba2f6236","Type":"ContainerDied","Data":"c28d30a2b760e3ebbe98681a086eea9adf4942f9ca5f692597b7830f1309f2a8"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451365 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-576b4d78bd-lq6ct" event={"ID":"709ac071-4392-4a3f-a3d1-4bc8ba2f6236","Type":"ContainerStarted","Data":"4257fe78462bb2b5b2d39786788f5521c3464b4b4bf8cf481be2dae32881a79a"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451395 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe39d10604843d18955327bfbe5d658cdd687bd280154adde8e869872e1753cf" Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451405 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" event={"ID":"865ceedb-b19a-4f2f-b295-311e1b7a645e","Type":"ContainerStarted","Data":"71999ac05242974407057ccc6a31a0e5b271cd2c31f7bd1c3957d9ede98478ca"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451457 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" event={"ID":"865ceedb-b19a-4f2f-b295-311e1b7a645e","Type":"ContainerDied","Data":"deffb87f96ddeeef2ceba573c92018620cd6c1adba32e1a82ff2a0041c126856"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451517 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" event={"ID":"865ceedb-b19a-4f2f-b295-311e1b7a645e","Type":"ContainerStarted","Data":"9c6c5f4b9ba45ac61b51f9857ceb74fc6b905bb2bdd1312940fdeb330ace9d7f"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451540 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" event={"ID":"f10f592e-5738-4879-b776-246b357d4621","Type":"ContainerStarted","Data":"f429434a74a7d0af0afa5ab1f60331480efe79d0b9a3742ea96282333066d9bc"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451555 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" event={"ID":"f10f592e-5738-4879-b776-246b357d4621","Type":"ContainerStarted","Data":"c159bc6185d4445126a130871d407c6543008d04b81d0dc8a0df37b2a655095d"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451567 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" event={"ID":"f10f592e-5738-4879-b776-246b357d4621","Type":"ContainerStarted","Data":"3ea5faed78b2ea2b8994832461aaaf74c9ef592eac3d0c97920f60f89ceac985"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451605 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" event={"ID":"f10f592e-5738-4879-b776-246b357d4621","Type":"ContainerStarted","Data":"949866c09669bb2b0e28930526f2d54501e60f4ade0138a026b409a4b51634b6"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451618 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" event={"ID":"f10f592e-5738-4879-b776-246b357d4621","Type":"ContainerStarted","Data":"141510cd57858396dc1d9695963a56975d67176a1a34b85a413db22e7a9d1c2d"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451630 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" event={"ID":"f10f592e-5738-4879-b776-246b357d4621","Type":"ContainerStarted","Data":"feadf5cd210bd27ef055d343cee55ad02012eb9f1c5d4c7dac28bf62c520d26a"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451641 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" event={"ID":"f10f592e-5738-4879-b776-246b357d4621","Type":"ContainerStarted","Data":"4abe3d81e03117e533870906b84ab247755d65cb9c415d60fd56970186891b64"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451683 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" event={"ID":"f10f592e-5738-4879-b776-246b357d4621","Type":"ContainerStarted","Data":"c7ce566a8c6dfc791fa99412db4be443eb6afb3b0b0f440d5d6ea4d4f2a79037"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451699 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" event={"ID":"f10f592e-5738-4879-b776-246b357d4621","Type":"ContainerDied","Data":"d631da69f8bc3fb53c35b8ef8cedda80eee352d8a4bf7c9c1590bb5315fa046f"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451714 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" event={"ID":"f10f592e-5738-4879-b776-246b357d4621","Type":"ContainerStarted","Data":"984304e1b4252b7619a58df9f7ce55ca2014852517f80186c3411dc4b687d274"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451726 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-596f79dd6f-mhzxn" event={"ID":"255b5a89-1b89-42dc-868a-32ce67975a54","Type":"ContainerStarted","Data":"3b1c0720f422e4cf08280fe87e035038c55aa20e5d2dbdf7c6fac26671b08d29"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451738 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-596f79dd6f-mhzxn" event={"ID":"255b5a89-1b89-42dc-868a-32ce67975a54","Type":"ContainerStarted","Data":"a5727697c2b4cf38a7045ae8edfe9cf2a413bc9b589c95f8630aa7de7ef3ba40"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451784 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"959c75833224b4ba3fa488b77d8f5032","Type":"ContainerStarted","Data":"7b1c7cc7b2e3d4c1fdbe5b4592355d6abc03a37f10ae5ab746402745b7ae1aa2"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451797 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"959c75833224b4ba3fa488b77d8f5032","Type":"ContainerStarted","Data":"03f063ba05fdab3010ecf16036a12816aeaffe3e50e5e9cbc85f6a31b61cfdf9"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451811 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"959c75833224b4ba3fa488b77d8f5032","Type":"ContainerStarted","Data":"f48e8181c4b1545411cb74be92ee46d43b72112e508bbbde3d0b0625382cc193"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451855 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"959c75833224b4ba3fa488b77d8f5032","Type":"ContainerStarted","Data":"db3bcfab91dd51437354641c4ae5c853de368a114ab29260a261ff0716a6e3aa"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451869 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"959c75833224b4ba3fa488b77d8f5032","Type":"ContainerStarted","Data":"8daf30ba9c8438cb4829a86d151f5a35be6ceb72bec19daa3149b2925e7076a2"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451880 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"959c75833224b4ba3fa488b77d8f5032","Type":"ContainerDied","Data":"40cb1664e8a96775d97586c3b2bf51f0c43fd54057e211ddda21f17bebe65211"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451893 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"959c75833224b4ba3fa488b77d8f5032","Type":"ContainerStarted","Data":"77590dc8fba389fa97fd0b176ea4707c8bfaef0fd399e113347fbdf7415d0d0f"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451905 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-686847ff5f-9q266" event={"ID":"4373687a-61a0-434b-81f7-3fecaa1494ef","Type":"ContainerStarted","Data":"61f0e42918568e19cfd95e1214cb522d9ea19f33de09143d470f6bb7988c8d8a"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451958 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-686847ff5f-9q266" event={"ID":"4373687a-61a0-434b-81f7-3fecaa1494ef","Type":"ContainerDied","Data":"9b45bf126e1d92621372b72946a5700b9c49834f8698b4a6266b185922dfcbee"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.451985 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-686847ff5f-9q266" event={"ID":"4373687a-61a0-434b-81f7-3fecaa1494ef","Type":"ContainerStarted","Data":"4ae19cabdf4e15b9983be578ad7a63be61278ebfe49db1eb9827bad0d8d1a242"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.452041 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-6968c58f46-p7jh7" event={"ID":"85365dec-af50-406c-b258-890e4f454c4a","Type":"ContainerStarted","Data":"87ac5de4a86de4ff5bcd686d0a9509d8482fb517bcd1929c37584398f44deed7"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.452057 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-6968c58f46-p7jh7" event={"ID":"85365dec-af50-406c-b258-890e4f454c4a","Type":"ContainerDied","Data":"348647c8be47f1f0398a726d98ab4e65fbf23ef3ceae1691e078bd87dddb99c7"} Feb 23 14:34:59.451972 master-0 kubenswrapper[28758]: I0223 14:34:59.452071 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-6968c58f46-p7jh7" event={"ID":"85365dec-af50-406c-b258-890e4f454c4a","Type":"ContainerStarted","Data":"4f625178953c567a0bf9ef3fc88c664d915e017cda848446b8ac0cad04aeff48"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452109 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-6968c58f46-p7jh7" event={"ID":"85365dec-af50-406c-b258-890e4f454c4a","Type":"ContainerStarted","Data":"b4132c8230caf30ada71198ad6ab1bfac93f4aab775d0d4c1263153a8363aaf9"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452125 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" event={"ID":"b9cf1c39-24f0-420b-8020-089616d1cdf0","Type":"ContainerStarted","Data":"de147076f9d00421d454d2b4d2c157f8440c0694fa5a6d3ad620d5c550d4adbe"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452138 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" event={"ID":"b9cf1c39-24f0-420b-8020-089616d1cdf0","Type":"ContainerDied","Data":"cb07ee7a08ec58d0214f496b0ca32c3611b77165c521b9fecab35b067ef91753"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452152 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" event={"ID":"b9cf1c39-24f0-420b-8020-089616d1cdf0","Type":"ContainerStarted","Data":"4782d187d8efc0b4014aa50653963e17b661187c5f36601036516cb2857a5d98"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452191 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" event={"ID":"24829faf-50e8-45bb-abb0-7cc5ccf81080","Type":"ContainerStarted","Data":"0772c3729738659c84accce490c8a697b2dc528f52ebd605f9f2d93ea9004a9e"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452216 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" event={"ID":"24829faf-50e8-45bb-abb0-7cc5ccf81080","Type":"ContainerDied","Data":"50817d53493752eda9d4463a0b5a65e93107befbd5b1e52f265dd7c7f17a73bc"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452244 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" event={"ID":"24829faf-50e8-45bb-abb0-7cc5ccf81080","Type":"ContainerStarted","Data":"fcec922662159dc1cf38c675599685e8c305a9fc3cb374ca7d731b92354b4d60"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452300 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" event={"ID":"9b558268-2262-4593-893e-408639a9987d","Type":"ContainerStarted","Data":"deea197b57c536e08f67845d8b2b560fc3c42e69e2de93f0b375ab236c9ca72e"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452313 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" event={"ID":"9b558268-2262-4593-893e-408639a9987d","Type":"ContainerStarted","Data":"16f09e4885901a350ad5d473f91c2d104b970ac63a52a074d9fd82367db4b586"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452325 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"e1148263-7b15-4c12-a217-8b030ecd9348","Type":"ContainerDied","Data":"909663cdb0c0ac8db46b5e0989f1e87cc68ef03f3124e36ed314cba8e6058032"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452374 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"e1148263-7b15-4c12-a217-8b030ecd9348","Type":"ContainerDied","Data":"329af3342eefe9c97b6c099fc381b37d3806777c128f648a5ea19e45c1f91e62"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452388 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="329af3342eefe9c97b6c099fc381b37d3806777c128f648a5ea19e45c1f91e62" Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452400 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" event={"ID":"b714a9df-026e-423d-a980-2569f0d92e47","Type":"ContainerStarted","Data":"78089bf3eecbd92606ced5d402993a51e0b2a1c97d9b614aa158b112de869851"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452416 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" event={"ID":"b714a9df-026e-423d-a980-2569f0d92e47","Type":"ContainerDied","Data":"22cda996f9dec95459a017791c6284a80f33c42296156317930bcb92d3fc7877"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452431 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" event={"ID":"b714a9df-026e-423d-a980-2569f0d92e47","Type":"ContainerStarted","Data":"5110b129f87dd0c4cfa0060a0c853f8887b553680a908511fa6dc6b38b84e26d"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452443 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" event={"ID":"b9774f8c-0f29-46d8-be77-81bcf74d5994","Type":"ContainerStarted","Data":"076f7d65fee110708a8e6296446403bfd54323b5e373a5fdc91a115dfcb1e945"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452455 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" event={"ID":"b9774f8c-0f29-46d8-be77-81bcf74d5994","Type":"ContainerDied","Data":"94bfdbcfdcf4914977da334b3fd2fe80966ec6c36be33d3628e4eada6361765f"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452469 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" event={"ID":"b9774f8c-0f29-46d8-be77-81bcf74d5994","Type":"ContainerStarted","Data":"a74df791e2285ece031ddb2cb6a548b32c5f641cf114501941f1933c7809fad4"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452506 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"18a83278819db2092fa26d8274eb3f00","Type":"ContainerStarted","Data":"8ebd95875caac8439baca54415c00dcf7fafd7bb372421de30584dee13828051"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452530 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"18a83278819db2092fa26d8274eb3f00","Type":"ContainerStarted","Data":"d783393f51b6dc83f57672bdad13e558c8969087e1d7a88a2a8c67c244b55dbe"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452551 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"18a83278819db2092fa26d8274eb3f00","Type":"ContainerStarted","Data":"ecda0f59f77c55aef5b6997149fe196458b5009b38a90f2c2ecb3d3be2666b23"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452573 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"18a83278819db2092fa26d8274eb3f00","Type":"ContainerStarted","Data":"6a2e68abb8955d199221c9563462d7c40348ae0f0d637124ee947f300645aca2"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452593 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"18a83278819db2092fa26d8274eb3f00","Type":"ContainerStarted","Data":"cce51c3475ff5b0dfbfd6ba41c684f036ddeb5d98e0d3c9ece1c70d0cc4cd606"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452612 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"18a83278819db2092fa26d8274eb3f00","Type":"ContainerDied","Data":"4850e29e1670d0434d8ca87c5950a0424937b61be4c5fb2ae511df8fe764c7a2"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452624 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"18a83278819db2092fa26d8274eb3f00","Type":"ContainerDied","Data":"626890ddbc06982ad60de27c4c4ad3f994d6a386f27886fbc0cdba298ce4fc87"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452637 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"18a83278819db2092fa26d8274eb3f00","Type":"ContainerDied","Data":"e1b9898b8e99e752199648be3eeb21746009166b99c8416be13f36fdd12cbcdd"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452647 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"18a83278819db2092fa26d8274eb3f00","Type":"ContainerStarted","Data":"94e94715e4a9a7ea0bdeab74580c1cabb71e05248b0269144d5616aa9022f9eb"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452657 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" event={"ID":"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c","Type":"ContainerStarted","Data":"027076b56ad4f46e2ddfd17e8442ed8e0b84dc543030de168a735a3c04b63c22"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452671 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" event={"ID":"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c","Type":"ContainerStarted","Data":"e10405a7c233b1e760533a57a424faa717efb1d70d9ccd63d71a225ba97dfe41"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452683 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" event={"ID":"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c","Type":"ContainerStarted","Data":"2d6756d14611d57202a60d04d7be8115390739a88f5d00004fc09162e73f45a9"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452717 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" event={"ID":"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c","Type":"ContainerStarted","Data":"f5da67148a68052c542cf85f2d066448d7345b96cbb5647569d62bb97b2af2b1"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452735 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp" event={"ID":"3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3","Type":"ContainerStarted","Data":"dfdd31784cff3d18e2ca10b9d658f04a933e734fce723e106164c1f2dd94e36b"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452748 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp" event={"ID":"3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3","Type":"ContainerDied","Data":"08d3df84ad8de18eec9e6a636baf4cb95ff798ffedb9a2d917a6b77d6c934fb7"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452759 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp" event={"ID":"3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3","Type":"ContainerStarted","Data":"20ad7c69969be4fe0faec7c107f4d93134a311b1be86286c2c8404730a5b81aa"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452776 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp" event={"ID":"3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3","Type":"ContainerStarted","Data":"2a52d8e1940b8a601f24fbfede361672aeb32a5195856b935f21043b52b85ae5"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452796 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"c997c8e9d3be51d454d8e61e376bef08","Type":"ContainerStarted","Data":"404214ba54b8b128195146c77065f2702359c7ee02579e2cb25064ddce3c7dcc"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452809 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"c997c8e9d3be51d454d8e61e376bef08","Type":"ContainerDied","Data":"6ac0b900bdb2d552799e0b2929f88eaa7518eb0c998cb215c17a947032781e19"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452831 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"c997c8e9d3be51d454d8e61e376bef08","Type":"ContainerDied","Data":"02553ea2f34fd5b9d9104437dd7120800883c473073a6a74895604093906e009"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452848 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"c997c8e9d3be51d454d8e61e376bef08","Type":"ContainerStarted","Data":"57996809f1e2dec5f618cc991b1ec9797922b627eb03d04dabd6bb6cb4205117"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452878 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5f54bf67d4-2p4jz" event={"ID":"8dd5fa7c-0519-4170-89c6-b369e5fc1990","Type":"ContainerStarted","Data":"37a983f29806128b58a67941f38167320dfc1a3122dd4612f255a73dd51901f6"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452890 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5f54bf67d4-2p4jz" event={"ID":"8dd5fa7c-0519-4170-89c6-b369e5fc1990","Type":"ContainerStarted","Data":"0ed4966819a9ced864bc862dafc478844197956409ab608658cff2b481eaf91a"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452901 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5f54bf67d4-2p4jz" event={"ID":"8dd5fa7c-0519-4170-89c6-b369e5fc1990","Type":"ContainerStarted","Data":"8a899f46b5ae367f29ecac877a3d8b6b2ea9e0cf04f3dc088df5a7ab7fffcc36"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452912 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" event={"ID":"172d47fd-e1a1-4d77-9e31-c4f22e824d5f","Type":"ContainerStarted","Data":"fdc652e9656c00d3c073d43834d94e829b13a496c3f4293067e250e5e619251d"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452924 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" event={"ID":"172d47fd-e1a1-4d77-9e31-c4f22e824d5f","Type":"ContainerStarted","Data":"c9d52dd8df6e377a4d3cb203e2bf79a0b610e87fcaa198b6bee208b63f11b129"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452935 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" event={"ID":"172d47fd-e1a1-4d77-9e31-c4f22e824d5f","Type":"ContainerStarted","Data":"5735cff00eeb26df95f36e1548b653da767e9f1764f098d8c148f2a98be789ec"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452946 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" event={"ID":"172d47fd-e1a1-4d77-9e31-c4f22e824d5f","Type":"ContainerStarted","Data":"b51483fb30eb125aa1ba7d4f431cb050c71a85528244347d0f1ad28b65c42bd5"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452959 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" event={"ID":"76c67569-3a72-4de9-87cd-432a4607b15b","Type":"ContainerStarted","Data":"124a68bb47c3e30bb951c84920269609e54235b4ea92c589f9e5c85ddedbc17d"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452971 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" event={"ID":"76c67569-3a72-4de9-87cd-432a4607b15b","Type":"ContainerDied","Data":"0ccc3d4cff85a134107f96cc12ad89d4b417a48e075d82e5410bc67ca88a884e"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452983 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" event={"ID":"76c67569-3a72-4de9-87cd-432a4607b15b","Type":"ContainerStarted","Data":"d07dabdbb75e5831d675bd90d3cedb35103b104effb049877f3263f5f9bc95d3"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.452995 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" event={"ID":"76c67569-3a72-4de9-87cd-432a4607b15b","Type":"ContainerStarted","Data":"475b3c76ad7ac657e1ef59565d052b44742cf128419941b4feb55cbb0d636474"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.453057 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" event={"ID":"b090ed5a-984f-41dd-8cea-34a1ece1514f","Type":"ContainerStarted","Data":"f1624e490e996adbc1581323e6fc52ba5b194ad2cd07588f621ae1c9497226e8"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.453175 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.453567 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.454012 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" event={"ID":"b090ed5a-984f-41dd-8cea-34a1ece1514f","Type":"ContainerDied","Data":"be11245e52df36836387b793176a5296c3112993cdce052d05331b901d833321"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.454082 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" event={"ID":"b090ed5a-984f-41dd-8cea-34a1ece1514f","Type":"ContainerStarted","Data":"2b304c8b0f837d8a5676c01cc4b19f81d7aa44858d1d53ee5b0312db0b49e71f"} Feb 23 14:34:59.455160 master-0 kubenswrapper[28758]: I0223 14:34:59.454100 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" event={"ID":"b090ed5a-984f-41dd-8cea-34a1ece1514f","Type":"ContainerStarted","Data":"f850a3ee886935c4dd2d0266e97d2bc00c30e8e88c1475292224ac9d98f6501e"} Feb 23 14:34:59.457613 master-0 kubenswrapper[28758]: I0223 14:34:59.454125 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"1225c7e0-f2d1-4b39-979c-c77191862c81","Type":"ContainerDied","Data":"77b1792020215e1792b7b140e0ac936225d54418ad659b82a3189d8865905a56"} Feb 23 14:34:59.457613 master-0 kubenswrapper[28758]: I0223 14:34:59.456917 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"1225c7e0-f2d1-4b39-979c-c77191862c81","Type":"ContainerDied","Data":"c76f131e30758a750c900160974a6f4d36e53aa3d716b4777b6fcccdf538b7ff"} Feb 23 14:34:59.457613 master-0 kubenswrapper[28758]: I0223 14:34:59.456949 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c76f131e30758a750c900160974a6f4d36e53aa3d716b4777b6fcccdf538b7ff" Feb 23 14:34:59.457613 master-0 kubenswrapper[28758]: I0223 14:34:59.456968 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" event={"ID":"fae9a4cf-2acf-4728-9105-87e004052fe5","Type":"ContainerStarted","Data":"02e5f74fd0fc64761a36d80414a3b15fecade9e3a74052c75afffc286b820ab8"} Feb 23 14:34:59.457613 master-0 kubenswrapper[28758]: I0223 14:34:59.456996 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" event={"ID":"fae9a4cf-2acf-4728-9105-87e004052fe5","Type":"ContainerStarted","Data":"b1dde519d91e918a963d5dd69f1689c73c27c10bcb7f3538a8a0cc3077e15afb"} Feb 23 14:34:59.457613 master-0 kubenswrapper[28758]: I0223 14:34:59.457014 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" event={"ID":"fae9a4cf-2acf-4728-9105-87e004052fe5","Type":"ContainerStarted","Data":"20acbe7091a8520599c223e28966f8924f32322212993de0d035d00606e90c34"} Feb 23 14:34:59.457613 master-0 kubenswrapper[28758]: I0223 14:34:59.457033 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" event={"ID":"fae9a4cf-2acf-4728-9105-87e004052fe5","Type":"ContainerStarted","Data":"f9fabe4de8507d0278903a966443b61784dc222f54713517ea295798fc992f95"} Feb 23 14:34:59.457613 master-0 kubenswrapper[28758]: I0223 14:34:59.457035 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 23 14:34:59.457613 master-0 kubenswrapper[28758]: I0223 14:34:59.457046 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vdzqk" event={"ID":"09d80e28-0b64-4c5d-a9bc-99d843d40165","Type":"ContainerStarted","Data":"f63f2509174f7f7271730d33288707704d50e4f6775be28d027d37556b5992a9"} Feb 23 14:34:59.457613 master-0 kubenswrapper[28758]: I0223 14:34:59.457177 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vdzqk" event={"ID":"09d80e28-0b64-4c5d-a9bc-99d843d40165","Type":"ContainerStarted","Data":"7d9debfc99355a24383e4ffd764682011042ebcd62151bc7e6d7e61d3c2be56f"} Feb 23 14:34:59.457613 master-0 kubenswrapper[28758]: I0223 14:34:59.454925 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Feb 23 14:34:59.458060 master-0 kubenswrapper[28758]: I0223 14:34:59.457198 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7" event={"ID":"ad0f0d72-0337-4347-bb50-e299a175f3ca","Type":"ContainerStarted","Data":"7ba62a28fa741876aeafcc419bd3f3721acce59a5947cd8fcdc354def4e8ba87"} Feb 23 14:34:59.458060 master-0 kubenswrapper[28758]: I0223 14:34:59.457826 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7" event={"ID":"ad0f0d72-0337-4347-bb50-e299a175f3ca","Type":"ContainerDied","Data":"c3f209a9ce16ae00e125bd88a555117337a8948041a4b5c781124f66c958f969"} Feb 23 14:34:59.458060 master-0 kubenswrapper[28758]: I0223 14:34:59.457885 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7" event={"ID":"ad0f0d72-0337-4347-bb50-e299a175f3ca","Type":"ContainerStarted","Data":"3dbe1f3d3698f2e251e24d454f894aefdf798ceecbb606aa9dd5f9be4602195a"} Feb 23 14:34:59.458060 master-0 kubenswrapper[28758]: I0223 14:34:59.457901 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" event={"ID":"66c72c71-f74a-43ab-bf0d-1f4c93623774","Type":"ContainerStarted","Data":"80c3493a0d8d53c83776cd0edf83e55e36824f6eb21d9ce4f03e101c3a13e139"} Feb 23 14:34:59.458060 master-0 kubenswrapper[28758]: I0223 14:34:59.457911 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" event={"ID":"66c72c71-f74a-43ab-bf0d-1f4c93623774","Type":"ContainerDied","Data":"e192093c7698f9c13f14fd55a50b3b960cd4142b3b8cb914299c2709465ffc51"} Feb 23 14:34:59.458060 master-0 kubenswrapper[28758]: I0223 14:34:59.457924 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" event={"ID":"66c72c71-f74a-43ab-bf0d-1f4c93623774","Type":"ContainerStarted","Data":"4d87515ad6869d3e6b52fa61f2c70d062fc9d561e8b978fe5f1550fbaf07704a"} Feb 23 14:34:59.458060 master-0 kubenswrapper[28758]: I0223 14:34:59.457938 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" event={"ID":"66c72c71-f74a-43ab-bf0d-1f4c93623774","Type":"ContainerStarted","Data":"e209b32301611ace99d9d8f60b3c7574bcb7691d3f24d73da6cbdd55987d8c54"} Feb 23 14:34:59.458060 master-0 kubenswrapper[28758]: I0223 14:34:59.457955 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afc8fc4dee0a8892459a65c0cd44c3d38a00ddab249a8e98f6954d9605e9c33a" Feb 23 14:34:59.458060 master-0 kubenswrapper[28758]: I0223 14:34:59.457969 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ckhv6" event={"ID":"15ad7f4e-44c6-4426-8b97-c47a47786544","Type":"ContainerStarted","Data":"3af9ebc2851196aac12096fa739a23a372f4916e42d4cc0e0b42e3494587ee38"} Feb 23 14:34:59.458060 master-0 kubenswrapper[28758]: I0223 14:34:59.457979 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ckhv6" event={"ID":"15ad7f4e-44c6-4426-8b97-c47a47786544","Type":"ContainerStarted","Data":"9cd0e2f066454ddc59ba9d55af0212ac1d7bb30de28054ded4b6510cddb437d7"} Feb 23 14:34:59.458060 master-0 kubenswrapper[28758]: I0223 14:34:59.457993 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ckhv6" event={"ID":"15ad7f4e-44c6-4426-8b97-c47a47786544","Type":"ContainerDied","Data":"1c473cd845ba12d4bf1927e76251ff9dc47cd40e997137152d0746ccd7834430"} Feb 23 14:34:59.458060 master-0 kubenswrapper[28758]: I0223 14:34:59.458002 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ckhv6" event={"ID":"15ad7f4e-44c6-4426-8b97-c47a47786544","Type":"ContainerStarted","Data":"38ec00e9dfbef6fee1166da5b097f1e6a12696d48f32303b497f0b2d760141c8"} Feb 23 14:34:59.458060 master-0 kubenswrapper[28758]: I0223 14:34:59.458015 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-65c5c48b9b-ps6x5" event={"ID":"1a283e3a-33ba-4ef7-87d3-55ed8c953fb4","Type":"ContainerStarted","Data":"7d1da5487ea8eb934b2cc7ce134e56590c6e42e0ab8b62329b544c3d98919034"} Feb 23 14:34:59.458060 master-0 kubenswrapper[28758]: I0223 14:34:59.458057 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-65c5c48b9b-ps6x5" event={"ID":"1a283e3a-33ba-4ef7-87d3-55ed8c953fb4","Type":"ContainerStarted","Data":"69bf6c3db3f46a2800c793cb644d9476720461fc601ab495118365c94dd14b4f"} Feb 23 14:34:59.458060 master-0 kubenswrapper[28758]: I0223 14:34:59.458070 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-65c5c48b9b-ps6x5" event={"ID":"1a283e3a-33ba-4ef7-87d3-55ed8c953fb4","Type":"ContainerStarted","Data":"d9d7c0c01b5302e99d057b82e04bc20f6aaa2ecd35612cea6195a17dbb1d878e"} Feb 23 14:34:59.458828 master-0 kubenswrapper[28758]: I0223 14:34:59.458080 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7b6jk" event={"ID":"2f876e5d-2e82-47d0-8a9c-adacf2bddf77","Type":"ContainerStarted","Data":"f105f21310354d57ab5d681251f5d073aac907cbf6a02b20b473f0be82c5c836"} Feb 23 14:34:59.458828 master-0 kubenswrapper[28758]: I0223 14:34:59.458095 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7b6jk" event={"ID":"2f876e5d-2e82-47d0-8a9c-adacf2bddf77","Type":"ContainerStarted","Data":"9307bb7ee156f4761943094e1eb907a68e217cea3a83d35051b952c84a004e40"} Feb 23 14:34:59.458828 master-0 kubenswrapper[28758]: I0223 14:34:59.458103 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" event={"ID":"cb6e88cd-98de-446a-92e8-f56a2f133703","Type":"ContainerStarted","Data":"c12a63d20fd94ddf67f99b5864ba6402ee26036260625afea30b344472b19c16"} Feb 23 14:34:59.458828 master-0 kubenswrapper[28758]: I0223 14:34:59.458117 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" event={"ID":"cb6e88cd-98de-446a-92e8-f56a2f133703","Type":"ContainerDied","Data":"e062ef1f26297d24d2516be8292a3297ef7a87cfa574a75bfb2f2e2e904d65e1"} Feb 23 14:34:59.458828 master-0 kubenswrapper[28758]: I0223 14:34:59.458131 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" event={"ID":"cb6e88cd-98de-446a-92e8-f56a2f133703","Type":"ContainerStarted","Data":"6ffc0e356ee8d2e23632fe04da113c89cd2bff5243dd2b5c07a151a546ba49d8"} Feb 23 14:34:59.458828 master-0 kubenswrapper[28758]: I0223 14:34:59.458139 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-x9gxm" event={"ID":"ded555da-db03-498e-81a9-ad166f29a2aa","Type":"ContainerStarted","Data":"ee198b41064b4937283c60c072335d7824964ece616cc687d14de79a4a2e8885"} Feb 23 14:34:59.458828 master-0 kubenswrapper[28758]: I0223 14:34:59.458179 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-x9gxm" event={"ID":"ded555da-db03-498e-81a9-ad166f29a2aa","Type":"ContainerStarted","Data":"7644d6b4dd6d2352356f500ef21c6602c372872ba1a236023043ba253ba34314"} Feb 23 14:34:59.458828 master-0 kubenswrapper[28758]: I0223 14:34:59.458191 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pfb9h" event={"ID":"3f86e881-275c-4387-a23a-06c559c8f1e8","Type":"ContainerStarted","Data":"9deb346771c78fcd8dfbd7481f6daa1fde9e767e61631dcce1b8d456640db83f"} Feb 23 14:34:59.458828 master-0 kubenswrapper[28758]: I0223 14:34:59.458219 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pfb9h" event={"ID":"3f86e881-275c-4387-a23a-06c559c8f1e8","Type":"ContainerDied","Data":"dd1e3de2ec0845831f5cf402eb5eb3565db815b7cdf1b7b1df427c27e6e8d027"} Feb 23 14:34:59.458828 master-0 kubenswrapper[28758]: I0223 14:34:59.458233 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pfb9h" event={"ID":"3f86e881-275c-4387-a23a-06c559c8f1e8","Type":"ContainerDied","Data":"d34072e7de379cff9844dcf3892b8004b156d1d5b0fd3f937aa6aac0ab1f96bb"} Feb 23 14:34:59.458828 master-0 kubenswrapper[28758]: I0223 14:34:59.458246 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pfb9h" event={"ID":"3f86e881-275c-4387-a23a-06c559c8f1e8","Type":"ContainerStarted","Data":"2421d3a73005cb81482c335c859228ad362145060855ec39106d04eb50279bdb"} Feb 23 14:34:59.458828 master-0 kubenswrapper[28758]: I0223 14:34:59.458255 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" event={"ID":"1c60ff3f-2bb1-422e-be27-5eca96d85fd2","Type":"ContainerStarted","Data":"5c9a811dd7ca05a47e75def9ccd5b3cad7fd1e69fba1f4ac35541ff048398ec8"} Feb 23 14:34:59.458828 master-0 kubenswrapper[28758]: I0223 14:34:59.458270 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" event={"ID":"1c60ff3f-2bb1-422e-be27-5eca96d85fd2","Type":"ContainerStarted","Data":"da1d5c483a29eae497076b88d3f36ceedbd222eda1407e5ddf3aa59ebba386d7"} Feb 23 14:34:59.458828 master-0 kubenswrapper[28758]: I0223 14:34:59.458281 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" event={"ID":"1c60ff3f-2bb1-422e-be27-5eca96d85fd2","Type":"ContainerDied","Data":"0bda8d15a11221e7b98f49af56e0807945868c4a5e5d028da4a5c53d7f410c01"} Feb 23 14:34:59.458828 master-0 kubenswrapper[28758]: I0223 14:34:59.458298 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" event={"ID":"1c60ff3f-2bb1-422e-be27-5eca96d85fd2","Type":"ContainerStarted","Data":"fee0bde3d0eee2f0bc5f9cbe5f3f907b178692716f3f6aef77b4bea08c864506"} Feb 23 14:34:59.458828 master-0 kubenswrapper[28758]: I0223 14:34:59.458333 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" event={"ID":"c67a2ed2-f520-46fc-84d3-6816dc19f4e0","Type":"ContainerStarted","Data":"c9089f7ec9b403b820c3589f967f8a29cc7167ff44a814892c003b600afb7102"} Feb 23 14:34:59.458828 master-0 kubenswrapper[28758]: I0223 14:34:59.458382 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" event={"ID":"c67a2ed2-f520-46fc-84d3-6816dc19f4e0","Type":"ContainerStarted","Data":"db5ef21ef8eab7ca7c090f44b0b9b626032073152ce3cc515970e88c0bd210b9"} Feb 23 14:34:59.458828 master-0 kubenswrapper[28758]: I0223 14:34:59.458400 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" event={"ID":"c67a2ed2-f520-46fc-84d3-6816dc19f4e0","Type":"ContainerStarted","Data":"b96554c0b26af60fc366d01cdb0653dfe860650b866d469b8eb85b8f7a39e783"} Feb 23 14:34:59.458828 master-0 kubenswrapper[28758]: I0223 14:34:59.458415 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"493a9ed3-6d64-489a-a68c-235b69a58782","Type":"ContainerDied","Data":"1df16973da8e7c98a51b37b7335c255585ebd5dc4bbbed0d842fe3c32df42186"} Feb 23 14:34:59.458828 master-0 kubenswrapper[28758]: I0223 14:34:59.458435 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"493a9ed3-6d64-489a-a68c-235b69a58782","Type":"ContainerDied","Data":"cb9c2c793a1a03d8088100a56c493a223ec7cd474c24708ed4bb05825975b542"} Feb 23 14:34:59.458828 master-0 kubenswrapper[28758]: I0223 14:34:59.458443 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb9c2c793a1a03d8088100a56c493a223ec7cd474c24708ed4bb05825975b542" Feb 23 14:34:59.458828 master-0 kubenswrapper[28758]: I0223 14:34:59.458459 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-59b498fcfb-rz897" event={"ID":"ae4baa4e-4ef4-433d-aa36-149e92fa6ee2","Type":"ContainerStarted","Data":"8d9482f71c6dc9a00e4b04f7ca64bfc8a0661a7250561f2c67d72c80ed865f03"} Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.458761 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-59b498fcfb-rz897" event={"ID":"ae4baa4e-4ef4-433d-aa36-149e92fa6ee2","Type":"ContainerStarted","Data":"dac97420bb9e3883db6238fff45b69e331260668444b09a65159744c334f79d2"} Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.458875 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-nwdpd" event={"ID":"87f989cd-6c19-4a30-833a-10e98b7a0326","Type":"ContainerStarted","Data":"3d15bb45bdbf256571f89062eabcbe60b3f2bafe86110ce99ab9a1b2166faf20"} Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.458891 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-nwdpd" event={"ID":"87f989cd-6c19-4a30-833a-10e98b7a0326","Type":"ContainerStarted","Data":"b960fc2af9e3400ec5c9c6469cfc9540631ffe8f2ef43e226085fb14e2ada0b8"} Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.458925 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" event={"ID":"0315476e-7140-4777-8061-9cead4c92024","Type":"ContainerStarted","Data":"65ed56b86651ccae834781c13bcafb7784fdc880e0648daccca8f9316f065493"} Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.458944 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" event={"ID":"0315476e-7140-4777-8061-9cead4c92024","Type":"ContainerStarted","Data":"97f199fa26d5c3158a89f49cff2f70c3039a2be9fc4ad7fe0571f7a519be854c"} Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.458960 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-r6z45" event={"ID":"0514f486-2562-473d-8b01-b69441b82367","Type":"ContainerDied","Data":"ac1b1e24015c720352cbb49d46332282e9687278977a0db4df21fe4d03fe58bd"} Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.458970 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-r6z45" event={"ID":"0514f486-2562-473d-8b01-b69441b82367","Type":"ContainerDied","Data":"ee2c13172eac3ecac6ffc4f8cbbcc4a92023a5bb4123fc2178049f9834005518"} Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.458983 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee2c13172eac3ecac6ffc4f8cbbcc4a92023a5bb4123fc2178049f9834005518" Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.458992 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3061aec629b93c613efcc696d7dd1406550961b477efc6823a883d8b2ccdd9eb" Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.459003 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-75d56db95f-rg8tp" event={"ID":"c02c8912-46c9-4f86-ad28-9bfb2eca4e54","Type":"ContainerStarted","Data":"2e584c3fc9cf516104594bf0a348d5b59d91e3945a5a45d94da10c2d426282bd"} Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.459013 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-75d56db95f-rg8tp" event={"ID":"c02c8912-46c9-4f86-ad28-9bfb2eca4e54","Type":"ContainerStarted","Data":"b6a377cd77390650f0232d49deb3c30f98f3070caae925837c6c5aeeec6246e5"} Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.459025 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-58fb6744f5-848dv" event={"ID":"e5104cdd-85b8-49ba-95ca-3e9c8218a01e","Type":"ContainerStarted","Data":"4a7b48481b75f40fe6158bb21dd43fb7c910b44534fca5550f51e80a8e690723"} Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.459038 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-58fb6744f5-848dv" event={"ID":"e5104cdd-85b8-49ba-95ca-3e9c8218a01e","Type":"ContainerStarted","Data":"82317c2ff886c91d683fc1357282b875ededf218f4ed66da91969784b2202cd9"} Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.459151 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-f94476f49-s6c8v" event={"ID":"fbb66172-1ea9-4683-b88f-227c4fd94924","Type":"ContainerStarted","Data":"688f3915cda6841fdb5b96d4c61c8187855740e4ef08e7e26901e30e5fbf3918"} Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.459168 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-f94476f49-s6c8v" event={"ID":"fbb66172-1ea9-4683-b88f-227c4fd94924","Type":"ContainerDied","Data":"4bd96aadee1934ae65fac50d897e75007505a399d7d143ad871ced8edd81b895"} Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.459186 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-f94476f49-s6c8v" event={"ID":"fbb66172-1ea9-4683-b88f-227c4fd94924","Type":"ContainerStarted","Data":"16c1ddc99b10767e840864cd1d61fae8aedde334dad20b9e34c987fd909a5a36"} Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.459201 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" event={"ID":"9416f5d0-32b4-4065-b678-26913af8b6dd","Type":"ContainerStarted","Data":"f866731e4ac5121ccde39a6f28422037df55500fc5889296919662d103c3a36f"} Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.459215 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" event={"ID":"9416f5d0-32b4-4065-b678-26913af8b6dd","Type":"ContainerStarted","Data":"ba80f8cbf4454b204ee21a5520078d48d5261a99279a142cb4f152e1edc60436"} Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.454235 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.459225 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-5499d7f7bb-t45zz" event={"ID":"0cebb80d-d898-44c8-82b3-1e18833cee3f","Type":"ContainerStarted","Data":"9c93c157cec045045a20555d6c266b19e6eda783508c1ee10612534c934a9d8f"} Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.459339 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-5499d7f7bb-t45zz" event={"ID":"0cebb80d-d898-44c8-82b3-1e18833cee3f","Type":"ContainerStarted","Data":"5b1e3102064a2333a33694b441523235b1896dd8c0ad7164b8c2f46c1cc4e9c2"} Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.459354 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-t5h8h" event={"ID":"08c561b3-613b-425f-9de4-d5fc8762ea51","Type":"ContainerStarted","Data":"12fd878098bd7b54da3d1c8bde1617a7040cd38eee56d2ee103f7f6046abc156"} Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.459454 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-t5h8h" event={"ID":"08c561b3-613b-425f-9de4-d5fc8762ea51","Type":"ContainerStarted","Data":"4f93047dc0b7cb7f4f7c771225dde60d727738bda2832af456ff04f11ecb402a"} Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.459472 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" event={"ID":"af950a67-1557-4352-8100-27281bb8ecbe","Type":"ContainerStarted","Data":"0ee1bcffd0080993a8a7d5ae7b401f3f8009a9023b5b1079dc04460210753ad8"} Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.459500 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" event={"ID":"af950a67-1557-4352-8100-27281bb8ecbe","Type":"ContainerStarted","Data":"6e8f1beca1435f1c0deddd5b1efa7b8ba2b2233374745801d4686a667cf0b8d6"} Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.459510 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" event={"ID":"af950a67-1557-4352-8100-27281bb8ecbe","Type":"ContainerDied","Data":"3d6da8a2ab007c14781f7a758e38f4dc17838974a913b095ba4be079439082e2"} Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.459524 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" event={"ID":"af950a67-1557-4352-8100-27281bb8ecbe","Type":"ContainerStarted","Data":"7c2bb3b30fb024eb641e1765113a1a36bccda4c627461f72aa312212e851ddb2"} Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.459537 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530950-wnkgm" event={"ID":"78f5dea4-ed09-44a1-8eb1-d1fc497cc173","Type":"ContainerDied","Data":"27840ca7db3cacb7b24041918e945eaa29f553e36d936e622a640f67b21753c5"} Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.459547 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530950-wnkgm" event={"ID":"78f5dea4-ed09-44a1-8eb1-d1fc497cc173","Type":"ContainerDied","Data":"692c4eae0472c8ecc4c0d101041ab3c55666af550c8833bacb84eb0444c9ad4b"} Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.459559 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="692c4eae0472c8ecc4c0d101041ab3c55666af550c8833bacb84eb0444c9ad4b" Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.459567 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" event={"ID":"cf04aca0-8174-4134-835d-37adf6a3b5ca","Type":"ContainerStarted","Data":"6333d975bdd0987ec01df36eb0d3e9c836ef6da8f8df4fea72b742962ec92ffd"} Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.459638 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" event={"ID":"cf04aca0-8174-4134-835d-37adf6a3b5ca","Type":"ContainerDied","Data":"7c094f15ea265ac3d44bbebfb78fef4402e37dfe5737cb2bab354a08b8292a17"} Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.459653 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" event={"ID":"cf04aca0-8174-4134-835d-37adf6a3b5ca","Type":"ContainerStarted","Data":"a97726e86565351c3e74221a112a0906c73bf937319f30cac1b3e4b4f38e404f"} Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.459662 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"0fdb9885-7479-43b5-8613-b2857a798ade","Type":"ContainerDied","Data":"0a7994e86e7ddf474fa9a6e9d028e17c8d71e5299119418e1b05d25a7b604984"} Feb 23 14:34:59.459698 master-0 kubenswrapper[28758]: I0223 14:34:59.459679 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"0fdb9885-7479-43b5-8613-b2857a798ade","Type":"ContainerDied","Data":"3cbdfb9045c2d2cb397063c37573cc9d345a2e61b6805238ad5391bd43edfbaa"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.459692 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cbdfb9045c2d2cb397063c37573cc9d345a2e61b6805238ad5391bd43edfbaa" Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.459796 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" event={"ID":"482284fd-6911-4ba6-8d57-7966cc51117a","Type":"ContainerStarted","Data":"f3016d799a76a0e861077c16a305235311ad634f8054a70ca605ccf2e9c27c2c"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.459872 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" event={"ID":"482284fd-6911-4ba6-8d57-7966cc51117a","Type":"ContainerDied","Data":"cc5b0e807a282b75c570fbfb71a174caf59e3ff1678808f33d1b9369bbe859b7"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.459942 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" event={"ID":"482284fd-6911-4ba6-8d57-7966cc51117a","Type":"ContainerStarted","Data":"842c59a633c6726baab1699104248bceff992214333b768aa99b1550ee1de3d0"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.459958 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" event={"ID":"5b54fc16-d2f7-4b10-a611-5b411b389c5a","Type":"ContainerStarted","Data":"d129e9b13c717050b25d9d0f6e17182a80ea8a33b9d790e963d24636d1efd35e"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460002 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" event={"ID":"5b54fc16-d2f7-4b10-a611-5b411b389c5a","Type":"ContainerDied","Data":"3e43920c8c9e66c01584e52a234477388c129ea94fe151ecc6c23098a8981522"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460018 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" event={"ID":"5b54fc16-d2f7-4b10-a611-5b411b389c5a","Type":"ContainerStarted","Data":"d6900766f82c9e206cd3e7039baaa2878e6511f4627d07a00a47ed50496678b9"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460034 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" event={"ID":"5b54fc16-d2f7-4b10-a611-5b411b389c5a","Type":"ContainerStarted","Data":"305f42f52b6ba5ef239c92f6ac8cee0e2721fbe74d1ef92a70428b3a6fabdd04"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460050 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" event={"ID":"e2d00ece-7586-4346-adbb-eaae1aeda69e","Type":"ContainerStarted","Data":"50bc9e472f842efe389fe206aab27bbc1397e0168e5678badfa7c1d1c6991cdf"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460066 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" event={"ID":"e2d00ece-7586-4346-adbb-eaae1aeda69e","Type":"ContainerDied","Data":"0d84cce0e88dcc70d83b8dd67a4e91c62d1f30fed4495c32ca427288ab62004f"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460084 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" event={"ID":"e2d00ece-7586-4346-adbb-eaae1aeda69e","Type":"ContainerStarted","Data":"b0f1382249dc5f24b8f4811073e190383453a7404f8296e621cf4a7e56c21042"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460097 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"a52cbaf6-c1af-4c29-aef9-67523f5148c6","Type":"ContainerDied","Data":"f2ac2c56c1e34a13c986eed31e989cbdb13313c0a91d44d3e68704b2399e5a39"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460115 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"a52cbaf6-c1af-4c29-aef9-67523f5148c6","Type":"ContainerDied","Data":"a0069da11d319bc5848078fcc0287e9847a5a57b6af5121c1b96241adcdaa914"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460165 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0069da11d319bc5848078fcc0287e9847a5a57b6af5121c1b96241adcdaa914" Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460181 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" event={"ID":"959c2393-e914-4c10-a18f-b30fcf012d19","Type":"ContainerStarted","Data":"18b02500a922018fef0fe170792a110deb1ca490ebe442765b459f2885b97744"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460204 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" event={"ID":"959c2393-e914-4c10-a18f-b30fcf012d19","Type":"ContainerDied","Data":"943dceb3c19889e0c21143fb06ce16ff62e733710dc9afea16ddd3ae92da4904"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460218 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" event={"ID":"959c2393-e914-4c10-a18f-b30fcf012d19","Type":"ContainerStarted","Data":"121fb1d62a402b22b2ce0dcefcc58af76b44ad548fdacc6da5113c93b5d1d4e0"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460233 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fjpvt" event={"ID":"adbf8f71-f005-4e5b-9de1-e49559cf7386","Type":"ContainerStarted","Data":"73be864b5b54929659f002b9332c455fcb5b8b1df377e86d1c53be302d99b753"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460251 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fjpvt" event={"ID":"adbf8f71-f005-4e5b-9de1-e49559cf7386","Type":"ContainerDied","Data":"82543a2650cb25bd6bfd3b4eeba404e288e77827c47ab01f21f6a40862867df7"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460265 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fjpvt" event={"ID":"adbf8f71-f005-4e5b-9de1-e49559cf7386","Type":"ContainerDied","Data":"ce4e5fa17dbd27ef1bc8352b48b01fe69433b598b77046f093daa7a26c560341"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460304 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fjpvt" event={"ID":"adbf8f71-f005-4e5b-9de1-e49559cf7386","Type":"ContainerStarted","Data":"244c9349c0c82d28b67e2cfc680e10b4528e1ddb2f6ad558456c92eee9746fa9"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460321 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-qwsmk" event={"ID":"c0a39496-5e47-4415-b8bf-ed0634797ce1","Type":"ContainerStarted","Data":"82ebb613a27a91687ba715ffa70d565def5a49f407b70b9bf2cfb026a5c45a01"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460348 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-qwsmk" event={"ID":"c0a39496-5e47-4415-b8bf-ed0634797ce1","Type":"ContainerStarted","Data":"3debae3963d7ed5cbda602b7ac89dc5c4d861ae9dad9b89fb0b3fcce27f1aad1"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460357 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-8c7d49845-5rk2g" event={"ID":"607c1101-3533-43e3-9eda-13cea2b9dbb6","Type":"ContainerStarted","Data":"45850e1beb05a95ae5fcf3a42237f73e3fe69700a603134a49b7e403ae4ba9f9"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460367 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-8c7d49845-5rk2g" event={"ID":"607c1101-3533-43e3-9eda-13cea2b9dbb6","Type":"ContainerStarted","Data":"af6d79da0d728e46a6d28f0e6d5149695cf9e0a5184f46643dc1c16723bb00aa"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460376 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-8c7d49845-5rk2g" event={"ID":"607c1101-3533-43e3-9eda-13cea2b9dbb6","Type":"ContainerStarted","Data":"a928f690a2e58a25ba69277c1852026731fa14cc1f9743eea2995395d98f0871"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460384 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdsv6" event={"ID":"483786a0-0a29-44bf-bbd0-2f37e045aa2c","Type":"ContainerStarted","Data":"7389ea8bfb9da14a68c3071f0b69781f7e61f7f464c04916bdf2668709eba104"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460396 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdsv6" event={"ID":"483786a0-0a29-44bf-bbd0-2f37e045aa2c","Type":"ContainerDied","Data":"6c7ee6bebf88d829805371dc4fd4b58845a3f175897eb6486d1688a8a41b95ec"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460409 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdsv6" event={"ID":"483786a0-0a29-44bf-bbd0-2f37e045aa2c","Type":"ContainerDied","Data":"a6fb92a24f40b4f0a4db9442684eefd34b35d2511917f6d03fe2ac8345b66ead"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460420 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdsv6" event={"ID":"483786a0-0a29-44bf-bbd0-2f37e045aa2c","Type":"ContainerDied","Data":"e5ef5b210d67b35d196c3c58900eaedb9852f06e215b468c9e1c1dc53fce376f"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460438 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdsv6" event={"ID":"483786a0-0a29-44bf-bbd0-2f37e045aa2c","Type":"ContainerDied","Data":"29b61cbeccf4eaed8df82b56cbe6a444cd43fd7fd1043bff465ae48185e7e6a0"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460447 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdsv6" event={"ID":"483786a0-0a29-44bf-bbd0-2f37e045aa2c","Type":"ContainerDied","Data":"5fd96309ade76aec20ed37e459e178ae08d952af2aa513f3703806ca12a7c927"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460456 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdsv6" event={"ID":"483786a0-0a29-44bf-bbd0-2f37e045aa2c","Type":"ContainerDied","Data":"7e56c504fefbada4ed2745ea4973c98d064a08b56a86637d3809d7946280cc20"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460464 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jdsv6" event={"ID":"483786a0-0a29-44bf-bbd0-2f37e045aa2c","Type":"ContainerStarted","Data":"fb44bfa273a0390e40795165f46ee3660a2a5c93ba6fcc3ac327138fc4e69610"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460472 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6fb4df594f-hkcgz" event={"ID":"a4ae9292-71dc-4484-b277-43cb26c1e04d","Type":"ContainerStarted","Data":"61dc17f9e6cf7debb617c1a5d6bf61a564d357e02c830ef1638e669b87de1835"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460498 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6fb4df594f-hkcgz" event={"ID":"a4ae9292-71dc-4484-b277-43cb26c1e04d","Type":"ContainerDied","Data":"fafa7b0f21c17417165ff9592e80bbb6992685b66472f608cb30827b7d663491"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460512 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6fb4df594f-hkcgz" event={"ID":"a4ae9292-71dc-4484-b277-43cb26c1e04d","Type":"ContainerStarted","Data":"8bce00dde4bf57f38bea21a54eaeb5445e9a6797bd70cd70ab2f40465ffb6015"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460521 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" event={"ID":"585f74db-4593-426b-b0c7-ec8f64810549","Type":"ContainerStarted","Data":"e8151e4f2721f179d56208bf1c11204648825d6daf5824777e4cd2cde1fcc527"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460538 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" event={"ID":"585f74db-4593-426b-b0c7-ec8f64810549","Type":"ContainerDied","Data":"3d191963e287b24eb8e359eae476b7710f1b01ed3998cce17300434d7f6e8d0b"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460547 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" event={"ID":"585f74db-4593-426b-b0c7-ec8f64810549","Type":"ContainerStarted","Data":"601970e99fee05d1ddde3baeb681b21e539729838cc176b833fde61b155a74a5"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460556 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl" event={"ID":"ceba7b56-f910-473d-aed5-add94868fb31","Type":"ContainerStarted","Data":"5f3a2dd4bf392d1f8fcfc7780b16f411d497e94edaa80b6729dddf180e05f5b1"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460564 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl" event={"ID":"ceba7b56-f910-473d-aed5-add94868fb31","Type":"ContainerStarted","Data":"a92c459609a9caf278c919ccf0e276499a4748b87d0a276b18119bbb8961e6d8"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460572 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl" event={"ID":"ceba7b56-f910-473d-aed5-add94868fb31","Type":"ContainerStarted","Data":"f2696aa250be24ef04b3fabb47f7471ddc013ced2adb7ee07e74a3053e3dcc2e"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460581 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" event={"ID":"961e4ecd-545b-4270-ae34-e733dec793b6","Type":"ContainerStarted","Data":"fbac2782160d294364c1ee2f8ffe6113c931b2c5daaaa5bdf6584ea5ed99fbf7"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460591 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" event={"ID":"961e4ecd-545b-4270-ae34-e733dec793b6","Type":"ContainerDied","Data":"41bf9ac4f6ba09181a226cfe2ad608e31e59bbb137b1b1ead593f9c6c980fde1"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460619 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" event={"ID":"961e4ecd-545b-4270-ae34-e733dec793b6","Type":"ContainerStarted","Data":"7a94361576154416139d60324d6f01b1540eacf16a8dedc989cadf9cc6e41fca"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460630 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-666b887977-f7h55" event={"ID":"588a804a-430a-47f4-aa97-c08e907239da","Type":"ContainerStarted","Data":"2243bd1ff5f9d8cba5c5a4b887f7a319c493e141478b3849cae308750eee3552"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460639 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-666b887977-f7h55" event={"ID":"588a804a-430a-47f4-aa97-c08e907239da","Type":"ContainerStarted","Data":"9204e03e920d3db05653e38c3d6e44bea01fa95318e2f1f94371f8808becda76"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460647 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-666b887977-f7h55" event={"ID":"588a804a-430a-47f4-aa97-c08e907239da","Type":"ContainerDied","Data":"dd9d0218254ede188f4f7d6704abe8b5fb6e6589605eb6657b67d6dc33a0eb39"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460656 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-666b887977-f7h55" event={"ID":"588a804a-430a-47f4-aa97-c08e907239da","Type":"ContainerStarted","Data":"15c8631804d0c1c71f4b64c5ee4ec4990f2c2f6adda4a03d015df366f3b28fd1"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460665 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" event={"ID":"ea0b3538-9a7d-4995-b628-2d63f21d683c","Type":"ContainerStarted","Data":"4b8ec9029f7fb9a56b769915848138fa88b329cbc5b64d2aed62c627e82f917f"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460673 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" event={"ID":"ea0b3538-9a7d-4995-b628-2d63f21d683c","Type":"ContainerDied","Data":"ef55d8167c92b21a23135f3a8ced87d51d79df376d7aca850c7cba442f901e30"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460694 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" event={"ID":"ea0b3538-9a7d-4995-b628-2d63f21d683c","Type":"ContainerStarted","Data":"fe38a11f2899f2913cfd5201bad475af2fb0c867e6d00537cbb69269270c3e16"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460702 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6847bb4785-5fw2x" event={"ID":"2e89a047-9ebc-459b-b7b3-e902c1fb0e17","Type":"ContainerStarted","Data":"0189afd080f3b6ea37eeb4ab1c36f7fceab4ba1f69c22daf84852629bcff7a8b"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460724 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6847bb4785-5fw2x" event={"ID":"2e89a047-9ebc-459b-b7b3-e902c1fb0e17","Type":"ContainerDied","Data":"3d02c5174ccc3722ad642137b2ae38a4ad6beee863578d93948d8f75b3ffc635"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460733 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-6847bb4785-5fw2x" event={"ID":"2e89a047-9ebc-459b-b7b3-e902c1fb0e17","Type":"ContainerStarted","Data":"390cfd29e9f1dac7e5d17a7f7165d182236f5a201c52a8221fd54d5117d708f7"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460742 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"29f7b30e-bf6a-4e54-b009-1b0fcd830035","Type":"ContainerDied","Data":"325ae25a8338b6a2543759476e50b822896d1071332fcb78a23d45a461fab54f"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460752 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"29f7b30e-bf6a-4e54-b009-1b0fcd830035","Type":"ContainerDied","Data":"d4622d20df32d4655ff4c5d8c0ab82bdd9c1a367900ef291744093a4801a66c4"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460759 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4622d20df32d4655ff4c5d8c0ab82bdd9c1a367900ef291744093a4801a66c4" Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460768 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" event={"ID":"06bde94a-3126-4d0f-baba-49dc5fbec61b","Type":"ContainerStarted","Data":"7461839a3a630e391eda2be4a947e3e187fea230edbbc3e8b3af02abc9e03e06"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460777 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" event={"ID":"06bde94a-3126-4d0f-baba-49dc5fbec61b","Type":"ContainerStarted","Data":"e75c63f7eeabff918836bbbeb9c11ed440bed473107c3e0b28076ddfdf91aadb"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460785 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-86l7f" event={"ID":"6a801da1-a7eb-4187-98b8-315076f55e19","Type":"ContainerStarted","Data":"e9346a8d9dc410d0da0fd8dd3433eca16fb555bf7b080b48fc268fe5ccceefb8"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460795 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-86l7f" event={"ID":"6a801da1-a7eb-4187-98b8-315076f55e19","Type":"ContainerStarted","Data":"3a1f76c10c7d29eada49f24161c1b7b7382f293e5fcfef65b090da9015564f91"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460803 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-86l7f" event={"ID":"6a801da1-a7eb-4187-98b8-315076f55e19","Type":"ContainerStarted","Data":"6d0f92f8c3e4f5b782259d4379958d9e827827d44cda6b952bb56160c213bbf8"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460814 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9dnsv" event={"ID":"ace75aae-6f4f-4299-90e2-d5292271b136","Type":"ContainerStarted","Data":"b5bbb5699c2dab9ad551747a05cc6c2594a6cc174c7a5d51b7f4dba4a3a2f82a"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460828 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9dnsv" event={"ID":"ace75aae-6f4f-4299-90e2-d5292271b136","Type":"ContainerStarted","Data":"3b7e44d2452ae6675e651554bab681a42fae84f0e05ad5a73fb3027444acc8b8"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460856 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9dnsv" event={"ID":"ace75aae-6f4f-4299-90e2-d5292271b136","Type":"ContainerStarted","Data":"edfafba30f67b299b61cf6429b5bf8b47c050040f18802ede0a7f2834a957ae9"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460868 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" event={"ID":"18da400b-2271-455d-be0d-0ed44c74f78d","Type":"ContainerStarted","Data":"fa5fb647e8638eaad65b8811059d4ee7fcda78f5ba1a5eaad0814e3ad8d66e8c"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460877 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" event={"ID":"18da400b-2271-455d-be0d-0ed44c74f78d","Type":"ContainerStarted","Data":"c81b3d29f769948ceb44d95e3377e7a7b6aa0ce8d14bbde1a834e3be95ef060d"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460885 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" event={"ID":"18da400b-2271-455d-be0d-0ed44c74f78d","Type":"ContainerStarted","Data":"b6510ef0b5fc51e782ccd9549d8a6adfc4072f0d6db015ee20beaf2f6eb3bcaa"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460894 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" event={"ID":"646fece3-4a42-4e0c-bcc7-5f705f948d63","Type":"ContainerStarted","Data":"616aaf781ead921aaf00c1f2c54e5afa774694ee9d0c517cb881a4d591a4f886"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460904 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" event={"ID":"646fece3-4a42-4e0c-bcc7-5f705f948d63","Type":"ContainerStarted","Data":"d4eec9eade1a6fd9bfe0d642fe3ae425b01a962b7129ee11f0681674274aaff6"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460932 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" event={"ID":"92c63c95-e880-4f51-9858-7715343f7bd8","Type":"ContainerStarted","Data":"b6071b4962f53457c312546180e687a36b9dd499d861714917a6ca4caba881c5"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460941 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" event={"ID":"92c63c95-e880-4f51-9858-7715343f7bd8","Type":"ContainerDied","Data":"5ace2c4cd314c92825fd50854b7a53f375dd7e9cb995361c6b2c717e5d66eb1b"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460961 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" event={"ID":"92c63c95-e880-4f51-9858-7715343f7bd8","Type":"ContainerDied","Data":"4c56fad74102d69cc7f4ad84a92d8641223fd6345d161d89086b8aeb7a8a3450"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460969 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" event={"ID":"92c63c95-e880-4f51-9858-7715343f7bd8","Type":"ContainerStarted","Data":"5c45c6ef7e4b05f37927f974cf6cb4b129b6dd3a04cd82a3e97ac1e29ceb5010"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460989 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"181adc3f4810f127b44f3750f5d2460c","Type":"ContainerStarted","Data":"b8d6f49109bc5e9937c7a4c297e2344d7130b49e86fe7057d8a2caa05af89ff5"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.460998 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"181adc3f4810f127b44f3750f5d2460c","Type":"ContainerStarted","Data":"0b38f3f0c36dbadf8be89c71d3b96febfcea8812afc5013e2c33ad058c7c6088"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.461006 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"181adc3f4810f127b44f3750f5d2460c","Type":"ContainerStarted","Data":"0f12986ca20c74365b105ffde80e7b4ab97ae2e79cf0faa03c36002407af9c04"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.461013 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"181adc3f4810f127b44f3750f5d2460c","Type":"ContainerStarted","Data":"ea1409538bec46d9eceb195d8a31f70cddcab9c02d2f2d5acf77e88b46aed24f"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.461022 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"181adc3f4810f127b44f3750f5d2460c","Type":"ContainerStarted","Data":"d25cb0e087422893459bc0facfa4b23104f61b45c1b3b866e5b4e0e0ad019f99"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.461030 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" event={"ID":"674041a2-e2b0-4286-88cc-f1b00571e3f3","Type":"ContainerStarted","Data":"1b2ef32436a2cab762069d2b98a667383566db0becb9c8c8fa356e686ede57e6"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.461063 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" event={"ID":"674041a2-e2b0-4286-88cc-f1b00571e3f3","Type":"ContainerDied","Data":"4059934c66f6a9887a7e6b1218e04bcfb0fcfe5376abb8c188a9213f581fe6f3"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.461072 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" event={"ID":"674041a2-e2b0-4286-88cc-f1b00571e3f3","Type":"ContainerStarted","Data":"7745fe383c3438f3eb713290ae29bc45137b7df8f820bdc331981eebbfe561fe"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.461080 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cdrlk" event={"ID":"efdde2df-cd07-4898-88f4-7ecde0e04d7a","Type":"ContainerStarted","Data":"bf5dc0832d3e371388160283edfa78766c830a1d36b70ceb5328f1c5793751db"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.461089 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cdrlk" event={"ID":"efdde2df-cd07-4898-88f4-7ecde0e04d7a","Type":"ContainerDied","Data":"6fcd03423b6cd43db7e56c4b05dae69802a9ed15c0ea5b94c83b762001bf26a8"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.461099 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cdrlk" event={"ID":"efdde2df-cd07-4898-88f4-7ecde0e04d7a","Type":"ContainerDied","Data":"9f45d1abf22f4312045c038d142f9bba7b80278a0653e6693862acdb73f898f7"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.461108 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cdrlk" event={"ID":"efdde2df-cd07-4898-88f4-7ecde0e04d7a","Type":"ContainerStarted","Data":"4c87997a2f68dd1175880f954711e05356d4ed55a7a6f3583b752f9d11da5e55"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.461127 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" event={"ID":"25b855e3-80dc-4ee5-80ab-c4742578a92f","Type":"ContainerDied","Data":"9e428f62a82052df41d6797fdf53021748cfc643092e404d44bde9e1092162d6"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.461138 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" event={"ID":"25b855e3-80dc-4ee5-80ab-c4742578a92f","Type":"ContainerDied","Data":"38c349954c9e4d48bcd2d1d0bfed1c2e92410197933aa893d6cec912dd9abe84"} Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.461156 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38c349954c9e4d48bcd2d1d0bfed1c2e92410197933aa893d6cec912dd9abe84" Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.461160 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 23 14:34:59.461164 master-0 kubenswrapper[28758]: I0223 14:34:59.461165 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"d03a1e6620a92c780b0a91c72a55bc8b","Type":"ContainerStarted","Data":"a45da35c56f17703c91fdc8040f2ec004087a605a92000fedcffc1936ffcae7f"} Feb 23 14:34:59.464947 master-0 kubenswrapper[28758]: I0223 14:34:59.461553 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"d03a1e6620a92c780b0a91c72a55bc8b","Type":"ContainerStarted","Data":"cecfab4dc3898d095fcb3a6b3e1cf511480563e469afa63033300c82e251f626"} Feb 23 14:34:59.464947 master-0 kubenswrapper[28758]: I0223 14:34:59.461915 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 23 14:34:59.464947 master-0 kubenswrapper[28758]: I0223 14:34:59.462157 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"d03a1e6620a92c780b0a91c72a55bc8b","Type":"ContainerStarted","Data":"25bfdd1b2d2ccd5f7ad2b82ecfb58c4fad723643d443beb5181c17d3e22ca1ac"} Feb 23 14:34:59.464947 master-0 kubenswrapper[28758]: I0223 14:34:59.462206 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"d03a1e6620a92c780b0a91c72a55bc8b","Type":"ContainerDied","Data":"e3f7365c1acb54a72a772e182d383b3e70a626dbae5d085f9cd81b46982b0137"} Feb 23 14:34:59.464947 master-0 kubenswrapper[28758]: I0223 14:34:59.462222 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"d03a1e6620a92c780b0a91c72a55bc8b","Type":"ContainerStarted","Data":"0b6059c95ec8023c3749bdce17c5d7c3d4cfc4af6e64639a41582908eb86d4e6"} Feb 23 14:34:59.464947 master-0 kubenswrapper[28758]: I0223 14:34:59.462233 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-td489" event={"ID":"bbe678de-546d-49d0-8280-3f6d94fa5e4f","Type":"ContainerStarted","Data":"0e0e360765d8d16da79c870190987bdf00a3af7c783a06b62100aaa85b3602c9"} Feb 23 14:34:59.464947 master-0 kubenswrapper[28758]: I0223 14:34:59.462244 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-td489" event={"ID":"bbe678de-546d-49d0-8280-3f6d94fa5e4f","Type":"ContainerDied","Data":"86a800fe59aed9a0c248de7a352a6c1ffaea2cbdde27bb246147baa866e1c79a"} Feb 23 14:34:59.464947 master-0 kubenswrapper[28758]: I0223 14:34:59.462253 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-td489" event={"ID":"bbe678de-546d-49d0-8280-3f6d94fa5e4f","Type":"ContainerStarted","Data":"88c122508c98e7a0f40824b07fef074fffccd316aa8ae95f930b03c4abba7eb2"} Feb 23 14:34:59.464947 master-0 kubenswrapper[28758]: I0223 14:34:59.462262 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-td489" event={"ID":"bbe678de-546d-49d0-8280-3f6d94fa5e4f","Type":"ContainerStarted","Data":"a763a9aa12dde6c52d5c6991687ebd101bd47550719a37c47c1a30d449928cff"} Feb 23 14:34:59.464947 master-0 kubenswrapper[28758]: I0223 14:34:59.462270 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"959c75833224b4ba3fa488b77d8f5032","Type":"ContainerDied","Data":"7b1c7cc7b2e3d4c1fdbe5b4592355d6abc03a37f10ae5ab746402745b7ae1aa2"} Feb 23 14:34:59.464947 master-0 kubenswrapper[28758]: I0223 14:34:59.462284 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" event={"ID":"06bde94a-3126-4d0f-baba-49dc5fbec61b","Type":"ContainerDied","Data":"7461839a3a630e391eda2be4a947e3e187fea230edbbc3e8b3af02abc9e03e06"} Feb 23 14:34:59.464947 master-0 kubenswrapper[28758]: I0223 14:34:59.464122 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Feb 23 14:34:59.465424 master-0 kubenswrapper[28758]: I0223 14:34:59.465306 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 23 14:34:59.475667 master-0 kubenswrapper[28758]: I0223 14:34:59.469206 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 23 14:34:59.477758 master-0 kubenswrapper[28758]: I0223 14:34:59.477712 28758 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Feb 23 14:34:59.479769 master-0 kubenswrapper[28758]: I0223 14:34:59.479715 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 23 14:34:59.499617 master-0 kubenswrapper[28758]: I0223 14:34:59.499554 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 23 14:34:59.511582 master-0 kubenswrapper[28758]: E0223 14:34:59.511534 28758 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0\" already exists" pod="openshift-etcd/etcd-master-0" Feb 23 14:34:59.520233 master-0 kubenswrapper[28758]: I0223 14:34:59.520190 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 23 14:34:59.540233 master-0 kubenswrapper[28758]: I0223 14:34:59.540157 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 23 14:34:59.555772 master-0 kubenswrapper[28758]: I0223 14:34:59.555712 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ceba7b56-f910-473d-aed5-add94868fb31-images\") pod \"machine-api-operator-5c7cf458b4-bb7zl\" (UID: \"ceba7b56-f910-473d-aed5-add94868fb31\") " pod="openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl" Feb 23 14:34:59.555772 master-0 kubenswrapper[28758]: I0223 14:34:59.555765 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ea0b3538-9a7d-4995-b628-2d63f21d683c-encryption-config\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:34:59.555772 master-0 kubenswrapper[28758]: I0223 14:34:59.555796 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8de1f285-47ac-42aa-8026-8addce656362-serving-cert\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:34:59.556218 master-0 kubenswrapper[28758]: I0223 14:34:59.555818 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p6hn\" (UniqueName: \"kubernetes.io/projected/06bde94a-3126-4d0f-baba-49dc5fbec61b-kube-api-access-2p6hn\") pod \"router-default-7b65dc9fcb-w68qb\" (UID: \"06bde94a-3126-4d0f-baba-49dc5fbec61b\") " pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:34:59.556218 master-0 kubenswrapper[28758]: I0223 14:34:59.555915 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs\") pod \"network-metrics-daemon-9dnsv\" (UID: \"ace75aae-6f4f-4299-90e2-d5292271b136\") " pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:34:59.556218 master-0 kubenswrapper[28758]: I0223 14:34:59.556155 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access\") pod \"installer-4-master-0\" (UID: \"e1148263-7b15-4c12-a217-8b030ecd9348\") " pod="openshift-kube-apiserver/installer-4-master-0" Feb 23 14:34:59.556352 master-0 kubenswrapper[28758]: I0223 14:34:59.556248 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-volume-directive-shadow\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:34:59.556352 master-0 kubenswrapper[28758]: I0223 14:34:59.556285 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8de1f285-47ac-42aa-8026-8addce656362-serving-cert\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:34:59.556352 master-0 kubenswrapper[28758]: I0223 14:34:59.556309 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9b558268-2262-4593-893e-408639a9987d-etc-tuned\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.556352 master-0 kubenswrapper[28758]: I0223 14:34:59.556334 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ace75aae-6f4f-4299-90e2-d5292271b136-metrics-certs\") pod \"network-metrics-daemon-9dnsv\" (UID: \"ace75aae-6f4f-4299-90e2-d5292271b136\") " pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:34:59.556352 master-0 kubenswrapper[28758]: I0223 14:34:59.556355 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/181adc3f4810f127b44f3750f5d2460c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"181adc3f4810f127b44f3750f5d2460c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:34:59.556637 master-0 kubenswrapper[28758]: I0223 14:34:59.556393 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sflb\" (UniqueName: \"kubernetes.io/projected/c0a39496-5e47-4415-b8bf-ed0634797ce1-kube-api-access-9sflb\") pod \"machine-config-server-qwsmk\" (UID: \"c0a39496-5e47-4415-b8bf-ed0634797ce1\") " pod="openshift-machine-config-operator/machine-config-server-qwsmk" Feb 23 14:34:59.556637 master-0 kubenswrapper[28758]: I0223 14:34:59.556442 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/255b5a89-1b89-42dc-868a-32ce67975a54-srv-cert\") pod \"catalog-operator-596f79dd6f-mhzxn\" (UID: \"255b5a89-1b89-42dc-868a-32ce67975a54\") " pod="openshift-operator-lifecycle-manager/catalog-operator-596f79dd6f-mhzxn" Feb 23 14:34:59.556637 master-0 kubenswrapper[28758]: I0223 14:34:59.556393 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-volume-directive-shadow\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:34:59.556637 master-0 kubenswrapper[28758]: I0223 14:34:59.556445 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9b558268-2262-4593-893e-408639a9987d-etc-tuned\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.556637 master-0 kubenswrapper[28758]: I0223 14:34:59.556617 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-client-ca-bundle\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:34:59.556637 master-0 kubenswrapper[28758]: I0223 14:34:59.556645 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/646fece3-4a42-4e0c-bcc7-5f705f948d63-telemetry-config\") pod \"cluster-monitoring-operator-6bb6d78bf-wzqcp\" (UID: \"646fece3-4a42-4e0c-bcc7-5f705f948d63\") " pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:34:59.556922 master-0 kubenswrapper[28758]: I0223 14:34:59.556664 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f86e881-275c-4387-a23a-06c559c8f1e8-utilities\") pod \"redhat-marketplace-pfb9h\" (UID: \"3f86e881-275c-4387-a23a-06c559c8f1e8\") " pod="openshift-marketplace/redhat-marketplace-pfb9h" Feb 23 14:34:59.556922 master-0 kubenswrapper[28758]: I0223 14:34:59.556682 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24829faf-50e8-45bb-abb0-7cc5ccf81080-serving-cert\") pod \"openshift-apiserver-operator-8586dccc9b-tvnmq\" (UID: \"24829faf-50e8-45bb-abb0-7cc5ccf81080\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" Feb 23 14:34:59.556922 master-0 kubenswrapper[28758]: I0223 14:34:59.556700 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-run-ovn\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.556922 master-0 kubenswrapper[28758]: I0223 14:34:59.556893 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24829faf-50e8-45bb-abb0-7cc5ccf81080-serving-cert\") pod \"openshift-apiserver-operator-8586dccc9b-tvnmq\" (UID: \"24829faf-50e8-45bb-abb0-7cc5ccf81080\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" Feb 23 14:34:59.557073 master-0 kubenswrapper[28758]: I0223 14:34:59.556943 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/646fece3-4a42-4e0c-bcc7-5f705f948d63-telemetry-config\") pod \"cluster-monitoring-operator-6bb6d78bf-wzqcp\" (UID: \"646fece3-4a42-4e0c-bcc7-5f705f948d63\") " pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:34:59.557073 master-0 kubenswrapper[28758]: I0223 14:34:59.556970 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f86e881-275c-4387-a23a-06c559c8f1e8-utilities\") pod \"redhat-marketplace-pfb9h\" (UID: \"3f86e881-275c-4387-a23a-06c559c8f1e8\") " pod="openshift-marketplace/redhat-marketplace-pfb9h" Feb 23 14:34:59.557073 master-0 kubenswrapper[28758]: I0223 14:34:59.556986 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-systemd\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.557073 master-0 kubenswrapper[28758]: I0223 14:34:59.557015 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qszm\" (UniqueName: \"kubernetes.io/projected/3488a7eb-5170-478c-9af7-490dbe0f514e-kube-api-access-6qszm\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:34:59.557073 master-0 kubenswrapper[28758]: I0223 14:34:59.557037 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cc28e06-3542-4a25-a8b1-5f5b4ee41114-proxy-tls\") pod \"machine-config-controller-54cb48566c-g4r57\" (UID: \"5cc28e06-3542-4a25-a8b1-5f5b4ee41114\") " pod="openshift-machine-config-operator/machine-config-controller-54cb48566c-g4r57" Feb 23 14:34:59.557073 master-0 kubenswrapper[28758]: I0223 14:34:59.557058 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c0a39496-5e47-4415-b8bf-ed0634797ce1-certs\") pod \"machine-config-server-qwsmk\" (UID: \"c0a39496-5e47-4415-b8bf-ed0634797ce1\") " pod="openshift-machine-config-operator/machine-config-server-qwsmk" Feb 23 14:34:59.557291 master-0 kubenswrapper[28758]: I0223 14:34:59.557079 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9416f5d0-32b4-4065-b678-26913af8b6dd-metrics-server-audit-profiles\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:34:59.557291 master-0 kubenswrapper[28758]: I0223 14:34:59.557098 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8de1f285-47ac-42aa-8026-8addce656362-config\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:34:59.557291 master-0 kubenswrapper[28758]: I0223 14:34:59.557260 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8de1f285-47ac-42aa-8026-8addce656362-config\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:34:59.557291 master-0 kubenswrapper[28758]: I0223 14:34:59.557269 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad0f0d72-0337-4347-bb50-e299a175f3ca-trusted-ca\") pod \"cluster-image-registry-operator-779979bdf7-ml2d7\" (UID: \"ad0f0d72-0337-4347-bb50-e299a175f3ca\") " pod="openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7" Feb 23 14:34:59.557291 master-0 kubenswrapper[28758]: I0223 14:34:59.557288 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-secret-metrics-server-tls\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:34:59.557492 master-0 kubenswrapper[28758]: I0223 14:34:59.557346 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/588a804a-430a-47f4-aa97-c08e907239da-config\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:34:59.557492 master-0 kubenswrapper[28758]: I0223 14:34:59.557367 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxrvf\" (UniqueName: \"kubernetes.io/projected/af950a67-1557-4352-8100-27281bb8ecbe-kube-api-access-jxrvf\") pod \"machine-config-operator-7f8c75f984-rdjxr\" (UID: \"af950a67-1557-4352-8100-27281bb8ecbe\") " pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" Feb 23 14:34:59.557492 master-0 kubenswrapper[28758]: I0223 14:34:59.557382 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0cebb80d-d898-44c8-82b3-1e18833cee3f-srv-cert\") pod \"olm-operator-5499d7f7bb-t45zz\" (UID: \"0cebb80d-d898-44c8-82b3-1e18833cee3f\") " pod="openshift-operator-lifecycle-manager/olm-operator-5499d7f7bb-t45zz" Feb 23 14:34:59.557492 master-0 kubenswrapper[28758]: I0223 14:34:59.557398 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94ddm\" (UniqueName: \"kubernetes.io/projected/15ad7f4e-44c6-4426-8b97-c47a47786544-kube-api-access-94ddm\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:34:59.557492 master-0 kubenswrapper[28758]: I0223 14:34:59.557412 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ceba7b56-f910-473d-aed5-add94868fb31-machine-api-operator-tls\") pod \"machine-api-operator-5c7cf458b4-bb7zl\" (UID: \"ceba7b56-f910-473d-aed5-add94868fb31\") " pod="openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl" Feb 23 14:34:59.557492 master-0 kubenswrapper[28758]: I0223 14:34:59.557428 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtbcj\" (UniqueName: \"kubernetes.io/projected/3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3-kube-api-access-qtbcj\") pod \"cluster-autoscaler-operator-86b8dc6d6-2kvfp\" (UID: \"3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3\") " pod="openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp" Feb 23 14:34:59.557492 master-0 kubenswrapper[28758]: I0223 14:34:59.557445 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2d00ece-7586-4346-adbb-eaae1aeda69e-serving-cert\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:34:59.557492 master-0 kubenswrapper[28758]: I0223 14:34:59.557462 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp5kb\" (UniqueName: \"kubernetes.io/projected/bbe678de-546d-49d0-8280-3f6d94fa5e4f-kube-api-access-kp5kb\") pod \"network-node-identity-td489\" (UID: \"bbe678de-546d-49d0-8280-3f6d94fa5e4f\") " pod="openshift-network-node-identity/network-node-identity-td489" Feb 23 14:34:59.557853 master-0 kubenswrapper[28758]: I0223 14:34:59.557783 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9416f5d0-32b4-4065-b678-26913af8b6dd-audit-log\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:34:59.557853 master-0 kubenswrapper[28758]: I0223 14:34:59.557812 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2d00ece-7586-4346-adbb-eaae1aeda69e-serving-cert\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:34:59.557853 master-0 kubenswrapper[28758]: I0223 14:34:59.557830 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/588a804a-430a-47f4-aa97-c08e907239da-node-pullsecrets\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:34:59.557970 master-0 kubenswrapper[28758]: I0223 14:34:59.557857 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:34:59.557970 master-0 kubenswrapper[28758]: I0223 14:34:59.557898 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-etc-openvswitch\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.557970 master-0 kubenswrapper[28758]: I0223 14:34:59.557920 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp6tj\" (UniqueName: \"kubernetes.io/projected/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061-kube-api-access-vp6tj\") pod \"cluster-olm-operator-5bd7768f54-bgg88\" (UID: \"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" Feb 23 14:34:59.557970 master-0 kubenswrapper[28758]: I0223 14:34:59.557930 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9416f5d0-32b4-4065-b678-26913af8b6dd-audit-log\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:34:59.557970 master-0 kubenswrapper[28758]: I0223 14:34:59.557944 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8de1f285-47ac-42aa-8026-8addce656362-etcd-client\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:34:59.558244 master-0 kubenswrapper[28758]: I0223 14:34:59.557974 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtp7w\" (UniqueName: \"kubernetes.io/projected/0315476e-7140-4777-8061-9cead4c92024-kube-api-access-jtp7w\") pod \"packageserver-65c9585877-m66zh\" (UID: \"0315476e-7140-4777-8061-9cead4c92024\") " pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" Feb 23 14:34:59.558244 master-0 kubenswrapper[28758]: I0223 14:34:59.557993 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/15ad7f4e-44c6-4426-8b97-c47a47786544-metrics-client-ca\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:34:59.558244 master-0 kubenswrapper[28758]: I0223 14:34:59.558010 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08c561b3-613b-425f-9de4-d5fc8762ea51-host-slash\") pod \"iptables-alerter-t5h8h\" (UID: \"08c561b3-613b-425f-9de4-d5fc8762ea51\") " pod="openshift-network-operator/iptables-alerter-t5h8h" Feb 23 14:34:59.558244 master-0 kubenswrapper[28758]: I0223 14:34:59.558201 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8de1f285-47ac-42aa-8026-8addce656362-etcd-client\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:34:59.558244 master-0 kubenswrapper[28758]: I0223 14:34:59.558236 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:59.558600 master-0 kubenswrapper[28758]: I0223 14:34:59.558259 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-resource-dir\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:34:59.558600 master-0 kubenswrapper[28758]: I0223 14:34:59.558291 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4sbp\" (UniqueName: \"kubernetes.io/projected/607c1101-3533-43e3-9eda-13cea2b9dbb6-kube-api-access-v4sbp\") pod \"dns-operator-8c7d49845-5rk2g\" (UID: \"607c1101-3533-43e3-9eda-13cea2b9dbb6\") " pod="openshift-dns-operator/dns-operator-8c7d49845-5rk2g" Feb 23 14:34:59.558600 master-0 kubenswrapper[28758]: I0223 14:34:59.558313 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c997c8e9d3be51d454d8e61e376bef08-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"c997c8e9d3be51d454d8e61e376bef08\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Feb 23 14:34:59.558600 master-0 kubenswrapper[28758]: I0223 14:34:59.558351 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-run-multus-certs\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.558600 master-0 kubenswrapper[28758]: I0223 14:34:59.558389 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b090ed5a-984f-41dd-8cea-34a1ece1514f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-5d8dfcdc87-jbc2v\" (UID: \"b090ed5a-984f-41dd-8cea-34a1ece1514f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" Feb 23 14:34:59.558600 master-0 kubenswrapper[28758]: I0223 14:34:59.558443 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efdde2df-cd07-4898-88f4-7ecde0e04d7a-catalog-content\") pod \"certified-operators-cdrlk\" (UID: \"efdde2df-cd07-4898-88f4-7ecde0e04d7a\") " pod="openshift-marketplace/certified-operators-cdrlk" Feb 23 14:34:59.558600 master-0 kubenswrapper[28758]: I0223 14:34:59.558465 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hp42\" (UniqueName: \"kubernetes.io/projected/24829faf-50e8-45bb-abb0-7cc5ccf81080-kube-api-access-7hp42\") pod \"openshift-apiserver-operator-8586dccc9b-tvnmq\" (UID: \"24829faf-50e8-45bb-abb0-7cc5ccf81080\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" Feb 23 14:34:59.558967 master-0 kubenswrapper[28758]: I0223 14:34:59.558616 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efdde2df-cd07-4898-88f4-7ecde0e04d7a-catalog-content\") pod \"certified-operators-cdrlk\" (UID: \"efdde2df-cd07-4898-88f4-7ecde0e04d7a\") " pod="openshift-marketplace/certified-operators-cdrlk" Feb 23 14:34:59.558967 master-0 kubenswrapper[28758]: I0223 14:34:59.558650 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9774f8c-0f29-46d8-be77-81bcf74d5994-serving-cert\") pod \"cluster-version-operator-57476485-m58rm\" (UID: \"b9774f8c-0f29-46d8-be77-81bcf74d5994\") " pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" Feb 23 14:34:59.558967 master-0 kubenswrapper[28758]: I0223 14:34:59.558672 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9774f8c-0f29-46d8-be77-81bcf74d5994-service-ca\") pod \"cluster-version-operator-57476485-m58rm\" (UID: \"b9774f8c-0f29-46d8-be77-81bcf74d5994\") " pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" Feb 23 14:34:59.558967 master-0 kubenswrapper[28758]: I0223 14:34:59.558677 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b090ed5a-984f-41dd-8cea-34a1ece1514f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-5d8dfcdc87-jbc2v\" (UID: \"b090ed5a-984f-41dd-8cea-34a1ece1514f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" Feb 23 14:34:59.558967 master-0 kubenswrapper[28758]: I0223 14:34:59.558692 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/66c72c71-f74a-43ab-bf0d-1f4c93623774-etc-containers\") pod \"catalogd-controller-manager-84b8d9d697-2hr5s\" (UID: \"66c72c71-f74a-43ab-bf0d-1f4c93623774\") " pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:34:59.558967 master-0 kubenswrapper[28758]: I0223 14:34:59.558710 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qggzs\" (UniqueName: \"kubernetes.io/projected/57b57915-64dd-42f5-b06f-bc4bcc06b667-kube-api-access-qggzs\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:34:59.558967 master-0 kubenswrapper[28758]: I0223 14:34:59.558730 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzcqx\" (UniqueName: \"kubernetes.io/projected/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-api-access-dzcqx\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:34:59.558967 master-0 kubenswrapper[28758]: I0223 14:34:59.558748 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-sysctl-conf\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.558967 master-0 kubenswrapper[28758]: I0223 14:34:59.558768 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9b558268-2262-4593-893e-408639a9987d-tmp\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.558967 master-0 kubenswrapper[28758]: I0223 14:34:59.558785 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2d00ece-7586-4346-adbb-eaae1aeda69e-service-ca-bundle\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:34:59.558967 master-0 kubenswrapper[28758]: I0223 14:34:59.558803 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nr85\" (UniqueName: \"kubernetes.io/projected/e2d00ece-7586-4346-adbb-eaae1aeda69e-kube-api-access-4nr85\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:34:59.558967 master-0 kubenswrapper[28758]: I0223 14:34:59.558821 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ea0b3538-9a7d-4995-b628-2d63f21d683c-audit-policies\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:34:59.558967 master-0 kubenswrapper[28758]: I0223 14:34:59.558844 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-config\") pod \"machine-approver-7dd9c7d7b9-rn8fj\" (UID: \"c67a2ed2-f520-46fc-84d3-6816dc19f4e0\") " pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" Feb 23 14:34:59.558967 master-0 kubenswrapper[28758]: I0223 14:34:59.558862 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9cf1c39-24f0-420b-8020-089616d1cdf0-serving-cert\") pod \"openshift-kube-scheduler-operator-77cd4d9559-qvq8x\" (UID: \"b9cf1c39-24f0-420b-8020-089616d1cdf0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" Feb 23 14:34:59.558967 master-0 kubenswrapper[28758]: I0223 14:34:59.558883 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.558967 master-0 kubenswrapper[28758]: I0223 14:34:59.558904 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf04aca0-8174-4134-835d-37adf6a3b5ca-config\") pod \"kube-controller-manager-operator-7bcfbc574b-zdntd\" (UID: \"cf04aca0-8174-4134-835d-37adf6a3b5ca\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" Feb 23 14:34:59.558967 master-0 kubenswrapper[28758]: I0223 14:34:59.558923 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8v9z\" (UniqueName: \"kubernetes.io/projected/fae9a4cf-2acf-4728-9105-87e004052fe5-kube-api-access-x8v9z\") pod \"openshift-state-metrics-6dbff8cb4c-9qg7j\" (UID: \"fae9a4cf-2acf-4728-9105-87e004052fe5\") " pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" Feb 23 14:34:59.558967 master-0 kubenswrapper[28758]: I0223 14:34:59.558942 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/959c75833224b4ba3fa488b77d8f5032-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"959c75833224b4ba3fa488b77d8f5032\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:34:59.558967 master-0 kubenswrapper[28758]: I0223 14:34:59.558962 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87f989cd-6c19-4a30-833a-10e98b7a0326-cert\") pod \"ingress-canary-nwdpd\" (UID: \"87f989cd-6c19-4a30-833a-10e98b7a0326\") " pod="openshift-ingress-canary/ingress-canary-nwdpd" Feb 23 14:34:59.559662 master-0 kubenswrapper[28758]: I0223 14:34:59.558991 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9b558268-2262-4593-893e-408639a9987d-tmp\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.559662 master-0 kubenswrapper[28758]: I0223 14:34:59.559025 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b9774f8c-0f29-46d8-be77-81bcf74d5994-etc-ssl-certs\") pod \"cluster-version-operator-57476485-m58rm\" (UID: \"b9774f8c-0f29-46d8-be77-81bcf74d5994\") " pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" Feb 23 14:34:59.559662 master-0 kubenswrapper[28758]: I0223 14:34:59.559168 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9cf1c39-24f0-420b-8020-089616d1cdf0-serving-cert\") pod \"openshift-kube-scheduler-operator-77cd4d9559-qvq8x\" (UID: \"b9cf1c39-24f0-420b-8020-089616d1cdf0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" Feb 23 14:34:59.559662 master-0 kubenswrapper[28758]: I0223 14:34:59.559221 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8llc8\" (UniqueName: \"kubernetes.io/projected/a4ae9292-71dc-4484-b277-43cb26c1e04d-kube-api-access-8llc8\") pod \"csi-snapshot-controller-operator-6fb4df594f-hkcgz\" (UID: \"a4ae9292-71dc-4484-b277-43cb26c1e04d\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6fb4df594f-hkcgz" Feb 23 14:34:59.559662 master-0 kubenswrapper[28758]: I0223 14:34:59.559283 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2d00ece-7586-4346-adbb-eaae1aeda69e-service-ca-bundle\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:34:59.559662 master-0 kubenswrapper[28758]: I0223 14:34:59.559288 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f10f592e-5738-4879-b776-246b357d4621-env-overrides\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.559662 master-0 kubenswrapper[28758]: I0223 14:34:59.559342 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:34:59.559662 master-0 kubenswrapper[28758]: I0223 14:34:59.559362 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/959c75833224b4ba3fa488b77d8f5032-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"959c75833224b4ba3fa488b77d8f5032\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:34:59.559662 master-0 kubenswrapper[28758]: I0223 14:34:59.559386 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-run\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.559662 master-0 kubenswrapper[28758]: I0223 14:34:59.559394 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf04aca0-8174-4134-835d-37adf6a3b5ca-config\") pod \"kube-controller-manager-operator-7bcfbc574b-zdntd\" (UID: \"cf04aca0-8174-4134-835d-37adf6a3b5ca\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" Feb 23 14:34:59.559662 master-0 kubenswrapper[28758]: I0223 14:34:59.559418 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jlzj\" (UniqueName: \"kubernetes.io/projected/76c67569-3a72-4de9-87cd-432a4607b15b-kube-api-access-2jlzj\") pod \"machine-config-daemon-fhcgg\" (UID: \"76c67569-3a72-4de9-87cd-432a4607b15b\") " pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" Feb 23 14:34:59.559662 master-0 kubenswrapper[28758]: I0223 14:34:59.559440 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fae9a4cf-2acf-4728-9105-87e004052fe5-metrics-client-ca\") pod \"openshift-state-metrics-6dbff8cb4c-9qg7j\" (UID: \"fae9a4cf-2acf-4728-9105-87e004052fe5\") " pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" Feb 23 14:34:59.559662 master-0 kubenswrapper[28758]: I0223 14:34:59.559460 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qfr7\" (UniqueName: \"kubernetes.io/projected/e5104cdd-85b8-49ba-95ca-3e9c8218a01e-kube-api-access-8qfr7\") pod \"network-check-source-58fb6744f5-848dv\" (UID: \"e5104cdd-85b8-49ba-95ca-3e9c8218a01e\") " pod="openshift-network-diagnostics/network-check-source-58fb6744f5-848dv" Feb 23 14:34:59.559662 master-0 kubenswrapper[28758]: I0223 14:34:59.559496 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brd4j\" (UniqueName: \"kubernetes.io/projected/674041a2-e2b0-4286-88cc-f1b00571e3f3-kube-api-access-brd4j\") pod \"network-operator-7d7db75979-x4qnw\" (UID: \"674041a2-e2b0-4286-88cc-f1b00571e3f3\") " pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" Feb 23 14:34:59.559662 master-0 kubenswrapper[28758]: I0223 14:34:59.559518 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-slash\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.559662 master-0 kubenswrapper[28758]: I0223 14:34:59.559533 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-run-systemd\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.559662 master-0 kubenswrapper[28758]: I0223 14:34:59.559552 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-images\") pod \"cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb\" (UID: \"172d47fd-e1a1-4d77-9e31-c4f22e824d5f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" Feb 23 14:34:59.559662 master-0 kubenswrapper[28758]: I0223 14:34:59.559572 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d03a1e6620a92c780b0a91c72a55bc8b-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"d03a1e6620a92c780b0a91c72a55bc8b\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Feb 23 14:34:59.559662 master-0 kubenswrapper[28758]: I0223 14:34:59.559579 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f10f592e-5738-4879-b776-246b357d4621-env-overrides\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.559662 master-0 kubenswrapper[28758]: I0223 14:34:59.559593 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c84f66f0-207e-436a-8f4e-d1971fa815eb-catalog-content\") pod \"redhat-operators-tl6dk\" (UID: \"c84f66f0-207e-436a-8f4e-d1971fa815eb\") " pod="openshift-marketplace/redhat-operators-tl6dk" Feb 23 14:34:59.559662 master-0 kubenswrapper[28758]: I0223 14:34:59.559610 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e1148263-7b15-4c12-a217-8b030ecd9348-var-lock\") pod \"installer-4-master-0\" (UID: \"e1148263-7b15-4c12-a217-8b030ecd9348\") " pod="openshift-kube-apiserver/installer-4-master-0" Feb 23 14:34:59.559662 master-0 kubenswrapper[28758]: I0223 14:34:59.559626 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-sysconfig\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.559662 master-0 kubenswrapper[28758]: I0223 14:34:59.559643 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72769\" (UniqueName: \"kubernetes.io/projected/ceba7b56-f910-473d-aed5-add94868fb31-kube-api-access-72769\") pod \"machine-api-operator-5c7cf458b4-bb7zl\" (UID: \"ceba7b56-f910-473d-aed5-add94868fb31\") " pod="openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl" Feb 23 14:34:59.559662 master-0 kubenswrapper[28758]: I0223 14:34:59.559656 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-apiservice-cert\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:34:59.559662 master-0 kubenswrapper[28758]: I0223 14:34:59.559660 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24829faf-50e8-45bb-abb0-7cc5ccf81080-config\") pod \"openshift-apiserver-operator-8586dccc9b-tvnmq\" (UID: \"24829faf-50e8-45bb-abb0-7cc5ccf81080\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" Feb 23 14:34:59.560551 master-0 kubenswrapper[28758]: I0223 14:34:59.559704 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-5bd7768f54-bgg88\" (UID: \"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" Feb 23 14:34:59.560551 master-0 kubenswrapper[28758]: I0223 14:34:59.559907 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24829faf-50e8-45bb-abb0-7cc5ccf81080-config\") pod \"openshift-apiserver-operator-8586dccc9b-tvnmq\" (UID: \"24829faf-50e8-45bb-abb0-7cc5ccf81080\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" Feb 23 14:34:59.560551 master-0 kubenswrapper[28758]: I0223 14:34:59.560027 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/865ceedb-b19a-4f2f-b295-311e1b7a645e-serving-cert\") pod \"kube-storage-version-migrator-operator-fc889cfd5-tw2r9\" (UID: \"865ceedb-b19a-4f2f-b295-311e1b7a645e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" Feb 23 14:34:59.560551 master-0 kubenswrapper[28758]: I0223 14:34:59.560154 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/865ceedb-b19a-4f2f-b295-311e1b7a645e-config\") pod \"kube-storage-version-migrator-operator-fc889cfd5-tw2r9\" (UID: \"865ceedb-b19a-4f2f-b295-311e1b7a645e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" Feb 23 14:34:59.560551 master-0 kubenswrapper[28758]: I0223 14:34:59.560179 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-var-lib-kubelet\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.560551 master-0 kubenswrapper[28758]: I0223 14:34:59.560222 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knkx2\" (UniqueName: \"kubernetes.io/projected/ad0f0d72-0337-4347-bb50-e299a175f3ca-kube-api-access-knkx2\") pod \"cluster-image-registry-operator-779979bdf7-ml2d7\" (UID: \"ad0f0d72-0337-4347-bb50-e299a175f3ca\") " pod="openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7" Feb 23 14:34:59.560551 master-0 kubenswrapper[28758]: I0223 14:34:59.560240 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a801da1-a7eb-4187-98b8-315076f55e19-config-volume\") pod \"dns-default-86l7f\" (UID: \"6a801da1-a7eb-4187-98b8-315076f55e19\") " pod="openshift-dns/dns-default-86l7f" Feb 23 14:34:59.560551 master-0 kubenswrapper[28758]: I0223 14:34:59.560259 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr2p2\" (UniqueName: \"kubernetes.io/projected/865ceedb-b19a-4f2f-b295-311e1b7a645e-kube-api-access-tr2p2\") pod \"kube-storage-version-migrator-operator-fc889cfd5-tw2r9\" (UID: \"865ceedb-b19a-4f2f-b295-311e1b7a645e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" Feb 23 14:34:59.560551 master-0 kubenswrapper[28758]: I0223 14:34:59.560279 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/06bde94a-3126-4d0f-baba-49dc5fbec61b-stats-auth\") pod \"router-default-7b65dc9fcb-w68qb\" (UID: \"06bde94a-3126-4d0f-baba-49dc5fbec61b\") " pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:34:59.560551 master-0 kubenswrapper[28758]: I0223 14:34:59.560279 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 23 14:34:59.560551 master-0 kubenswrapper[28758]: I0223 14:34:59.560313 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/865ceedb-b19a-4f2f-b295-311e1b7a645e-serving-cert\") pod \"kube-storage-version-migrator-operator-fc889cfd5-tw2r9\" (UID: \"865ceedb-b19a-4f2f-b295-311e1b7a645e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" Feb 23 14:34:59.560551 master-0 kubenswrapper[28758]: I0223 14:34:59.560426 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c84f66f0-207e-436a-8f4e-d1971fa815eb-catalog-content\") pod \"redhat-operators-tl6dk\" (UID: \"c84f66f0-207e-436a-8f4e-d1971fa815eb\") " pod="openshift-marketplace/redhat-operators-tl6dk" Feb 23 14:34:59.560551 master-0 kubenswrapper[28758]: I0223 14:34:59.560538 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/12b256b7-a57b-4124-8452-25e74cfa7926-images\") pod \"cluster-baremetal-operator-d6bb9bb76-4frj6\" (UID: \"12b256b7-a57b-4124-8452-25e74cfa7926\") " pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" Feb 23 14:34:59.560551 master-0 kubenswrapper[28758]: I0223 14:34:59.560562 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/76c67569-3a72-4de9-87cd-432a4607b15b-rootfs\") pod \"machine-config-daemon-fhcgg\" (UID: \"76c67569-3a72-4de9-87cd-432a4607b15b\") " pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" Feb 23 14:34:59.560551 master-0 kubenswrapper[28758]: I0223 14:34:59.560563 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-5bd7768f54-bgg88\" (UID: \"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" Feb 23 14:34:59.561535 master-0 kubenswrapper[28758]: I0223 14:34:59.560633 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6a801da1-a7eb-4187-98b8-315076f55e19-config-volume\") pod \"dns-default-86l7f\" (UID: \"6a801da1-a7eb-4187-98b8-315076f55e19\") " pod="openshift-dns/dns-default-86l7f" Feb 23 14:34:59.561535 master-0 kubenswrapper[28758]: I0223 14:34:59.560675 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-system-cni-dir\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:34:59.561535 master-0 kubenswrapper[28758]: I0223 14:34:59.560788 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:34:59.561535 master-0 kubenswrapper[28758]: I0223 14:34:59.560814 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phmkf\" (UniqueName: \"kubernetes.io/projected/08c561b3-613b-425f-9de4-d5fc8762ea51-kube-api-access-phmkf\") pod \"iptables-alerter-t5h8h\" (UID: \"08c561b3-613b-425f-9de4-d5fc8762ea51\") " pod="openshift-network-operator/iptables-alerter-t5h8h" Feb 23 14:34:59.561535 master-0 kubenswrapper[28758]: I0223 14:34:59.560838 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:34:59.561535 master-0 kubenswrapper[28758]: I0223 14:34:59.560862 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-machine-approver-tls\") pod \"machine-approver-7dd9c7d7b9-rn8fj\" (UID: \"c67a2ed2-f520-46fc-84d3-6816dc19f4e0\") " pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" Feb 23 14:34:59.561535 master-0 kubenswrapper[28758]: I0223 14:34:59.560881 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/865ceedb-b19a-4f2f-b295-311e1b7a645e-config\") pod \"kube-storage-version-migrator-operator-fc889cfd5-tw2r9\" (UID: \"865ceedb-b19a-4f2f-b295-311e1b7a645e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" Feb 23 14:34:59.561535 master-0 kubenswrapper[28758]: I0223 14:34:59.560905 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f10f592e-5738-4879-b776-246b357d4621-ovnkube-config\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.561535 master-0 kubenswrapper[28758]: I0223 14:34:59.560945 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhsc6\" (UniqueName: \"kubernetes.io/projected/2e89a047-9ebc-459b-b7b3-e902c1fb0e17-kube-api-access-bhsc6\") pod \"csi-snapshot-controller-6847bb4785-5fw2x\" (UID: \"2e89a047-9ebc-459b-b7b3-e902c1fb0e17\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-6847bb4785-5fw2x" Feb 23 14:34:59.561535 master-0 kubenswrapper[28758]: I0223 14:34:59.560982 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/66c72c71-f74a-43ab-bf0d-1f4c93623774-ca-certs\") pod \"catalogd-controller-manager-84b8d9d697-2hr5s\" (UID: \"66c72c71-f74a-43ab-bf0d-1f4c93623774\") " pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:34:59.561535 master-0 kubenswrapper[28758]: I0223 14:34:59.561012 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-system-cni-dir\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.561535 master-0 kubenswrapper[28758]: I0223 14:34:59.561040 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12b256b7-a57b-4124-8452-25e74cfa7926-config\") pod \"cluster-baremetal-operator-d6bb9bb76-4frj6\" (UID: \"12b256b7-a57b-4124-8452-25e74cfa7926\") " pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" Feb 23 14:34:59.561535 master-0 kubenswrapper[28758]: I0223 14:34:59.561058 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/af950a67-1557-4352-8100-27281bb8ecbe-proxy-tls\") pod \"machine-config-operator-7f8c75f984-rdjxr\" (UID: \"af950a67-1557-4352-8100-27281bb8ecbe\") " pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" Feb 23 14:34:59.561535 master-0 kubenswrapper[28758]: I0223 14:34:59.561099 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f10f592e-5738-4879-b776-246b357d4621-ovnkube-config\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.561535 master-0 kubenswrapper[28758]: I0223 14:34:59.561136 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-config\") pod \"controller-manager-55d786cb4c-cqkbt\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:34:59.561535 master-0 kubenswrapper[28758]: I0223 14:34:59.561172 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpc4t\" (UniqueName: \"kubernetes.io/projected/efdde2df-cd07-4898-88f4-7ecde0e04d7a-kube-api-access-tpc4t\") pod \"certified-operators-cdrlk\" (UID: \"efdde2df-cd07-4898-88f4-7ecde0e04d7a\") " pod="openshift-marketplace/certified-operators-cdrlk" Feb 23 14:34:59.561535 master-0 kubenswrapper[28758]: I0223 14:34:59.561200 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-run-netns\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.561535 master-0 kubenswrapper[28758]: I0223 14:34:59.561285 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb\" (UID: \"172d47fd-e1a1-4d77-9e31-c4f22e824d5f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" Feb 23 14:34:59.561535 master-0 kubenswrapper[28758]: I0223 14:34:59.561309 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics\") pod \"marketplace-operator-6f5488b997-7b5sp\" (UID: \"585f74db-4593-426b-b0c7-ec8f64810549\") " pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:34:59.561535 master-0 kubenswrapper[28758]: I0223 14:34:59.561378 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-client-ca\") pod \"controller-manager-55d786cb4c-cqkbt\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:34:59.561535 master-0 kubenswrapper[28758]: I0223 14:34:59.561411 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:59.561535 master-0 kubenswrapper[28758]: I0223 14:34:59.561437 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-data-dir\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:34:59.561535 master-0 kubenswrapper[28758]: I0223 14:34:59.561496 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-269v7\" (UniqueName: \"kubernetes.io/projected/f10f592e-5738-4879-b776-246b357d4621-kube-api-access-269v7\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.561535 master-0 kubenswrapper[28758]: I0223 14:34:59.561511 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-operator-metrics\") pod \"marketplace-operator-6f5488b997-7b5sp\" (UID: \"585f74db-4593-426b-b0c7-ec8f64810549\") " pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:34:59.561535 master-0 kubenswrapper[28758]: I0223 14:34:59.561526 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9tkx\" (UniqueName: \"kubernetes.io/projected/585f74db-4593-426b-b0c7-ec8f64810549-kube-api-access-q9tkx\") pod \"marketplace-operator-6f5488b997-7b5sp\" (UID: \"585f74db-4593-426b-b0c7-ec8f64810549\") " pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:34:59.561535 master-0 kubenswrapper[28758]: I0223 14:34:59.561555 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf04aca0-8174-4134-835d-37adf6a3b5ca-kube-api-access\") pod \"kube-controller-manager-operator-7bcfbc574b-zdntd\" (UID: \"cf04aca0-8174-4134-835d-37adf6a3b5ca\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.561582 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/76c67569-3a72-4de9-87cd-432a4607b15b-proxy-tls\") pod \"machine-config-daemon-fhcgg\" (UID: \"76c67569-3a72-4de9-87cd-432a4607b15b\") " pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.561606 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3488a7eb-5170-478c-9af7-490dbe0f514e-trusted-ca\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.561633 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzj2j\" (UniqueName: \"kubernetes.io/projected/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-kube-api-access-lzj2j\") pod \"insights-operator-59b498fcfb-rz897\" (UID: \"ae4baa4e-4ef4-433d-aa36-149e92fa6ee2\") " pod="openshift-insights/insights-operator-59b498fcfb-rz897" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.561690 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.561713 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bbe678de-546d-49d0-8280-3f6d94fa5e4f-webhook-cert\") pod \"network-node-identity-td489\" (UID: \"bbe678de-546d-49d0-8280-3f6d94fa5e4f\") " pod="openshift-network-node-identity/network-node-identity-td489" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.561732 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjs6f\" (UniqueName: \"kubernetes.io/projected/b090ed5a-984f-41dd-8cea-34a1ece1514f-kube-api-access-fjs6f\") pod \"ovnkube-control-plane-5d8dfcdc87-jbc2v\" (UID: \"b090ed5a-984f-41dd-8cea-34a1ece1514f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.561748 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8de1f285-47ac-42aa-8026-8addce656362-etcd-service-ca\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.561765 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/66c72c71-f74a-43ab-bf0d-1f4c93623774-cache\") pod \"catalogd-controller-manager-84b8d9d697-2hr5s\" (UID: \"66c72c71-f74a-43ab-bf0d-1f4c93623774\") " pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.561784 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls\") pod \"dns-operator-8c7d49845-5rk2g\" (UID: \"607c1101-3533-43e3-9eda-13cea2b9dbb6\") " pod="openshift-dns-operator/dns-operator-8c7d49845-5rk2g" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.561813 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f10f592e-5738-4879-b776-246b357d4621-ovnkube-script-lib\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.561829 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/588a804a-430a-47f4-aa97-c08e907239da-etcd-client\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.561846 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmmn9\" (UniqueName: \"kubernetes.io/projected/9b558268-2262-4593-893e-408639a9987d-kube-api-access-nmmn9\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.561862 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g999k\" (UniqueName: \"kubernetes.io/projected/8ca3dee6-f651-4536-991c-303752c22f07-kube-api-access-g999k\") pod \"migrator-5c85bff57-vk2x8\" (UID: \"8ca3dee6-f651-4536-991c-303752c22f07\") " pod="openshift-kube-storage-version-migrator/migrator-5c85bff57-vk2x8" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.561878 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzn8r\" (UniqueName: \"kubernetes.io/projected/c84f66f0-207e-436a-8f4e-d1971fa815eb-kube-api-access-gzn8r\") pod \"redhat-operators-tl6dk\" (UID: \"c84f66f0-207e-436a-8f4e-d1971fa815eb\") " pod="openshift-marketplace/redhat-operators-tl6dk" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.561895 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/482284fd-6911-4ba6-8d57-7966cc51117a-serving-cert\") pod \"route-controller-manager-8bb99f4f-msq8f\" (UID: \"482284fd-6911-4ba6-8d57-7966cc51117a\") " pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.561910 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-trusted-ca\") pod \"marketplace-operator-6f5488b997-7b5sp\" (UID: \"585f74db-4593-426b-b0c7-ec8f64810549\") " pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.561927 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-socket-dir-parent\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.561943 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fae9a4cf-2acf-4728-9105-87e004052fe5-openshift-state-metrics-tls\") pod \"openshift-state-metrics-6dbff8cb4c-9qg7j\" (UID: \"fae9a4cf-2acf-4728-9105-87e004052fe5\") " pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.561959 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1148263-7b15-4c12-a217-8b030ecd9348-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"e1148263-7b15-4c12-a217-8b030ecd9348\") " pod="openshift-kube-apiserver/installer-4-master-0" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.561994 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb\" (UID: \"172d47fd-e1a1-4d77-9e31-c4f22e824d5f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.562003 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3488a7eb-5170-478c-9af7-490dbe0f514e-trusted-ca\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.562014 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzrqz\" (UniqueName: \"kubernetes.io/projected/588a804a-430a-47f4-aa97-c08e907239da-kube-api-access-hzrqz\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.562066 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-os-release\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.562094 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2d00ece-7586-4346-adbb-eaae1aeda69e-config\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.562126 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/1c60ff3f-2bb1-422e-be27-5eca96d85fd2-ca-certs\") pod \"operator-controller-controller-manager-9cc7d7bb-6zmk9\" (UID: \"1c60ff3f-2bb1-422e-be27-5eca96d85fd2\") " pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.562155 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88qnh\" (UniqueName: \"kubernetes.io/projected/483786a0-0a29-44bf-bbd0-2f37e045aa2c-kube-api-access-88qnh\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.562180 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b714a9df-026e-423d-a980-2569f0d92e47-serving-cert\") pod \"service-ca-operator-c48c8bf7c-vtnsw\" (UID: \"b714a9df-026e-423d-a980-2569f0d92e47\") " pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.562206 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4lz2\" (UniqueName: \"kubernetes.io/projected/ded555da-db03-498e-81a9-ad166f29a2aa-kube-api-access-x4lz2\") pod \"network-check-target-x9gxm\" (UID: \"ded555da-db03-498e-81a9-ad166f29a2aa\") " pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.562233 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-sys\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.562258 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.562325 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-run-openvswitch\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.562345 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/588a804a-430a-47f4-aa97-c08e907239da-etcd-serving-ca\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.562362 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/06bde94a-3126-4d0f-baba-49dc5fbec61b-default-certificate\") pod \"router-default-7b65dc9fcb-w68qb\" (UID: \"06bde94a-3126-4d0f-baba-49dc5fbec61b\") " pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.562380 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06bde94a-3126-4d0f-baba-49dc5fbec61b-service-ca-bundle\") pod \"router-default-7b65dc9fcb-w68qb\" (UID: \"06bde94a-3126-4d0f-baba-49dc5fbec61b\") " pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.562397 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-host\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.562382 master-0 kubenswrapper[28758]: I0223 14:34:59.562448 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/483786a0-0a29-44bf-bbd0-2f37e045aa2c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:34:59.563854 master-0 kubenswrapper[28758]: I0223 14:34:59.562467 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efdde2df-cd07-4898-88f4-7ecde0e04d7a-utilities\") pod \"certified-operators-cdrlk\" (UID: \"efdde2df-cd07-4898-88f4-7ecde0e04d7a\") " pod="openshift-marketplace/certified-operators-cdrlk" Feb 23 14:34:59.563854 master-0 kubenswrapper[28758]: I0223 14:34:59.562498 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-var-lib-cni-multus\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.563854 master-0 kubenswrapper[28758]: I0223 14:34:59.562515 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-tls\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:34:59.563854 master-0 kubenswrapper[28758]: I0223 14:34:59.562532 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/18da400b-2271-455d-be0d-0ed44c74f78d-prometheus-operator-tls\") pod \"prometheus-operator-754bc4d665-nl92v\" (UID: \"18da400b-2271-455d-be0d-0ed44c74f78d\") " pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" Feb 23 14:34:59.563854 master-0 kubenswrapper[28758]: I0223 14:34:59.562533 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2d00ece-7586-4346-adbb-eaae1aeda69e-config\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:34:59.563854 master-0 kubenswrapper[28758]: I0223 14:34:59.562549 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ea0b3538-9a7d-4995-b628-2d63f21d683c-etcd-serving-ca\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:34:59.563854 master-0 kubenswrapper[28758]: I0223 14:34:59.562564 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-log-socket\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.563854 master-0 kubenswrapper[28758]: I0223 14:34:59.562581 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-run-ovn-kubernetes\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.563854 master-0 kubenswrapper[28758]: I0223 14:34:59.562597 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-os-release\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.563854 master-0 kubenswrapper[28758]: I0223 14:34:59.562238 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bbe678de-546d-49d0-8280-3f6d94fa5e4f-webhook-cert\") pod \"network-node-identity-td489\" (UID: \"bbe678de-546d-49d0-8280-3f6d94fa5e4f\") " pod="openshift-network-node-identity/network-node-identity-td489" Feb 23 14:34:59.563854 master-0 kubenswrapper[28758]: I0223 14:34:59.562681 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/66c72c71-f74a-43ab-bf0d-1f4c93623774-cache\") pod \"catalogd-controller-manager-84b8d9d697-2hr5s\" (UID: \"66c72c71-f74a-43ab-bf0d-1f4c93623774\") " pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:34:59.563854 master-0 kubenswrapper[28758]: I0223 14:34:59.562830 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/588a804a-430a-47f4-aa97-c08e907239da-audit\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:34:59.563854 master-0 kubenswrapper[28758]: I0223 14:34:59.563023 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8de1f285-47ac-42aa-8026-8addce656362-etcd-service-ca\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:34:59.563854 master-0 kubenswrapper[28758]: I0223 14:34:59.563142 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/585f74db-4593-426b-b0c7-ec8f64810549-marketplace-trusted-ca\") pod \"marketplace-operator-6f5488b997-7b5sp\" (UID: \"585f74db-4593-426b-b0c7-ec8f64810549\") " pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:34:59.563854 master-0 kubenswrapper[28758]: I0223 14:34:59.563208 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qhr9\" (UniqueName: \"kubernetes.io/projected/709ac071-4392-4a3f-a3d1-4bc8ba2f6236-kube-api-access-6qhr9\") pod \"service-ca-576b4d78bd-lq6ct\" (UID: \"709ac071-4392-4a3f-a3d1-4bc8ba2f6236\") " pod="openshift-service-ca/service-ca-576b4d78bd-lq6ct" Feb 23 14:34:59.563854 master-0 kubenswrapper[28758]: I0223 14:34:59.563226 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/607c1101-3533-43e3-9eda-13cea2b9dbb6-metrics-tls\") pod \"dns-operator-8c7d49845-5rk2g\" (UID: \"607c1101-3533-43e3-9eda-13cea2b9dbb6\") " pod="openshift-dns-operator/dns-operator-8c7d49845-5rk2g" Feb 23 14:34:59.563854 master-0 kubenswrapper[28758]: I0223 14:34:59.563236 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5cc28e06-3542-4a25-a8b1-5f5b4ee41114-mcc-auth-proxy-config\") pod \"machine-config-controller-54cb48566c-g4r57\" (UID: \"5cc28e06-3542-4a25-a8b1-5f5b4ee41114\") " pod="openshift-machine-config-operator/machine-config-controller-54cb48566c-g4r57" Feb 23 14:34:59.563854 master-0 kubenswrapper[28758]: I0223 14:34:59.563354 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efdde2df-cd07-4898-88f4-7ecde0e04d7a-utilities\") pod \"certified-operators-cdrlk\" (UID: \"efdde2df-cd07-4898-88f4-7ecde0e04d7a\") " pod="openshift-marketplace/certified-operators-cdrlk" Feb 23 14:34:59.563854 master-0 kubenswrapper[28758]: I0223 14:34:59.563374 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b714a9df-026e-423d-a980-2569f0d92e47-serving-cert\") pod \"service-ca-operator-c48c8bf7c-vtnsw\" (UID: \"b714a9df-026e-423d-a980-2569f0d92e47\") " pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" Feb 23 14:34:59.563854 master-0 kubenswrapper[28758]: I0223 14:34:59.563424 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ea0b3538-9a7d-4995-b628-2d63f21d683c-etcd-client\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:34:59.563854 master-0 kubenswrapper[28758]: I0223 14:34:59.563429 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f10f592e-5738-4879-b776-246b357d4621-ovnkube-script-lib\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.563854 master-0 kubenswrapper[28758]: I0223 14:34:59.563488 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea0b3538-9a7d-4995-b628-2d63f21d683c-trusted-ca-bundle\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:34:59.563854 master-0 kubenswrapper[28758]: I0223 14:34:59.563547 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cd7w\" (UniqueName: \"kubernetes.io/projected/ea0b3538-9a7d-4995-b628-2d63f21d683c-kube-api-access-2cd7w\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:34:59.563854 master-0 kubenswrapper[28758]: I0223 14:34:59.563615 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/483786a0-0a29-44bf-bbd0-2f37e045aa2c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:34:59.563854 master-0 kubenswrapper[28758]: I0223 14:34:59.563625 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/bbe678de-546d-49d0-8280-3f6d94fa5e4f-ovnkube-identity-cm\") pod \"network-node-identity-td489\" (UID: \"bbe678de-546d-49d0-8280-3f6d94fa5e4f\") " pod="openshift-network-node-identity/network-node-identity-td489" Feb 23 14:34:59.563854 master-0 kubenswrapper[28758]: I0223 14:34:59.563684 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b090ed5a-984f-41dd-8cea-34a1ece1514f-ovnkube-config\") pod \"ovnkube-control-plane-5d8dfcdc87-jbc2v\" (UID: \"b090ed5a-984f-41dd-8cea-34a1ece1514f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" Feb 23 14:34:59.563854 master-0 kubenswrapper[28758]: I0223 14:34:59.563709 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-sysctl-d\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.564669 master-0 kubenswrapper[28758]: I0223 14:34:59.563907 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b090ed5a-984f-41dd-8cea-34a1ece1514f-ovnkube-config\") pod \"ovnkube-control-plane-5d8dfcdc87-jbc2v\" (UID: \"b090ed5a-984f-41dd-8cea-34a1ece1514f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" Feb 23 14:34:59.564669 master-0 kubenswrapper[28758]: I0223 14:34:59.563951 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/181adc3f4810f127b44f3750f5d2460c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"181adc3f4810f127b44f3750f5d2460c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:34:59.564669 master-0 kubenswrapper[28758]: I0223 14:34:59.563982 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85365dec-af50-406c-b258-890e4f454c4a-cco-trusted-ca\") pod \"cloud-credential-operator-6968c58f46-p7jh7\" (UID: \"85365dec-af50-406c-b258-890e4f454c4a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-6968c58f46-p7jh7" Feb 23 14:34:59.564669 master-0 kubenswrapper[28758]: I0223 14:34:59.564009 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w5kr\" (UniqueName: \"kubernetes.io/projected/18da400b-2271-455d-be0d-0ed44c74f78d-kube-api-access-2w5kr\") pod \"prometheus-operator-754bc4d665-nl92v\" (UID: \"18da400b-2271-455d-be0d-0ed44c74f78d\") " pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" Feb 23 14:34:59.564669 master-0 kubenswrapper[28758]: I0223 14:34:59.564055 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea0b3538-9a7d-4995-b628-2d63f21d683c-serving-cert\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:34:59.564669 master-0 kubenswrapper[28758]: I0223 14:34:59.564076 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/bbe678de-546d-49d0-8280-3f6d94fa5e4f-ovnkube-identity-cm\") pod \"network-node-identity-td489\" (UID: \"bbe678de-546d-49d0-8280-3f6d94fa5e4f\") " pod="openshift-network-node-identity/network-node-identity-td489" Feb 23 14:34:59.564669 master-0 kubenswrapper[28758]: I0223 14:34:59.564087 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5f8j\" (UniqueName: \"kubernetes.io/projected/5b54fc16-d2f7-4b10-a611-5b411b389c5a-kube-api-access-d5f8j\") pod \"package-server-manager-5c75f78c8b-cj2l7\" (UID: \"5b54fc16-d2f7-4b10-a611-5b411b389c5a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:34:59.564669 master-0 kubenswrapper[28758]: I0223 14:34:59.564203 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlz28\" (UniqueName: \"kubernetes.io/projected/1c60ff3f-2bb1-422e-be27-5eca96d85fd2-kube-api-access-jlz28\") pod \"operator-controller-controller-manager-9cc7d7bb-6zmk9\" (UID: \"1c60ff3f-2bb1-422e-be27-5eca96d85fd2\") " pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:34:59.564669 master-0 kubenswrapper[28758]: I0223 14:34:59.564236 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c0a39496-5e47-4415-b8bf-ed0634797ce1-node-bootstrap-token\") pod \"machine-config-server-qwsmk\" (UID: \"c0a39496-5e47-4415-b8bf-ed0634797ce1\") " pod="openshift-machine-config-operator/machine-config-server-qwsmk" Feb 23 14:34:59.564669 master-0 kubenswrapper[28758]: I0223 14:34:59.564260 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-log-dir\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:34:59.564669 master-0 kubenswrapper[28758]: I0223 14:34:59.564287 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1a283e3a-33ba-4ef7-87d3-55ed8c953fb4-samples-operator-tls\") pod \"cluster-samples-operator-65c5c48b9b-ps6x5\" (UID: \"1a283e3a-33ba-4ef7-87d3-55ed8c953fb4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-65c5c48b9b-ps6x5" Feb 23 14:34:59.564669 master-0 kubenswrapper[28758]: I0223 14:34:59.564314 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d03a1e6620a92c780b0a91c72a55bc8b-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"d03a1e6620a92c780b0a91c72a55bc8b\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Feb 23 14:34:59.564669 master-0 kubenswrapper[28758]: I0223 14:34:59.564343 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0315476e-7140-4777-8061-9cead4c92024-apiservice-cert\") pod \"packageserver-65c9585877-m66zh\" (UID: \"0315476e-7140-4777-8061-9cead4c92024\") " pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" Feb 23 14:34:59.564669 master-0 kubenswrapper[28758]: I0223 14:34:59.564372 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/12b256b7-a57b-4124-8452-25e74cfa7926-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-d6bb9bb76-4frj6\" (UID: \"12b256b7-a57b-4124-8452-25e74cfa7926\") " pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" Feb 23 14:34:59.564669 master-0 kubenswrapper[28758]: I0223 14:34:59.564399 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b9774f8c-0f29-46d8-be77-81bcf74d5994-etc-cvo-updatepayloads\") pod \"cluster-version-operator-57476485-m58rm\" (UID: \"b9774f8c-0f29-46d8-be77-81bcf74d5994\") " pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" Feb 23 14:34:59.564669 master-0 kubenswrapper[28758]: I0223 14:34:59.564427 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-snapshots\") pod \"insights-operator-59b498fcfb-rz897\" (UID: \"ae4baa4e-4ef4-433d-aa36-149e92fa6ee2\") " pod="openshift-insights/insights-operator-59b498fcfb-rz897" Feb 23 14:34:59.564669 master-0 kubenswrapper[28758]: I0223 14:34:59.564454 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3-auth-proxy-config\") pod \"cluster-autoscaler-operator-86b8dc6d6-2kvfp\" (UID: \"3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3\") " pod="openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp" Feb 23 14:34:59.564669 master-0 kubenswrapper[28758]: I0223 14:34:59.564502 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/255b5a89-1b89-42dc-868a-32ce67975a54-profile-collector-cert\") pod \"catalog-operator-596f79dd6f-mhzxn\" (UID: \"255b5a89-1b89-42dc-868a-32ce67975a54\") " pod="openshift-operator-lifecycle-manager/catalog-operator-596f79dd6f-mhzxn" Feb 23 14:34:59.564669 master-0 kubenswrapper[28758]: I0223 14:34:59.564676 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-snapshots\") pod \"insights-operator-59b498fcfb-rz897\" (UID: \"ae4baa4e-4ef4-433d-aa36-149e92fa6ee2\") " pod="openshift-insights/insights-operator-59b498fcfb-rz897" Feb 23 14:34:59.565374 master-0 kubenswrapper[28758]: I0223 14:34:59.564719 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-kubelet\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.565374 master-0 kubenswrapper[28758]: I0223 14:34:59.564749 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f10f592e-5738-4879-b776-246b357d4621-ovn-node-metrics-cert\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.565374 master-0 kubenswrapper[28758]: I0223 14:34:59.564776 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adbf8f71-f005-4e5b-9de1-e49559cf7386-catalog-content\") pod \"community-operators-fjpvt\" (UID: \"adbf8f71-f005-4e5b-9de1-e49559cf7386\") " pod="openshift-marketplace/community-operators-fjpvt" Feb 23 14:34:59.565374 master-0 kubenswrapper[28758]: I0223 14:34:59.564802 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqkz4\" (UniqueName: \"kubernetes.io/projected/6a801da1-a7eb-4187-98b8-315076f55e19-kube-api-access-pqkz4\") pod \"dns-default-86l7f\" (UID: \"6a801da1-a7eb-4187-98b8-315076f55e19\") " pod="openshift-dns/dns-default-86l7f" Feb 23 14:34:59.565374 master-0 kubenswrapper[28758]: I0223 14:34:59.564829 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj8ff\" (UniqueName: \"kubernetes.io/projected/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-kube-api-access-hj8ff\") pod \"machine-approver-7dd9c7d7b9-rn8fj\" (UID: \"c67a2ed2-f520-46fc-84d3-6816dc19f4e0\") " pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" Feb 23 14:34:59.565374 master-0 kubenswrapper[28758]: I0223 14:34:59.564857 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061-operand-assets\") pod \"cluster-olm-operator-5bd7768f54-bgg88\" (UID: \"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" Feb 23 14:34:59.565374 master-0 kubenswrapper[28758]: I0223 14:34:59.564881 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9z2f\" (UniqueName: \"kubernetes.io/projected/09d80e28-0b64-4c5d-a9bc-99d843d40165-kube-api-access-g9z2f\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.565374 master-0 kubenswrapper[28758]: I0223 14:34:59.564904 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-serving-cert\") pod \"insights-operator-59b498fcfb-rz897\" (UID: \"ae4baa4e-4ef4-433d-aa36-149e92fa6ee2\") " pod="openshift-insights/insights-operator-59b498fcfb-rz897" Feb 23 14:34:59.565374 master-0 kubenswrapper[28758]: I0223 14:34:59.564928 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7jvd\" (UniqueName: \"kubernetes.io/projected/8de1f285-47ac-42aa-8026-8addce656362-kube-api-access-x7jvd\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:34:59.565374 master-0 kubenswrapper[28758]: I0223 14:34:59.564964 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlqzc\" (UniqueName: \"kubernetes.io/projected/66c72c71-f74a-43ab-bf0d-1f4c93623774-kube-api-access-xlqzc\") pod \"catalogd-controller-manager-84b8d9d697-2hr5s\" (UID: \"66c72c71-f74a-43ab-bf0d-1f4c93623774\") " pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:34:59.565374 master-0 kubenswrapper[28758]: I0223 14:34:59.564989 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0315476e-7140-4777-8061-9cead4c92024-webhook-cert\") pod \"packageserver-65c9585877-m66zh\" (UID: \"0315476e-7140-4777-8061-9cead4c92024\") " pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" Feb 23 14:34:59.565374 master-0 kubenswrapper[28758]: I0223 14:34:59.565015 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1c60ff3f-2bb1-422e-be27-5eca96d85fd2-etc-containers\") pod \"operator-controller-controller-manager-9cc7d7bb-6zmk9\" (UID: \"1c60ff3f-2bb1-422e-be27-5eca96d85fd2\") " pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:34:59.565374 master-0 kubenswrapper[28758]: I0223 14:34:59.565041 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3488a7eb-5170-478c-9af7-490dbe0f514e-bound-sa-token\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:34:59.565374 master-0 kubenswrapper[28758]: I0223 14:34:59.565067 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57b57915-64dd-42f5-b06f-bc4bcc06b667-trusted-ca\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:34:59.565924 master-0 kubenswrapper[28758]: I0223 14:34:59.565455 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/57b57915-64dd-42f5-b06f-bc4bcc06b667-trusted-ca\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:34:59.565924 master-0 kubenswrapper[28758]: I0223 14:34:59.565701 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adbf8f71-f005-4e5b-9de1-e49559cf7386-catalog-content\") pod \"community-operators-fjpvt\" (UID: \"adbf8f71-f005-4e5b-9de1-e49559cf7386\") " pod="openshift-marketplace/community-operators-fjpvt" Feb 23 14:34:59.565924 master-0 kubenswrapper[28758]: I0223 14:34:59.565675 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f10f592e-5738-4879-b776-246b357d4621-ovn-node-metrics-cert\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.565924 master-0 kubenswrapper[28758]: I0223 14:34:59.565742 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061-operand-assets\") pod \"cluster-olm-operator-5bd7768f54-bgg88\" (UID: \"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" Feb 23 14:34:59.565924 master-0 kubenswrapper[28758]: I0223 14:34:59.565758 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/af950a67-1557-4352-8100-27281bb8ecbe-images\") pod \"machine-config-operator-7f8c75f984-rdjxr\" (UID: \"af950a67-1557-4352-8100-27281bb8ecbe\") " pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" Feb 23 14:34:59.565924 master-0 kubenswrapper[28758]: I0223 14:34:59.565820 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5d9w\" (UniqueName: \"kubernetes.io/projected/85365dec-af50-406c-b258-890e4f454c4a-kube-api-access-k5d9w\") pod \"cloud-credential-operator-6968c58f46-p7jh7\" (UID: \"85365dec-af50-406c-b258-890e4f454c4a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-6968c58f46-p7jh7" Feb 23 14:34:59.565924 master-0 kubenswrapper[28758]: I0223 14:34:59.565848 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:34:59.565924 master-0 kubenswrapper[28758]: I0223 14:34:59.565877 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8dd5fa7c-0519-4170-89c6-b369e5fc1990-webhook-certs\") pod \"multus-admission-controller-5f54bf67d4-2p4jz\" (UID: \"8dd5fa7c-0519-4170-89c6-b369e5fc1990\") " pod="openshift-multus/multus-admission-controller-5f54bf67d4-2p4jz" Feb 23 14:34:59.565924 master-0 kubenswrapper[28758]: I0223 14:34:59.565924 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/961e4ecd-545b-4270-ae34-e733dec793b6-config\") pod \"kube-apiserver-operator-5d87bf58c-nq2tz\" (UID: \"961e4ecd-545b-4270-ae34-e733dec793b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" Feb 23 14:34:59.566169 master-0 kubenswrapper[28758]: I0223 14:34:59.565946 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/66c72c71-f74a-43ab-bf0d-1f4c93623774-etc-docker\") pod \"catalogd-controller-manager-84b8d9d697-2hr5s\" (UID: \"66c72c71-f74a-43ab-bf0d-1f4c93623774\") " pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:34:59.566169 master-0 kubenswrapper[28758]: I0223 14:34:59.566052 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:34:59.566169 master-0 kubenswrapper[28758]: I0223 14:34:59.566076 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/588a804a-430a-47f4-aa97-c08e907239da-serving-cert\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:34:59.566169 master-0 kubenswrapper[28758]: I0223 14:34:59.566094 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr868\" (UniqueName: \"kubernetes.io/projected/b714a9df-026e-423d-a980-2569f0d92e47-kube-api-access-lr868\") pod \"service-ca-operator-c48c8bf7c-vtnsw\" (UID: \"b714a9df-026e-423d-a980-2569f0d92e47\") " pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" Feb 23 14:34:59.566169 master-0 kubenswrapper[28758]: I0223 14:34:59.566102 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/961e4ecd-545b-4270-ae34-e733dec793b6-config\") pod \"kube-apiserver-operator-5d87bf58c-nq2tz\" (UID: \"961e4ecd-545b-4270-ae34-e733dec793b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" Feb 23 14:34:59.566169 master-0 kubenswrapper[28758]: I0223 14:34:59.566112 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb6e88cd-98de-446a-92e8-f56a2f133703-config\") pod \"openshift-controller-manager-operator-584cc7bcb5-67ds6\" (UID: \"cb6e88cd-98de-446a-92e8-f56a2f133703\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" Feb 23 14:34:59.566169 master-0 kubenswrapper[28758]: I0223 14:34:59.566153 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3488a7eb-5170-478c-9af7-490dbe0f514e-metrics-tls\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:34:59.566363 master-0 kubenswrapper[28758]: I0223 14:34:59.566224 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpqzn\" (UniqueName: \"kubernetes.io/projected/87f989cd-6c19-4a30-833a-10e98b7a0326-kube-api-access-wpqzn\") pod \"ingress-canary-nwdpd\" (UID: \"87f989cd-6c19-4a30-833a-10e98b7a0326\") " pod="openshift-ingress-canary/ingress-canary-nwdpd" Feb 23 14:34:59.566363 master-0 kubenswrapper[28758]: I0223 14:34:59.566251 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/482284fd-6911-4ba6-8d57-7966cc51117a-client-ca\") pod \"route-controller-manager-8bb99f4f-msq8f\" (UID: \"482284fd-6911-4ba6-8d57-7966cc51117a\") " pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:34:59.566363 master-0 kubenswrapper[28758]: I0223 14:34:59.566268 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/961e4ecd-545b-4270-ae34-e733dec793b6-serving-cert\") pod \"kube-apiserver-operator-5d87bf58c-nq2tz\" (UID: \"961e4ecd-545b-4270-ae34-e733dec793b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" Feb 23 14:34:59.566363 master-0 kubenswrapper[28758]: I0223 14:34:59.566286 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/18da400b-2271-455d-be0d-0ed44c74f78d-metrics-client-ca\") pod \"prometheus-operator-754bc4d665-nl92v\" (UID: \"18da400b-2271-455d-be0d-0ed44c74f78d\") " pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" Feb 23 14:34:59.566363 master-0 kubenswrapper[28758]: I0223 14:34:59.566305 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-var-lib-openvswitch\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.566579 master-0 kubenswrapper[28758]: I0223 14:34:59.566382 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/57b57915-64dd-42f5-b06f-bc4bcc06b667-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:34:59.566579 master-0 kubenswrapper[28758]: I0223 14:34:59.566409 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb6e88cd-98de-446a-92e8-f56a2f133703-config\") pod \"openshift-controller-manager-operator-584cc7bcb5-67ds6\" (UID: \"cb6e88cd-98de-446a-92e8-f56a2f133703\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" Feb 23 14:34:59.566579 master-0 kubenswrapper[28758]: I0223 14:34:59.566555 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b090ed5a-984f-41dd-8cea-34a1ece1514f-env-overrides\") pod \"ovnkube-control-plane-5d8dfcdc87-jbc2v\" (UID: \"b090ed5a-984f-41dd-8cea-34a1ece1514f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" Feb 23 14:34:59.566671 master-0 kubenswrapper[28758]: I0223 14:34:59.566586 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c997c8e9d3be51d454d8e61e376bef08-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"c997c8e9d3be51d454d8e61e376bef08\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Feb 23 14:34:59.566671 master-0 kubenswrapper[28758]: I0223 14:34:59.566612 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp8kk\" (UniqueName: \"kubernetes.io/projected/3f86e881-275c-4387-a23a-06c559c8f1e8-kube-api-access-wp8kk\") pod \"redhat-marketplace-pfb9h\" (UID: \"3f86e881-275c-4387-a23a-06c559c8f1e8\") " pod="openshift-marketplace/redhat-marketplace-pfb9h" Feb 23 14:34:59.566671 master-0 kubenswrapper[28758]: I0223 14:34:59.566639 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:34:59.566671 master-0 kubenswrapper[28758]: I0223 14:34:59.566663 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1c60ff3f-2bb1-422e-be27-5eca96d85fd2-cache\") pod \"operator-controller-controller-manager-9cc7d7bb-6zmk9\" (UID: \"1c60ff3f-2bb1-422e-be27-5eca96d85fd2\") " pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:34:59.566778 master-0 kubenswrapper[28758]: I0223 14:34:59.566664 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/961e4ecd-545b-4270-ae34-e733dec793b6-serving-cert\") pod \"kube-apiserver-operator-5d87bf58c-nq2tz\" (UID: \"961e4ecd-545b-4270-ae34-e733dec793b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" Feb 23 14:34:59.566778 master-0 kubenswrapper[28758]: I0223 14:34:59.566683 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06bde94a-3126-4d0f-baba-49dc5fbec61b-metrics-certs\") pod \"router-default-7b65dc9fcb-w68qb\" (UID: \"06bde94a-3126-4d0f-baba-49dc5fbec61b\") " pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:34:59.566778 master-0 kubenswrapper[28758]: I0223 14:34:59.566734 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b090ed5a-984f-41dd-8cea-34a1ece1514f-env-overrides\") pod \"ovnkube-control-plane-5d8dfcdc87-jbc2v\" (UID: \"b090ed5a-984f-41dd-8cea-34a1ece1514f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" Feb 23 14:34:59.566857 master-0 kubenswrapper[28758]: I0223 14:34:59.566820 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-wtmp\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:34:59.566857 master-0 kubenswrapper[28758]: I0223 14:34:59.566833 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1c60ff3f-2bb1-422e-be27-5eca96d85fd2-cache\") pod \"operator-controller-controller-manager-9cc7d7bb-6zmk9\" (UID: \"1c60ff3f-2bb1-422e-be27-5eca96d85fd2\") " pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:34:59.566914 master-0 kubenswrapper[28758]: I0223 14:34:59.566848 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/18da400b-2271-455d-be0d-0ed44c74f78d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-754bc4d665-nl92v\" (UID: \"18da400b-2271-455d-be0d-0ed44c74f78d\") " pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" Feb 23 14:34:59.566943 master-0 kubenswrapper[28758]: I0223 14:34:59.566922 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ea0b3538-9a7d-4995-b628-2d63f21d683c-audit-dir\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:34:59.567008 master-0 kubenswrapper[28758]: I0223 14:34:59.566969 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-static-pod-dir\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:34:59.567103 master-0 kubenswrapper[28758]: I0223 14:34:59.567057 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbb66172-1ea9-4683-b88f-227c4fd94924-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-f94476f49-s6c8v\" (UID: \"fbb66172-1ea9-4683-b88f-227c4fd94924\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-f94476f49-s6c8v" Feb 23 14:34:59.567142 master-0 kubenswrapper[28758]: I0223 14:34:59.567119 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hnfl\" (UniqueName: \"kubernetes.io/projected/9416f5d0-32b4-4065-b678-26913af8b6dd-kube-api-access-7hnfl\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:34:59.567197 master-0 kubenswrapper[28758]: I0223 14:34:59.567148 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phzkn\" (UniqueName: \"kubernetes.io/projected/5cc28e06-3542-4a25-a8b1-5f5b4ee41114-kube-api-access-phzkn\") pod \"machine-config-controller-54cb48566c-g4r57\" (UID: \"5cc28e06-3542-4a25-a8b1-5f5b4ee41114\") " pod="openshift-machine-config-operator/machine-config-controller-54cb48566c-g4r57" Feb 23 14:34:59.567197 master-0 kubenswrapper[28758]: I0223 14:34:59.567177 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-cert-dir\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:34:59.567266 master-0 kubenswrapper[28758]: I0223 14:34:59.567207 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9cf1c39-24f0-420b-8020-089616d1cdf0-kube-api-access\") pod \"openshift-kube-scheduler-operator-77cd4d9559-qvq8x\" (UID: \"b9cf1c39-24f0-420b-8020-089616d1cdf0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" Feb 23 14:34:59.567266 master-0 kubenswrapper[28758]: I0223 14:34:59.567253 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/15ad7f4e-44c6-4426-8b97-c47a47786544-sys\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:34:59.567321 master-0 kubenswrapper[28758]: I0223 14:34:59.567283 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chznd\" (UniqueName: \"kubernetes.io/projected/cb6e88cd-98de-446a-92e8-f56a2f133703-kube-api-access-chznd\") pod \"openshift-controller-manager-operator-584cc7bcb5-67ds6\" (UID: \"cb6e88cd-98de-446a-92e8-f56a2f133703\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" Feb 23 14:34:59.567353 master-0 kubenswrapper[28758]: I0223 14:34:59.567319 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwtcj\" (UniqueName: \"kubernetes.io/projected/2f876e5d-2e82-47d0-8a9c-adacf2bddf77-kube-api-access-pwtcj\") pod \"node-resolver-7b6jk\" (UID: \"2f876e5d-2e82-47d0-8a9c-adacf2bddf77\") " pod="openshift-dns/node-resolver-7b6jk" Feb 23 14:34:59.567353 master-0 kubenswrapper[28758]: I0223 14:34:59.567343 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/15ad7f4e-44c6-4426-8b97-c47a47786544-root\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:34:59.567415 master-0 kubenswrapper[28758]: I0223 14:34:59.567373 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceba7b56-f910-473d-aed5-add94868fb31-config\") pod \"machine-api-operator-5c7cf458b4-bb7zl\" (UID: \"ceba7b56-f910-473d-aed5-add94868fb31\") " pod="openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl" Feb 23 14:34:59.567464 master-0 kubenswrapper[28758]: I0223 14:34:59.567416 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42sml\" (UniqueName: \"kubernetes.io/projected/959c2393-e914-4c10-a18f-b30fcf012d19-kube-api-access-42sml\") pod \"controller-manager-55d786cb4c-cqkbt\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:34:59.567532 master-0 kubenswrapper[28758]: I0223 14:34:59.567462 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-cni-netd\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.567532 master-0 kubenswrapper[28758]: I0223 14:34:59.567528 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/588a804a-430a-47f4-aa97-c08e907239da-encryption-config\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:34:59.567611 master-0 kubenswrapper[28758]: I0223 14:34:59.567553 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-run-netns\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.567611 master-0 kubenswrapper[28758]: I0223 14:34:59.567593 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/961e4ecd-545b-4270-ae34-e733dec793b6-kube-api-access\") pod \"kube-apiserver-operator-5d87bf58c-nq2tz\" (UID: \"961e4ecd-545b-4270-ae34-e733dec793b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" Feb 23 14:34:59.567682 master-0 kubenswrapper[28758]: I0223 14:34:59.567644 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2d00ece-7586-4346-adbb-eaae1aeda69e-trusted-ca-bundle\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:34:59.567682 master-0 kubenswrapper[28758]: I0223 14:34:59.567673 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl87q\" (UniqueName: \"kubernetes.io/projected/fbb66172-1ea9-4683-b88f-227c4fd94924-kube-api-access-kl87q\") pod \"cluster-storage-operator-f94476f49-s6c8v\" (UID: \"fbb66172-1ea9-4683-b88f-227c4fd94924\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-f94476f49-s6c8v" Feb 23 14:34:59.567877 master-0 kubenswrapper[28758]: I0223 14:34:59.567846 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/588a804a-430a-47f4-aa97-c08e907239da-audit-dir\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:34:59.567915 master-0 kubenswrapper[28758]: I0223 14:34:59.567881 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6bb6d78bf-wzqcp\" (UID: \"646fece3-4a42-4e0c-bcc7-5f705f948d63\") " pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:34:59.567947 master-0 kubenswrapper[28758]: I0223 14:34:59.567902 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jzsd\" (UniqueName: \"kubernetes.io/projected/646fece3-4a42-4e0c-bcc7-5f705f948d63-kube-api-access-2jzsd\") pod \"cluster-monitoring-operator-6bb6d78bf-wzqcp\" (UID: \"646fece3-4a42-4e0c-bcc7-5f705f948d63\") " pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:34:59.567977 master-0 kubenswrapper[28758]: I0223 14:34:59.567938 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/588a804a-430a-47f4-aa97-c08e907239da-trusted-ca-bundle\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:34:59.568019 master-0 kubenswrapper[28758]: I0223 14:34:59.567996 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2d00ece-7586-4346-adbb-eaae1aeda69e-trusted-ca-bundle\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:34:59.568058 master-0 kubenswrapper[28758]: I0223 14:34:59.568035 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2f876e5d-2e82-47d0-8a9c-adacf2bddf77-hosts-file\") pod \"node-resolver-7b6jk\" (UID: \"2f876e5d-2e82-47d0-8a9c-adacf2bddf77\") " pod="openshift-dns/node-resolver-7b6jk" Feb 23 14:34:59.568097 master-0 kubenswrapper[28758]: I0223 14:34:59.568058 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3-cert\") pod \"cluster-autoscaler-operator-86b8dc6d6-2kvfp\" (UID: \"3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3\") " pod="openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp" Feb 23 14:34:59.568097 master-0 kubenswrapper[28758]: I0223 14:34:59.568080 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c02c8912-46c9-4f86-ad28-9bfb2eca4e54-tls-certificates\") pod \"prometheus-operator-admission-webhook-75d56db95f-rg8tp\" (UID: \"c02c8912-46c9-4f86-ad28-9bfb2eca4e54\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-75d56db95f-rg8tp" Feb 23 14:34:59.568156 master-0 kubenswrapper[28758]: I0223 14:34:59.568096 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-usr-local-bin\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:34:59.568156 master-0 kubenswrapper[28758]: I0223 14:34:59.568115 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nn4m\" (UniqueName: \"kubernetes.io/projected/255b5a89-1b89-42dc-868a-32ce67975a54-kube-api-access-5nn4m\") pod \"catalog-operator-596f79dd6f-mhzxn\" (UID: \"255b5a89-1b89-42dc-868a-32ce67975a54\") " pod="openshift-operator-lifecycle-manager/catalog-operator-596f79dd6f-mhzxn" Feb 23 14:34:59.568156 master-0 kubenswrapper[28758]: I0223 14:34:59.568139 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/483786a0-0a29-44bf-bbd0-2f37e045aa2c-whereabouts-configmap\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:34:59.568240 master-0 kubenswrapper[28758]: I0223 14:34:59.568160 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/646fece3-4a42-4e0c-bcc7-5f705f948d63-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6bb6d78bf-wzqcp\" (UID: \"646fece3-4a42-4e0c-bcc7-5f705f948d63\") " pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:34:59.568240 master-0 kubenswrapper[28758]: I0223 14:34:59.568185 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chs7z\" (UniqueName: \"kubernetes.io/projected/8dd5fa7c-0519-4170-89c6-b369e5fc1990-kube-api-access-chs7z\") pod \"multus-admission-controller-5f54bf67d4-2p4jz\" (UID: \"8dd5fa7c-0519-4170-89c6-b369e5fc1990\") " pod="openshift-multus/multus-admission-controller-5f54bf67d4-2p4jz" Feb 23 14:34:59.568328 master-0 kubenswrapper[28758]: I0223 14:34:59.568281 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12b256b7-a57b-4124-8452-25e74cfa7926-cert\") pod \"cluster-baremetal-operator-d6bb9bb76-4frj6\" (UID: \"12b256b7-a57b-4124-8452-25e74cfa7926\") " pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" Feb 23 14:34:59.568369 master-0 kubenswrapper[28758]: I0223 14:34:59.568351 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rg9g\" (UniqueName: \"kubernetes.io/projected/12b256b7-a57b-4124-8452-25e74cfa7926-kube-api-access-2rg9g\") pod \"cluster-baremetal-operator-d6bb9bb76-4frj6\" (UID: \"12b256b7-a57b-4124-8452-25e74cfa7926\") " pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" Feb 23 14:34:59.568415 master-0 kubenswrapper[28758]: I0223 14:34:59.568397 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/483786a0-0a29-44bf-bbd0-2f37e045aa2c-whereabouts-configmap\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:34:59.568446 master-0 kubenswrapper[28758]: I0223 14:34:59.568415 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/09d80e28-0b64-4c5d-a9bc-99d843d40165-cni-binary-copy\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.568504 master-0 kubenswrapper[28758]: I0223 14:34:59.568465 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-var-lib-cni-bin\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.568504 master-0 kubenswrapper[28758]: I0223 14:34:59.568502 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-cnibin\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:34:59.568569 master-0 kubenswrapper[28758]: I0223 14:34:59.568523 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/674041a2-e2b0-4286-88cc-f1b00571e3f3-metrics-tls\") pod \"network-operator-7d7db75979-x4qnw\" (UID: \"674041a2-e2b0-4286-88cc-f1b00571e3f3\") " pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" Feb 23 14:34:59.568569 master-0 kubenswrapper[28758]: I0223 14:34:59.568560 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-kubernetes\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.568634 master-0 kubenswrapper[28758]: I0223 14:34:59.568567 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/09d80e28-0b64-4c5d-a9bc-99d843d40165-cni-binary-copy\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.568634 master-0 kubenswrapper[28758]: I0223 14:34:59.568583 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-service-ca-bundle\") pod \"insights-operator-59b498fcfb-rz897\" (UID: \"ae4baa4e-4ef4-433d-aa36-149e92fa6ee2\") " pod="openshift-insights/insights-operator-59b498fcfb-rz897" Feb 23 14:34:59.568634 master-0 kubenswrapper[28758]: I0223 14:34:59.568605 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert\") pod \"package-server-manager-5c75f78c8b-cj2l7\" (UID: \"5b54fc16-d2f7-4b10-a611-5b411b389c5a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:34:59.568719 master-0 kubenswrapper[28758]: I0223 14:34:59.568648 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-hostroot\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.568719 master-0 kubenswrapper[28758]: I0223 14:34:59.568670 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b714a9df-026e-423d-a980-2569f0d92e47-config\") pod \"service-ca-operator-c48c8bf7c-vtnsw\" (UID: \"b714a9df-026e-423d-a980-2569f0d92e47\") " pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" Feb 23 14:34:59.568719 master-0 kubenswrapper[28758]: I0223 14:34:59.568710 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzkcs\" (UniqueName: \"kubernetes.io/projected/ace75aae-6f4f-4299-90e2-d5292271b136-kube-api-access-wzkcs\") pod \"network-metrics-daemon-9dnsv\" (UID: \"ace75aae-6f4f-4299-90e2-d5292271b136\") " pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:34:59.568798 master-0 kubenswrapper[28758]: I0223 14:34:59.568739 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/85365dec-af50-406c-b258-890e4f454c4a-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-6968c58f46-p7jh7\" (UID: \"85365dec-af50-406c-b258-890e4f454c4a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-6968c58f46-p7jh7" Feb 23 14:34:59.568798 master-0 kubenswrapper[28758]: I0223 14:34:59.568757 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb6e88cd-98de-446a-92e8-f56a2f133703-serving-cert\") pod \"openshift-controller-manager-operator-584cc7bcb5-67ds6\" (UID: \"cb6e88cd-98de-446a-92e8-f56a2f133703\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" Feb 23 14:34:59.568798 master-0 kubenswrapper[28758]: I0223 14:34:59.568769 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/674041a2-e2b0-4286-88cc-f1b00571e3f3-metrics-tls\") pod \"network-operator-7d7db75979-x4qnw\" (UID: \"674041a2-e2b0-4286-88cc-f1b00571e3f3\") " pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" Feb 23 14:34:59.568798 master-0 kubenswrapper[28758]: I0223 14:34:59.568795 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-systemd-units\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.568902 master-0 kubenswrapper[28758]: I0223 14:34:59.568816 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/959c75833224b4ba3fa488b77d8f5032-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"959c75833224b4ba3fa488b77d8f5032\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:34:59.568902 master-0 kubenswrapper[28758]: I0223 14:34:59.568836 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0cebb80d-d898-44c8-82b3-1e18833cee3f-profile-collector-cert\") pod \"olm-operator-5499d7f7bb-t45zz\" (UID: \"0cebb80d-d898-44c8-82b3-1e18833cee3f\") " pod="openshift-operator-lifecycle-manager/olm-operator-5499d7f7bb-t45zz" Feb 23 14:34:59.568902 master-0 kubenswrapper[28758]: I0223 14:34:59.568853 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b54fc16-d2f7-4b10-a611-5b411b389c5a-package-server-manager-serving-cert\") pod \"package-server-manager-5c75f78c8b-cj2l7\" (UID: \"5b54fc16-d2f7-4b10-a611-5b411b389c5a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:34:59.568902 master-0 kubenswrapper[28758]: I0223 14:34:59.568877 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9774f8c-0f29-46d8-be77-81bcf74d5994-kube-api-access\") pod \"cluster-version-operator-57476485-m58rm\" (UID: \"b9774f8c-0f29-46d8-be77-81bcf74d5994\") " pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" Feb 23 14:34:59.569008 master-0 kubenswrapper[28758]: I0223 14:34:59.568897 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b714a9df-026e-423d-a980-2569f0d92e47-config\") pod \"service-ca-operator-c48c8bf7c-vtnsw\" (UID: \"b714a9df-026e-423d-a980-2569f0d92e47\") " pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" Feb 23 14:34:59.569008 master-0 kubenswrapper[28758]: I0223 14:34:59.568989 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb6e88cd-98de-446a-92e8-f56a2f133703-serving-cert\") pod \"openshift-controller-manager-operator-584cc7bcb5-67ds6\" (UID: \"cb6e88cd-98de-446a-92e8-f56a2f133703\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" Feb 23 14:34:59.569067 master-0 kubenswrapper[28758]: I0223 14:34:59.568983 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khfkr\" (UniqueName: \"kubernetes.io/projected/482284fd-6911-4ba6-8d57-7966cc51117a-kube-api-access-khfkr\") pod \"route-controller-manager-8bb99f4f-msq8f\" (UID: \"482284fd-6911-4ba6-8d57-7966cc51117a\") " pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:34:59.569098 master-0 kubenswrapper[28758]: I0223 14:34:59.569060 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf04aca0-8174-4134-835d-37adf6a3b5ca-serving-cert\") pod \"kube-controller-manager-operator-7bcfbc574b-zdntd\" (UID: \"cf04aca0-8174-4134-835d-37adf6a3b5ca\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" Feb 23 14:34:59.569128 master-0 kubenswrapper[28758]: I0223 14:34:59.569098 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/92c63c95-e880-4f51-9858-7715343f7bd8-available-featuregates\") pod \"openshift-config-operator-6f47d587d6-55qjr\" (UID: \"92c63c95-e880-4f51-9858-7715343f7bd8\") " pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" Feb 23 14:34:59.569165 master-0 kubenswrapper[28758]: I0223 14:34:59.569129 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/482284fd-6911-4ba6-8d57-7966cc51117a-config\") pod \"route-controller-manager-8bb99f4f-msq8f\" (UID: \"482284fd-6911-4ba6-8d57-7966cc51117a\") " pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:34:59.569165 master-0 kubenswrapper[28758]: I0223 14:34:59.569154 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-node-log\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.569220 master-0 kubenswrapper[28758]: I0223 14:34:59.569202 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/92c63c95-e880-4f51-9858-7715343f7bd8-available-featuregates\") pod \"openshift-config-operator-6f47d587d6-55qjr\" (UID: \"92c63c95-e880-4f51-9858-7715343f7bd8\") " pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" Feb 23 14:34:59.569220 master-0 kubenswrapper[28758]: I0223 14:34:59.569178 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4373687a-61a0-434b-81f7-3fecaa1494ef-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-686847ff5f-9q266\" (UID: \"4373687a-61a0-434b-81f7-3fecaa1494ef\") " pod="openshift-machine-api/control-plane-machine-set-operator-686847ff5f-9q266" Feb 23 14:34:59.569285 master-0 kubenswrapper[28758]: I0223 14:34:59.569241 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fae9a4cf-2acf-4728-9105-87e004052fe5-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-6dbff8cb4c-9qg7j\" (UID: \"fae9a4cf-2acf-4728-9105-87e004052fe5\") " pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" Feb 23 14:34:59.569285 master-0 kubenswrapper[28758]: I0223 14:34:59.569273 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tl7p\" (UniqueName: \"kubernetes.io/projected/92c63c95-e880-4f51-9858-7715343f7bd8-kube-api-access-9tl7p\") pod \"openshift-config-operator-6f47d587d6-55qjr\" (UID: \"92c63c95-e880-4f51-9858-7715343f7bd8\") " pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" Feb 23 14:34:59.569421 master-0 kubenswrapper[28758]: I0223 14:34:59.569291 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a801da1-a7eb-4187-98b8-315076f55e19-metrics-tls\") pod \"dns-default-86l7f\" (UID: \"6a801da1-a7eb-4187-98b8-315076f55e19\") " pod="openshift-dns/dns-default-86l7f" Feb 23 14:34:59.569421 master-0 kubenswrapper[28758]: I0223 14:34:59.569310 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-auth-proxy-config\") pod \"machine-approver-7dd9c7d7b9-rn8fj\" (UID: \"c67a2ed2-f520-46fc-84d3-6816dc19f4e0\") " pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" Feb 23 14:34:59.569421 master-0 kubenswrapper[28758]: I0223 14:34:59.569358 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf04aca0-8174-4134-835d-37adf6a3b5ca-serving-cert\") pod \"kube-controller-manager-operator-7bcfbc574b-zdntd\" (UID: \"cf04aca0-8174-4134-835d-37adf6a3b5ca\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" Feb 23 14:34:59.569421 master-0 kubenswrapper[28758]: I0223 14:34:59.569362 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr7rw\" (UniqueName: \"kubernetes.io/projected/1a283e3a-33ba-4ef7-87d3-55ed8c953fb4-kube-api-access-rr7rw\") pod \"cluster-samples-operator-65c5c48b9b-ps6x5\" (UID: \"1a283e3a-33ba-4ef7-87d3-55ed8c953fb4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-65c5c48b9b-ps6x5" Feb 23 14:34:59.569589 master-0 kubenswrapper[28758]: I0223 14:34:59.569426 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb\" (UID: \"172d47fd-e1a1-4d77-9e31-c4f22e824d5f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" Feb 23 14:34:59.569589 master-0 kubenswrapper[28758]: I0223 14:34:59.569459 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f86e881-275c-4387-a23a-06c559c8f1e8-catalog-content\") pod \"redhat-marketplace-pfb9h\" (UID: \"3f86e881-275c-4387-a23a-06c559c8f1e8\") " pod="openshift-marketplace/redhat-marketplace-pfb9h" Feb 23 14:34:59.569589 master-0 kubenswrapper[28758]: I0223 14:34:59.569503 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adbf8f71-f005-4e5b-9de1-e49559cf7386-utilities\") pod \"community-operators-fjpvt\" (UID: \"adbf8f71-f005-4e5b-9de1-e49559cf7386\") " pod="openshift-marketplace/community-operators-fjpvt" Feb 23 14:34:59.569589 master-0 kubenswrapper[28758]: I0223 14:34:59.569525 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a801da1-a7eb-4187-98b8-315076f55e19-metrics-tls\") pod \"dns-default-86l7f\" (UID: \"6a801da1-a7eb-4187-98b8-315076f55e19\") " pod="openshift-dns/dns-default-86l7f" Feb 23 14:34:59.569589 master-0 kubenswrapper[28758]: I0223 14:34:59.569552 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8de1f285-47ac-42aa-8026-8addce656362-etcd-ca\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:34:59.569589 master-0 kubenswrapper[28758]: I0223 14:34:59.569577 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f86e881-275c-4387-a23a-06c559c8f1e8-catalog-content\") pod \"redhat-marketplace-pfb9h\" (UID: \"3f86e881-275c-4387-a23a-06c559c8f1e8\") " pod="openshift-marketplace/redhat-marketplace-pfb9h" Feb 23 14:34:59.569801 master-0 kubenswrapper[28758]: I0223 14:34:59.569598 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/588a804a-430a-47f4-aa97-c08e907239da-image-import-ca\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:34:59.569801 master-0 kubenswrapper[28758]: I0223 14:34:59.569583 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adbf8f71-f005-4e5b-9de1-e49559cf7386-utilities\") pod \"community-operators-fjpvt\" (UID: \"adbf8f71-f005-4e5b-9de1-e49559cf7386\") " pod="openshift-marketplace/community-operators-fjpvt" Feb 23 14:34:59.569801 master-0 kubenswrapper[28758]: I0223 14:34:59.569664 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/76c67569-3a72-4de9-87cd-432a4607b15b-mcd-auth-proxy-config\") pod \"machine-config-daemon-fhcgg\" (UID: \"76c67569-3a72-4de9-87cd-432a4607b15b\") " pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" Feb 23 14:34:59.569801 master-0 kubenswrapper[28758]: I0223 14:34:59.569704 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-lib-modules\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.569801 master-0 kubenswrapper[28758]: I0223 14:34:59.569729 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:59.569801 master-0 kubenswrapper[28758]: I0223 14:34:59.569761 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vxqg\" (UniqueName: \"kubernetes.io/projected/adbf8f71-f005-4e5b-9de1-e49559cf7386-kube-api-access-5vxqg\") pod \"community-operators-fjpvt\" (UID: \"adbf8f71-f005-4e5b-9de1-e49559cf7386\") " pod="openshift-marketplace/community-operators-fjpvt" Feb 23 14:34:59.569801 master-0 kubenswrapper[28758]: I0223 14:34:59.569788 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8de1f285-47ac-42aa-8026-8addce656362-etcd-ca\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:34:59.570024 master-0 kubenswrapper[28758]: I0223 14:34:59.569815 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-conf-dir\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.570024 master-0 kubenswrapper[28758]: I0223 14:34:59.569874 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/959c2393-e914-4c10-a18f-b30fcf012d19-serving-cert\") pod \"controller-manager-55d786cb4c-cqkbt\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:34:59.570024 master-0 kubenswrapper[28758]: I0223 14:34:59.569898 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-state-metrics-tls\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:34:59.570024 master-0 kubenswrapper[28758]: I0223 14:34:59.569962 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9cf1c39-24f0-420b-8020-089616d1cdf0-config\") pod \"openshift-kube-scheduler-operator-77cd4d9559-qvq8x\" (UID: \"b9cf1c39-24f0-420b-8020-089616d1cdf0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" Feb 23 14:34:59.570024 master-0 kubenswrapper[28758]: I0223 14:34:59.570001 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/0315476e-7140-4777-8061-9cead4c92024-tmpfs\") pod \"packageserver-65c9585877-m66zh\" (UID: \"0315476e-7140-4777-8061-9cead4c92024\") " pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" Feb 23 14:34:59.570164 master-0 kubenswrapper[28758]: I0223 14:34:59.570029 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad0f0d72-0337-4347-bb50-e299a175f3ca-bound-sa-token\") pod \"cluster-image-registry-operator-779979bdf7-ml2d7\" (UID: \"ad0f0d72-0337-4347-bb50-e299a175f3ca\") " pod="openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7" Feb 23 14:34:59.570164 master-0 kubenswrapper[28758]: I0223 14:34:59.570079 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/709ac071-4392-4a3f-a3d1-4bc8ba2f6236-signing-cabundle\") pod \"service-ca-576b4d78bd-lq6ct\" (UID: \"709ac071-4392-4a3f-a3d1-4bc8ba2f6236\") " pod="openshift-service-ca/service-ca-576b4d78bd-lq6ct" Feb 23 14:34:59.570164 master-0 kubenswrapper[28758]: I0223 14:34:59.570107 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/709ac071-4392-4a3f-a3d1-4bc8ba2f6236-signing-key\") pod \"service-ca-576b4d78bd-lq6ct\" (UID: \"709ac071-4392-4a3f-a3d1-4bc8ba2f6236\") " pod="openshift-service-ca/service-ca-576b4d78bd-lq6ct" Feb 23 14:34:59.570164 master-0 kubenswrapper[28758]: I0223 14:34:59.570114 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/0315476e-7140-4777-8061-9cead4c92024-tmpfs\") pod \"packageserver-65c9585877-m66zh\" (UID: \"0315476e-7140-4777-8061-9cead4c92024\") " pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" Feb 23 14:34:59.570164 master-0 kubenswrapper[28758]: I0223 14:34:59.570132 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/674041a2-e2b0-4286-88cc-f1b00571e3f3-host-etc-kube\") pod \"network-operator-7d7db75979-x4qnw\" (UID: \"674041a2-e2b0-4286-88cc-f1b00571e3f3\") " pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" Feb 23 14:34:59.570164 master-0 kubenswrapper[28758]: I0223 14:34:59.570143 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9cf1c39-24f0-420b-8020-089616d1cdf0-config\") pod \"openshift-kube-scheduler-operator-77cd4d9559-qvq8x\" (UID: \"b9cf1c39-24f0-420b-8020-089616d1cdf0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" Feb 23 14:34:59.570916 master-0 kubenswrapper[28758]: I0223 14:34:59.570169 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/66c72c71-f74a-43ab-bf0d-1f4c93623774-catalogserver-certs\") pod \"catalogd-controller-manager-84b8d9d697-2hr5s\" (UID: \"66c72c71-f74a-43ab-bf0d-1f4c93623774\") " pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:34:59.570916 master-0 kubenswrapper[28758]: I0223 14:34:59.570197 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-run-k8s-cni-cncf-io\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.570916 master-0 kubenswrapper[28758]: I0223 14:34:59.570226 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/483786a0-0a29-44bf-bbd0-2f37e045aa2c-cni-binary-copy\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:34:59.570916 master-0 kubenswrapper[28758]: I0223 14:34:59.570246 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/709ac071-4392-4a3f-a3d1-4bc8ba2f6236-signing-cabundle\") pod \"service-ca-576b4d78bd-lq6ct\" (UID: \"709ac071-4392-4a3f-a3d1-4bc8ba2f6236\") " pod="openshift-service-ca/service-ca-576b4d78bd-lq6ct" Feb 23 14:34:59.570916 master-0 kubenswrapper[28758]: I0223 14:34:59.570257 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92c63c95-e880-4f51-9858-7715343f7bd8-serving-cert\") pod \"openshift-config-operator-6f47d587d6-55qjr\" (UID: \"92c63c95-e880-4f51-9858-7715343f7bd8\") " pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" Feb 23 14:34:59.570916 master-0 kubenswrapper[28758]: I0223 14:34:59.570286 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv5nj\" (UniqueName: \"kubernetes.io/projected/4373687a-61a0-434b-81f7-3fecaa1494ef-kube-api-access-wv5nj\") pod \"control-plane-machine-set-operator-686847ff5f-9q266\" (UID: \"4373687a-61a0-434b-81f7-3fecaa1494ef\") " pod="openshift-machine-api/control-plane-machine-set-operator-686847ff5f-9q266" Feb 23 14:34:59.570916 master-0 kubenswrapper[28758]: I0223 14:34:59.570321 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/af950a67-1557-4352-8100-27281bb8ecbe-auth-proxy-config\") pod \"machine-config-operator-7f8c75f984-rdjxr\" (UID: \"af950a67-1557-4352-8100-27281bb8ecbe\") " pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" Feb 23 14:34:59.570916 master-0 kubenswrapper[28758]: I0223 14:34:59.570348 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-cni-dir\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.570916 master-0 kubenswrapper[28758]: I0223 14:34:59.570370 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/483786a0-0a29-44bf-bbd0-2f37e045aa2c-cni-binary-copy\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:34:59.570916 master-0 kubenswrapper[28758]: I0223 14:34:59.570378 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/709ac071-4392-4a3f-a3d1-4bc8ba2f6236-signing-key\") pod \"service-ca-576b4d78bd-lq6ct\" (UID: \"709ac071-4392-4a3f-a3d1-4bc8ba2f6236\") " pod="openshift-service-ca/service-ca-576b4d78bd-lq6ct" Feb 23 14:34:59.570916 master-0 kubenswrapper[28758]: I0223 14:34:59.570456 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44599\" (UniqueName: \"kubernetes.io/projected/0cebb80d-d898-44c8-82b3-1e18833cee3f-kube-api-access-44599\") pod \"olm-operator-5499d7f7bb-t45zz\" (UID: \"0cebb80d-d898-44c8-82b3-1e18833cee3f\") " pod="openshift-operator-lifecycle-manager/olm-operator-5499d7f7bb-t45zz" Feb 23 14:34:59.570916 master-0 kubenswrapper[28758]: I0223 14:34:59.570525 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad0f0d72-0337-4347-bb50-e299a175f3ca-image-registry-operator-tls\") pod \"cluster-image-registry-operator-779979bdf7-ml2d7\" (UID: \"ad0f0d72-0337-4347-bb50-e299a175f3ca\") " pod="openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7" Feb 23 14:34:59.570916 master-0 kubenswrapper[28758]: I0223 14:34:59.570556 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-metrics-client-ca\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:34:59.570916 master-0 kubenswrapper[28758]: I0223 14:34:59.570589 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-modprobe-d\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.570916 master-0 kubenswrapper[28758]: I0223 14:34:59.570635 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bbe678de-546d-49d0-8280-3f6d94fa5e4f-env-overrides\") pod \"network-node-identity-td489\" (UID: \"bbe678de-546d-49d0-8280-3f6d94fa5e4f\") " pod="openshift-network-node-identity/network-node-identity-td489" Feb 23 14:34:59.570916 master-0 kubenswrapper[28758]: I0223 14:34:59.570682 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-secret-metrics-client-certs\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:34:59.570916 master-0 kubenswrapper[28758]: I0223 14:34:59.570705 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9416f5d0-32b4-4065-b678-26913af8b6dd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:34:59.570916 master-0 kubenswrapper[28758]: I0223 14:34:59.570731 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-daemon-config\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.570916 master-0 kubenswrapper[28758]: I0223 14:34:59.570760 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-proxy-ca-bundles\") pod \"controller-manager-55d786cb4c-cqkbt\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:34:59.570916 master-0 kubenswrapper[28758]: I0223 14:34:59.570765 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bbe678de-546d-49d0-8280-3f6d94fa5e4f-env-overrides\") pod \"network-node-identity-td489\" (UID: \"bbe678de-546d-49d0-8280-3f6d94fa5e4f\") " pod="openshift-network-node-identity/network-node-identity-td489" Feb 23 14:34:59.570916 master-0 kubenswrapper[28758]: I0223 14:34:59.570804 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-var-lib-kubelet\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.570916 master-0 kubenswrapper[28758]: I0223 14:34:59.570840 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-trusted-ca-bundle\") pod \"insights-operator-59b498fcfb-rz897\" (UID: \"ae4baa4e-4ef4-433d-aa36-149e92fa6ee2\") " pod="openshift-insights/insights-operator-59b498fcfb-rz897" Feb 23 14:34:59.570916 master-0 kubenswrapper[28758]: I0223 14:34:59.570876 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-cni-bin\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.570916 master-0 kubenswrapper[28758]: I0223 14:34:59.570906 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-cnibin\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.570916 master-0 kubenswrapper[28758]: I0223 14:34:59.570927 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-textfile\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:34:59.571781 master-0 kubenswrapper[28758]: I0223 14:34:59.570946 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c84f66f0-207e-436a-8f4e-d1971fa815eb-utilities\") pod \"redhat-operators-tl6dk\" (UID: \"c84f66f0-207e-436a-8f4e-d1971fa815eb\") " pod="openshift-marketplace/redhat-operators-tl6dk" Feb 23 14:34:59.571781 master-0 kubenswrapper[28758]: I0223 14:34:59.570976 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-daemon-config\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.571781 master-0 kubenswrapper[28758]: I0223 14:34:59.571014 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x6q2\" (UniqueName: \"kubernetes.io/projected/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-kube-api-access-9x6q2\") pod \"cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb\" (UID: \"172d47fd-e1a1-4d77-9e31-c4f22e824d5f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" Feb 23 14:34:59.571781 master-0 kubenswrapper[28758]: I0223 14:34:59.571060 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-etc-kubernetes\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.571781 master-0 kubenswrapper[28758]: I0223 14:34:59.571067 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-textfile\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:34:59.571781 master-0 kubenswrapper[28758]: I0223 14:34:59.571081 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1c60ff3f-2bb1-422e-be27-5eca96d85fd2-etc-docker\") pod \"operator-controller-controller-manager-9cc7d7bb-6zmk9\" (UID: \"1c60ff3f-2bb1-422e-be27-5eca96d85fd2\") " pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:34:59.571781 master-0 kubenswrapper[28758]: I0223 14:34:59.571103 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/08c561b3-613b-425f-9de4-d5fc8762ea51-iptables-alerter-script\") pod \"iptables-alerter-t5h8h\" (UID: \"08c561b3-613b-425f-9de4-d5fc8762ea51\") " pod="openshift-network-operator/iptables-alerter-t5h8h" Feb 23 14:34:59.571781 master-0 kubenswrapper[28758]: I0223 14:34:59.571103 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c84f66f0-207e-436a-8f4e-d1971fa815eb-utilities\") pod \"redhat-operators-tl6dk\" (UID: \"c84f66f0-207e-436a-8f4e-d1971fa815eb\") " pod="openshift-marketplace/redhat-operators-tl6dk" Feb 23 14:34:59.571781 master-0 kubenswrapper[28758]: I0223 14:34:59.571376 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/08c561b3-613b-425f-9de4-d5fc8762ea51-iptables-alerter-script\") pod \"iptables-alerter-t5h8h\" (UID: \"08c561b3-613b-425f-9de4-d5fc8762ea51\") " pod="openshift-network-operator/iptables-alerter-t5h8h" Feb 23 14:34:59.579678 master-0 kubenswrapper[28758]: I0223 14:34:59.579644 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Feb 23 14:34:59.599336 master-0 kubenswrapper[28758]: I0223 14:34:59.599294 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 23 14:34:59.619793 master-0 kubenswrapper[28758]: I0223 14:34:59.619750 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Feb 23 14:34:59.620874 master-0 kubenswrapper[28758]: I0223 14:34:59.620823 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/66c72c71-f74a-43ab-bf0d-1f4c93623774-catalogserver-certs\") pod \"catalogd-controller-manager-84b8d9d697-2hr5s\" (UID: \"66c72c71-f74a-43ab-bf0d-1f4c93623774\") " pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:34:59.644277 master-0 kubenswrapper[28758]: I0223 14:34:59.644211 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Feb 23 14:34:59.659624 master-0 kubenswrapper[28758]: I0223 14:34:59.659428 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Feb 23 14:34:59.660845 master-0 kubenswrapper[28758]: I0223 14:34:59.660808 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Feb 23 14:34:59.661980 master-0 kubenswrapper[28758]: I0223 14:34:59.661944 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/66c72c71-f74a-43ab-bf0d-1f4c93623774-ca-certs\") pod \"catalogd-controller-manager-84b8d9d697-2hr5s\" (UID: \"66c72c71-f74a-43ab-bf0d-1f4c93623774\") " pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:34:59.670514 master-0 kubenswrapper[28758]: I0223 14:34:59.670458 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Feb 23 14:34:59.672825 master-0 kubenswrapper[28758]: I0223 14:34:59.672785 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb\" (UID: \"172d47fd-e1a1-4d77-9e31-c4f22e824d5f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" Feb 23 14:34:59.672899 master-0 kubenswrapper[28758]: I0223 14:34:59.672865 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-lib-modules\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.673027 master-0 kubenswrapper[28758]: I0223 14:34:59.673000 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-lib-modules\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.673089 master-0 kubenswrapper[28758]: I0223 14:34:59.673015 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb\" (UID: \"172d47fd-e1a1-4d77-9e31-c4f22e824d5f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" Feb 23 14:34:59.673089 master-0 kubenswrapper[28758]: I0223 14:34:59.673033 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-conf-dir\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.673187 master-0 kubenswrapper[28758]: I0223 14:34:59.673084 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-conf-dir\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.673238 master-0 kubenswrapper[28758]: I0223 14:34:59.673189 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:59.673289 master-0 kubenswrapper[28758]: I0223 14:34:59.673253 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-run-k8s-cni-cncf-io\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.673344 master-0 kubenswrapper[28758]: I0223 14:34:59.673298 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-run-k8s-cni-cncf-io\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.673393 master-0 kubenswrapper[28758]: I0223 14:34:59.673344 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/674041a2-e2b0-4286-88cc-f1b00571e3f3-host-etc-kube\") pod \"network-operator-7d7db75979-x4qnw\" (UID: \"674041a2-e2b0-4286-88cc-f1b00571e3f3\") " pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" Feb 23 14:34:59.673393 master-0 kubenswrapper[28758]: I0223 14:34:59.673375 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-cni-dir\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.673393 master-0 kubenswrapper[28758]: I0223 14:34:59.673383 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:59.673541 master-0 kubenswrapper[28758]: I0223 14:34:59.673441 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/674041a2-e2b0-4286-88cc-f1b00571e3f3-host-etc-kube\") pod \"network-operator-7d7db75979-x4qnw\" (UID: \"674041a2-e2b0-4286-88cc-f1b00571e3f3\") " pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" Feb 23 14:34:59.673607 master-0 kubenswrapper[28758]: I0223 14:34:59.673551 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-cni-dir\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.673607 master-0 kubenswrapper[28758]: I0223 14:34:59.673553 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-modprobe-d\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.673692 master-0 kubenswrapper[28758]: I0223 14:34:59.673609 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-var-lib-kubelet\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.673692 master-0 kubenswrapper[28758]: I0223 14:34:59.673668 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-cni-bin\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.673779 master-0 kubenswrapper[28758]: I0223 14:34:59.673695 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-cnibin\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.673779 master-0 kubenswrapper[28758]: I0223 14:34:59.673684 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-var-lib-kubelet\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.673779 master-0 kubenswrapper[28758]: I0223 14:34:59.673694 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-modprobe-d\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.673779 master-0 kubenswrapper[28758]: I0223 14:34:59.673735 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-cni-bin\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.673779 master-0 kubenswrapper[28758]: I0223 14:34:59.673761 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1c60ff3f-2bb1-422e-be27-5eca96d85fd2-etc-docker\") pod \"operator-controller-controller-manager-9cc7d7bb-6zmk9\" (UID: \"1c60ff3f-2bb1-422e-be27-5eca96d85fd2\") " pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:34:59.673994 master-0 kubenswrapper[28758]: I0223 14:34:59.673801 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-etc-kubernetes\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.673994 master-0 kubenswrapper[28758]: I0223 14:34:59.673818 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/1c60ff3f-2bb1-422e-be27-5eca96d85fd2-etc-docker\") pod \"operator-controller-controller-manager-9cc7d7bb-6zmk9\" (UID: \"1c60ff3f-2bb1-422e-be27-5eca96d85fd2\") " pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:34:59.673994 master-0 kubenswrapper[28758]: I0223 14:34:59.673766 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-cnibin\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.673994 master-0 kubenswrapper[28758]: I0223 14:34:59.673840 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access\") pod \"installer-4-master-0\" (UID: \"e1148263-7b15-4c12-a217-8b030ecd9348\") " pod="openshift-kube-apiserver/installer-4-master-0" Feb 23 14:34:59.673994 master-0 kubenswrapper[28758]: I0223 14:34:59.673852 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-etc-kubernetes\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.673994 master-0 kubenswrapper[28758]: I0223 14:34:59.673883 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/181adc3f4810f127b44f3750f5d2460c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"181adc3f4810f127b44f3750f5d2460c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:34:59.673994 master-0 kubenswrapper[28758]: I0223 14:34:59.673918 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-systemd\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.673994 master-0 kubenswrapper[28758]: I0223 14:34:59.673940 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-run-ovn\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.673994 master-0 kubenswrapper[28758]: I0223 14:34:59.673964 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/181adc3f4810f127b44f3750f5d2460c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"181adc3f4810f127b44f3750f5d2460c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:34:59.673994 master-0 kubenswrapper[28758]: I0223 14:34:59.673990 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-run-ovn\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.674384 master-0 kubenswrapper[28758]: I0223 14:34:59.674028 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/588a804a-430a-47f4-aa97-c08e907239da-node-pullsecrets\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:34:59.674384 master-0 kubenswrapper[28758]: I0223 14:34:59.674065 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/588a804a-430a-47f4-aa97-c08e907239da-node-pullsecrets\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:34:59.674384 master-0 kubenswrapper[28758]: I0223 14:34:59.674082 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-etc-openvswitch\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.674384 master-0 kubenswrapper[28758]: I0223 14:34:59.674001 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-systemd\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.674384 master-0 kubenswrapper[28758]: I0223 14:34:59.674112 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c997c8e9d3be51d454d8e61e376bef08-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"c997c8e9d3be51d454d8e61e376bef08\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Feb 23 14:34:59.674384 master-0 kubenswrapper[28758]: I0223 14:34:59.674128 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-run-multus-certs\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.674384 master-0 kubenswrapper[28758]: I0223 14:34:59.674152 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c997c8e9d3be51d454d8e61e376bef08-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"c997c8e9d3be51d454d8e61e376bef08\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Feb 23 14:34:59.674384 master-0 kubenswrapper[28758]: I0223 14:34:59.674151 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-etc-openvswitch\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.674384 master-0 kubenswrapper[28758]: I0223 14:34:59.674184 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08c561b3-613b-425f-9de4-d5fc8762ea51-host-slash\") pod \"iptables-alerter-t5h8h\" (UID: \"08c561b3-613b-425f-9de4-d5fc8762ea51\") " pod="openshift-network-operator/iptables-alerter-t5h8h" Feb 23 14:34:59.674384 master-0 kubenswrapper[28758]: I0223 14:34:59.674214 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:59.674384 master-0 kubenswrapper[28758]: I0223 14:34:59.674257 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08c561b3-613b-425f-9de4-d5fc8762ea51-host-slash\") pod \"iptables-alerter-t5h8h\" (UID: \"08c561b3-613b-425f-9de4-d5fc8762ea51\") " pod="openshift-network-operator/iptables-alerter-t5h8h" Feb 23 14:34:59.674384 master-0 kubenswrapper[28758]: I0223 14:34:59.674278 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-run-multus-certs\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.674384 master-0 kubenswrapper[28758]: I0223 14:34:59.674295 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:59.674384 master-0 kubenswrapper[28758]: I0223 14:34:59.674314 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-resource-dir\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:34:59.674384 master-0 kubenswrapper[28758]: I0223 14:34:59.674331 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-resource-dir\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:34:59.674384 master-0 kubenswrapper[28758]: I0223 14:34:59.674372 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-sysctl-conf\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.675207 master-0 kubenswrapper[28758]: I0223 14:34:59.674407 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/66c72c71-f74a-43ab-bf0d-1f4c93623774-etc-containers\") pod \"catalogd-controller-manager-84b8d9d697-2hr5s\" (UID: \"66c72c71-f74a-43ab-bf0d-1f4c93623774\") " pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:34:59.675207 master-0 kubenswrapper[28758]: I0223 14:34:59.674496 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/66c72c71-f74a-43ab-bf0d-1f4c93623774-etc-containers\") pod \"catalogd-controller-manager-84b8d9d697-2hr5s\" (UID: \"66c72c71-f74a-43ab-bf0d-1f4c93623774\") " pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:34:59.675207 master-0 kubenswrapper[28758]: I0223 14:34:59.674571 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/959c75833224b4ba3fa488b77d8f5032-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"959c75833224b4ba3fa488b77d8f5032\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:34:59.675207 master-0 kubenswrapper[28758]: I0223 14:34:59.674590 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-sysctl-conf\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.675207 master-0 kubenswrapper[28758]: I0223 14:34:59.674608 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/959c75833224b4ba3fa488b77d8f5032-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"959c75833224b4ba3fa488b77d8f5032\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:34:59.675207 master-0 kubenswrapper[28758]: I0223 14:34:59.674614 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.675207 master-0 kubenswrapper[28758]: I0223 14:34:59.674637 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.675207 master-0 kubenswrapper[28758]: I0223 14:34:59.674647 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/959c75833224b4ba3fa488b77d8f5032-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"959c75833224b4ba3fa488b77d8f5032\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:34:59.675207 master-0 kubenswrapper[28758]: I0223 14:34:59.674674 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/959c75833224b4ba3fa488b77d8f5032-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"959c75833224b4ba3fa488b77d8f5032\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:34:59.675207 master-0 kubenswrapper[28758]: I0223 14:34:59.674678 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-run\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.675207 master-0 kubenswrapper[28758]: I0223 14:34:59.674702 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b9774f8c-0f29-46d8-be77-81bcf74d5994-etc-ssl-certs\") pod \"cluster-version-operator-57476485-m58rm\" (UID: \"b9774f8c-0f29-46d8-be77-81bcf74d5994\") " pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" Feb 23 14:34:59.675207 master-0 kubenswrapper[28758]: I0223 14:34:59.674884 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-run-systemd\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.675207 master-0 kubenswrapper[28758]: I0223 14:34:59.674901 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-run\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.675207 master-0 kubenswrapper[28758]: I0223 14:34:59.674972 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b9774f8c-0f29-46d8-be77-81bcf74d5994-etc-ssl-certs\") pod \"cluster-version-operator-57476485-m58rm\" (UID: \"b9774f8c-0f29-46d8-be77-81bcf74d5994\") " pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" Feb 23 14:34:59.675207 master-0 kubenswrapper[28758]: I0223 14:34:59.675055 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d03a1e6620a92c780b0a91c72a55bc8b-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"d03a1e6620a92c780b0a91c72a55bc8b\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Feb 23 14:34:59.675207 master-0 kubenswrapper[28758]: I0223 14:34:59.675095 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-run-systemd\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.675207 master-0 kubenswrapper[28758]: I0223 14:34:59.675118 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-slash\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.675207 master-0 kubenswrapper[28758]: I0223 14:34:59.675140 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-slash\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.675207 master-0 kubenswrapper[28758]: I0223 14:34:59.675171 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d03a1e6620a92c780b0a91c72a55bc8b-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"d03a1e6620a92c780b0a91c72a55bc8b\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Feb 23 14:34:59.675207 master-0 kubenswrapper[28758]: I0223 14:34:59.675183 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-var-lib-kubelet\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.675207 master-0 kubenswrapper[28758]: I0223 14:34:59.675179 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-var-lib-kubelet\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.676069 master-0 kubenswrapper[28758]: I0223 14:34:59.675260 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e1148263-7b15-4c12-a217-8b030ecd9348-var-lock\") pod \"installer-4-master-0\" (UID: \"e1148263-7b15-4c12-a217-8b030ecd9348\") " pod="openshift-kube-apiserver/installer-4-master-0" Feb 23 14:34:59.676069 master-0 kubenswrapper[28758]: I0223 14:34:59.675310 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e1148263-7b15-4c12-a217-8b030ecd9348-var-lock\") pod \"installer-4-master-0\" (UID: \"e1148263-7b15-4c12-a217-8b030ecd9348\") " pod="openshift-kube-apiserver/installer-4-master-0" Feb 23 14:34:59.676069 master-0 kubenswrapper[28758]: I0223 14:34:59.675315 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-sysconfig\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.676069 master-0 kubenswrapper[28758]: I0223 14:34:59.675367 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-sysconfig\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.676069 master-0 kubenswrapper[28758]: I0223 14:34:59.675426 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/76c67569-3a72-4de9-87cd-432a4607b15b-rootfs\") pod \"machine-config-daemon-fhcgg\" (UID: \"76c67569-3a72-4de9-87cd-432a4607b15b\") " pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" Feb 23 14:34:59.676069 master-0 kubenswrapper[28758]: I0223 14:34:59.675546 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-system-cni-dir\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:34:59.676069 master-0 kubenswrapper[28758]: I0223 14:34:59.675588 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:34:59.676069 master-0 kubenswrapper[28758]: I0223 14:34:59.675661 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/76c67569-3a72-4de9-87cd-432a4607b15b-rootfs\") pod \"machine-config-daemon-fhcgg\" (UID: \"76c67569-3a72-4de9-87cd-432a4607b15b\") " pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" Feb 23 14:34:59.676069 master-0 kubenswrapper[28758]: I0223 14:34:59.675677 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:34:59.676069 master-0 kubenswrapper[28758]: I0223 14:34:59.675693 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-system-cni-dir\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.676069 master-0 kubenswrapper[28758]: I0223 14:34:59.675735 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-system-cni-dir\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:34:59.676069 master-0 kubenswrapper[28758]: I0223 14:34:59.675762 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-system-cni-dir\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.676069 master-0 kubenswrapper[28758]: I0223 14:34:59.675871 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-run-netns\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.676069 master-0 kubenswrapper[28758]: I0223 14:34:59.675930 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:59.676069 master-0 kubenswrapper[28758]: I0223 14:34:59.675955 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-data-dir\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:34:59.676069 master-0 kubenswrapper[28758]: I0223 14:34:59.675970 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-run-netns\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.676069 master-0 kubenswrapper[28758]: I0223 14:34:59.676020 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-data-dir\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:34:59.676069 master-0 kubenswrapper[28758]: I0223 14:34:59.676063 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:59.676886 master-0 kubenswrapper[28758]: I0223 14:34:59.676119 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:59.676886 master-0 kubenswrapper[28758]: I0223 14:34:59.676157 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-socket-dir-parent\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.676886 master-0 kubenswrapper[28758]: I0223 14:34:59.676189 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1148263-7b15-4c12-a217-8b030ecd9348-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"e1148263-7b15-4c12-a217-8b030ecd9348\") " pod="openshift-kube-apiserver/installer-4-master-0" Feb 23 14:34:59.676886 master-0 kubenswrapper[28758]: I0223 14:34:59.676235 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:59.676886 master-0 kubenswrapper[28758]: I0223 14:34:59.676252 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-multus-socket-dir-parent\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.676886 master-0 kubenswrapper[28758]: I0223 14:34:59.676265 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-os-release\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:34:59.676886 master-0 kubenswrapper[28758]: I0223 14:34:59.676357 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-sys\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.676886 master-0 kubenswrapper[28758]: I0223 14:34:59.676378 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:59.676886 master-0 kubenswrapper[28758]: I0223 14:34:59.676392 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1148263-7b15-4c12-a217-8b030ecd9348-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"e1148263-7b15-4c12-a217-8b030ecd9348\") " pod="openshift-kube-apiserver/installer-4-master-0" Feb 23 14:34:59.676886 master-0 kubenswrapper[28758]: I0223 14:34:59.676400 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-run-openvswitch\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.676886 master-0 kubenswrapper[28758]: I0223 14:34:59.676427 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-run-openvswitch\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.676886 master-0 kubenswrapper[28758]: I0223 14:34:59.676434 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:34:59.676886 master-0 kubenswrapper[28758]: I0223 14:34:59.676448 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-host\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.676886 master-0 kubenswrapper[28758]: I0223 14:34:59.676463 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-sys\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.676886 master-0 kubenswrapper[28758]: I0223 14:34:59.676468 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-log-socket\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.676886 master-0 kubenswrapper[28758]: I0223 14:34:59.676509 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-log-socket\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.676886 master-0 kubenswrapper[28758]: I0223 14:34:59.676527 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-os-release\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:34:59.676886 master-0 kubenswrapper[28758]: I0223 14:34:59.676536 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-host\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.676886 master-0 kubenswrapper[28758]: I0223 14:34:59.676570 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-run-ovn-kubernetes\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.676886 master-0 kubenswrapper[28758]: I0223 14:34:59.676617 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-os-release\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.676886 master-0 kubenswrapper[28758]: I0223 14:34:59.676656 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-var-lib-cni-multus\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.676886 master-0 kubenswrapper[28758]: I0223 14:34:59.676669 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-os-release\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.676886 master-0 kubenswrapper[28758]: I0223 14:34:59.676615 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-run-ovn-kubernetes\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.676886 master-0 kubenswrapper[28758]: I0223 14:34:59.676757 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/181adc3f4810f127b44f3750f5d2460c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"181adc3f4810f127b44f3750f5d2460c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:34:59.676886 master-0 kubenswrapper[28758]: I0223 14:34:59.676781 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-var-lib-cni-multus\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.676886 master-0 kubenswrapper[28758]: I0223 14:34:59.676786 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-sysctl-d\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.676886 master-0 kubenswrapper[28758]: I0223 14:34:59.676811 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-sysctl-d\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.676886 master-0 kubenswrapper[28758]: I0223 14:34:59.676835 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/181adc3f4810f127b44f3750f5d2460c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"181adc3f4810f127b44f3750f5d2460c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:34:59.676886 master-0 kubenswrapper[28758]: I0223 14:34:59.676855 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d03a1e6620a92c780b0a91c72a55bc8b-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"d03a1e6620a92c780b0a91c72a55bc8b\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Feb 23 14:34:59.676886 master-0 kubenswrapper[28758]: I0223 14:34:59.676894 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-log-dir\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:34:59.676886 master-0 kubenswrapper[28758]: I0223 14:34:59.676909 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d03a1e6620a92c780b0a91c72a55bc8b-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"d03a1e6620a92c780b0a91c72a55bc8b\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Feb 23 14:34:59.676886 master-0 kubenswrapper[28758]: I0223 14:34:59.676914 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-kubelet\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.676935 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-log-dir\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.676946 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b9774f8c-0f29-46d8-be77-81bcf74d5994-etc-cvo-updatepayloads\") pod \"cluster-version-operator-57476485-m58rm\" (UID: \"b9774f8c-0f29-46d8-be77-81bcf74d5994\") " pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.676959 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-kubelet\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.677016 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1c60ff3f-2bb1-422e-be27-5eca96d85fd2-etc-containers\") pod \"operator-controller-controller-manager-9cc7d7bb-6zmk9\" (UID: \"1c60ff3f-2bb1-422e-be27-5eca96d85fd2\") " pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.677073 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/1c60ff3f-2bb1-422e-be27-5eca96d85fd2-etc-containers\") pod \"operator-controller-controller-manager-9cc7d7bb-6zmk9\" (UID: \"1c60ff3f-2bb1-422e-be27-5eca96d85fd2\") " pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.677092 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b9774f8c-0f29-46d8-be77-81bcf74d5994-etc-cvo-updatepayloads\") pod \"cluster-version-operator-57476485-m58rm\" (UID: \"b9774f8c-0f29-46d8-be77-81bcf74d5994\") " pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.677146 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/66c72c71-f74a-43ab-bf0d-1f4c93623774-etc-docker\") pod \"catalogd-controller-manager-84b8d9d697-2hr5s\" (UID: \"66c72c71-f74a-43ab-bf0d-1f4c93623774\") " pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.677168 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-var-lib-openvswitch\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.677208 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c997c8e9d3be51d454d8e61e376bef08-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"c997c8e9d3be51d454d8e61e376bef08\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.677231 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-static-pod-dir\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.677246 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/66c72c71-f74a-43ab-bf0d-1f4c93623774-etc-docker\") pod \"catalogd-controller-manager-84b8d9d697-2hr5s\" (UID: \"66c72c71-f74a-43ab-bf0d-1f4c93623774\") " pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.677276 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c997c8e9d3be51d454d8e61e376bef08-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"c997c8e9d3be51d454d8e61e376bef08\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.677288 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-wtmp\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.677293 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-var-lib-openvswitch\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.677314 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ea0b3538-9a7d-4995-b628-2d63f21d683c-audit-dir\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.677346 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-static-pod-dir\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.677358 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-wtmp\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.677365 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ea0b3538-9a7d-4995-b628-2d63f21d683c-audit-dir\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.677389 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/15ad7f4e-44c6-4426-8b97-c47a47786544-sys\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.677453 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/15ad7f4e-44c6-4426-8b97-c47a47786544-sys\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.677451 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-cert-dir\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.677505 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/15ad7f4e-44c6-4426-8b97-c47a47786544-root\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.677550 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-cert-dir\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.677532 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-run-netns\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.677585 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-run-netns\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.677623 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-cni-netd\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.677625 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/15ad7f4e-44c6-4426-8b97-c47a47786544-root\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.677666 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-host-cni-netd\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.677746 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/588a804a-430a-47f4-aa97-c08e907239da-audit-dir\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.677791 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/588a804a-430a-47f4-aa97-c08e907239da-audit-dir\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.677871 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-usr-local-bin\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.677956 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/18a83278819db2092fa26d8274eb3f00-usr-local-bin\") pod \"etcd-master-0\" (UID: \"18a83278819db2092fa26d8274eb3f00\") " pod="openshift-etcd/etcd-master-0" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.677964 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2f876e5d-2e82-47d0-8a9c-adacf2bddf77-hosts-file\") pod \"node-resolver-7b6jk\" (UID: \"2f876e5d-2e82-47d0-8a9c-adacf2bddf77\") " pod="openshift-dns/node-resolver-7b6jk" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.678018 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-var-lib-cni-bin\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.678055 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-cnibin\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.678065 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2f876e5d-2e82-47d0-8a9c-adacf2bddf77-hosts-file\") pod \"node-resolver-7b6jk\" (UID: \"2f876e5d-2e82-47d0-8a9c-adacf2bddf77\") " pod="openshift-dns/node-resolver-7b6jk" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.678119 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/483786a0-0a29-44bf-bbd0-2f37e045aa2c-cnibin\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:34:59.678149 master-0 kubenswrapper[28758]: I0223 14:34:59.678161 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-host-var-lib-cni-bin\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.679712 master-0 kubenswrapper[28758]: I0223 14:34:59.678261 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-kubernetes\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.679712 master-0 kubenswrapper[28758]: I0223 14:34:59.678295 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b558268-2262-4593-893e-408639a9987d-etc-kubernetes\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:34:59.679712 master-0 kubenswrapper[28758]: I0223 14:34:59.678338 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-hostroot\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.679712 master-0 kubenswrapper[28758]: I0223 14:34:59.678389 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/09d80e28-0b64-4c5d-a9bc-99d843d40165-hostroot\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:34:59.679712 master-0 kubenswrapper[28758]: I0223 14:34:59.678407 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-systemd-units\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.679712 master-0 kubenswrapper[28758]: I0223 14:34:59.678442 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/959c75833224b4ba3fa488b77d8f5032-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"959c75833224b4ba3fa488b77d8f5032\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:34:59.679712 master-0 kubenswrapper[28758]: I0223 14:34:59.678505 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-systemd-units\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.679712 master-0 kubenswrapper[28758]: I0223 14:34:59.678596 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/959c75833224b4ba3fa488b77d8f5032-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"959c75833224b4ba3fa488b77d8f5032\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:34:59.679712 master-0 kubenswrapper[28758]: I0223 14:34:59.678634 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-node-log\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.679712 master-0 kubenswrapper[28758]: I0223 14:34:59.678731 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f10f592e-5738-4879-b776-246b357d4621-node-log\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:34:59.679712 master-0 kubenswrapper[28758]: I0223 14:34:59.679335 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 23 14:34:59.683753 master-0 kubenswrapper[28758]: I0223 14:34:59.683722 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/588a804a-430a-47f4-aa97-c08e907239da-etcd-client\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:34:59.700231 master-0 kubenswrapper[28758]: I0223 14:34:59.700198 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 23 14:34:59.706635 master-0 kubenswrapper[28758]: I0223 14:34:59.706556 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/588a804a-430a-47f4-aa97-c08e907239da-serving-cert\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:34:59.719809 master-0 kubenswrapper[28758]: I0223 14:34:59.719758 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 23 14:34:59.728897 master-0 kubenswrapper[28758]: I0223 14:34:59.728817 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/588a804a-430a-47f4-aa97-c08e907239da-encryption-config\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:34:59.739516 master-0 kubenswrapper[28758]: I0223 14:34:59.739468 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 23 14:34:59.741036 master-0 kubenswrapper[28758]: I0223 14:34:59.740989 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Feb 23 14:34:59.741529 master-0 kubenswrapper[28758]: I0223 14:34:59.741425 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:34:59.741710 master-0 kubenswrapper[28758]: I0223 14:34:59.741635 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:34:59.741710 master-0 kubenswrapper[28758]: I0223 14:34:59.741721 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:34:59.741939 master-0 kubenswrapper[28758]: I0223 14:34:59.741747 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:34:59.742350 master-0 kubenswrapper[28758]: I0223 14:34:59.742322 28758 scope.go:117] "RemoveContainer" containerID="7b1c7cc7b2e3d4c1fdbe5b4592355d6abc03a37f10ae5ab746402745b7ae1aa2" Feb 23 14:34:59.742427 master-0 kubenswrapper[28758]: I0223 14:34:59.742374 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:34:59.742567 master-0 kubenswrapper[28758]: I0223 14:34:59.742528 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:34:59.742667 master-0 kubenswrapper[28758]: I0223 14:34:59.742573 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:34:59.742667 master-0 kubenswrapper[28758]: I0223 14:34:59.742589 28758 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:34:59.747667 master-0 kubenswrapper[28758]: I0223 14:34:59.746431 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:34:59.747667 master-0 kubenswrapper[28758]: I0223 14:34:59.747630 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Feb 23 14:34:59.748106 master-0 kubenswrapper[28758]: I0223 14:34:59.748074 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/588a804a-430a-47f4-aa97-c08e907239da-config\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:34:59.749530 master-0 kubenswrapper[28758]: I0223 14:34:59.749445 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:34:59.763891 master-0 kubenswrapper[28758]: I0223 14:34:59.763838 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 23 14:34:59.773967 master-0 kubenswrapper[28758]: I0223 14:34:59.773915 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/588a804a-430a-47f4-aa97-c08e907239da-audit\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:34:59.785349 master-0 kubenswrapper[28758]: I0223 14:34:59.785292 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1148263-7b15-4c12-a217-8b030ecd9348-kubelet-dir\") pod \"e1148263-7b15-4c12-a217-8b030ecd9348\" (UID: \"e1148263-7b15-4c12-a217-8b030ecd9348\") " Feb 23 14:34:59.785695 master-0 kubenswrapper[28758]: I0223 14:34:59.785390 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e1148263-7b15-4c12-a217-8b030ecd9348-var-lock\") pod \"e1148263-7b15-4c12-a217-8b030ecd9348\" (UID: \"e1148263-7b15-4c12-a217-8b030ecd9348\") " Feb 23 14:34:59.785946 master-0 kubenswrapper[28758]: I0223 14:34:59.785869 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1148263-7b15-4c12-a217-8b030ecd9348-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e1148263-7b15-4c12-a217-8b030ecd9348" (UID: "e1148263-7b15-4c12-a217-8b030ecd9348"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:34:59.786333 master-0 kubenswrapper[28758]: I0223 14:34:59.786055 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1148263-7b15-4c12-a217-8b030ecd9348-var-lock" (OuterVolumeSpecName: "var-lock") pod "e1148263-7b15-4c12-a217-8b030ecd9348" (UID: "e1148263-7b15-4c12-a217-8b030ecd9348"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:34:59.786962 master-0 kubenswrapper[28758]: I0223 14:34:59.786896 28758 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e1148263-7b15-4c12-a217-8b030ecd9348-var-lock\") on node \"master-0\" DevicePath \"\"" Feb 23 14:34:59.787033 master-0 kubenswrapper[28758]: I0223 14:34:59.786967 28758 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1148263-7b15-4c12-a217-8b030ecd9348-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:34:59.788088 master-0 kubenswrapper[28758]: I0223 14:34:59.788049 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 23 14:34:59.794536 master-0 kubenswrapper[28758]: I0223 14:34:59.794431 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/588a804a-430a-47f4-aa97-c08e907239da-etcd-serving-ca\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:34:59.800136 master-0 kubenswrapper[28758]: I0223 14:34:59.800097 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 23 14:34:59.809965 master-0 kubenswrapper[28758]: I0223 14:34:59.809901 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/588a804a-430a-47f4-aa97-c08e907239da-image-import-ca\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:34:59.812258 master-0 kubenswrapper[28758]: I0223 14:34:59.812197 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Feb 23 14:34:59.812589 master-0 kubenswrapper[28758]: I0223 14:34:59.812542 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Feb 23 14:34:59.826877 master-0 kubenswrapper[28758]: I0223 14:34:59.826817 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 23 14:34:59.828449 master-0 kubenswrapper[28758]: I0223 14:34:59.828407 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/588a804a-430a-47f4-aa97-c08e907239da-trusted-ca-bundle\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:34:59.840553 master-0 kubenswrapper[28758]: I0223 14:34:59.840503 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 23 14:34:59.860614 master-0 kubenswrapper[28758]: I0223 14:34:59.860554 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 23 14:34:59.879578 master-0 kubenswrapper[28758]: I0223 14:34:59.879525 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 23 14:34:59.900197 master-0 kubenswrapper[28758]: I0223 14:34:59.900104 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 23 14:34:59.906585 master-0 kubenswrapper[28758]: I0223 14:34:59.906527 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ea0b3538-9a7d-4995-b628-2d63f21d683c-encryption-config\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:34:59.920471 master-0 kubenswrapper[28758]: I0223 14:34:59.920420 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 23 14:34:59.924225 master-0 kubenswrapper[28758]: I0223 14:34:59.924178 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ea0b3538-9a7d-4995-b628-2d63f21d683c-etcd-client\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:34:59.940313 master-0 kubenswrapper[28758]: I0223 14:34:59.940275 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 23 14:34:59.944660 master-0 kubenswrapper[28758]: I0223 14:34:59.944619 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea0b3538-9a7d-4995-b628-2d63f21d683c-serving-cert\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:34:59.960296 master-0 kubenswrapper[28758]: I0223 14:34:59.960194 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 23 14:34:59.969792 master-0 kubenswrapper[28758]: I0223 14:34:59.969727 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ea0b3538-9a7d-4995-b628-2d63f21d683c-audit-policies\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:34:59.980856 master-0 kubenswrapper[28758]: I0223 14:34:59.980812 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 23 14:34:59.984515 master-0 kubenswrapper[28758]: I0223 14:34:59.984404 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ea0b3538-9a7d-4995-b628-2d63f21d683c-etcd-serving-ca\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:34:59.999878 master-0 kubenswrapper[28758]: I0223 14:34:59.999601 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 23 14:35:00.004387 master-0 kubenswrapper[28758]: I0223 14:35:00.004326 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea0b3538-9a7d-4995-b628-2d63f21d683c-trusted-ca-bundle\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:35:00.020035 master-0 kubenswrapper[28758]: I0223 14:35:00.019988 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 23 14:35:00.040060 master-0 kubenswrapper[28758]: I0223 14:35:00.039997 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Feb 23 14:35:00.071143 master-0 kubenswrapper[28758]: I0223 14:35:00.070801 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 23 14:35:00.073762 master-0 kubenswrapper[28758]: I0223 14:35:00.073713 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06bde94a-3126-4d0f-baba-49dc5fbec61b-service-ca-bundle\") pod \"router-default-7b65dc9fcb-w68qb\" (UID: \"06bde94a-3126-4d0f-baba-49dc5fbec61b\") " pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:35:00.079803 master-0 kubenswrapper[28758]: I0223 14:35:00.079750 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Feb 23 14:35:00.097009 master-0 kubenswrapper[28758]: I0223 14:35:00.096961 28758 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Feb 23 14:35:00.099782 master-0 kubenswrapper[28758]: I0223 14:35:00.099746 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 23 14:35:00.110188 master-0 kubenswrapper[28758]: I0223 14:35:00.110135 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9774f8c-0f29-46d8-be77-81bcf74d5994-service-ca\") pod \"cluster-version-operator-57476485-m58rm\" (UID: \"b9774f8c-0f29-46d8-be77-81bcf74d5994\") " pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" Feb 23 14:35:00.120105 master-0 kubenswrapper[28758]: I0223 14:35:00.120069 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 23 14:35:00.145269 master-0 kubenswrapper[28758]: I0223 14:35:00.145205 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Feb 23 14:35:00.153501 master-0 kubenswrapper[28758]: I0223 14:35:00.153441 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/1c60ff3f-2bb1-422e-be27-5eca96d85fd2-ca-certs\") pod \"operator-controller-controller-manager-9cc7d7bb-6zmk9\" (UID: \"1c60ff3f-2bb1-422e-be27-5eca96d85fd2\") " pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:35:00.159508 master-0 kubenswrapper[28758]: I0223 14:35:00.159451 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 23 14:35:00.164130 master-0 kubenswrapper[28758]: I0223 14:35:00.164023 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/06bde94a-3126-4d0f-baba-49dc5fbec61b-default-certificate\") pod \"router-default-7b65dc9fcb-w68qb\" (UID: \"06bde94a-3126-4d0f-baba-49dc5fbec61b\") " pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:35:00.181133 master-0 kubenswrapper[28758]: I0223 14:35:00.181075 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 23 14:35:00.189630 master-0 kubenswrapper[28758]: I0223 14:35:00.189562 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9774f8c-0f29-46d8-be77-81bcf74d5994-serving-cert\") pod \"cluster-version-operator-57476485-m58rm\" (UID: \"b9774f8c-0f29-46d8-be77-81bcf74d5994\") " pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" Feb 23 14:35:00.200648 master-0 kubenswrapper[28758]: I0223 14:35:00.200602 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 23 14:35:00.211539 master-0 kubenswrapper[28758]: I0223 14:35:00.211473 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/06bde94a-3126-4d0f-baba-49dc5fbec61b-stats-auth\") pod \"router-default-7b65dc9fcb-w68qb\" (UID: \"06bde94a-3126-4d0f-baba-49dc5fbec61b\") " pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:35:00.221201 master-0 kubenswrapper[28758]: I0223 14:35:00.221127 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Feb 23 14:35:00.229086 master-0 kubenswrapper[28758]: I0223 14:35:00.229030 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c02c8912-46c9-4f86-ad28-9bfb2eca4e54-tls-certificates\") pod \"prometheus-operator-admission-webhook-75d56db95f-rg8tp\" (UID: \"c02c8912-46c9-4f86-ad28-9bfb2eca4e54\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-75d56db95f-rg8tp" Feb 23 14:35:00.239707 master-0 kubenswrapper[28758]: I0223 14:35:00.239675 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 23 14:35:00.248489 master-0 kubenswrapper[28758]: I0223 14:35:00.248425 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06bde94a-3126-4d0f-baba-49dc5fbec61b-metrics-certs\") pod \"router-default-7b65dc9fcb-w68qb\" (UID: \"06bde94a-3126-4d0f-baba-49dc5fbec61b\") " pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:35:00.260783 master-0 kubenswrapper[28758]: I0223 14:35:00.260718 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 23 14:35:00.280043 master-0 kubenswrapper[28758]: I0223 14:35:00.279974 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 23 14:35:00.281332 master-0 kubenswrapper[28758]: I0223 14:35:00.281277 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad0f0d72-0337-4347-bb50-e299a175f3ca-image-registry-operator-tls\") pod \"cluster-image-registry-operator-779979bdf7-ml2d7\" (UID: \"ad0f0d72-0337-4347-bb50-e299a175f3ca\") " pod="openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7" Feb 23 14:35:00.307995 master-0 kubenswrapper[28758]: I0223 14:35:00.307419 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 23 14:35:00.308175 master-0 kubenswrapper[28758]: I0223 14:35:00.308030 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad0f0d72-0337-4347-bb50-e299a175f3ca-trusted-ca\") pod \"cluster-image-registry-operator-779979bdf7-ml2d7\" (UID: \"ad0f0d72-0337-4347-bb50-e299a175f3ca\") " pod="openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7" Feb 23 14:35:00.319897 master-0 kubenswrapper[28758]: I0223 14:35:00.319848 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 23 14:35:00.340159 master-0 kubenswrapper[28758]: I0223 14:35:00.340108 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Feb 23 14:35:00.360689 master-0 kubenswrapper[28758]: I0223 14:35:00.360612 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-hrd9b" Feb 23 14:35:00.384778 master-0 kubenswrapper[28758]: I0223 14:35:00.384722 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Feb 23 14:35:00.395232 master-0 kubenswrapper[28758]: I0223 14:35:00.395186 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85365dec-af50-406c-b258-890e4f454c4a-cco-trusted-ca\") pod \"cloud-credential-operator-6968c58f46-p7jh7\" (UID: \"85365dec-af50-406c-b258-890e4f454c4a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-6968c58f46-p7jh7" Feb 23 14:35:00.399780 master-0 kubenswrapper[28758]: I0223 14:35:00.399750 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Feb 23 14:35:00.420830 master-0 kubenswrapper[28758]: I0223 14:35:00.420693 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Feb 23 14:35:00.430514 master-0 kubenswrapper[28758]: I0223 14:35:00.430432 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/85365dec-af50-406c-b258-890e4f454c4a-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-6968c58f46-p7jh7\" (UID: \"85365dec-af50-406c-b258-890e4f454c4a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-6968c58f46-p7jh7" Feb 23 14:35:00.438964 master-0 kubenswrapper[28758]: I0223 14:35:00.438923 28758 request.go:700] Waited for 1.020066331s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/secrets?fieldSelector=metadata.name%3Dcontrol-plane-machine-set-operator-tls&limit=500&resourceVersion=0 Feb 23 14:35:00.440171 master-0 kubenswrapper[28758]: I0223 14:35:00.440131 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 23 14:35:00.440703 master-0 kubenswrapper[28758]: I0223 14:35:00.440659 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4373687a-61a0-434b-81f7-3fecaa1494ef-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-686847ff5f-9q266\" (UID: \"4373687a-61a0-434b-81f7-3fecaa1494ef\") " pod="openshift-machine-api/control-plane-machine-set-operator-686847ff5f-9q266" Feb 23 14:35:00.459777 master-0 kubenswrapper[28758]: I0223 14:35:00.459722 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 23 14:35:00.479404 master-0 kubenswrapper[28758]: I0223 14:35:00.479324 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 23 14:35:00.499853 master-0 kubenswrapper[28758]: I0223 14:35:00.499803 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 23 14:35:00.520234 master-0 kubenswrapper[28758]: I0223 14:35:00.520185 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 23 14:35:00.525729 master-0 kubenswrapper[28758]: I0223 14:35:00.525692 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1a283e3a-33ba-4ef7-87d3-55ed8c953fb4-samples-operator-tls\") pod \"cluster-samples-operator-65c5c48b9b-ps6x5\" (UID: \"1a283e3a-33ba-4ef7-87d3-55ed8c953fb4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-65c5c48b9b-ps6x5" Feb 23 14:35:00.540285 master-0 kubenswrapper[28758]: I0223 14:35:00.540223 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 23 14:35:00.557115 master-0 kubenswrapper[28758]: E0223 14:35:00.557019 28758 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-car2df00nf4i0: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.557115 master-0 kubenswrapper[28758]: E0223 14:35:00.557068 28758 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.557115 master-0 kubenswrapper[28758]: E0223 14:35:00.557075 28758 configmap.go:193] Couldn't get configMap openshift-machine-api/machine-api-operator-images: failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.557499 master-0 kubenswrapper[28758]: E0223 14:35:00.557179 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-client-ca-bundle podName:9416f5d0-32b4-4065-b678-26913af8b6dd nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.057152384 +0000 UTC m=+33.183468316 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-client-ca-bundle") pod "metrics-server-f55d8f669-b2gf9" (UID: "9416f5d0-32b4-4065-b678-26913af8b6dd") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.557499 master-0 kubenswrapper[28758]: E0223 14:35:00.557203 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/255b5a89-1b89-42dc-868a-32ce67975a54-srv-cert podName:255b5a89-1b89-42dc-868a-32ce67975a54 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.057193425 +0000 UTC m=+33.183509357 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/255b5a89-1b89-42dc-868a-32ce67975a54-srv-cert") pod "catalog-operator-596f79dd6f-mhzxn" (UID: "255b5a89-1b89-42dc-868a-32ce67975a54") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.557499 master-0 kubenswrapper[28758]: E0223 14:35:00.557227 28758 secret.go:189] Couldn't get secret openshift-machine-config-operator/mcc-proxy-tls: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.557499 master-0 kubenswrapper[28758]: E0223 14:35:00.557239 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ceba7b56-f910-473d-aed5-add94868fb31-images podName:ceba7b56-f910-473d-aed5-add94868fb31 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.057215375 +0000 UTC m=+33.183531327 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/ceba7b56-f910-473d-aed5-add94868fb31-images") pod "machine-api-operator-5c7cf458b4-bb7zl" (UID: "ceba7b56-f910-473d-aed5-add94868fb31") : failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.557499 master-0 kubenswrapper[28758]: E0223 14:35:00.557238 28758 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-server-audit-profiles: failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.557499 master-0 kubenswrapper[28758]: E0223 14:35:00.557266 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5cc28e06-3542-4a25-a8b1-5f5b4ee41114-proxy-tls podName:5cc28e06-3542-4a25-a8b1-5f5b4ee41114 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.057256496 +0000 UTC m=+33.183572518 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5cc28e06-3542-4a25-a8b1-5f5b4ee41114-proxy-tls") pod "machine-config-controller-54cb48566c-g4r57" (UID: "5cc28e06-3542-4a25-a8b1-5f5b4ee41114") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.557499 master-0 kubenswrapper[28758]: E0223 14:35:00.557286 28758 secret.go:189] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.557499 master-0 kubenswrapper[28758]: E0223 14:35:00.557300 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9416f5d0-32b4-4065-b678-26913af8b6dd-metrics-server-audit-profiles podName:9416f5d0-32b4-4065-b678-26913af8b6dd nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.057289017 +0000 UTC m=+33.183605049 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-server-audit-profiles" (UniqueName: "kubernetes.io/configmap/9416f5d0-32b4-4065-b678-26913af8b6dd-metrics-server-audit-profiles") pod "metrics-server-f55d8f669-b2gf9" (UID: "9416f5d0-32b4-4065-b678-26913af8b6dd") : failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.557499 master-0 kubenswrapper[28758]: E0223 14:35:00.557313 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0a39496-5e47-4415-b8bf-ed0634797ce1-certs podName:c0a39496-5e47-4415-b8bf-ed0634797ce1 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.057307808 +0000 UTC m=+33.183623740 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/c0a39496-5e47-4415-b8bf-ed0634797ce1-certs") pod "machine-config-server-qwsmk" (UID: "c0a39496-5e47-4415-b8bf-ed0634797ce1") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.557499 master-0 kubenswrapper[28758]: E0223 14:35:00.557465 28758 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-tls: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.558007 master-0 kubenswrapper[28758]: E0223 14:35:00.557555 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-secret-metrics-server-tls podName:9416f5d0-32b4-4065-b678-26913af8b6dd nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.057537414 +0000 UTC m=+33.183853366 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-metrics-server-tls" (UniqueName: "kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-secret-metrics-server-tls") pod "metrics-server-f55d8f669-b2gf9" (UID: "9416f5d0-32b4-4065-b678-26913af8b6dd") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.558007 master-0 kubenswrapper[28758]: E0223 14:35:00.557617 28758 secret.go:189] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.558007 master-0 kubenswrapper[28758]: E0223 14:35:00.557673 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ceba7b56-f910-473d-aed5-add94868fb31-machine-api-operator-tls podName:ceba7b56-f910-473d-aed5-add94868fb31 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.057655337 +0000 UTC m=+33.183971349 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/ceba7b56-f910-473d-aed5-add94868fb31-machine-api-operator-tls") pod "machine-api-operator-5c7cf458b4-bb7zl" (UID: "ceba7b56-f910-473d-aed5-add94868fb31") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.558424 master-0 kubenswrapper[28758]: E0223 14:35:00.558282 28758 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.558424 master-0 kubenswrapper[28758]: E0223 14:35:00.558305 28758 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.558424 master-0 kubenswrapper[28758]: E0223 14:35:00.558357 28758 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.558424 master-0 kubenswrapper[28758]: E0223 14:35:00.558328 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/15ad7f4e-44c6-4426-8b97-c47a47786544-metrics-client-ca podName:15ad7f4e-44c6-4426-8b97-c47a47786544 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.058317045 +0000 UTC m=+33.184633027 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/15ad7f4e-44c6-4426-8b97-c47a47786544-metrics-client-ca") pod "node-exporter-ckhv6" (UID: "15ad7f4e-44c6-4426-8b97-c47a47786544") : failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.558424 master-0 kubenswrapper[28758]: E0223 14:35:00.558416 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-kube-rbac-proxy-config podName:15ad7f4e-44c6-4426-8b97-c47a47786544 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.058383917 +0000 UTC m=+33.184699849 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-kube-rbac-proxy-config") pod "node-exporter-ckhv6" (UID: "15ad7f4e-44c6-4426-8b97-c47a47786544") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.558714 master-0 kubenswrapper[28758]: E0223 14:35:00.558438 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cebb80d-d898-44c8-82b3-1e18833cee3f-srv-cert podName:0cebb80d-d898-44c8-82b3-1e18833cee3f nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.058427878 +0000 UTC m=+33.184743900 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/0cebb80d-d898-44c8-82b3-1e18833cee3f-srv-cert") pod "olm-operator-5499d7f7bb-t45zz" (UID: "0cebb80d-d898-44c8-82b3-1e18833cee3f") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.559549 master-0 kubenswrapper[28758]: E0223 14:35:00.559508 28758 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.559677 master-0 kubenswrapper[28758]: E0223 14:35:00.559589 28758 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/machine-approver-config: failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.559677 master-0 kubenswrapper[28758]: E0223 14:35:00.559593 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87f989cd-6c19-4a30-833a-10e98b7a0326-cert podName:87f989cd-6c19-4a30-833a-10e98b7a0326 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.059568849 +0000 UTC m=+33.185884811 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/87f989cd-6c19-4a30-833a-10e98b7a0326-cert") pod "ingress-canary-nwdpd" (UID: "87f989cd-6c19-4a30-833a-10e98b7a0326") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.559677 master-0 kubenswrapper[28758]: E0223 14:35:00.559662 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-config podName:c67a2ed2-f520-46fc-84d3-6816dc19f4e0 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.059650781 +0000 UTC m=+33.185966703 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-config") pod "machine-approver-7dd9c7d7b9-rn8fj" (UID: "c67a2ed2-f520-46fc-84d3-6816dc19f4e0") : failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.560139 master-0 kubenswrapper[28758]: E0223 14:35:00.560103 28758 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.560139 master-0 kubenswrapper[28758]: E0223 14:35:00.560109 28758 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/cloud-controller-manager-images: failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.560270 master-0 kubenswrapper[28758]: I0223 14:35:00.560154 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Feb 23 14:35:00.560313 master-0 kubenswrapper[28758]: E0223 14:35:00.560271 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-images podName:172d47fd-e1a1-4d77-9e31-c4f22e824d5f nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.060181035 +0000 UTC m=+33.186497007 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-images") pod "cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" (UID: "172d47fd-e1a1-4d77-9e31-c4f22e824d5f") : failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.560361 master-0 kubenswrapper[28758]: E0223 14:35:00.560308 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fae9a4cf-2acf-4728-9105-87e004052fe5-metrics-client-ca podName:fae9a4cf-2acf-4728-9105-87e004052fe5 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.060292988 +0000 UTC m=+33.186608950 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/fae9a4cf-2acf-4728-9105-87e004052fe5-metrics-client-ca") pod "openshift-state-metrics-6dbff8cb4c-9qg7j" (UID: "fae9a4cf-2acf-4728-9105-87e004052fe5") : failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.561006 master-0 kubenswrapper[28758]: E0223 14:35:00.560978 28758 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.561006 master-0 kubenswrapper[28758]: E0223 14:35:00.560999 28758 secret.go:189] Couldn't get secret openshift-cluster-machine-approver/machine-approver-tls: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.561112 master-0 kubenswrapper[28758]: E0223 14:35:00.561031 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-state-metrics-kube-rbac-proxy-config podName:f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.061011507 +0000 UTC m=+33.187327439 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-state-metrics-kube-rbac-proxy-config") pod "kube-state-metrics-59584d565f-pdl4r" (UID: "f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.561112 master-0 kubenswrapper[28758]: E0223 14:35:00.561056 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-machine-approver-tls podName:c67a2ed2-f520-46fc-84d3-6816dc19f4e0 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.061045098 +0000 UTC m=+33.187361030 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-approver-tls" (UniqueName: "kubernetes.io/secret/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-machine-approver-tls") pod "machine-approver-7dd9c7d7b9-rn8fj" (UID: "c67a2ed2-f520-46fc-84d3-6816dc19f4e0") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.561305 master-0 kubenswrapper[28758]: I0223 14:35:00.561264 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/12b256b7-a57b-4124-8452-25e74cfa7926-images\") pod \"cluster-baremetal-operator-d6bb9bb76-4frj6\" (UID: \"12b256b7-a57b-4124-8452-25e74cfa7926\") " pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" Feb 23 14:35:00.561401 master-0 kubenswrapper[28758]: E0223 14:35:00.561371 28758 configmap.go:193] Couldn't get configMap openshift-machine-api/baremetal-kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.561401 master-0 kubenswrapper[28758]: E0223 14:35:00.561384 28758 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.561525 master-0 kubenswrapper[28758]: E0223 14:35:00.561410 28758 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.561525 master-0 kubenswrapper[28758]: E0223 14:35:00.561420 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/12b256b7-a57b-4124-8452-25e74cfa7926-config podName:12b256b7-a57b-4124-8452-25e74cfa7926 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.061409158 +0000 UTC m=+33.187725170 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/12b256b7-a57b-4124-8452-25e74cfa7926-config") pod "cluster-baremetal-operator-d6bb9bb76-4frj6" (UID: "12b256b7-a57b-4124-8452-25e74cfa7926") : failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.561525 master-0 kubenswrapper[28758]: E0223 14:35:00.561438 28758 secret.go:189] Couldn't get secret openshift-cloud-controller-manager-operator/cloud-controller-manager-operator-tls: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.561525 master-0 kubenswrapper[28758]: E0223 14:35:00.561444 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af950a67-1557-4352-8100-27281bb8ecbe-proxy-tls podName:af950a67-1557-4352-8100-27281bb8ecbe nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.061433049 +0000 UTC m=+33.187749091 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/af950a67-1557-4352-8100-27281bb8ecbe-proxy-tls") pod "machine-config-operator-7f8c75f984-rdjxr" (UID: "af950a67-1557-4352-8100-27281bb8ecbe") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.561525 master-0 kubenswrapper[28758]: E0223 14:35:00.561506 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-config podName:959c2393-e914-4c10-a18f-b30fcf012d19 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.06149413 +0000 UTC m=+33.187810122 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-config") pod "controller-manager-55d786cb4c-cqkbt" (UID: "959c2393-e914-4c10-a18f-b30fcf012d19") : failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.561525 master-0 kubenswrapper[28758]: E0223 14:35:00.561521 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-cloud-controller-manager-operator-tls podName:172d47fd-e1a1-4d77-9e31-c4f22e824d5f nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.061513581 +0000 UTC m=+33.187829633 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cloud-controller-manager-operator-tls" (UniqueName: "kubernetes.io/secret/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-cloud-controller-manager-operator-tls") pod "cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" (UID: "172d47fd-e1a1-4d77-9e31-c4f22e824d5f") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.562280 master-0 kubenswrapper[28758]: E0223 14:35:00.562250 28758 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.562332 master-0 kubenswrapper[28758]: E0223 14:35:00.562316 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-client-ca podName:959c2393-e914-4c10-a18f-b30fcf012d19 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.062299262 +0000 UTC m=+33.188615194 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-client-ca") pod "controller-manager-55d786cb4c-cqkbt" (UID: "959c2393-e914-4c10-a18f-b30fcf012d19") : failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.562381 master-0 kubenswrapper[28758]: E0223 14:35:00.562334 28758 secret.go:189] Couldn't get secret openshift-machine-config-operator/proxy-tls: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.562424 master-0 kubenswrapper[28758]: E0223 14:35:00.562408 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76c67569-3a72-4de9-87cd-432a4607b15b-proxy-tls podName:76c67569-3a72-4de9-87cd-432a4607b15b nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.062365184 +0000 UTC m=+33.188681136 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/76c67569-3a72-4de9-87cd-432a4607b15b-proxy-tls") pod "machine-config-daemon-fhcgg" (UID: "76c67569-3a72-4de9-87cd-432a4607b15b") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.563352 master-0 kubenswrapper[28758]: E0223 14:35:00.563323 28758 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.563407 master-0 kubenswrapper[28758]: E0223 14:35:00.563365 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/482284fd-6911-4ba6-8d57-7966cc51117a-serving-cert podName:482284fd-6911-4ba6-8d57-7966cc51117a nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.063355071 +0000 UTC m=+33.189671003 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/482284fd-6911-4ba6-8d57-7966cc51117a-serving-cert") pod "route-controller-manager-8bb99f4f-msq8f" (UID: "482284fd-6911-4ba6-8d57-7966cc51117a") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.563407 master-0 kubenswrapper[28758]: E0223 14:35:00.563379 28758 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.563407 master-0 kubenswrapper[28758]: E0223 14:35:00.563396 28758 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.563601 master-0 kubenswrapper[28758]: E0223 14:35:00.563421 28758 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.563601 master-0 kubenswrapper[28758]: E0223 14:35:00.563431 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-auth-proxy-config podName:172d47fd-e1a1-4d77-9e31-c4f22e824d5f nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.063419362 +0000 UTC m=+33.189735304 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-auth-proxy-config") pod "cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" (UID: "172d47fd-e1a1-4d77-9e31-c4f22e824d5f") : failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.563601 master-0 kubenswrapper[28758]: E0223 14:35:00.563500 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-tls podName:15ad7f4e-44c6-4426-8b97-c47a47786544 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.063448263 +0000 UTC m=+33.189764285 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-tls") pod "node-exporter-ckhv6" (UID: "15ad7f4e-44c6-4426-8b97-c47a47786544") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.563601 master-0 kubenswrapper[28758]: E0223 14:35:00.563540 28758 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.563601 master-0 kubenswrapper[28758]: E0223 14:35:00.563561 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18da400b-2271-455d-be0d-0ed44c74f78d-prometheus-operator-tls podName:18da400b-2271-455d-be0d-0ed44c74f78d nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.063547616 +0000 UTC m=+33.189863568 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/18da400b-2271-455d-be0d-0ed44c74f78d-prometheus-operator-tls") pod "prometheus-operator-754bc4d665-nl92v" (UID: "18da400b-2271-455d-be0d-0ed44c74f78d") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.563601 master-0 kubenswrapper[28758]: E0223 14:35:00.563584 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5cc28e06-3542-4a25-a8b1-5f5b4ee41114-mcc-auth-proxy-config podName:5cc28e06-3542-4a25-a8b1-5f5b4ee41114 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.063574927 +0000 UTC m=+33.189890869 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "mcc-auth-proxy-config" (UniqueName: "kubernetes.io/configmap/5cc28e06-3542-4a25-a8b1-5f5b4ee41114-mcc-auth-proxy-config") pod "machine-config-controller-54cb48566c-g4r57" (UID: "5cc28e06-3542-4a25-a8b1-5f5b4ee41114") : failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.563601 master-0 kubenswrapper[28758]: E0223 14:35:00.563585 28758 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.563824 master-0 kubenswrapper[28758]: E0223 14:35:00.563620 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fae9a4cf-2acf-4728-9105-87e004052fe5-openshift-state-metrics-tls podName:fae9a4cf-2acf-4728-9105-87e004052fe5 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.063611198 +0000 UTC m=+33.189927130 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/fae9a4cf-2acf-4728-9105-87e004052fe5-openshift-state-metrics-tls") pod "openshift-state-metrics-6dbff8cb4c-9qg7j" (UID: "fae9a4cf-2acf-4728-9105-87e004052fe5") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.564649 master-0 kubenswrapper[28758]: E0223 14:35:00.564577 28758 secret.go:189] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.564649 master-0 kubenswrapper[28758]: E0223 14:35:00.564577 28758 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.564649 master-0 kubenswrapper[28758]: E0223 14:35:00.564610 28758 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy-cluster-autoscaler-operator: failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.564649 master-0 kubenswrapper[28758]: E0223 14:35:00.564623 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0a39496-5e47-4415-b8bf-ed0634797ce1-node-bootstrap-token podName:c0a39496-5e47-4415-b8bf-ed0634797ce1 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.064613335 +0000 UTC m=+33.190929267 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/c0a39496-5e47-4415-b8bf-ed0634797ce1-node-bootstrap-token") pod "machine-config-server-qwsmk" (UID: "c0a39496-5e47-4415-b8bf-ed0634797ce1") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.564649 master-0 kubenswrapper[28758]: E0223 14:35:00.564650 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3-auth-proxy-config podName:3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.064640785 +0000 UTC m=+33.190956717 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3-auth-proxy-config") pod "cluster-autoscaler-operator-86b8dc6d6-2kvfp" (UID: "3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3") : failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.564855 master-0 kubenswrapper[28758]: E0223 14:35:00.564663 28758 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.564855 master-0 kubenswrapper[28758]: E0223 14:35:00.564674 28758 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.564855 master-0 kubenswrapper[28758]: E0223 14:35:00.564681 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0315476e-7140-4777-8061-9cead4c92024-apiservice-cert podName:0315476e-7140-4777-8061-9cead4c92024 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.064659356 +0000 UTC m=+33.190975318 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/0315476e-7140-4777-8061-9cead4c92024-apiservice-cert") pod "packageserver-65c9585877-m66zh" (UID: "0315476e-7140-4777-8061-9cead4c92024") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.564855 master-0 kubenswrapper[28758]: E0223 14:35:00.564705 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/255b5a89-1b89-42dc-868a-32ce67975a54-profile-collector-cert podName:255b5a89-1b89-42dc-868a-32ce67975a54 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.064696917 +0000 UTC m=+33.191012939 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "profile-collector-cert" (UniqueName: "kubernetes.io/secret/255b5a89-1b89-42dc-868a-32ce67975a54-profile-collector-cert") pod "catalog-operator-596f79dd6f-mhzxn" (UID: "255b5a89-1b89-42dc-868a-32ce67975a54") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.564855 master-0 kubenswrapper[28758]: E0223 14:35:00.564724 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12b256b7-a57b-4124-8452-25e74cfa7926-cluster-baremetal-operator-tls podName:12b256b7-a57b-4124-8452-25e74cfa7926 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.064716617 +0000 UTC m=+33.191032659 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/12b256b7-a57b-4124-8452-25e74cfa7926-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-d6bb9bb76-4frj6" (UID: "12b256b7-a57b-4124-8452-25e74cfa7926") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.565935 master-0 kubenswrapper[28758]: E0223 14:35:00.565839 28758 secret.go:189] Couldn't get secret openshift-insights/openshift-insights-serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.565935 master-0 kubenswrapper[28758]: E0223 14:35:00.565847 28758 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/machine-config-operator-images: failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.565935 master-0 kubenswrapper[28758]: E0223 14:35:00.565857 28758 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.565935 master-0 kubenswrapper[28758]: E0223 14:35:00.565889 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-serving-cert podName:ae4baa4e-4ef4-433d-aa36-149e92fa6ee2 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.065878089 +0000 UTC m=+33.192194101 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-serving-cert") pod "insights-operator-59b498fcfb-rz897" (UID: "ae4baa4e-4ef4-433d-aa36-149e92fa6ee2") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.565935 master-0 kubenswrapper[28758]: E0223 14:35:00.565907 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/af950a67-1557-4352-8100-27281bb8ecbe-images podName:af950a67-1557-4352-8100-27281bb8ecbe nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.065900379 +0000 UTC m=+33.192216401 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/af950a67-1557-4352-8100-27281bb8ecbe-images") pod "machine-config-operator-7f8c75f984-rdjxr" (UID: "af950a67-1557-4352-8100-27281bb8ecbe") : failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.565935 master-0 kubenswrapper[28758]: E0223 14:35:00.565923 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0315476e-7140-4777-8061-9cead4c92024-webhook-cert podName:0315476e-7140-4777-8061-9cead4c92024 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.06591649 +0000 UTC m=+33.192232522 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/0315476e-7140-4777-8061-9cead4c92024-webhook-cert") pod "packageserver-65c9585877-m66zh" (UID: "0315476e-7140-4777-8061-9cead4c92024") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.567098 master-0 kubenswrapper[28758]: E0223 14:35:00.566991 28758 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.567098 master-0 kubenswrapper[28758]: E0223 14:35:00.567016 28758 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.567098 master-0 kubenswrapper[28758]: E0223 14:35:00.567026 28758 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.567098 master-0 kubenswrapper[28758]: E0223 14:35:00.566991 28758 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.567098 master-0 kubenswrapper[28758]: E0223 14:35:00.567054 28758 configmap.go:193] Couldn't get configMap openshift-monitoring/kube-state-metrics-custom-resource-state-configmap: failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.567098 master-0 kubenswrapper[28758]: E0223 14:35:00.567039 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18da400b-2271-455d-be0d-0ed44c74f78d-prometheus-operator-kube-rbac-proxy-config podName:18da400b-2271-455d-be0d-0ed44c74f78d nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.067022469 +0000 UTC m=+33.193338401 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/18da400b-2271-455d-be0d-0ed44c74f78d-prometheus-operator-kube-rbac-proxy-config") pod "prometheus-operator-754bc4d665-nl92v" (UID: "18da400b-2271-455d-be0d-0ed44c74f78d") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.567098 master-0 kubenswrapper[28758]: E0223 14:35:00.567087 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dd5fa7c-0519-4170-89c6-b369e5fc1990-webhook-certs podName:8dd5fa7c-0519-4170-89c6-b369e5fc1990 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.067079021 +0000 UTC m=+33.193395043 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8dd5fa7c-0519-4170-89c6-b369e5fc1990-webhook-certs") pod "multus-admission-controller-5f54bf67d4-2p4jz" (UID: "8dd5fa7c-0519-4170-89c6-b369e5fc1990") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.567336 master-0 kubenswrapper[28758]: E0223 14:35:00.567102 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/482284fd-6911-4ba6-8d57-7966cc51117a-client-ca podName:482284fd-6911-4ba6-8d57-7966cc51117a nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.067096411 +0000 UTC m=+33.193412343 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/482284fd-6911-4ba6-8d57-7966cc51117a-client-ca") pod "route-controller-manager-8bb99f4f-msq8f" (UID: "482284fd-6911-4ba6-8d57-7966cc51117a") : failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.567336 master-0 kubenswrapper[28758]: E0223 14:35:00.567122 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/18da400b-2271-455d-be0d-0ed44c74f78d-metrics-client-ca podName:18da400b-2271-455d-be0d-0ed44c74f78d nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.067114712 +0000 UTC m=+33.193430714 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/18da400b-2271-455d-be0d-0ed44c74f78d-metrics-client-ca") pod "prometheus-operator-754bc4d665-nl92v" (UID: "18da400b-2271-455d-be0d-0ed44c74f78d") : failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.567336 master-0 kubenswrapper[28758]: E0223 14:35:00.567136 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-state-metrics-custom-resource-state-configmap podName:f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.067129732 +0000 UTC m=+33.193445664 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-custom-resource-state-configmap" (UniqueName: "kubernetes.io/configmap/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-state-metrics-custom-resource-state-configmap") pod "kube-state-metrics-59584d565f-pdl4r" (UID: "f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c") : failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.568256 master-0 kubenswrapper[28758]: E0223 14:35:00.568220 28758 secret.go:189] Couldn't get secret openshift-cluster-storage-operator/cluster-storage-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.568333 master-0 kubenswrapper[28758]: E0223 14:35:00.568257 28758 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.568333 master-0 kubenswrapper[28758]: E0223 14:35:00.568281 28758 secret.go:189] Couldn't get secret openshift-machine-api/cluster-autoscaler-operator-cert: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.568333 master-0 kubenswrapper[28758]: E0223 14:35:00.568283 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbb66172-1ea9-4683-b88f-227c4fd94924-cluster-storage-operator-serving-cert podName:fbb66172-1ea9-4683-b88f-227c4fd94924 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.068267173 +0000 UTC m=+33.194583115 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-storage-operator-serving-cert" (UniqueName: "kubernetes.io/secret/fbb66172-1ea9-4683-b88f-227c4fd94924-cluster-storage-operator-serving-cert") pod "cluster-storage-operator-f94476f49-s6c8v" (UID: "fbb66172-1ea9-4683-b88f-227c4fd94924") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.568436 master-0 kubenswrapper[28758]: E0223 14:35:00.568336 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ceba7b56-f910-473d-aed5-add94868fb31-config podName:ceba7b56-f910-473d-aed5-add94868fb31 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.068324354 +0000 UTC m=+33.194640296 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/ceba7b56-f910-473d-aed5-add94868fb31-config") pod "machine-api-operator-5c7cf458b4-bb7zl" (UID: "ceba7b56-f910-473d-aed5-add94868fb31") : failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.568436 master-0 kubenswrapper[28758]: E0223 14:35:00.568352 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3-cert podName:3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.068344825 +0000 UTC m=+33.194660767 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3-cert") pod "cluster-autoscaler-operator-86b8dc6d6-2kvfp" (UID: "3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.568644 master-0 kubenswrapper[28758]: E0223 14:35:00.568576 28758 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.568644 master-0 kubenswrapper[28758]: E0223 14:35:00.568626 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12b256b7-a57b-4124-8452-25e74cfa7926-cert podName:12b256b7-a57b-4124-8452-25e74cfa7926 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.068611222 +0000 UTC m=+33.194927224 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/12b256b7-a57b-4124-8452-25e74cfa7926-cert") pod "cluster-baremetal-operator-d6bb9bb76-4frj6" (UID: "12b256b7-a57b-4124-8452-25e74cfa7926") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.569353 master-0 kubenswrapper[28758]: E0223 14:35:00.569329 28758 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.569420 master-0 kubenswrapper[28758]: E0223 14:35:00.569373 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/482284fd-6911-4ba6-8d57-7966cc51117a-config podName:482284fd-6911-4ba6-8d57-7966cc51117a nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.069362902 +0000 UTC m=+33.195678874 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/482284fd-6911-4ba6-8d57-7966cc51117a-config") pod "route-controller-manager-8bb99f4f-msq8f" (UID: "482284fd-6911-4ba6-8d57-7966cc51117a") : failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.569420 master-0 kubenswrapper[28758]: E0223 14:35:00.569405 28758 configmap.go:193] Couldn't get configMap openshift-insights/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.569543 master-0 kubenswrapper[28758]: E0223 14:35:00.569434 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-service-ca-bundle podName:ae4baa4e-4ef4-433d-aa36-149e92fa6ee2 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.069425284 +0000 UTC m=+33.195741316 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-service-ca-bundle") pod "insights-operator-59b498fcfb-rz897" (UID: "ae4baa4e-4ef4-433d-aa36-149e92fa6ee2") : failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.569543 master-0 kubenswrapper[28758]: E0223 14:35:00.569470 28758 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.569543 master-0 kubenswrapper[28758]: E0223 14:35:00.569520 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fae9a4cf-2acf-4728-9105-87e004052fe5-openshift-state-metrics-kube-rbac-proxy-config podName:fae9a4cf-2acf-4728-9105-87e004052fe5 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.069510456 +0000 UTC m=+33.195826488 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/fae9a4cf-2acf-4728-9105-87e004052fe5-openshift-state-metrics-kube-rbac-proxy-config") pod "openshift-state-metrics-6dbff8cb4c-9qg7j" (UID: "fae9a4cf-2acf-4728-9105-87e004052fe5") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.569803 master-0 kubenswrapper[28758]: E0223 14:35:00.569709 28758 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.569803 master-0 kubenswrapper[28758]: E0223 14:35:00.569745 28758 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.569803 master-0 kubenswrapper[28758]: E0223 14:35:00.569753 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-auth-proxy-config podName:c67a2ed2-f520-46fc-84d3-6816dc19f4e0 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.069742503 +0000 UTC m=+33.196058505 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-auth-proxy-config") pod "machine-approver-7dd9c7d7b9-rn8fj" (UID: "c67a2ed2-f520-46fc-84d3-6816dc19f4e0") : failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.569803 master-0 kubenswrapper[28758]: E0223 14:35:00.569711 28758 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.569803 master-0 kubenswrapper[28758]: E0223 14:35:00.569797 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/76c67569-3a72-4de9-87cd-432a4607b15b-mcd-auth-proxy-config podName:76c67569-3a72-4de9-87cd-432a4607b15b nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.069772954 +0000 UTC m=+33.196088956 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "mcd-auth-proxy-config" (UniqueName: "kubernetes.io/configmap/76c67569-3a72-4de9-87cd-432a4607b15b-mcd-auth-proxy-config") pod "machine-config-daemon-fhcgg" (UID: "76c67569-3a72-4de9-87cd-432a4607b15b") : failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.570000 master-0 kubenswrapper[28758]: E0223 14:35:00.569814 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cebb80d-d898-44c8-82b3-1e18833cee3f-profile-collector-cert podName:0cebb80d-d898-44c8-82b3-1e18833cee3f nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.069806254 +0000 UTC m=+33.196122276 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "profile-collector-cert" (UniqueName: "kubernetes.io/secret/0cebb80d-d898-44c8-82b3-1e18833cee3f-profile-collector-cert") pod "olm-operator-5499d7f7bb-t45zz" (UID: "0cebb80d-d898-44c8-82b3-1e18833cee3f") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.571150 master-0 kubenswrapper[28758]: E0223 14:35:00.570944 28758 configmap.go:193] Couldn't get configMap openshift-insights/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.571150 master-0 kubenswrapper[28758]: E0223 14:35:00.570952 28758 secret.go:189] Couldn't get secret openshift-monitoring/metrics-client-certs: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.571150 master-0 kubenswrapper[28758]: E0223 14:35:00.570987 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-trusted-ca-bundle podName:ae4baa4e-4ef4-433d-aa36-149e92fa6ee2 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.070977156 +0000 UTC m=+33.197293158 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-trusted-ca-bundle") pod "insights-operator-59b498fcfb-rz897" (UID: "ae4baa4e-4ef4-433d-aa36-149e92fa6ee2") : failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.571150 master-0 kubenswrapper[28758]: E0223 14:35:00.571006 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-secret-metrics-client-certs podName:9416f5d0-32b4-4065-b678-26913af8b6dd nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.070998367 +0000 UTC m=+33.197314399 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-metrics-client-certs" (UniqueName: "kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-secret-metrics-client-certs") pod "metrics-server-f55d8f669-b2gf9" (UID: "9416f5d0-32b4-4065-b678-26913af8b6dd") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.571150 master-0 kubenswrapper[28758]: E0223 14:35:00.571016 28758 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.571150 master-0 kubenswrapper[28758]: E0223 14:35:00.571032 28758 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.571150 master-0 kubenswrapper[28758]: E0223 14:35:00.571048 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-metrics-client-ca podName:f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.071037718 +0000 UTC m=+33.197353710 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-metrics-client-ca") pod "kube-state-metrics-59584d565f-pdl4r" (UID: "f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c") : failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.571150 master-0 kubenswrapper[28758]: E0223 14:35:00.571065 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/959c2393-e914-4c10-a18f-b30fcf012d19-serving-cert podName:959c2393-e914-4c10-a18f-b30fcf012d19 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.071056058 +0000 UTC m=+33.197372070 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/959c2393-e914-4c10-a18f-b30fcf012d19-serving-cert") pod "controller-manager-55d786cb4c-cqkbt" (UID: "959c2393-e914-4c10-a18f-b30fcf012d19") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.571150 master-0 kubenswrapper[28758]: E0223 14:35:00.571086 28758 configmap.go:193] Couldn't get configMap openshift-monitoring/kubelet-serving-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.571150 master-0 kubenswrapper[28758]: E0223 14:35:00.571091 28758 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.571150 master-0 kubenswrapper[28758]: E0223 14:35:00.571116 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9416f5d0-32b4-4065-b678-26913af8b6dd-configmap-kubelet-serving-ca-bundle podName:9416f5d0-32b4-4065-b678-26913af8b6dd nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.071107549 +0000 UTC m=+33.197423551 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "configmap-kubelet-serving-ca-bundle" (UniqueName: "kubernetes.io/configmap/9416f5d0-32b4-4065-b678-26913af8b6dd-configmap-kubelet-serving-ca-bundle") pod "metrics-server-f55d8f669-b2gf9" (UID: "9416f5d0-32b4-4065-b678-26913af8b6dd") : failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.571150 master-0 kubenswrapper[28758]: E0223 14:35:00.571132 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/af950a67-1557-4352-8100-27281bb8ecbe-auth-proxy-config podName:af950a67-1557-4352-8100-27281bb8ecbe nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.07112525 +0000 UTC m=+33.197441262 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/af950a67-1557-4352-8100-27281bb8ecbe-auth-proxy-config") pod "machine-config-operator-7f8c75f984-rdjxr" (UID: "af950a67-1557-4352-8100-27281bb8ecbe") : failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.571731 master-0 kubenswrapper[28758]: E0223 14:35:00.571223 28758 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.571731 master-0 kubenswrapper[28758]: E0223 14:35:00.571259 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-proxy-ca-bundles podName:959c2393-e914-4c10-a18f-b30fcf012d19 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.071249083 +0000 UTC m=+33.197565015 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-proxy-ca-bundles") pod "controller-manager-55d786cb4c-cqkbt" (UID: "959c2393-e914-4c10-a18f-b30fcf012d19") : failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:00.571731 master-0 kubenswrapper[28758]: E0223 14:35:00.571279 28758 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.571731 master-0 kubenswrapper[28758]: E0223 14:35:00.571307 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-state-metrics-tls podName:f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.071299325 +0000 UTC m=+33.197615337 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-state-metrics-tls") pod "kube-state-metrics-59584d565f-pdl4r" (UID: "f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.571731 master-0 kubenswrapper[28758]: E0223 14:35:00.571322 28758 secret.go:189] Couldn't get secret openshift-config-operator/config-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.571731 master-0 kubenswrapper[28758]: E0223 14:35:00.571538 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92c63c95-e880-4f51-9858-7715343f7bd8-serving-cert podName:92c63c95-e880-4f51-9858-7715343f7bd8 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:01.071524891 +0000 UTC m=+33.197840823 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/92c63c95-e880-4f51-9858-7715343f7bd8-serving-cert") pod "openshift-config-operator-6f47d587d6-55qjr" (UID: "92c63c95-e880-4f51-9858-7715343f7bd8") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:00.579591 master-0 kubenswrapper[28758]: I0223 14:35:00.579420 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-dockercfg-vvbrf" Feb 23 14:35:00.599592 master-0 kubenswrapper[28758]: I0223 14:35:00.599550 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Feb 23 14:35:00.620310 master-0 kubenswrapper[28758]: I0223 14:35:00.620246 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Feb 23 14:35:00.639392 master-0 kubenswrapper[28758]: I0223 14:35:00.639342 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Feb 23 14:35:00.659718 master-0 kubenswrapper[28758]: I0223 14:35:00.659672 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Feb 23 14:35:00.677336 master-0 kubenswrapper[28758]: I0223 14:35:00.677209 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_959c75833224b4ba3fa488b77d8f5032/kube-apiserver-check-endpoints/0.log" Feb 23 14:35:00.679304 master-0 kubenswrapper[28758]: I0223 14:35:00.679281 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Feb 23 14:35:00.680446 master-0 kubenswrapper[28758]: I0223 14:35:00.680429 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 23 14:35:00.700408 master-0 kubenswrapper[28758]: I0223 14:35:00.700354 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-r2snt" Feb 23 14:35:00.720321 master-0 kubenswrapper[28758]: I0223 14:35:00.719606 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 23 14:35:00.740096 master-0 kubenswrapper[28758]: I0223 14:35:00.739824 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-dnwls" Feb 23 14:35:00.760641 master-0 kubenswrapper[28758]: I0223 14:35:00.760587 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Feb 23 14:35:00.780399 master-0 kubenswrapper[28758]: I0223 14:35:00.780286 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 23 14:35:00.800705 master-0 kubenswrapper[28758]: I0223 14:35:00.800531 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Feb 23 14:35:00.819569 master-0 kubenswrapper[28758]: I0223 14:35:00.819502 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 23 14:35:00.839894 master-0 kubenswrapper[28758]: I0223 14:35:00.839825 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-zc2l5" Feb 23 14:35:00.859685 master-0 kubenswrapper[28758]: I0223 14:35:00.859630 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 23 14:35:00.879284 master-0 kubenswrapper[28758]: I0223 14:35:00.879179 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 23 14:35:00.900182 master-0 kubenswrapper[28758]: I0223 14:35:00.900120 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Feb 23 14:35:00.940138 master-0 kubenswrapper[28758]: I0223 14:35:00.940074 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 23 14:35:00.960079 master-0 kubenswrapper[28758]: I0223 14:35:00.960034 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Feb 23 14:35:00.980430 master-0 kubenswrapper[28758]: I0223 14:35:00.980382 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 23 14:35:01.000011 master-0 kubenswrapper[28758]: I0223 14:35:00.999948 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-zmwkf" Feb 23 14:35:01.019871 master-0 kubenswrapper[28758]: I0223 14:35:01.019813 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Feb 23 14:35:01.040096 master-0 kubenswrapper[28758]: I0223 14:35:01.040044 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-cpnmg" Feb 23 14:35:01.059828 master-0 kubenswrapper[28758]: I0223 14:35:01.059765 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 23 14:35:01.079689 master-0 kubenswrapper[28758]: I0223 14:35:01.079616 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 23 14:35:01.099603 master-0 kubenswrapper[28758]: I0223 14:35:01.099547 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 23 14:35:01.121319 master-0 kubenswrapper[28758]: I0223 14:35:01.121256 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/482284fd-6911-4ba6-8d57-7966cc51117a-config\") pod \"route-controller-manager-8bb99f4f-msq8f\" (UID: \"482284fd-6911-4ba6-8d57-7966cc51117a\") " pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:35:01.121319 master-0 kubenswrapper[28758]: I0223 14:35:01.121303 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fae9a4cf-2acf-4728-9105-87e004052fe5-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-6dbff8cb4c-9qg7j\" (UID: \"fae9a4cf-2acf-4728-9105-87e004052fe5\") " pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" Feb 23 14:35:01.121593 master-0 kubenswrapper[28758]: I0223 14:35:01.121432 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-auth-proxy-config\") pod \"machine-approver-7dd9c7d7b9-rn8fj\" (UID: \"c67a2ed2-f520-46fc-84d3-6816dc19f4e0\") " pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" Feb 23 14:35:01.121593 master-0 kubenswrapper[28758]: I0223 14:35:01.121514 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/76c67569-3a72-4de9-87cd-432a4607b15b-mcd-auth-proxy-config\") pod \"machine-config-daemon-fhcgg\" (UID: \"76c67569-3a72-4de9-87cd-432a4607b15b\") " pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" Feb 23 14:35:01.121715 master-0 kubenswrapper[28758]: I0223 14:35:01.121652 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/959c2393-e914-4c10-a18f-b30fcf012d19-serving-cert\") pod \"controller-manager-55d786cb4c-cqkbt\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:35:01.121715 master-0 kubenswrapper[28758]: I0223 14:35:01.121705 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-state-metrics-tls\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:35:01.121783 master-0 kubenswrapper[28758]: I0223 14:35:01.121745 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92c63c95-e880-4f51-9858-7715343f7bd8-serving-cert\") pod \"openshift-config-operator-6f47d587d6-55qjr\" (UID: \"92c63c95-e880-4f51-9858-7715343f7bd8\") " pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" Feb 23 14:35:01.121783 master-0 kubenswrapper[28758]: I0223 14:35:01.121766 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/76c67569-3a72-4de9-87cd-432a4607b15b-mcd-auth-proxy-config\") pod \"machine-config-daemon-fhcgg\" (UID: \"76c67569-3a72-4de9-87cd-432a4607b15b\") " pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" Feb 23 14:35:01.121783 master-0 kubenswrapper[28758]: I0223 14:35:01.121778 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/af950a67-1557-4352-8100-27281bb8ecbe-auth-proxy-config\") pod \"machine-config-operator-7f8c75f984-rdjxr\" (UID: \"af950a67-1557-4352-8100-27281bb8ecbe\") " pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" Feb 23 14:35:01.121967 master-0 kubenswrapper[28758]: I0223 14:35:01.121932 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-metrics-client-ca\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:35:01.122010 master-0 kubenswrapper[28758]: I0223 14:35:01.121972 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-secret-metrics-client-certs\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:35:01.122010 master-0 kubenswrapper[28758]: I0223 14:35:01.122002 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9416f5d0-32b4-4065-b678-26913af8b6dd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:35:01.122092 master-0 kubenswrapper[28758]: I0223 14:35:01.122031 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-proxy-ca-bundles\") pod \"controller-manager-55d786cb4c-cqkbt\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:35:01.122177 master-0 kubenswrapper[28758]: I0223 14:35:01.122145 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/af950a67-1557-4352-8100-27281bb8ecbe-auth-proxy-config\") pod \"machine-config-operator-7f8c75f984-rdjxr\" (UID: \"af950a67-1557-4352-8100-27281bb8ecbe\") " pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" Feb 23 14:35:01.122263 master-0 kubenswrapper[28758]: I0223 14:35:01.122246 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-trusted-ca-bundle\") pod \"insights-operator-59b498fcfb-rz897\" (UID: \"ae4baa4e-4ef4-433d-aa36-149e92fa6ee2\") " pod="openshift-insights/insights-operator-59b498fcfb-rz897" Feb 23 14:35:01.122315 master-0 kubenswrapper[28758]: I0223 14:35:01.122302 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ceba7b56-f910-473d-aed5-add94868fb31-images\") pod \"machine-api-operator-5c7cf458b4-bb7zl\" (UID: \"ceba7b56-f910-473d-aed5-add94868fb31\") " pod="openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl" Feb 23 14:35:01.122364 master-0 kubenswrapper[28758]: I0223 14:35:01.122333 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/255b5a89-1b89-42dc-868a-32ce67975a54-srv-cert\") pod \"catalog-operator-596f79dd6f-mhzxn\" (UID: \"255b5a89-1b89-42dc-868a-32ce67975a54\") " pod="openshift-operator-lifecycle-manager/catalog-operator-596f79dd6f-mhzxn" Feb 23 14:35:01.122431 master-0 kubenswrapper[28758]: I0223 14:35:01.122397 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-client-ca-bundle\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:35:01.122513 master-0 kubenswrapper[28758]: I0223 14:35:01.122425 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92c63c95-e880-4f51-9858-7715343f7bd8-serving-cert\") pod \"openshift-config-operator-6f47d587d6-55qjr\" (UID: \"92c63c95-e880-4f51-9858-7715343f7bd8\") " pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" Feb 23 14:35:01.122513 master-0 kubenswrapper[28758]: I0223 14:35:01.122468 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cc28e06-3542-4a25-a8b1-5f5b4ee41114-proxy-tls\") pod \"machine-config-controller-54cb48566c-g4r57\" (UID: \"5cc28e06-3542-4a25-a8b1-5f5b4ee41114\") " pod="openshift-machine-config-operator/machine-config-controller-54cb48566c-g4r57" Feb 23 14:35:01.122592 master-0 kubenswrapper[28758]: I0223 14:35:01.122551 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/255b5a89-1b89-42dc-868a-32ce67975a54-srv-cert\") pod \"catalog-operator-596f79dd6f-mhzxn\" (UID: \"255b5a89-1b89-42dc-868a-32ce67975a54\") " pod="openshift-operator-lifecycle-manager/catalog-operator-596f79dd6f-mhzxn" Feb 23 14:35:01.122592 master-0 kubenswrapper[28758]: I0223 14:35:01.122550 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c0a39496-5e47-4415-b8bf-ed0634797ce1-certs\") pod \"machine-config-server-qwsmk\" (UID: \"c0a39496-5e47-4415-b8bf-ed0634797ce1\") " pod="openshift-machine-config-operator/machine-config-server-qwsmk" Feb 23 14:35:01.122592 master-0 kubenswrapper[28758]: I0223 14:35:01.122584 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9416f5d0-32b4-4065-b678-26913af8b6dd-metrics-server-audit-profiles\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:35:01.122703 master-0 kubenswrapper[28758]: I0223 14:35:01.122605 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-secret-metrics-server-tls\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:35:01.122703 master-0 kubenswrapper[28758]: I0223 14:35:01.122680 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0cebb80d-d898-44c8-82b3-1e18833cee3f-srv-cert\") pod \"olm-operator-5499d7f7bb-t45zz\" (UID: \"0cebb80d-d898-44c8-82b3-1e18833cee3f\") " pod="openshift-operator-lifecycle-manager/olm-operator-5499d7f7bb-t45zz" Feb 23 14:35:01.122782 master-0 kubenswrapper[28758]: I0223 14:35:01.122724 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ceba7b56-f910-473d-aed5-add94868fb31-machine-api-operator-tls\") pod \"machine-api-operator-5c7cf458b4-bb7zl\" (UID: \"ceba7b56-f910-473d-aed5-add94868fb31\") " pod="openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl" Feb 23 14:35:01.122824 master-0 kubenswrapper[28758]: I0223 14:35:01.122809 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/15ad7f4e-44c6-4426-8b97-c47a47786544-metrics-client-ca\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:35:01.122859 master-0 kubenswrapper[28758]: I0223 14:35:01.122828 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:35:01.122905 master-0 kubenswrapper[28758]: I0223 14:35:01.122885 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-config\") pod \"machine-approver-7dd9c7d7b9-rn8fj\" (UID: \"c67a2ed2-f520-46fc-84d3-6816dc19f4e0\") " pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" Feb 23 14:35:01.122940 master-0 kubenswrapper[28758]: I0223 14:35:01.122912 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87f989cd-6c19-4a30-833a-10e98b7a0326-cert\") pod \"ingress-canary-nwdpd\" (UID: \"87f989cd-6c19-4a30-833a-10e98b7a0326\") " pod="openshift-ingress-canary/ingress-canary-nwdpd" Feb 23 14:35:01.122969 master-0 kubenswrapper[28758]: I0223 14:35:01.122939 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fae9a4cf-2acf-4728-9105-87e004052fe5-metrics-client-ca\") pod \"openshift-state-metrics-6dbff8cb4c-9qg7j\" (UID: \"fae9a4cf-2acf-4728-9105-87e004052fe5\") " pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" Feb 23 14:35:01.122969 master-0 kubenswrapper[28758]: I0223 14:35:01.122960 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0cebb80d-d898-44c8-82b3-1e18833cee3f-srv-cert\") pod \"olm-operator-5499d7f7bb-t45zz\" (UID: \"0cebb80d-d898-44c8-82b3-1e18833cee3f\") " pod="openshift-operator-lifecycle-manager/olm-operator-5499d7f7bb-t45zz" Feb 23 14:35:01.123030 master-0 kubenswrapper[28758]: I0223 14:35:01.122970 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-images\") pod \"cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb\" (UID: \"172d47fd-e1a1-4d77-9e31-c4f22e824d5f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" Feb 23 14:35:01.123084 master-0 kubenswrapper[28758]: I0223 14:35:01.123034 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-machine-approver-tls\") pod \"machine-approver-7dd9c7d7b9-rn8fj\" (UID: \"c67a2ed2-f520-46fc-84d3-6816dc19f4e0\") " pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" Feb 23 14:35:01.123120 master-0 kubenswrapper[28758]: I0223 14:35:01.123092 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:35:01.123120 master-0 kubenswrapper[28758]: I0223 14:35:01.123115 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12b256b7-a57b-4124-8452-25e74cfa7926-config\") pod \"cluster-baremetal-operator-d6bb9bb76-4frj6\" (UID: \"12b256b7-a57b-4124-8452-25e74cfa7926\") " pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" Feb 23 14:35:01.123210 master-0 kubenswrapper[28758]: I0223 14:35:01.123134 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/af950a67-1557-4352-8100-27281bb8ecbe-proxy-tls\") pod \"machine-config-operator-7f8c75f984-rdjxr\" (UID: \"af950a67-1557-4352-8100-27281bb8ecbe\") " pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" Feb 23 14:35:01.123264 master-0 kubenswrapper[28758]: I0223 14:35:01.123238 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-config\") pod \"controller-manager-55d786cb4c-cqkbt\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:35:01.123299 master-0 kubenswrapper[28758]: I0223 14:35:01.123271 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb\" (UID: \"172d47fd-e1a1-4d77-9e31-c4f22e824d5f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" Feb 23 14:35:01.123331 master-0 kubenswrapper[28758]: I0223 14:35:01.123296 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-client-ca\") pod \"controller-manager-55d786cb4c-cqkbt\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:35:01.123331 master-0 kubenswrapper[28758]: I0223 14:35:01.123299 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12b256b7-a57b-4124-8452-25e74cfa7926-config\") pod \"cluster-baremetal-operator-d6bb9bb76-4frj6\" (UID: \"12b256b7-a57b-4124-8452-25e74cfa7926\") " pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" Feb 23 14:35:01.123405 master-0 kubenswrapper[28758]: I0223 14:35:01.123359 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/76c67569-3a72-4de9-87cd-432a4607b15b-proxy-tls\") pod \"machine-config-daemon-fhcgg\" (UID: \"76c67569-3a72-4de9-87cd-432a4607b15b\") " pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" Feb 23 14:35:01.123452 master-0 kubenswrapper[28758]: I0223 14:35:01.123417 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/482284fd-6911-4ba6-8d57-7966cc51117a-serving-cert\") pod \"route-controller-manager-8bb99f4f-msq8f\" (UID: \"482284fd-6911-4ba6-8d57-7966cc51117a\") " pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:35:01.123528 master-0 kubenswrapper[28758]: I0223 14:35:01.123449 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fae9a4cf-2acf-4728-9105-87e004052fe5-openshift-state-metrics-tls\") pod \"openshift-state-metrics-6dbff8cb4c-9qg7j\" (UID: \"fae9a4cf-2acf-4728-9105-87e004052fe5\") " pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" Feb 23 14:35:01.123528 master-0 kubenswrapper[28758]: I0223 14:35:01.123501 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb\" (UID: \"172d47fd-e1a1-4d77-9e31-c4f22e824d5f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" Feb 23 14:35:01.123614 master-0 kubenswrapper[28758]: I0223 14:35:01.123562 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-tls\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:35:01.123614 master-0 kubenswrapper[28758]: I0223 14:35:01.123590 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/18da400b-2271-455d-be0d-0ed44c74f78d-prometheus-operator-tls\") pod \"prometheus-operator-754bc4d665-nl92v\" (UID: \"18da400b-2271-455d-be0d-0ed44c74f78d\") " pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" Feb 23 14:35:01.123701 master-0 kubenswrapper[28758]: I0223 14:35:01.123630 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5cc28e06-3542-4a25-a8b1-5f5b4ee41114-mcc-auth-proxy-config\") pod \"machine-config-controller-54cb48566c-g4r57\" (UID: \"5cc28e06-3542-4a25-a8b1-5f5b4ee41114\") " pod="openshift-machine-config-operator/machine-config-controller-54cb48566c-g4r57" Feb 23 14:35:01.123701 master-0 kubenswrapper[28758]: I0223 14:35:01.123681 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c0a39496-5e47-4415-b8bf-ed0634797ce1-node-bootstrap-token\") pod \"machine-config-server-qwsmk\" (UID: \"c0a39496-5e47-4415-b8bf-ed0634797ce1\") " pod="openshift-machine-config-operator/machine-config-server-qwsmk" Feb 23 14:35:01.123772 master-0 kubenswrapper[28758]: I0223 14:35:01.123750 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0315476e-7140-4777-8061-9cead4c92024-apiservice-cert\") pod \"packageserver-65c9585877-m66zh\" (UID: \"0315476e-7140-4777-8061-9cead4c92024\") " pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" Feb 23 14:35:01.123816 master-0 kubenswrapper[28758]: I0223 14:35:01.123806 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3-auth-proxy-config\") pod \"cluster-autoscaler-operator-86b8dc6d6-2kvfp\" (UID: \"3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3\") " pod="openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp" Feb 23 14:35:01.123861 master-0 kubenswrapper[28758]: I0223 14:35:01.123834 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/255b5a89-1b89-42dc-868a-32ce67975a54-profile-collector-cert\") pod \"catalog-operator-596f79dd6f-mhzxn\" (UID: \"255b5a89-1b89-42dc-868a-32ce67975a54\") " pod="openshift-operator-lifecycle-manager/catalog-operator-596f79dd6f-mhzxn" Feb 23 14:35:01.123861 master-0 kubenswrapper[28758]: I0223 14:35:01.123837 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5cc28e06-3542-4a25-a8b1-5f5b4ee41114-mcc-auth-proxy-config\") pod \"machine-config-controller-54cb48566c-g4r57\" (UID: \"5cc28e06-3542-4a25-a8b1-5f5b4ee41114\") " pod="openshift-machine-config-operator/machine-config-controller-54cb48566c-g4r57" Feb 23 14:35:01.123935 master-0 kubenswrapper[28758]: I0223 14:35:01.123864 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/12b256b7-a57b-4124-8452-25e74cfa7926-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-d6bb9bb76-4frj6\" (UID: \"12b256b7-a57b-4124-8452-25e74cfa7926\") " pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" Feb 23 14:35:01.124085 master-0 kubenswrapper[28758]: I0223 14:35:01.124056 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/255b5a89-1b89-42dc-868a-32ce67975a54-profile-collector-cert\") pod \"catalog-operator-596f79dd6f-mhzxn\" (UID: \"255b5a89-1b89-42dc-868a-32ce67975a54\") " pod="openshift-operator-lifecycle-manager/catalog-operator-596f79dd6f-mhzxn" Feb 23 14:35:01.124085 master-0 kubenswrapper[28758]: I0223 14:35:01.124061 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3-auth-proxy-config\") pod \"cluster-autoscaler-operator-86b8dc6d6-2kvfp\" (UID: \"3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3\") " pod="openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp" Feb 23 14:35:01.124085 master-0 kubenswrapper[28758]: I0223 14:35:01.124056 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-serving-cert\") pod \"insights-operator-59b498fcfb-rz897\" (UID: \"ae4baa4e-4ef4-433d-aa36-149e92fa6ee2\") " pod="openshift-insights/insights-operator-59b498fcfb-rz897" Feb 23 14:35:01.124215 master-0 kubenswrapper[28758]: I0223 14:35:01.124110 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/12b256b7-a57b-4124-8452-25e74cfa7926-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-d6bb9bb76-4frj6\" (UID: \"12b256b7-a57b-4124-8452-25e74cfa7926\") " pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" Feb 23 14:35:01.124215 master-0 kubenswrapper[28758]: I0223 14:35:01.124125 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0315476e-7140-4777-8061-9cead4c92024-webhook-cert\") pod \"packageserver-65c9585877-m66zh\" (UID: \"0315476e-7140-4777-8061-9cead4c92024\") " pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" Feb 23 14:35:01.124215 master-0 kubenswrapper[28758]: I0223 14:35:01.124150 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/af950a67-1557-4352-8100-27281bb8ecbe-images\") pod \"machine-config-operator-7f8c75f984-rdjxr\" (UID: \"af950a67-1557-4352-8100-27281bb8ecbe\") " pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" Feb 23 14:35:01.124215 master-0 kubenswrapper[28758]: I0223 14:35:01.124184 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8dd5fa7c-0519-4170-89c6-b369e5fc1990-webhook-certs\") pod \"multus-admission-controller-5f54bf67d4-2p4jz\" (UID: \"8dd5fa7c-0519-4170-89c6-b369e5fc1990\") " pod="openshift-multus/multus-admission-controller-5f54bf67d4-2p4jz" Feb 23 14:35:01.124358 master-0 kubenswrapper[28758]: I0223 14:35:01.124246 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/482284fd-6911-4ba6-8d57-7966cc51117a-client-ca\") pod \"route-controller-manager-8bb99f4f-msq8f\" (UID: \"482284fd-6911-4ba6-8d57-7966cc51117a\") " pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:35:01.124358 master-0 kubenswrapper[28758]: I0223 14:35:01.124247 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-serving-cert\") pod \"insights-operator-59b498fcfb-rz897\" (UID: \"ae4baa4e-4ef4-433d-aa36-149e92fa6ee2\") " pod="openshift-insights/insights-operator-59b498fcfb-rz897" Feb 23 14:35:01.124358 master-0 kubenswrapper[28758]: I0223 14:35:01.124290 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/18da400b-2271-455d-be0d-0ed44c74f78d-metrics-client-ca\") pod \"prometheus-operator-754bc4d665-nl92v\" (UID: \"18da400b-2271-455d-be0d-0ed44c74f78d\") " pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" Feb 23 14:35:01.124358 master-0 kubenswrapper[28758]: I0223 14:35:01.124313 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/af950a67-1557-4352-8100-27281bb8ecbe-images\") pod \"machine-config-operator-7f8c75f984-rdjxr\" (UID: \"af950a67-1557-4352-8100-27281bb8ecbe\") " pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" Feb 23 14:35:01.124358 master-0 kubenswrapper[28758]: I0223 14:35:01.124326 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:35:01.124560 master-0 kubenswrapper[28758]: I0223 14:35:01.124403 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/18da400b-2271-455d-be0d-0ed44c74f78d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-754bc4d665-nl92v\" (UID: \"18da400b-2271-455d-be0d-0ed44c74f78d\") " pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" Feb 23 14:35:01.124560 master-0 kubenswrapper[28758]: I0223 14:35:01.124444 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbb66172-1ea9-4683-b88f-227c4fd94924-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-f94476f49-s6c8v\" (UID: \"fbb66172-1ea9-4683-b88f-227c4fd94924\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-f94476f49-s6c8v" Feb 23 14:35:01.124560 master-0 kubenswrapper[28758]: I0223 14:35:01.124538 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceba7b56-f910-473d-aed5-add94868fb31-config\") pod \"machine-api-operator-5c7cf458b4-bb7zl\" (UID: \"ceba7b56-f910-473d-aed5-add94868fb31\") " pod="openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl" Feb 23 14:35:01.124646 master-0 kubenswrapper[28758]: I0223 14:35:01.124595 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3-cert\") pod \"cluster-autoscaler-operator-86b8dc6d6-2kvfp\" (UID: \"3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3\") " pod="openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp" Feb 23 14:35:01.124646 master-0 kubenswrapper[28758]: I0223 14:35:01.124632 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12b256b7-a57b-4124-8452-25e74cfa7926-cert\") pod \"cluster-baremetal-operator-d6bb9bb76-4frj6\" (UID: \"12b256b7-a57b-4124-8452-25e74cfa7926\") " pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" Feb 23 14:35:01.124706 master-0 kubenswrapper[28758]: I0223 14:35:01.124679 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-service-ca-bundle\") pod \"insights-operator-59b498fcfb-rz897\" (UID: \"ae4baa4e-4ef4-433d-aa36-149e92fa6ee2\") " pod="openshift-insights/insights-operator-59b498fcfb-rz897" Feb 23 14:35:01.124739 master-0 kubenswrapper[28758]: I0223 14:35:01.124690 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbb66172-1ea9-4683-b88f-227c4fd94924-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-f94476f49-s6c8v\" (UID: \"fbb66172-1ea9-4683-b88f-227c4fd94924\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-f94476f49-s6c8v" Feb 23 14:35:01.124898 master-0 kubenswrapper[28758]: I0223 14:35:01.124869 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12b256b7-a57b-4124-8452-25e74cfa7926-cert\") pod \"cluster-baremetal-operator-d6bb9bb76-4frj6\" (UID: \"12b256b7-a57b-4124-8452-25e74cfa7926\") " pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" Feb 23 14:35:01.124898 master-0 kubenswrapper[28758]: I0223 14:35:01.124885 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3-cert\") pod \"cluster-autoscaler-operator-86b8dc6d6-2kvfp\" (UID: \"3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3\") " pod="openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp" Feb 23 14:35:01.124968 master-0 kubenswrapper[28758]: I0223 14:35:01.124915 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0cebb80d-d898-44c8-82b3-1e18833cee3f-profile-collector-cert\") pod \"olm-operator-5499d7f7bb-t45zz\" (UID: \"0cebb80d-d898-44c8-82b3-1e18833cee3f\") " pod="openshift-operator-lifecycle-manager/olm-operator-5499d7f7bb-t45zz" Feb 23 14:35:01.125157 master-0 kubenswrapper[28758]: I0223 14:35:01.125117 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0cebb80d-d898-44c8-82b3-1e18833cee3f-profile-collector-cert\") pod \"olm-operator-5499d7f7bb-t45zz\" (UID: \"0cebb80d-d898-44c8-82b3-1e18833cee3f\") " pod="openshift-operator-lifecycle-manager/olm-operator-5499d7f7bb-t45zz" Feb 23 14:35:01.127508 master-0 kubenswrapper[28758]: I0223 14:35:01.127466 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Feb 23 14:35:01.133249 master-0 kubenswrapper[28758]: I0223 14:35:01.133214 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-trusted-ca-bundle\") pod \"insights-operator-59b498fcfb-rz897\" (UID: \"ae4baa4e-4ef4-433d-aa36-149e92fa6ee2\") " pod="openshift-insights/insights-operator-59b498fcfb-rz897" Feb 23 14:35:01.139642 master-0 kubenswrapper[28758]: I0223 14:35:01.139613 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Feb 23 14:35:01.145297 master-0 kubenswrapper[28758]: I0223 14:35:01.145267 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-service-ca-bundle\") pod \"insights-operator-59b498fcfb-rz897\" (UID: \"ae4baa4e-4ef4-433d-aa36-149e92fa6ee2\") " pod="openshift-insights/insights-operator-59b498fcfb-rz897" Feb 23 14:35:01.160081 master-0 kubenswrapper[28758]: I0223 14:35:01.160001 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 23 14:35:01.163723 master-0 kubenswrapper[28758]: I0223 14:35:01.163692 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/af950a67-1557-4352-8100-27281bb8ecbe-proxy-tls\") pod \"machine-config-operator-7f8c75f984-rdjxr\" (UID: \"af950a67-1557-4352-8100-27281bb8ecbe\") " pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" Feb 23 14:35:01.180200 master-0 kubenswrapper[28758]: I0223 14:35:01.180138 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 23 14:35:01.183692 master-0 kubenswrapper[28758]: I0223 14:35:01.183662 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-client-ca\") pod \"controller-manager-55d786cb4c-cqkbt\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:35:01.200262 master-0 kubenswrapper[28758]: I0223 14:35:01.200145 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 23 14:35:01.202564 master-0 kubenswrapper[28758]: I0223 14:35:01.202503 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/959c2393-e914-4c10-a18f-b30fcf012d19-serving-cert\") pod \"controller-manager-55d786cb4c-cqkbt\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:35:01.220045 master-0 kubenswrapper[28758]: I0223 14:35:01.219986 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 23 14:35:01.223763 master-0 kubenswrapper[28758]: I0223 14:35:01.223703 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-config\") pod \"controller-manager-55d786cb4c-cqkbt\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:35:01.240627 master-0 kubenswrapper[28758]: I0223 14:35:01.240581 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 23 14:35:01.241717 master-0 kubenswrapper[28758]: I0223 14:35:01.241678 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/482284fd-6911-4ba6-8d57-7966cc51117a-config\") pod \"route-controller-manager-8bb99f4f-msq8f\" (UID: \"482284fd-6911-4ba6-8d57-7966cc51117a\") " pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:35:01.260292 master-0 kubenswrapper[28758]: I0223 14:35:01.260230 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 23 14:35:01.264813 master-0 kubenswrapper[28758]: I0223 14:35:01.264769 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/482284fd-6911-4ba6-8d57-7966cc51117a-serving-cert\") pod \"route-controller-manager-8bb99f4f-msq8f\" (UID: \"482284fd-6911-4ba6-8d57-7966cc51117a\") " pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:35:01.280151 master-0 kubenswrapper[28758]: I0223 14:35:01.280094 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 23 14:35:01.284838 master-0 kubenswrapper[28758]: I0223 14:35:01.284797 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/482284fd-6911-4ba6-8d57-7966cc51117a-client-ca\") pod \"route-controller-manager-8bb99f4f-msq8f\" (UID: \"482284fd-6911-4ba6-8d57-7966cc51117a\") " pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:35:01.300031 master-0 kubenswrapper[28758]: I0223 14:35:01.299962 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 23 14:35:01.325008 master-0 kubenswrapper[28758]: I0223 14:35:01.324943 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 23 14:35:01.332760 master-0 kubenswrapper[28758]: I0223 14:35:01.332713 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-proxy-ca-bundles\") pod \"controller-manager-55d786cb4c-cqkbt\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:35:01.339691 master-0 kubenswrapper[28758]: I0223 14:35:01.339651 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 23 14:35:01.360382 master-0 kubenswrapper[28758]: I0223 14:35:01.360333 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 23 14:35:01.362722 master-0 kubenswrapper[28758]: I0223 14:35:01.362681 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ceba7b56-f910-473d-aed5-add94868fb31-images\") pod \"machine-api-operator-5c7cf458b4-bb7zl\" (UID: \"ceba7b56-f910-473d-aed5-add94868fb31\") " pod="openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl" Feb 23 14:35:01.380560 master-0 kubenswrapper[28758]: I0223 14:35:01.380501 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-5wrwc" Feb 23 14:35:01.400047 master-0 kubenswrapper[28758]: I0223 14:35:01.400000 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 23 14:35:01.404508 master-0 kubenswrapper[28758]: I0223 14:35:01.404440 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ceba7b56-f910-473d-aed5-add94868fb31-machine-api-operator-tls\") pod \"machine-api-operator-5c7cf458b4-bb7zl\" (UID: \"ceba7b56-f910-473d-aed5-add94868fb31\") " pod="openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl" Feb 23 14:35:01.420767 master-0 kubenswrapper[28758]: I0223 14:35:01.420711 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 23 14:35:01.425963 master-0 kubenswrapper[28758]: I0223 14:35:01.425892 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceba7b56-f910-473d-aed5-add94868fb31-config\") pod \"machine-api-operator-5c7cf458b4-bb7zl\" (UID: \"ceba7b56-f910-473d-aed5-add94868fb31\") " pod="openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl" Feb 23 14:35:01.440524 master-0 kubenswrapper[28758]: I0223 14:35:01.440456 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 23 14:35:01.458847 master-0 kubenswrapper[28758]: I0223 14:35:01.458735 28758 request.go:700] Waited for 2.023861656s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-route-controller-manager/secrets?fieldSelector=metadata.name%3Droute-controller-manager-sa-dockercfg-hnfcp&limit=500&resourceVersion=0 Feb 23 14:35:01.460638 master-0 kubenswrapper[28758]: I0223 14:35:01.460603 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-hnfcp" Feb 23 14:35:01.482589 master-0 kubenswrapper[28758]: I0223 14:35:01.482518 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 23 14:35:01.500722 master-0 kubenswrapper[28758]: I0223 14:35:01.500461 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 23 14:35:01.505719 master-0 kubenswrapper[28758]: I0223 14:35:01.505659 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0315476e-7140-4777-8061-9cead4c92024-webhook-cert\") pod \"packageserver-65c9585877-m66zh\" (UID: \"0315476e-7140-4777-8061-9cead4c92024\") " pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" Feb 23 14:35:01.505923 master-0 kubenswrapper[28758]: I0223 14:35:01.505898 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0315476e-7140-4777-8061-9cead4c92024-apiservice-cert\") pod \"packageserver-65c9585877-m66zh\" (UID: \"0315476e-7140-4777-8061-9cead4c92024\") " pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" Feb 23 14:35:01.519566 master-0 kubenswrapper[28758]: I0223 14:35:01.519505 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 23 14:35:01.542793 master-0 kubenswrapper[28758]: I0223 14:35:01.542704 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-zqmp5" Feb 23 14:35:01.559318 master-0 kubenswrapper[28758]: I0223 14:35:01.559278 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 23 14:35:01.562788 master-0 kubenswrapper[28758]: I0223 14:35:01.562734 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-auth-proxy-config\") pod \"machine-approver-7dd9c7d7b9-rn8fj\" (UID: \"c67a2ed2-f520-46fc-84d3-6816dc19f4e0\") " pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" Feb 23 14:35:01.579770 master-0 kubenswrapper[28758]: I0223 14:35:01.579706 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 23 14:35:01.583589 master-0 kubenswrapper[28758]: I0223 14:35:01.583553 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-config\") pod \"machine-approver-7dd9c7d7b9-rn8fj\" (UID: \"c67a2ed2-f520-46fc-84d3-6816dc19f4e0\") " pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" Feb 23 14:35:01.600209 master-0 kubenswrapper[28758]: I0223 14:35:01.600162 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 23 14:35:01.603645 master-0 kubenswrapper[28758]: I0223 14:35:01.603604 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-machine-approver-tls\") pod \"machine-approver-7dd9c7d7b9-rn8fj\" (UID: \"c67a2ed2-f520-46fc-84d3-6816dc19f4e0\") " pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" Feb 23 14:35:01.619649 master-0 kubenswrapper[28758]: I0223 14:35:01.619581 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 23 14:35:01.640493 master-0 kubenswrapper[28758]: I0223 14:35:01.640423 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 23 14:35:01.643940 master-0 kubenswrapper[28758]: I0223 14:35:01.643909 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/76c67569-3a72-4de9-87cd-432a4607b15b-proxy-tls\") pod \"machine-config-daemon-fhcgg\" (UID: \"76c67569-3a72-4de9-87cd-432a4607b15b\") " pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" Feb 23 14:35:01.659916 master-0 kubenswrapper[28758]: I0223 14:35:01.659879 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-b9hzz" Feb 23 14:35:01.679313 master-0 kubenswrapper[28758]: I0223 14:35:01.679250 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-hcgtz" Feb 23 14:35:01.700617 master-0 kubenswrapper[28758]: I0223 14:35:01.700514 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-jgw8n" Feb 23 14:35:01.719939 master-0 kubenswrapper[28758]: I0223 14:35:01.719799 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-xc5pk" Feb 23 14:35:01.740174 master-0 kubenswrapper[28758]: I0223 14:35:01.739960 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Feb 23 14:35:01.745313 master-0 kubenswrapper[28758]: I0223 14:35:01.745269 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:35:01.760087 master-0 kubenswrapper[28758]: I0223 14:35:01.760023 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-vzz9z" Feb 23 14:35:01.779946 master-0 kubenswrapper[28758]: I0223 14:35:01.779903 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Feb 23 14:35:01.783298 master-0 kubenswrapper[28758]: I0223 14:35:01.783256 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-metrics-client-ca\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:35:01.783690 master-0 kubenswrapper[28758]: I0223 14:35:01.783648 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fae9a4cf-2acf-4728-9105-87e004052fe5-metrics-client-ca\") pod \"openshift-state-metrics-6dbff8cb4c-9qg7j\" (UID: \"fae9a4cf-2acf-4728-9105-87e004052fe5\") " pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" Feb 23 14:35:01.783889 master-0 kubenswrapper[28758]: I0223 14:35:01.783861 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/15ad7f4e-44c6-4426-8b97-c47a47786544-metrics-client-ca\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:35:01.785350 master-0 kubenswrapper[28758]: I0223 14:35:01.785314 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/18da400b-2271-455d-be0d-0ed44c74f78d-metrics-client-ca\") pod \"prometheus-operator-754bc4d665-nl92v\" (UID: \"18da400b-2271-455d-be0d-0ed44c74f78d\") " pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" Feb 23 14:35:01.800550 master-0 kubenswrapper[28758]: I0223 14:35:01.800467 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Feb 23 14:35:01.805607 master-0 kubenswrapper[28758]: I0223 14:35:01.805530 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/18da400b-2271-455d-be0d-0ed44c74f78d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-754bc4d665-nl92v\" (UID: \"18da400b-2271-455d-be0d-0ed44c74f78d\") " pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" Feb 23 14:35:01.819748 master-0 kubenswrapper[28758]: I0223 14:35:01.819696 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Feb 23 14:35:01.824447 master-0 kubenswrapper[28758]: I0223 14:35:01.824336 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:35:01.840551 master-0 kubenswrapper[28758]: I0223 14:35:01.840504 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Feb 23 14:35:01.842280 master-0 kubenswrapper[28758]: I0223 14:35:01.842216 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-state-metrics-tls\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:35:01.859320 master-0 kubenswrapper[28758]: I0223 14:35:01.859261 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 23 14:35:01.862806 master-0 kubenswrapper[28758]: I0223 14:35:01.862772 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5cc28e06-3542-4a25-a8b1-5f5b4ee41114-proxy-tls\") pod \"machine-config-controller-54cb48566c-g4r57\" (UID: \"5cc28e06-3542-4a25-a8b1-5f5b4ee41114\") " pod="openshift-machine-config-operator/machine-config-controller-54cb48566c-g4r57" Feb 23 14:35:01.879507 master-0 kubenswrapper[28758]: I0223 14:35:01.879450 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Feb 23 14:35:01.899973 master-0 kubenswrapper[28758]: I0223 14:35:01.899916 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-j9tjr" Feb 23 14:35:01.920036 master-0 kubenswrapper[28758]: I0223 14:35:01.919975 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-bqgj7" Feb 23 14:35:01.940349 master-0 kubenswrapper[28758]: I0223 14:35:01.940284 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Feb 23 14:35:01.943956 master-0 kubenswrapper[28758]: I0223 14:35:01.943904 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb\" (UID: \"172d47fd-e1a1-4d77-9e31-c4f22e824d5f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" Feb 23 14:35:01.962942 master-0 kubenswrapper[28758]: I0223 14:35:01.962871 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Feb 23 14:35:01.979694 master-0 kubenswrapper[28758]: I0223 14:35:01.979531 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Feb 23 14:35:01.984387 master-0 kubenswrapper[28758]: I0223 14:35:01.984334 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-images\") pod \"cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb\" (UID: \"172d47fd-e1a1-4d77-9e31-c4f22e824d5f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" Feb 23 14:35:02.000063 master-0 kubenswrapper[28758]: I0223 14:35:02.000012 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Feb 23 14:35:02.004984 master-0 kubenswrapper[28758]: I0223 14:35:02.004944 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb\" (UID: \"172d47fd-e1a1-4d77-9e31-c4f22e824d5f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" Feb 23 14:35:02.020498 master-0 kubenswrapper[28758]: I0223 14:35:02.020433 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Feb 23 14:35:02.024699 master-0 kubenswrapper[28758]: I0223 14:35:02.024670 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/18da400b-2271-455d-be0d-0ed44c74f78d-prometheus-operator-tls\") pod \"prometheus-operator-754bc4d665-nl92v\" (UID: \"18da400b-2271-455d-be0d-0ed44c74f78d\") " pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" Feb 23 14:35:02.040026 master-0 kubenswrapper[28758]: I0223 14:35:02.039933 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-5q5j4" Feb 23 14:35:02.059653 master-0 kubenswrapper[28758]: I0223 14:35:02.059614 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-8k4nq" Feb 23 14:35:02.080267 master-0 kubenswrapper[28758]: I0223 14:35:02.080204 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Feb 23 14:35:02.083813 master-0 kubenswrapper[28758]: I0223 14:35:02.083781 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9416f5d0-32b4-4065-b678-26913af8b6dd-metrics-server-audit-profiles\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:35:02.100764 master-0 kubenswrapper[28758]: I0223 14:35:02.100704 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-car2df00nf4i0" Feb 23 14:35:02.103640 master-0 kubenswrapper[28758]: I0223 14:35:02.103603 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-client-ca-bundle\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:35:02.120164 master-0 kubenswrapper[28758]: I0223 14:35:02.120124 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Feb 23 14:35:02.122594 master-0 kubenswrapper[28758]: E0223 14:35:02.122554 28758 configmap.go:193] Couldn't get configMap openshift-monitoring/kubelet-serving-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:02.122664 master-0 kubenswrapper[28758]: E0223 14:35:02.122636 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9416f5d0-32b4-4065-b678-26913af8b6dd-configmap-kubelet-serving-ca-bundle podName:9416f5d0-32b4-4065-b678-26913af8b6dd nodeName:}" failed. No retries permitted until 2026-02-23 14:35:03.12261949 +0000 UTC m=+35.248935422 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "configmap-kubelet-serving-ca-bundle" (UniqueName: "kubernetes.io/configmap/9416f5d0-32b4-4065-b678-26913af8b6dd-configmap-kubelet-serving-ca-bundle") pod "metrics-server-f55d8f669-b2gf9" (UID: "9416f5d0-32b4-4065-b678-26913af8b6dd") : failed to sync configmap cache: timed out waiting for the condition Feb 23 14:35:02.122732 master-0 kubenswrapper[28758]: E0223 14:35:02.122657 28758 secret.go:189] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:02.122732 master-0 kubenswrapper[28758]: E0223 14:35:02.122705 28758 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:02.122811 master-0 kubenswrapper[28758]: E0223 14:35:02.122660 28758 secret.go:189] Couldn't get secret openshift-monitoring/metrics-client-certs: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:02.122811 master-0 kubenswrapper[28758]: E0223 14:35:02.122764 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0a39496-5e47-4415-b8bf-ed0634797ce1-certs podName:c0a39496-5e47-4415-b8bf-ed0634797ce1 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:03.122736753 +0000 UTC m=+35.249052715 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/c0a39496-5e47-4415-b8bf-ed0634797ce1-certs") pod "machine-config-server-qwsmk" (UID: "c0a39496-5e47-4415-b8bf-ed0634797ce1") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:02.122811 master-0 kubenswrapper[28758]: E0223 14:35:02.122798 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fae9a4cf-2acf-4728-9105-87e004052fe5-openshift-state-metrics-kube-rbac-proxy-config podName:fae9a4cf-2acf-4728-9105-87e004052fe5 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:03.122781764 +0000 UTC m=+35.249097736 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/fae9a4cf-2acf-4728-9105-87e004052fe5-openshift-state-metrics-kube-rbac-proxy-config") pod "openshift-state-metrics-6dbff8cb4c-9qg7j" (UID: "fae9a4cf-2acf-4728-9105-87e004052fe5") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:02.122913 master-0 kubenswrapper[28758]: E0223 14:35:02.122851 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-secret-metrics-client-certs podName:9416f5d0-32b4-4065-b678-26913af8b6dd nodeName:}" failed. No retries permitted until 2026-02-23 14:35:03.122825696 +0000 UTC m=+35.249141668 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-metrics-client-certs" (UniqueName: "kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-secret-metrics-client-certs") pod "metrics-server-f55d8f669-b2gf9" (UID: "9416f5d0-32b4-4065-b678-26913af8b6dd") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:02.123026 master-0 kubenswrapper[28758]: I0223 14:35:02.122992 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-secret-metrics-server-tls\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:35:02.123996 master-0 kubenswrapper[28758]: E0223 14:35:02.123939 28758 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:02.123996 master-0 kubenswrapper[28758]: E0223 14:35:02.123976 28758 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:02.124126 master-0 kubenswrapper[28758]: E0223 14:35:02.124025 28758 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:02.124126 master-0 kubenswrapper[28758]: E0223 14:35:02.124052 28758 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:02.124126 master-0 kubenswrapper[28758]: E0223 14:35:02.124080 28758 secret.go:189] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:02.124126 master-0 kubenswrapper[28758]: E0223 14:35:02.124027 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-tls podName:15ad7f4e-44c6-4426-8b97-c47a47786544 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:03.124000547 +0000 UTC m=+35.250316499 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-tls") pod "node-exporter-ckhv6" (UID: "15ad7f4e-44c6-4426-8b97-c47a47786544") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:02.124126 master-0 kubenswrapper[28758]: E0223 14:35:02.124119 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0a39496-5e47-4415-b8bf-ed0634797ce1-node-bootstrap-token podName:c0a39496-5e47-4415-b8bf-ed0634797ce1 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:03.12410516 +0000 UTC m=+35.250421202 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/c0a39496-5e47-4415-b8bf-ed0634797ce1-node-bootstrap-token") pod "machine-config-server-qwsmk" (UID: "c0a39496-5e47-4415-b8bf-ed0634797ce1") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:02.124334 master-0 kubenswrapper[28758]: E0223 14:35:02.124138 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-kube-rbac-proxy-config podName:15ad7f4e-44c6-4426-8b97-c47a47786544 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:03.124130141 +0000 UTC m=+35.250446203 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-kube-rbac-proxy-config") pod "node-exporter-ckhv6" (UID: "15ad7f4e-44c6-4426-8b97-c47a47786544") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:02.124334 master-0 kubenswrapper[28758]: E0223 14:35:02.124159 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fae9a4cf-2acf-4728-9105-87e004052fe5-openshift-state-metrics-tls podName:fae9a4cf-2acf-4728-9105-87e004052fe5 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:03.124148391 +0000 UTC m=+35.250464463 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/fae9a4cf-2acf-4728-9105-87e004052fe5-openshift-state-metrics-tls") pod "openshift-state-metrics-6dbff8cb4c-9qg7j" (UID: "fae9a4cf-2acf-4728-9105-87e004052fe5") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:02.124334 master-0 kubenswrapper[28758]: E0223 14:35:02.124174 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87f989cd-6c19-4a30-833a-10e98b7a0326-cert podName:87f989cd-6c19-4a30-833a-10e98b7a0326 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:03.124167822 +0000 UTC m=+35.250483884 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/87f989cd-6c19-4a30-833a-10e98b7a0326-cert") pod "ingress-canary-nwdpd" (UID: "87f989cd-6c19-4a30-833a-10e98b7a0326") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:02.124462 master-0 kubenswrapper[28758]: E0223 14:35:02.124352 28758 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:02.124462 master-0 kubenswrapper[28758]: E0223 14:35:02.124393 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dd5fa7c-0519-4170-89c6-b369e5fc1990-webhook-certs podName:8dd5fa7c-0519-4170-89c6-b369e5fc1990 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:03.124384148 +0000 UTC m=+35.250700180 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8dd5fa7c-0519-4170-89c6-b369e5fc1990-webhook-certs") pod "multus-admission-controller-5f54bf67d4-2p4jz" (UID: "8dd5fa7c-0519-4170-89c6-b369e5fc1990") : failed to sync secret cache: timed out waiting for the condition Feb 23 14:35:02.139627 master-0 kubenswrapper[28758]: I0223 14:35:02.139574 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 23 14:35:02.160677 master-0 kubenswrapper[28758]: I0223 14:35:02.160621 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-l72nr" Feb 23 14:35:02.179839 master-0 kubenswrapper[28758]: I0223 14:35:02.179782 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 23 14:35:02.199920 master-0 kubenswrapper[28758]: I0223 14:35:02.199874 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 23 14:35:02.219782 master-0 kubenswrapper[28758]: I0223 14:35:02.219740 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Feb 23 14:35:02.240338 master-0 kubenswrapper[28758]: I0223 14:35:02.240211 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Feb 23 14:35:02.259937 master-0 kubenswrapper[28758]: I0223 14:35:02.259902 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Feb 23 14:35:02.280425 master-0 kubenswrapper[28758]: I0223 14:35:02.280361 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-h82x8" Feb 23 14:35:02.300106 master-0 kubenswrapper[28758]: I0223 14:35:02.300055 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 23 14:35:02.319818 master-0 kubenswrapper[28758]: I0223 14:35:02.319772 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Feb 23 14:35:02.340235 master-0 kubenswrapper[28758]: I0223 14:35:02.340142 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-vkk2b" Feb 23 14:35:02.359778 master-0 kubenswrapper[28758]: I0223 14:35:02.359716 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Feb 23 14:35:02.380419 master-0 kubenswrapper[28758]: I0223 14:35:02.380365 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Feb 23 14:35:02.400072 master-0 kubenswrapper[28758]: I0223 14:35:02.400030 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-srdtb" Feb 23 14:35:02.419418 master-0 kubenswrapper[28758]: I0223 14:35:02.419368 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-wvfd9" Feb 23 14:35:02.439420 master-0 kubenswrapper[28758]: I0223 14:35:02.439374 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 23 14:35:02.458835 master-0 kubenswrapper[28758]: I0223 14:35:02.458784 28758 request.go:700] Waited for 2.997530328s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-ingress-canary/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Feb 23 14:35:02.459847 master-0 kubenswrapper[28758]: I0223 14:35:02.459826 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 23 14:35:02.491031 master-0 kubenswrapper[28758]: I0223 14:35:02.490927 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p6hn\" (UniqueName: \"kubernetes.io/projected/06bde94a-3126-4d0f-baba-49dc5fbec61b-kube-api-access-2p6hn\") pod \"router-default-7b65dc9fcb-w68qb\" (UID: \"06bde94a-3126-4d0f-baba-49dc5fbec61b\") " pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:35:02.509813 master-0 kubenswrapper[28758]: I0223 14:35:02.509779 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sflb\" (UniqueName: \"kubernetes.io/projected/c0a39496-5e47-4415-b8bf-ed0634797ce1-kube-api-access-9sflb\") pod \"machine-config-server-qwsmk\" (UID: \"c0a39496-5e47-4415-b8bf-ed0634797ce1\") " pod="openshift-machine-config-operator/machine-config-server-qwsmk" Feb 23 14:35:02.530400 master-0 kubenswrapper[28758]: I0223 14:35:02.530369 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qszm\" (UniqueName: \"kubernetes.io/projected/3488a7eb-5170-478c-9af7-490dbe0f514e-kube-api-access-6qszm\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:35:02.551347 master-0 kubenswrapper[28758]: I0223 14:35:02.551310 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxrvf\" (UniqueName: \"kubernetes.io/projected/af950a67-1557-4352-8100-27281bb8ecbe-kube-api-access-jxrvf\") pod \"machine-config-operator-7f8c75f984-rdjxr\" (UID: \"af950a67-1557-4352-8100-27281bb8ecbe\") " pod="openshift-machine-config-operator/machine-config-operator-7f8c75f984-rdjxr" Feb 23 14:35:02.575383 master-0 kubenswrapper[28758]: I0223 14:35:02.575349 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94ddm\" (UniqueName: \"kubernetes.io/projected/15ad7f4e-44c6-4426-8b97-c47a47786544-kube-api-access-94ddm\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:35:02.596017 master-0 kubenswrapper[28758]: I0223 14:35:02.595977 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtbcj\" (UniqueName: \"kubernetes.io/projected/3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3-kube-api-access-qtbcj\") pod \"cluster-autoscaler-operator-86b8dc6d6-2kvfp\" (UID: \"3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3\") " pod="openshift-machine-api/cluster-autoscaler-operator-86b8dc6d6-2kvfp" Feb 23 14:35:02.610914 master-0 kubenswrapper[28758]: I0223 14:35:02.610871 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp5kb\" (UniqueName: \"kubernetes.io/projected/bbe678de-546d-49d0-8280-3f6d94fa5e4f-kube-api-access-kp5kb\") pod \"network-node-identity-td489\" (UID: \"bbe678de-546d-49d0-8280-3f6d94fa5e4f\") " pod="openshift-network-node-identity/network-node-identity-td489" Feb 23 14:35:02.633761 master-0 kubenswrapper[28758]: I0223 14:35:02.633678 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtp7w\" (UniqueName: \"kubernetes.io/projected/0315476e-7140-4777-8061-9cead4c92024-kube-api-access-jtp7w\") pod \"packageserver-65c9585877-m66zh\" (UID: \"0315476e-7140-4777-8061-9cead4c92024\") " pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" Feb 23 14:35:02.651967 master-0 kubenswrapper[28758]: I0223 14:35:02.651908 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp6tj\" (UniqueName: \"kubernetes.io/projected/d2aa0d48-7c8e-4ddb-84a3-b3c34414c061-kube-api-access-vp6tj\") pod \"cluster-olm-operator-5bd7768f54-bgg88\" (UID: \"d2aa0d48-7c8e-4ddb-84a3-b3c34414c061\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-5bd7768f54-bgg88" Feb 23 14:35:02.684062 master-0 kubenswrapper[28758]: I0223 14:35:02.683985 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4sbp\" (UniqueName: \"kubernetes.io/projected/607c1101-3533-43e3-9eda-13cea2b9dbb6-kube-api-access-v4sbp\") pod \"dns-operator-8c7d49845-5rk2g\" (UID: \"607c1101-3533-43e3-9eda-13cea2b9dbb6\") " pod="openshift-dns-operator/dns-operator-8c7d49845-5rk2g" Feb 23 14:35:02.692727 master-0 kubenswrapper[28758]: I0223 14:35:02.692680 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hp42\" (UniqueName: \"kubernetes.io/projected/24829faf-50e8-45bb-abb0-7cc5ccf81080-kube-api-access-7hp42\") pod \"openshift-apiserver-operator-8586dccc9b-tvnmq\" (UID: \"24829faf-50e8-45bb-abb0-7cc5ccf81080\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-8586dccc9b-tvnmq" Feb 23 14:35:02.708450 master-0 kubenswrapper[28758]: I0223 14:35:02.708371 28758 scope.go:117] "RemoveContainer" containerID="7461839a3a630e391eda2be4a947e3e187fea230edbbc3e8b3af02abc9e03e06" Feb 23 14:35:02.717827 master-0 kubenswrapper[28758]: I0223 14:35:02.717729 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzcqx\" (UniqueName: \"kubernetes.io/projected/f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c-kube-api-access-dzcqx\") pod \"kube-state-metrics-59584d565f-pdl4r\" (UID: \"f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c\") " pod="openshift-monitoring/kube-state-metrics-59584d565f-pdl4r" Feb 23 14:35:02.733602 master-0 kubenswrapper[28758]: I0223 14:35:02.732498 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qggzs\" (UniqueName: \"kubernetes.io/projected/57b57915-64dd-42f5-b06f-bc4bcc06b667-kube-api-access-qggzs\") pod \"cluster-node-tuning-operator-bcf775fc9-z5t5b\" (UID: \"57b57915-64dd-42f5-b06f-bc4bcc06b667\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bcf775fc9-z5t5b" Feb 23 14:35:02.755711 master-0 kubenswrapper[28758]: I0223 14:35:02.755603 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nr85\" (UniqueName: \"kubernetes.io/projected/e2d00ece-7586-4346-adbb-eaae1aeda69e-kube-api-access-4nr85\") pod \"authentication-operator-5bd7c86784-mlbx2\" (UID: \"e2d00ece-7586-4346-adbb-eaae1aeda69e\") " pod="openshift-authentication-operator/authentication-operator-5bd7c86784-mlbx2" Feb 23 14:35:02.770953 master-0 kubenswrapper[28758]: I0223 14:35:02.770914 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8v9z\" (UniqueName: \"kubernetes.io/projected/fae9a4cf-2acf-4728-9105-87e004052fe5-kube-api-access-x8v9z\") pod \"openshift-state-metrics-6dbff8cb4c-9qg7j\" (UID: \"fae9a4cf-2acf-4728-9105-87e004052fe5\") " pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" Feb 23 14:35:02.791756 master-0 kubenswrapper[28758]: I0223 14:35:02.791716 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8llc8\" (UniqueName: \"kubernetes.io/projected/a4ae9292-71dc-4484-b277-43cb26c1e04d-kube-api-access-8llc8\") pod \"csi-snapshot-controller-operator-6fb4df594f-hkcgz\" (UID: \"a4ae9292-71dc-4484-b277-43cb26c1e04d\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-6fb4df594f-hkcgz" Feb 23 14:35:02.816085 master-0 kubenswrapper[28758]: I0223 14:35:02.816036 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brd4j\" (UniqueName: \"kubernetes.io/projected/674041a2-e2b0-4286-88cc-f1b00571e3f3-kube-api-access-brd4j\") pod \"network-operator-7d7db75979-x4qnw\" (UID: \"674041a2-e2b0-4286-88cc-f1b00571e3f3\") " pod="openshift-network-operator/network-operator-7d7db75979-x4qnw" Feb 23 14:35:02.835708 master-0 kubenswrapper[28758]: I0223 14:35:02.835662 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jlzj\" (UniqueName: \"kubernetes.io/projected/76c67569-3a72-4de9-87cd-432a4607b15b-kube-api-access-2jlzj\") pod \"machine-config-daemon-fhcgg\" (UID: \"76c67569-3a72-4de9-87cd-432a4607b15b\") " pod="openshift-machine-config-operator/machine-config-daemon-fhcgg" Feb 23 14:35:02.852228 master-0 kubenswrapper[28758]: I0223 14:35:02.852186 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qfr7\" (UniqueName: \"kubernetes.io/projected/e5104cdd-85b8-49ba-95ca-3e9c8218a01e-kube-api-access-8qfr7\") pod \"network-check-source-58fb6744f5-848dv\" (UID: \"e5104cdd-85b8-49ba-95ca-3e9c8218a01e\") " pod="openshift-network-diagnostics/network-check-source-58fb6744f5-848dv" Feb 23 14:35:02.883227 master-0 kubenswrapper[28758]: I0223 14:35:02.883161 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72769\" (UniqueName: \"kubernetes.io/projected/ceba7b56-f910-473d-aed5-add94868fb31-kube-api-access-72769\") pod \"machine-api-operator-5c7cf458b4-bb7zl\" (UID: \"ceba7b56-f910-473d-aed5-add94868fb31\") " pod="openshift-machine-api/machine-api-operator-5c7cf458b4-bb7zl" Feb 23 14:35:02.891968 master-0 kubenswrapper[28758]: I0223 14:35:02.891918 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr2p2\" (UniqueName: \"kubernetes.io/projected/865ceedb-b19a-4f2f-b295-311e1b7a645e-kube-api-access-tr2p2\") pod \"kube-storage-version-migrator-operator-fc889cfd5-tw2r9\" (UID: \"865ceedb-b19a-4f2f-b295-311e1b7a645e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-fc889cfd5-tw2r9" Feb 23 14:35:02.911547 master-0 kubenswrapper[28758]: I0223 14:35:02.911494 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knkx2\" (UniqueName: \"kubernetes.io/projected/ad0f0d72-0337-4347-bb50-e299a175f3ca-kube-api-access-knkx2\") pod \"cluster-image-registry-operator-779979bdf7-ml2d7\" (UID: \"ad0f0d72-0337-4347-bb50-e299a175f3ca\") " pod="openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7" Feb 23 14:35:02.932167 master-0 kubenswrapper[28758]: I0223 14:35:02.932104 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phmkf\" (UniqueName: \"kubernetes.io/projected/08c561b3-613b-425f-9de4-d5fc8762ea51-kube-api-access-phmkf\") pod \"iptables-alerter-t5h8h\" (UID: \"08c561b3-613b-425f-9de4-d5fc8762ea51\") " pod="openshift-network-operator/iptables-alerter-t5h8h" Feb 23 14:35:02.957036 master-0 kubenswrapper[28758]: I0223 14:35:02.956987 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhsc6\" (UniqueName: \"kubernetes.io/projected/2e89a047-9ebc-459b-b7b3-e902c1fb0e17-kube-api-access-bhsc6\") pod \"csi-snapshot-controller-6847bb4785-5fw2x\" (UID: \"2e89a047-9ebc-459b-b7b3-e902c1fb0e17\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-6847bb4785-5fw2x" Feb 23 14:35:02.971700 master-0 kubenswrapper[28758]: I0223 14:35:02.971647 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpc4t\" (UniqueName: \"kubernetes.io/projected/efdde2df-cd07-4898-88f4-7ecde0e04d7a-kube-api-access-tpc4t\") pod \"certified-operators-cdrlk\" (UID: \"efdde2df-cd07-4898-88f4-7ecde0e04d7a\") " pod="openshift-marketplace/certified-operators-cdrlk" Feb 23 14:35:02.993499 master-0 kubenswrapper[28758]: I0223 14:35:02.993429 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-269v7\" (UniqueName: \"kubernetes.io/projected/f10f592e-5738-4879-b776-246b357d4621-kube-api-access-269v7\") pod \"ovnkube-node-ftngv\" (UID: \"f10f592e-5738-4879-b776-246b357d4621\") " pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:35:03.013802 master-0 kubenswrapper[28758]: I0223 14:35:03.013649 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf04aca0-8174-4134-835d-37adf6a3b5ca-kube-api-access\") pod \"kube-controller-manager-operator-7bcfbc574b-zdntd\" (UID: \"cf04aca0-8174-4134-835d-37adf6a3b5ca\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-7bcfbc574b-zdntd" Feb 23 14:35:03.033199 master-0 kubenswrapper[28758]: I0223 14:35:03.033153 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9tkx\" (UniqueName: \"kubernetes.io/projected/585f74db-4593-426b-b0c7-ec8f64810549-kube-api-access-q9tkx\") pod \"marketplace-operator-6f5488b997-7b5sp\" (UID: \"585f74db-4593-426b-b0c7-ec8f64810549\") " pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:35:03.050890 master-0 kubenswrapper[28758]: I0223 14:35:03.050822 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzj2j\" (UniqueName: \"kubernetes.io/projected/ae4baa4e-4ef4-433d-aa36-149e92fa6ee2-kube-api-access-lzj2j\") pod \"insights-operator-59b498fcfb-rz897\" (UID: \"ae4baa4e-4ef4-433d-aa36-149e92fa6ee2\") " pod="openshift-insights/insights-operator-59b498fcfb-rz897" Feb 23 14:35:03.089270 master-0 kubenswrapper[28758]: I0223 14:35:03.089213 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzrqz\" (UniqueName: \"kubernetes.io/projected/588a804a-430a-47f4-aa97-c08e907239da-kube-api-access-hzrqz\") pod \"apiserver-666b887977-f7h55\" (UID: \"588a804a-430a-47f4-aa97-c08e907239da\") " pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:35:03.102541 master-0 kubenswrapper[28758]: I0223 14:35:03.102094 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjs6f\" (UniqueName: \"kubernetes.io/projected/b090ed5a-984f-41dd-8cea-34a1ece1514f-kube-api-access-fjs6f\") pod \"ovnkube-control-plane-5d8dfcdc87-jbc2v\" (UID: \"b090ed5a-984f-41dd-8cea-34a1ece1514f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-5d8dfcdc87-jbc2v" Feb 23 14:35:03.113432 master-0 kubenswrapper[28758]: I0223 14:35:03.113385 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzn8r\" (UniqueName: \"kubernetes.io/projected/c84f66f0-207e-436a-8f4e-d1971fa815eb-kube-api-access-gzn8r\") pod \"redhat-operators-tl6dk\" (UID: \"c84f66f0-207e-436a-8f4e-d1971fa815eb\") " pod="openshift-marketplace/redhat-operators-tl6dk" Feb 23 14:35:03.133504 master-0 kubenswrapper[28758]: I0223 14:35:03.133425 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88qnh\" (UniqueName: \"kubernetes.io/projected/483786a0-0a29-44bf-bbd0-2f37e045aa2c-kube-api-access-88qnh\") pod \"multus-additional-cni-plugins-jdsv6\" (UID: \"483786a0-0a29-44bf-bbd0-2f37e045aa2c\") " pod="openshift-multus/multus-additional-cni-plugins-jdsv6" Feb 23 14:35:03.167063 master-0 kubenswrapper[28758]: I0223 14:35:03.166941 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c0a39496-5e47-4415-b8bf-ed0634797ce1-certs\") pod \"machine-config-server-qwsmk\" (UID: \"c0a39496-5e47-4415-b8bf-ed0634797ce1\") " pod="openshift-machine-config-operator/machine-config-server-qwsmk" Feb 23 14:35:03.167063 master-0 kubenswrapper[28758]: I0223 14:35:03.167008 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:35:03.167063 master-0 kubenswrapper[28758]: I0223 14:35:03.167065 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87f989cd-6c19-4a30-833a-10e98b7a0326-cert\") pod \"ingress-canary-nwdpd\" (UID: \"87f989cd-6c19-4a30-833a-10e98b7a0326\") " pod="openshift-ingress-canary/ingress-canary-nwdpd" Feb 23 14:35:03.167936 master-0 kubenswrapper[28758]: I0223 14:35:03.167270 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fae9a4cf-2acf-4728-9105-87e004052fe5-openshift-state-metrics-tls\") pod \"openshift-state-metrics-6dbff8cb4c-9qg7j\" (UID: \"fae9a4cf-2acf-4728-9105-87e004052fe5\") " pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" Feb 23 14:35:03.167936 master-0 kubenswrapper[28758]: I0223 14:35:03.167303 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/c0a39496-5e47-4415-b8bf-ed0634797ce1-certs\") pod \"machine-config-server-qwsmk\" (UID: \"c0a39496-5e47-4415-b8bf-ed0634797ce1\") " pod="openshift-machine-config-operator/machine-config-server-qwsmk" Feb 23 14:35:03.167936 master-0 kubenswrapper[28758]: I0223 14:35:03.167427 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-tls\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:35:03.167936 master-0 kubenswrapper[28758]: I0223 14:35:03.167547 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:35:03.167936 master-0 kubenswrapper[28758]: I0223 14:35:03.167652 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87f989cd-6c19-4a30-833a-10e98b7a0326-cert\") pod \"ingress-canary-nwdpd\" (UID: \"87f989cd-6c19-4a30-833a-10e98b7a0326\") " pod="openshift-ingress-canary/ingress-canary-nwdpd" Feb 23 14:35:03.167936 master-0 kubenswrapper[28758]: I0223 14:35:03.167664 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c0a39496-5e47-4415-b8bf-ed0634797ce1-node-bootstrap-token\") pod \"machine-config-server-qwsmk\" (UID: \"c0a39496-5e47-4415-b8bf-ed0634797ce1\") " pod="openshift-machine-config-operator/machine-config-server-qwsmk" Feb 23 14:35:03.167936 master-0 kubenswrapper[28758]: I0223 14:35:03.167733 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/15ad7f4e-44c6-4426-8b97-c47a47786544-node-exporter-tls\") pod \"node-exporter-ckhv6\" (UID: \"15ad7f4e-44c6-4426-8b97-c47a47786544\") " pod="openshift-monitoring/node-exporter-ckhv6" Feb 23 14:35:03.167936 master-0 kubenswrapper[28758]: I0223 14:35:03.167785 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fae9a4cf-2acf-4728-9105-87e004052fe5-openshift-state-metrics-tls\") pod \"openshift-state-metrics-6dbff8cb4c-9qg7j\" (UID: \"fae9a4cf-2acf-4728-9105-87e004052fe5\") " pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" Feb 23 14:35:03.167936 master-0 kubenswrapper[28758]: I0223 14:35:03.167804 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8dd5fa7c-0519-4170-89c6-b369e5fc1990-webhook-certs\") pod \"multus-admission-controller-5f54bf67d4-2p4jz\" (UID: \"8dd5fa7c-0519-4170-89c6-b369e5fc1990\") " pod="openshift-multus/multus-admission-controller-5f54bf67d4-2p4jz" Feb 23 14:35:03.168312 master-0 kubenswrapper[28758]: I0223 14:35:03.167956 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/c0a39496-5e47-4415-b8bf-ed0634797ce1-node-bootstrap-token\") pod \"machine-config-server-qwsmk\" (UID: \"c0a39496-5e47-4415-b8bf-ed0634797ce1\") " pod="openshift-machine-config-operator/machine-config-server-qwsmk" Feb 23 14:35:03.168312 master-0 kubenswrapper[28758]: I0223 14:35:03.167998 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fae9a4cf-2acf-4728-9105-87e004052fe5-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-6dbff8cb4c-9qg7j\" (UID: \"fae9a4cf-2acf-4728-9105-87e004052fe5\") " pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" Feb 23 14:35:03.168312 master-0 kubenswrapper[28758]: I0223 14:35:03.168011 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8dd5fa7c-0519-4170-89c6-b369e5fc1990-webhook-certs\") pod \"multus-admission-controller-5f54bf67d4-2p4jz\" (UID: \"8dd5fa7c-0519-4170-89c6-b369e5fc1990\") " pod="openshift-multus/multus-admission-controller-5f54bf67d4-2p4jz" Feb 23 14:35:03.168312 master-0 kubenswrapper[28758]: I0223 14:35:03.168142 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-secret-metrics-client-certs\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:35:03.168312 master-0 kubenswrapper[28758]: I0223 14:35:03.168174 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9416f5d0-32b4-4065-b678-26913af8b6dd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:35:03.168312 master-0 kubenswrapper[28758]: I0223 14:35:03.168189 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fae9a4cf-2acf-4728-9105-87e004052fe5-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-6dbff8cb4c-9qg7j\" (UID: \"fae9a4cf-2acf-4728-9105-87e004052fe5\") " pod="openshift-monitoring/openshift-state-metrics-6dbff8cb4c-9qg7j" Feb 23 14:35:03.168569 master-0 kubenswrapper[28758]: I0223 14:35:03.168423 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-secret-metrics-client-certs\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:35:03.168569 master-0 kubenswrapper[28758]: I0223 14:35:03.168489 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9416f5d0-32b4-4065-b678-26913af8b6dd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:35:03.174558 master-0 kubenswrapper[28758]: I0223 14:35:03.174516 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4lz2\" (UniqueName: \"kubernetes.io/projected/ded555da-db03-498e-81a9-ad166f29a2aa-kube-api-access-x4lz2\") pod \"network-check-target-x9gxm\" (UID: \"ded555da-db03-498e-81a9-ad166f29a2aa\") " pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:35:03.178724 master-0 kubenswrapper[28758]: I0223 14:35:03.178686 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmmn9\" (UniqueName: \"kubernetes.io/projected/9b558268-2262-4593-893e-408639a9987d-kube-api-access-nmmn9\") pod \"tuned-wsx6c\" (UID: \"9b558268-2262-4593-893e-408639a9987d\") " pod="openshift-cluster-node-tuning-operator/tuned-wsx6c" Feb 23 14:35:03.203710 master-0 kubenswrapper[28758]: I0223 14:35:03.203663 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g999k\" (UniqueName: \"kubernetes.io/projected/8ca3dee6-f651-4536-991c-303752c22f07-kube-api-access-g999k\") pod \"migrator-5c85bff57-vk2x8\" (UID: \"8ca3dee6-f651-4536-991c-303752c22f07\") " pod="openshift-kube-storage-version-migrator/migrator-5c85bff57-vk2x8" Feb 23 14:35:03.217856 master-0 kubenswrapper[28758]: I0223 14:35:03.217804 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qhr9\" (UniqueName: \"kubernetes.io/projected/709ac071-4392-4a3f-a3d1-4bc8ba2f6236-kube-api-access-6qhr9\") pod \"service-ca-576b4d78bd-lq6ct\" (UID: \"709ac071-4392-4a3f-a3d1-4bc8ba2f6236\") " pod="openshift-service-ca/service-ca-576b4d78bd-lq6ct" Feb 23 14:35:03.231972 master-0 kubenswrapper[28758]: I0223 14:35:03.231921 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cd7w\" (UniqueName: \"kubernetes.io/projected/ea0b3538-9a7d-4995-b628-2d63f21d683c-kube-api-access-2cd7w\") pod \"apiserver-67f44b4d6d-7lpn4\" (UID: \"ea0b3538-9a7d-4995-b628-2d63f21d683c\") " pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:35:03.254131 master-0 kubenswrapper[28758]: I0223 14:35:03.254077 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5f8j\" (UniqueName: \"kubernetes.io/projected/5b54fc16-d2f7-4b10-a611-5b411b389c5a-kube-api-access-d5f8j\") pod \"package-server-manager-5c75f78c8b-cj2l7\" (UID: \"5b54fc16-d2f7-4b10-a611-5b411b389c5a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:35:03.276291 master-0 kubenswrapper[28758]: I0223 14:35:03.276151 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w5kr\" (UniqueName: \"kubernetes.io/projected/18da400b-2271-455d-be0d-0ed44c74f78d-kube-api-access-2w5kr\") pod \"prometheus-operator-754bc4d665-nl92v\" (UID: \"18da400b-2271-455d-be0d-0ed44c74f78d\") " pod="openshift-monitoring/prometheus-operator-754bc4d665-nl92v" Feb 23 14:35:03.290420 master-0 kubenswrapper[28758]: I0223 14:35:03.290368 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlz28\" (UniqueName: \"kubernetes.io/projected/1c60ff3f-2bb1-422e-be27-5eca96d85fd2-kube-api-access-jlz28\") pod \"operator-controller-controller-manager-9cc7d7bb-6zmk9\" (UID: \"1c60ff3f-2bb1-422e-be27-5eca96d85fd2\") " pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:35:03.315858 master-0 kubenswrapper[28758]: I0223 14:35:03.315813 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9z2f\" (UniqueName: \"kubernetes.io/projected/09d80e28-0b64-4c5d-a9bc-99d843d40165-kube-api-access-g9z2f\") pod \"multus-vdzqk\" (UID: \"09d80e28-0b64-4c5d-a9bc-99d843d40165\") " pod="openshift-multus/multus-vdzqk" Feb 23 14:35:03.330096 master-0 kubenswrapper[28758]: I0223 14:35:03.330052 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7jvd\" (UniqueName: \"kubernetes.io/projected/8de1f285-47ac-42aa-8026-8addce656362-kube-api-access-x7jvd\") pod \"etcd-operator-545bf96f4d-fpwtm\" (UID: \"8de1f285-47ac-42aa-8026-8addce656362\") " pod="openshift-etcd-operator/etcd-operator-545bf96f4d-fpwtm" Feb 23 14:35:03.354765 master-0 kubenswrapper[28758]: I0223 14:35:03.354132 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlqzc\" (UniqueName: \"kubernetes.io/projected/66c72c71-f74a-43ab-bf0d-1f4c93623774-kube-api-access-xlqzc\") pod \"catalogd-controller-manager-84b8d9d697-2hr5s\" (UID: \"66c72c71-f74a-43ab-bf0d-1f4c93623774\") " pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:35:03.370391 master-0 kubenswrapper[28758]: I0223 14:35:03.370338 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3488a7eb-5170-478c-9af7-490dbe0f514e-bound-sa-token\") pod \"ingress-operator-6569778c84-hsl6c\" (UID: \"3488a7eb-5170-478c-9af7-490dbe0f514e\") " pod="openshift-ingress-operator/ingress-operator-6569778c84-hsl6c" Feb 23 14:35:03.390149 master-0 kubenswrapper[28758]: I0223 14:35:03.389822 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqkz4\" (UniqueName: \"kubernetes.io/projected/6a801da1-a7eb-4187-98b8-315076f55e19-kube-api-access-pqkz4\") pod \"dns-default-86l7f\" (UID: \"6a801da1-a7eb-4187-98b8-315076f55e19\") " pod="openshift-dns/dns-default-86l7f" Feb 23 14:35:03.410304 master-0 kubenswrapper[28758]: I0223 14:35:03.410155 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj8ff\" (UniqueName: \"kubernetes.io/projected/c67a2ed2-f520-46fc-84d3-6816dc19f4e0-kube-api-access-hj8ff\") pod \"machine-approver-7dd9c7d7b9-rn8fj\" (UID: \"c67a2ed2-f520-46fc-84d3-6816dc19f4e0\") " pod="openshift-cluster-machine-approver/machine-approver-7dd9c7d7b9-rn8fj" Feb 23 14:35:03.430532 master-0 kubenswrapper[28758]: I0223 14:35:03.430467 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5d9w\" (UniqueName: \"kubernetes.io/projected/85365dec-af50-406c-b258-890e4f454c4a-kube-api-access-k5d9w\") pod \"cloud-credential-operator-6968c58f46-p7jh7\" (UID: \"85365dec-af50-406c-b258-890e4f454c4a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-6968c58f46-p7jh7" Feb 23 14:35:03.452048 master-0 kubenswrapper[28758]: I0223 14:35:03.451982 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr868\" (UniqueName: \"kubernetes.io/projected/b714a9df-026e-423d-a980-2569f0d92e47-kube-api-access-lr868\") pod \"service-ca-operator-c48c8bf7c-vtnsw\" (UID: \"b714a9df-026e-423d-a980-2569f0d92e47\") " pod="openshift-service-ca-operator/service-ca-operator-c48c8bf7c-vtnsw" Feb 23 14:35:03.472501 master-0 kubenswrapper[28758]: I0223 14:35:03.472422 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpqzn\" (UniqueName: \"kubernetes.io/projected/87f989cd-6c19-4a30-833a-10e98b7a0326-kube-api-access-wpqzn\") pod \"ingress-canary-nwdpd\" (UID: \"87f989cd-6c19-4a30-833a-10e98b7a0326\") " pod="openshift-ingress-canary/ingress-canary-nwdpd" Feb 23 14:35:03.478663 master-0 kubenswrapper[28758]: I0223 14:35:03.478624 28758 request.go:700] Waited for 3.911903871s due to client-side throttling, not priority and fairness, request: POST:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/serviceaccounts/redhat-marketplace/token Feb 23 14:35:03.492264 master-0 kubenswrapper[28758]: I0223 14:35:03.492225 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp8kk\" (UniqueName: \"kubernetes.io/projected/3f86e881-275c-4387-a23a-06c559c8f1e8-kube-api-access-wp8kk\") pod \"redhat-marketplace-pfb9h\" (UID: \"3f86e881-275c-4387-a23a-06c559c8f1e8\") " pod="openshift-marketplace/redhat-marketplace-pfb9h" Feb 23 14:35:03.511624 master-0 kubenswrapper[28758]: I0223 14:35:03.511551 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hnfl\" (UniqueName: \"kubernetes.io/projected/9416f5d0-32b4-4065-b678-26913af8b6dd-kube-api-access-7hnfl\") pod \"metrics-server-f55d8f669-b2gf9\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:35:03.533355 master-0 kubenswrapper[28758]: I0223 14:35:03.533249 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phzkn\" (UniqueName: \"kubernetes.io/projected/5cc28e06-3542-4a25-a8b1-5f5b4ee41114-kube-api-access-phzkn\") pod \"machine-config-controller-54cb48566c-g4r57\" (UID: \"5cc28e06-3542-4a25-a8b1-5f5b4ee41114\") " pod="openshift-machine-config-operator/machine-config-controller-54cb48566c-g4r57" Feb 23 14:35:03.553943 master-0 kubenswrapper[28758]: I0223 14:35:03.553892 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9cf1c39-24f0-420b-8020-089616d1cdf0-kube-api-access\") pod \"openshift-kube-scheduler-operator-77cd4d9559-qvq8x\" (UID: \"b9cf1c39-24f0-420b-8020-089616d1cdf0\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-77cd4d9559-qvq8x" Feb 23 14:35:03.570908 master-0 kubenswrapper[28758]: I0223 14:35:03.570846 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chznd\" (UniqueName: \"kubernetes.io/projected/cb6e88cd-98de-446a-92e8-f56a2f133703-kube-api-access-chznd\") pod \"openshift-controller-manager-operator-584cc7bcb5-67ds6\" (UID: \"cb6e88cd-98de-446a-92e8-f56a2f133703\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-584cc7bcb5-67ds6" Feb 23 14:35:03.593344 master-0 kubenswrapper[28758]: I0223 14:35:03.593280 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwtcj\" (UniqueName: \"kubernetes.io/projected/2f876e5d-2e82-47d0-8a9c-adacf2bddf77-kube-api-access-pwtcj\") pod \"node-resolver-7b6jk\" (UID: \"2f876e5d-2e82-47d0-8a9c-adacf2bddf77\") " pod="openshift-dns/node-resolver-7b6jk" Feb 23 14:35:03.615423 master-0 kubenswrapper[28758]: I0223 14:35:03.615328 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42sml\" (UniqueName: \"kubernetes.io/projected/959c2393-e914-4c10-a18f-b30fcf012d19-kube-api-access-42sml\") pod \"controller-manager-55d786cb4c-cqkbt\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:35:03.641011 master-0 kubenswrapper[28758]: I0223 14:35:03.640961 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl87q\" (UniqueName: \"kubernetes.io/projected/fbb66172-1ea9-4683-b88f-227c4fd94924-kube-api-access-kl87q\") pod \"cluster-storage-operator-f94476f49-s6c8v\" (UID: \"fbb66172-1ea9-4683-b88f-227c4fd94924\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-f94476f49-s6c8v" Feb 23 14:35:03.654890 master-0 kubenswrapper[28758]: I0223 14:35:03.654838 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/961e4ecd-545b-4270-ae34-e733dec793b6-kube-api-access\") pod \"kube-apiserver-operator-5d87bf58c-nq2tz\" (UID: \"961e4ecd-545b-4270-ae34-e733dec793b6\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5d87bf58c-nq2tz" Feb 23 14:35:03.673856 master-0 kubenswrapper[28758]: I0223 14:35:03.673763 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jzsd\" (UniqueName: \"kubernetes.io/projected/646fece3-4a42-4e0c-bcc7-5f705f948d63-kube-api-access-2jzsd\") pod \"cluster-monitoring-operator-6bb6d78bf-wzqcp\" (UID: \"646fece3-4a42-4e0c-bcc7-5f705f948d63\") " pod="openshift-monitoring/cluster-monitoring-operator-6bb6d78bf-wzqcp" Feb 23 14:35:03.695239 master-0 kubenswrapper[28758]: I0223 14:35:03.695187 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chs7z\" (UniqueName: \"kubernetes.io/projected/8dd5fa7c-0519-4170-89c6-b369e5fc1990-kube-api-access-chs7z\") pod \"multus-admission-controller-5f54bf67d4-2p4jz\" (UID: \"8dd5fa7c-0519-4170-89c6-b369e5fc1990\") " pod="openshift-multus/multus-admission-controller-5f54bf67d4-2p4jz" Feb 23 14:35:03.712797 master-0 kubenswrapper[28758]: I0223 14:35:03.712727 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nn4m\" (UniqueName: \"kubernetes.io/projected/255b5a89-1b89-42dc-868a-32ce67975a54-kube-api-access-5nn4m\") pod \"catalog-operator-596f79dd6f-mhzxn\" (UID: \"255b5a89-1b89-42dc-868a-32ce67975a54\") " pod="openshift-operator-lifecycle-manager/catalog-operator-596f79dd6f-mhzxn" Feb 23 14:35:03.731467 master-0 kubenswrapper[28758]: I0223 14:35:03.731404 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rg9g\" (UniqueName: \"kubernetes.io/projected/12b256b7-a57b-4124-8452-25e74cfa7926-kube-api-access-2rg9g\") pod \"cluster-baremetal-operator-d6bb9bb76-4frj6\" (UID: \"12b256b7-a57b-4124-8452-25e74cfa7926\") " pod="openshift-machine-api/cluster-baremetal-operator-d6bb9bb76-4frj6" Feb 23 14:35:03.751344 master-0 kubenswrapper[28758]: I0223 14:35:03.751285 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzkcs\" (UniqueName: \"kubernetes.io/projected/ace75aae-6f4f-4299-90e2-d5292271b136-kube-api-access-wzkcs\") pod \"network-metrics-daemon-9dnsv\" (UID: \"ace75aae-6f4f-4299-90e2-d5292271b136\") " pod="openshift-multus/network-metrics-daemon-9dnsv" Feb 23 14:35:03.771060 master-0 kubenswrapper[28758]: I0223 14:35:03.770962 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9774f8c-0f29-46d8-be77-81bcf74d5994-kube-api-access\") pod \"cluster-version-operator-57476485-m58rm\" (UID: \"b9774f8c-0f29-46d8-be77-81bcf74d5994\") " pod="openshift-cluster-version/cluster-version-operator-57476485-m58rm" Feb 23 14:35:03.849423 master-0 kubenswrapper[28758]: I0223 14:35:03.849279 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tl7p\" (UniqueName: \"kubernetes.io/projected/92c63c95-e880-4f51-9858-7715343f7bd8-kube-api-access-9tl7p\") pod \"openshift-config-operator-6f47d587d6-55qjr\" (UID: \"92c63c95-e880-4f51-9858-7715343f7bd8\") " pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" Feb 23 14:35:03.850840 master-0 kubenswrapper[28758]: I0223 14:35:03.850803 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr7rw\" (UniqueName: \"kubernetes.io/projected/1a283e3a-33ba-4ef7-87d3-55ed8c953fb4-kube-api-access-rr7rw\") pod \"cluster-samples-operator-65c5c48b9b-ps6x5\" (UID: \"1a283e3a-33ba-4ef7-87d3-55ed8c953fb4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-65c5c48b9b-ps6x5" Feb 23 14:35:03.851370 master-0 kubenswrapper[28758]: I0223 14:35:03.851314 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khfkr\" (UniqueName: \"kubernetes.io/projected/482284fd-6911-4ba6-8d57-7966cc51117a-kube-api-access-khfkr\") pod \"route-controller-manager-8bb99f4f-msq8f\" (UID: \"482284fd-6911-4ba6-8d57-7966cc51117a\") " pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:35:03.859089 master-0 kubenswrapper[28758]: I0223 14:35:03.859051 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vxqg\" (UniqueName: \"kubernetes.io/projected/adbf8f71-f005-4e5b-9de1-e49559cf7386-kube-api-access-5vxqg\") pod \"community-operators-fjpvt\" (UID: \"adbf8f71-f005-4e5b-9de1-e49559cf7386\") " pod="openshift-marketplace/community-operators-fjpvt" Feb 23 14:35:03.874568 master-0 kubenswrapper[28758]: I0223 14:35:03.874512 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ad0f0d72-0337-4347-bb50-e299a175f3ca-bound-sa-token\") pod \"cluster-image-registry-operator-779979bdf7-ml2d7\" (UID: \"ad0f0d72-0337-4347-bb50-e299a175f3ca\") " pod="openshift-image-registry/cluster-image-registry-operator-779979bdf7-ml2d7" Feb 23 14:35:03.890709 master-0 kubenswrapper[28758]: I0223 14:35:03.890653 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv5nj\" (UniqueName: \"kubernetes.io/projected/4373687a-61a0-434b-81f7-3fecaa1494ef-kube-api-access-wv5nj\") pod \"control-plane-machine-set-operator-686847ff5f-9q266\" (UID: \"4373687a-61a0-434b-81f7-3fecaa1494ef\") " pod="openshift-machine-api/control-plane-machine-set-operator-686847ff5f-9q266" Feb 23 14:35:03.921063 master-0 kubenswrapper[28758]: I0223 14:35:03.921003 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44599\" (UniqueName: \"kubernetes.io/projected/0cebb80d-d898-44c8-82b3-1e18833cee3f-kube-api-access-44599\") pod \"olm-operator-5499d7f7bb-t45zz\" (UID: \"0cebb80d-d898-44c8-82b3-1e18833cee3f\") " pod="openshift-operator-lifecycle-manager/olm-operator-5499d7f7bb-t45zz" Feb 23 14:35:03.932493 master-0 kubenswrapper[28758]: I0223 14:35:03.932423 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x6q2\" (UniqueName: \"kubernetes.io/projected/172d47fd-e1a1-4d77-9e31-c4f22e824d5f-kube-api-access-9x6q2\") pod \"cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb\" (UID: \"172d47fd-e1a1-4d77-9e31-c4f22e824d5f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb" Feb 23 14:35:03.949638 master-0 kubenswrapper[28758]: E0223 14:35:03.949587 28758 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Feb 23 14:35:03.949638 master-0 kubenswrapper[28758]: E0223 14:35:03.949626 28758 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-4-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Feb 23 14:35:03.949840 master-0 kubenswrapper[28758]: E0223 14:35:03.949699 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access podName:e1148263-7b15-4c12-a217-8b030ecd9348 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:04.449676893 +0000 UTC m=+36.575992825 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access") pod "installer-4-master-0" (UID: "e1148263-7b15-4c12-a217-8b030ecd9348") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Feb 23 14:35:03.972372 master-0 kubenswrapper[28758]: E0223 14:35:03.972325 28758 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="3.885s" Feb 23 14:35:03.972537 master-0 kubenswrapper[28758]: I0223 14:35:03.972391 28758 status_manager.go:379] "Container startup changed for unknown container" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" containerID="cri-o://7461839a3a630e391eda2be4a947e3e187fea230edbbc3e8b3af02abc9e03e06" Feb 23 14:35:03.972537 master-0 kubenswrapper[28758]: I0223 14:35:03.972410 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:35:03.972537 master-0 kubenswrapper[28758]: I0223 14:35:03.972469 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-75d56db95f-rg8tp" Feb 23 14:35:03.984460 master-0 kubenswrapper[28758]: I0223 14:35:03.984414 28758 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Feb 23 14:35:04.021375 master-0 kubenswrapper[28758]: I0223 14:35:04.021327 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-75d56db95f-rg8tp" Feb 23 14:35:04.021375 master-0 kubenswrapper[28758]: I0223 14:35:04.021369 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"959c75833224b4ba3fa488b77d8f5032","Type":"ContainerStarted","Data":"f441f13cd166edd4b57db31fab13e616ea6f8d8cb025cae8395a8b8e6b1895b4"} Feb 23 14:35:04.021652 master-0 kubenswrapper[28758]: I0223 14:35:04.021393 28758 status_manager.go:379] "Container startup changed for unknown container" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" containerID="cri-o://7461839a3a630e391eda2be4a947e3e187fea230edbbc3e8b3af02abc9e03e06" Feb 23 14:35:04.021652 master-0 kubenswrapper[28758]: I0223 14:35:04.021402 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:35:04.021652 master-0 kubenswrapper[28758]: I0223 14:35:04.021412 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:35:04.021652 master-0 kubenswrapper[28758]: I0223 14:35:04.021426 28758 status_manager.go:317] "Container readiness changed for unknown container" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" containerID="cri-o://7461839a3a630e391eda2be4a947e3e187fea230edbbc3e8b3af02abc9e03e06" Feb 23 14:35:04.021652 master-0 kubenswrapper[28758]: I0223 14:35:04.021434 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:35:04.021652 master-0 kubenswrapper[28758]: I0223 14:35:04.021441 28758 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:35:04.021652 master-0 kubenswrapper[28758]: I0223 14:35:04.021494 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" Feb 23 14:35:04.021652 master-0 kubenswrapper[28758]: I0223 14:35:04.021518 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-65c9585877-m66zh" Feb 23 14:35:04.021652 master-0 kubenswrapper[28758]: I0223 14:35:04.021531 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Feb 23 14:35:04.021652 master-0 kubenswrapper[28758]: I0223 14:35:04.021543 28758 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="7957d771-baea-4150-9b67-556f890a05d5" Feb 23 14:35:04.021652 master-0 kubenswrapper[28758]: I0223 14:35:04.021556 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" event={"ID":"06bde94a-3126-4d0f-baba-49dc5fbec61b","Type":"ContainerStarted","Data":"0a95beed49d38f4ead367fae9b123c8123da14c34a22193aaf09e63776bf0fbc"} Feb 23 14:35:04.021652 master-0 kubenswrapper[28758]: I0223 14:35:04.021597 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:35:04.021652 master-0 kubenswrapper[28758]: I0223 14:35:04.021610 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Feb 23 14:35:04.021652 master-0 kubenswrapper[28758]: I0223 14:35:04.021618 28758 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="7957d771-baea-4150-9b67-556f890a05d5" Feb 23 14:35:04.021652 master-0 kubenswrapper[28758]: I0223 14:35:04.021647 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:35:04.024064 master-0 kubenswrapper[28758]: I0223 14:35:04.021676 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:35:04.024064 master-0 kubenswrapper[28758]: I0223 14:35:04.021687 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cdrlk" Feb 23 14:35:04.024064 master-0 kubenswrapper[28758]: I0223 14:35:04.021719 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:35:04.024064 master-0 kubenswrapper[28758]: I0223 14:35:04.021748 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:35:04.024064 master-0 kubenswrapper[28758]: I0223 14:35:04.021758 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:35:04.024064 master-0 kubenswrapper[28758]: I0223 14:35:04.022876 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ftngv" Feb 23 14:35:04.024064 master-0 kubenswrapper[28758]: I0223 14:35:04.023065 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cdrlk" Feb 23 14:35:04.024064 master-0 kubenswrapper[28758]: I0223 14:35:04.023209 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:35:04.024064 master-0 kubenswrapper[28758]: I0223 14:35:04.023927 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tl6dk" Feb 23 14:35:04.024064 master-0 kubenswrapper[28758]: I0223 14:35:04.023963 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pfb9h" Feb 23 14:35:04.024064 master-0 kubenswrapper[28758]: I0223 14:35:04.024031 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cdrlk" Feb 23 14:35:04.024064 master-0 kubenswrapper[28758]: I0223 14:35:04.024049 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:35:04.024665 master-0 kubenswrapper[28758]: I0223 14:35:04.024085 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:35:04.024665 master-0 kubenswrapper[28758]: I0223 14:35:04.024115 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:35:04.024665 master-0 kubenswrapper[28758]: I0223 14:35:04.024138 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-6f5488b997-7b5sp" Feb 23 14:35:04.024665 master-0 kubenswrapper[28758]: I0223 14:35:04.024162 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:35:04.024665 master-0 kubenswrapper[28758]: I0223 14:35:04.024230 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:35:04.024665 master-0 kubenswrapper[28758]: I0223 14:35:04.024296 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:35:04.024665 master-0 kubenswrapper[28758]: I0223 14:35:04.024328 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-5c75f78c8b-cj2l7" Feb 23 14:35:04.024665 master-0 kubenswrapper[28758]: I0223 14:35:04.024406 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fjpvt" Feb 23 14:35:04.024665 master-0 kubenswrapper[28758]: I0223 14:35:04.024432 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-x9gxm" Feb 23 14:35:04.024665 master-0 kubenswrapper[28758]: I0223 14:35:04.024460 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:35:04.024665 master-0 kubenswrapper[28758]: I0223 14:35:04.024493 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:35:04.024665 master-0 kubenswrapper[28758]: I0223 14:35:04.024510 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:35:04.024665 master-0 kubenswrapper[28758]: I0223 14:35:04.024524 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tl6dk" Feb 23 14:35:04.024665 master-0 kubenswrapper[28758]: I0223 14:35:04.024536 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-86l7f" Feb 23 14:35:04.024665 master-0 kubenswrapper[28758]: I0223 14:35:04.024550 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:35:04.024665 master-0 kubenswrapper[28758]: I0223 14:35:04.024567 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:35:04.024665 master-0 kubenswrapper[28758]: I0223 14:35:04.024585 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pfb9h" Feb 23 14:35:04.024665 master-0 kubenswrapper[28758]: I0223 14:35:04.024597 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:35:04.024665 master-0 kubenswrapper[28758]: I0223 14:35:04.024631 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" Feb 23 14:35:04.024665 master-0 kubenswrapper[28758]: I0223 14:35:04.024660 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-5499d7f7bb-t45zz" Feb 23 14:35:04.024665 master-0 kubenswrapper[28758]: I0223 14:35:04.024685 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-596f79dd6f-mhzxn" Feb 23 14:35:04.025387 master-0 kubenswrapper[28758]: I0223 14:35:04.024713 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:35:04.025387 master-0 kubenswrapper[28758]: I0223 14:35:04.024746 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fjpvt" Feb 23 14:35:04.025387 master-0 kubenswrapper[28758]: I0223 14:35:04.024776 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:35:04.025387 master-0 kubenswrapper[28758]: I0223 14:35:04.024799 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-86l7f" Feb 23 14:35:04.025387 master-0 kubenswrapper[28758]: I0223 14:35:04.024824 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-9cc7d7bb-6zmk9" Feb 23 14:35:04.027341 master-0 kubenswrapper[28758]: I0223 14:35:04.027302 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-84b8d9d697-2hr5s" Feb 23 14:35:04.028929 master-0 kubenswrapper[28758]: I0223 14:35:04.028895 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:35:04.029210 master-0 kubenswrapper[28758]: I0223 14:35:04.029125 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-5499d7f7bb-t45zz" Feb 23 14:35:04.029412 master-0 kubenswrapper[28758]: I0223 14:35:04.029270 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:35:04.029412 master-0 kubenswrapper[28758]: I0223 14:35:04.029358 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:35:04.031295 master-0 kubenswrapper[28758]: I0223 14:35:04.030311 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-596f79dd6f-mhzxn" Feb 23 14:35:04.031295 master-0 kubenswrapper[28758]: I0223 14:35:04.030352 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-6f47d587d6-55qjr" Feb 23 14:35:04.031295 master-0 kubenswrapper[28758]: I0223 14:35:04.031020 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:35:04.065136 master-0 kubenswrapper[28758]: I0223 14:35:04.065100 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tl6dk" Feb 23 14:35:04.066097 master-0 kubenswrapper[28758]: I0223 14:35:04.066081 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fjpvt" Feb 23 14:35:04.072365 master-0 kubenswrapper[28758]: I0223 14:35:04.072332 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pfb9h" Feb 23 14:35:04.489608 master-0 kubenswrapper[28758]: I0223 14:35:04.489534 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access\") pod \"installer-4-master-0\" (UID: \"e1148263-7b15-4c12-a217-8b030ecd9348\") " pod="openshift-kube-apiserver/installer-4-master-0" Feb 23 14:35:04.490071 master-0 kubenswrapper[28758]: E0223 14:35:04.489702 28758 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Feb 23 14:35:04.490071 master-0 kubenswrapper[28758]: E0223 14:35:04.489720 28758 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-4-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Feb 23 14:35:04.490071 master-0 kubenswrapper[28758]: E0223 14:35:04.489763 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access podName:e1148263-7b15-4c12-a217-8b030ecd9348 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:05.489748894 +0000 UTC m=+37.616064816 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access") pod "installer-4-master-0" (UID: "e1148263-7b15-4c12-a217-8b030ecd9348") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Feb 23 14:35:04.707406 master-0 kubenswrapper[28758]: I0223 14:35:04.707366 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:35:04.709793 master-0 kubenswrapper[28758]: I0223 14:35:04.709614 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7b65dc9fcb-w68qb" Feb 23 14:35:04.752890 master-0 kubenswrapper[28758]: I0223 14:35:04.752758 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:35:05.189167 master-0 kubenswrapper[28758]: I0223 14:35:05.189037 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podStartSLOduration=6.188987243 podStartE2EDuration="6.188987243s" podCreationTimestamp="2026-02-23 14:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:35:05.188915821 +0000 UTC m=+37.315231753" watchObservedRunningTime="2026-02-23 14:35:05.188987243 +0000 UTC m=+37.315303175" Feb 23 14:35:05.508545 master-0 kubenswrapper[28758]: I0223 14:35:05.508196 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access\") pod \"installer-4-master-0\" (UID: \"e1148263-7b15-4c12-a217-8b030ecd9348\") " pod="openshift-kube-apiserver/installer-4-master-0" Feb 23 14:35:05.508545 master-0 kubenswrapper[28758]: E0223 14:35:05.508424 28758 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Feb 23 14:35:05.508545 master-0 kubenswrapper[28758]: E0223 14:35:05.508441 28758 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-4-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Feb 23 14:35:05.508545 master-0 kubenswrapper[28758]: E0223 14:35:05.508513 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access podName:e1148263-7b15-4c12-a217-8b030ecd9348 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:07.508495701 +0000 UTC m=+39.634811633 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access") pod "installer-4-master-0" (UID: "e1148263-7b15-4c12-a217-8b030ecd9348") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Feb 23 14:35:06.034933 master-0 kubenswrapper[28758]: I0223 14:35:06.034859 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=7.034838851 podStartE2EDuration="7.034838851s" podCreationTimestamp="2026-02-23 14:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:35:05.994003001 +0000 UTC m=+38.120318943" watchObservedRunningTime="2026-02-23 14:35:06.034838851 +0000 UTC m=+38.161154783" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: I0223 14:35:07.037839 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-5df5ffc47c-mlk22"] Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: E0223 14:35:07.038093 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="687e92a6cecf1e2beeef16a0b322ad08" containerName="kube-apiserver-insecure-readyz" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: I0223 14:35:07.038104 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="687e92a6cecf1e2beeef16a0b322ad08" containerName="kube-apiserver-insecure-readyz" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: E0223 14:35:07.038124 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="493a9ed3-6d64-489a-a68c-235b69a58782" containerName="installer" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: I0223 14:35:07.038130 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="493a9ed3-6d64-489a-a68c-235b69a58782" containerName="installer" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: E0223 14:35:07.038166 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fdb9885-7479-43b5-8613-b2857a798ade" containerName="installer" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: I0223 14:35:07.038171 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fdb9885-7479-43b5-8613-b2857a798ade" containerName="installer" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: E0223 14:35:07.038185 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ff46cdb00d28519af7c0cdc9ea8d11" containerName="kube-scheduler-recovery-controller" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: I0223 14:35:07.038191 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ff46cdb00d28519af7c0cdc9ea8d11" containerName="kube-scheduler-recovery-controller" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: E0223 14:35:07.038202 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="687e92a6cecf1e2beeef16a0b322ad08" containerName="setup" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: I0223 14:35:07.038207 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="687e92a6cecf1e2beeef16a0b322ad08" containerName="setup" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: E0223 14:35:07.038217 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f67ab24-82bc-4e71-b974-e25b819986c8" containerName="installer" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: I0223 14:35:07.038223 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f67ab24-82bc-4e71-b974-e25b819986c8" containerName="installer" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: E0223 14:35:07.038233 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0514f486-2562-473d-8b01-b69441b82367" containerName="assisted-installer-controller" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: I0223 14:35:07.038240 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="0514f486-2562-473d-8b01-b69441b82367" containerName="assisted-installer-controller" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: E0223 14:35:07.038251 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1225c7e0-f2d1-4b39-979c-c77191862c81" containerName="installer" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: I0223 14:35:07.038258 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="1225c7e0-f2d1-4b39-979c-c77191862c81" containerName="installer" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: E0223 14:35:07.038274 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ff46cdb00d28519af7c0cdc9ea8d11" containerName="kube-scheduler-cert-syncer" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: I0223 14:35:07.038280 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ff46cdb00d28519af7c0cdc9ea8d11" containerName="kube-scheduler-cert-syncer" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: E0223 14:35:07.038303 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ff46cdb00d28519af7c0cdc9ea8d11" containerName="kube-scheduler" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: I0223 14:35:07.038310 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ff46cdb00d28519af7c0cdc9ea8d11" containerName="kube-scheduler" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: E0223 14:35:07.038326 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52cbaf6-c1af-4c29-aef9-67523f5148c6" containerName="installer" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: I0223 14:35:07.038333 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52cbaf6-c1af-4c29-aef9-67523f5148c6" containerName="installer" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: E0223 14:35:07.038342 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1148263-7b15-4c12-a217-8b030ecd9348" containerName="installer" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: I0223 14:35:07.038349 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1148263-7b15-4c12-a217-8b030ecd9348" containerName="installer" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: E0223 14:35:07.038357 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56ff46cdb00d28519af7c0cdc9ea8d11" containerName="wait-for-host-port" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: I0223 14:35:07.038363 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="56ff46cdb00d28519af7c0cdc9ea8d11" containerName="wait-for-host-port" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: E0223 14:35:07.038373 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="687e92a6cecf1e2beeef16a0b322ad08" containerName="kube-apiserver" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: I0223 14:35:07.038379 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="687e92a6cecf1e2beeef16a0b322ad08" containerName="kube-apiserver" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: E0223 14:35:07.038397 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25b855e3-80dc-4ee5-80ab-c4742578a92f" containerName="installer" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: I0223 14:35:07.038404 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="25b855e3-80dc-4ee5-80ab-c4742578a92f" containerName="installer" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: E0223 14:35:07.038419 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29f7b30e-bf6a-4e54-b009-1b0fcd830035" containerName="installer" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: I0223 14:35:07.038427 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="29f7b30e-bf6a-4e54-b009-1b0fcd830035" containerName="installer" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: E0223 14:35:07.038439 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78f5dea4-ed09-44a1-8eb1-d1fc497cc173" containerName="collect-profiles" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: I0223 14:35:07.038446 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="78f5dea4-ed09-44a1-8eb1-d1fc497cc173" containerName="collect-profiles" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: E0223 14:35:07.038461 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94e30288-572c-4c6f-a063-a30243db8fd8" containerName="installer" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: I0223 14:35:07.038468 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="94e30288-572c-4c6f-a063-a30243db8fd8" containerName="installer" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: E0223 14:35:07.038496 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15245f43-22db-42eb-ab0b-702240986437" containerName="installer" Feb 23 14:35:07.038548 master-0 kubenswrapper[28758]: I0223 14:35:07.038503 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="15245f43-22db-42eb-ab0b-702240986437" containerName="installer" Feb 23 14:35:07.040516 master-0 kubenswrapper[28758]: I0223 14:35:07.038628 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="493a9ed3-6d64-489a-a68c-235b69a58782" containerName="installer" Feb 23 14:35:07.040516 master-0 kubenswrapper[28758]: I0223 14:35:07.038662 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="0514f486-2562-473d-8b01-b69441b82367" containerName="assisted-installer-controller" Feb 23 14:35:07.040516 master-0 kubenswrapper[28758]: I0223 14:35:07.038674 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="56ff46cdb00d28519af7c0cdc9ea8d11" containerName="kube-scheduler" Feb 23 14:35:07.040516 master-0 kubenswrapper[28758]: I0223 14:35:07.038692 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1148263-7b15-4c12-a217-8b030ecd9348" containerName="installer" Feb 23 14:35:07.040516 master-0 kubenswrapper[28758]: I0223 14:35:07.038703 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="56ff46cdb00d28519af7c0cdc9ea8d11" containerName="wait-for-host-port" Feb 23 14:35:07.040516 master-0 kubenswrapper[28758]: I0223 14:35:07.038719 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="687e92a6cecf1e2beeef16a0b322ad08" containerName="kube-apiserver-insecure-readyz" Feb 23 14:35:07.040516 master-0 kubenswrapper[28758]: I0223 14:35:07.038735 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="56ff46cdb00d28519af7c0cdc9ea8d11" containerName="kube-scheduler-cert-syncer" Feb 23 14:35:07.040516 master-0 kubenswrapper[28758]: I0223 14:35:07.038747 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="94e30288-572c-4c6f-a063-a30243db8fd8" containerName="installer" Feb 23 14:35:07.040516 master-0 kubenswrapper[28758]: I0223 14:35:07.038762 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fdb9885-7479-43b5-8613-b2857a798ade" containerName="installer" Feb 23 14:35:07.040516 master-0 kubenswrapper[28758]: I0223 14:35:07.038773 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="78f5dea4-ed09-44a1-8eb1-d1fc497cc173" containerName="collect-profiles" Feb 23 14:35:07.040516 master-0 kubenswrapper[28758]: I0223 14:35:07.038783 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="15245f43-22db-42eb-ab0b-702240986437" containerName="installer" Feb 23 14:35:07.040516 master-0 kubenswrapper[28758]: I0223 14:35:07.038793 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="687e92a6cecf1e2beeef16a0b322ad08" containerName="setup" Feb 23 14:35:07.040516 master-0 kubenswrapper[28758]: I0223 14:35:07.038802 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="29f7b30e-bf6a-4e54-b009-1b0fcd830035" containerName="installer" Feb 23 14:35:07.040516 master-0 kubenswrapper[28758]: I0223 14:35:07.038813 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="56ff46cdb00d28519af7c0cdc9ea8d11" containerName="kube-scheduler-recovery-controller" Feb 23 14:35:07.040516 master-0 kubenswrapper[28758]: I0223 14:35:07.038824 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="a52cbaf6-c1af-4c29-aef9-67523f5148c6" containerName="installer" Feb 23 14:35:07.040516 master-0 kubenswrapper[28758]: I0223 14:35:07.038831 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="687e92a6cecf1e2beeef16a0b322ad08" containerName="kube-apiserver" Feb 23 14:35:07.040516 master-0 kubenswrapper[28758]: I0223 14:35:07.038842 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f67ab24-82bc-4e71-b974-e25b819986c8" containerName="installer" Feb 23 14:35:07.040516 master-0 kubenswrapper[28758]: I0223 14:35:07.038852 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="1225c7e0-f2d1-4b39-979c-c77191862c81" containerName="installer" Feb 23 14:35:07.040516 master-0 kubenswrapper[28758]: I0223 14:35:07.038860 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="25b855e3-80dc-4ee5-80ab-c4742578a92f" containerName="installer" Feb 23 14:35:07.040516 master-0 kubenswrapper[28758]: I0223 14:35:07.039272 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-5df5ffc47c-mlk22" Feb 23 14:35:07.048977 master-0 kubenswrapper[28758]: I0223 14:35:07.048268 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 23 14:35:07.048977 master-0 kubenswrapper[28758]: I0223 14:35:07.048471 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-jdc7c" Feb 23 14:35:07.048977 master-0 kubenswrapper[28758]: I0223 14:35:07.048650 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 23 14:35:07.051659 master-0 kubenswrapper[28758]: I0223 14:35:07.051280 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 23 14:35:07.051659 master-0 kubenswrapper[28758]: I0223 14:35:07.051523 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 23 14:35:07.083977 master-0 kubenswrapper[28758]: I0223 14:35:07.081754 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 23 14:35:07.083977 master-0 kubenswrapper[28758]: I0223 14:35:07.083972 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-5df5ffc47c-mlk22"] Feb 23 14:35:07.132380 master-0 kubenswrapper[28758]: I0223 14:35:07.131878 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa3fb4e1-1d55-46c2-af94-6063cdedd456-config\") pod \"console-operator-5df5ffc47c-mlk22\" (UID: \"aa3fb4e1-1d55-46c2-af94-6063cdedd456\") " pod="openshift-console-operator/console-operator-5df5ffc47c-mlk22" Feb 23 14:35:07.132380 master-0 kubenswrapper[28758]: I0223 14:35:07.131982 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbzsj\" (UniqueName: \"kubernetes.io/projected/aa3fb4e1-1d55-46c2-af94-6063cdedd456-kube-api-access-nbzsj\") pod \"console-operator-5df5ffc47c-mlk22\" (UID: \"aa3fb4e1-1d55-46c2-af94-6063cdedd456\") " pod="openshift-console-operator/console-operator-5df5ffc47c-mlk22" Feb 23 14:35:07.132380 master-0 kubenswrapper[28758]: I0223 14:35:07.132009 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa3fb4e1-1d55-46c2-af94-6063cdedd456-serving-cert\") pod \"console-operator-5df5ffc47c-mlk22\" (UID: \"aa3fb4e1-1d55-46c2-af94-6063cdedd456\") " pod="openshift-console-operator/console-operator-5df5ffc47c-mlk22" Feb 23 14:35:07.132380 master-0 kubenswrapper[28758]: I0223 14:35:07.132036 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aa3fb4e1-1d55-46c2-af94-6063cdedd456-trusted-ca\") pod \"console-operator-5df5ffc47c-mlk22\" (UID: \"aa3fb4e1-1d55-46c2-af94-6063cdedd456\") " pod="openshift-console-operator/console-operator-5df5ffc47c-mlk22" Feb 23 14:35:07.135337 master-0 kubenswrapper[28758]: I0223 14:35:07.135306 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-8dc4666db-hhgpf"] Feb 23 14:35:07.137444 master-0 kubenswrapper[28758]: I0223 14:35:07.137417 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.139319 master-0 kubenswrapper[28758]: I0223 14:35:07.139303 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 23 14:35:07.139630 master-0 kubenswrapper[28758]: I0223 14:35:07.139618 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 23 14:35:07.140377 master-0 kubenswrapper[28758]: I0223 14:35:07.140339 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 23 14:35:07.140553 master-0 kubenswrapper[28758]: I0223 14:35:07.140538 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 23 14:35:07.140696 master-0 kubenswrapper[28758]: I0223 14:35:07.140685 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 23 14:35:07.140801 master-0 kubenswrapper[28758]: I0223 14:35:07.140729 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 23 14:35:07.140879 master-0 kubenswrapper[28758]: I0223 14:35:07.140858 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 23 14:35:07.141195 master-0 kubenswrapper[28758]: I0223 14:35:07.141184 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 23 14:35:07.154106 master-0 kubenswrapper[28758]: I0223 14:35:07.151873 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 23 14:35:07.155096 master-0 kubenswrapper[28758]: I0223 14:35:07.155051 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-8dc4666db-hhgpf"] Feb 23 14:35:07.160295 master-0 kubenswrapper[28758]: I0223 14:35:07.160256 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 23 14:35:07.161697 master-0 kubenswrapper[28758]: I0223 14:35:07.161637 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 23 14:35:07.180920 master-0 kubenswrapper[28758]: I0223 14:35:07.180871 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 23 14:35:07.200260 master-0 kubenswrapper[28758]: I0223 14:35:07.200200 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-f27w9" Feb 23 14:35:07.233617 master-0 kubenswrapper[28758]: I0223 14:35:07.233549 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.233617 master-0 kubenswrapper[28758]: I0223 14:35:07.233602 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-audit-dir\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.233617 master-0 kubenswrapper[28758]: I0223 14:35:07.233623 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.233951 master-0 kubenswrapper[28758]: I0223 14:35:07.233778 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-service-ca\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.233951 master-0 kubenswrapper[28758]: I0223 14:35:07.233868 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa3fb4e1-1d55-46c2-af94-6063cdedd456-config\") pod \"console-operator-5df5ffc47c-mlk22\" (UID: \"aa3fb4e1-1d55-46c2-af94-6063cdedd456\") " pod="openshift-console-operator/console-operator-5df5ffc47c-mlk22" Feb 23 14:35:07.233951 master-0 kubenswrapper[28758]: I0223 14:35:07.233933 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-session\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.234039 master-0 kubenswrapper[28758]: I0223 14:35:07.233963 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-audit-policies\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.234039 master-0 kubenswrapper[28758]: I0223 14:35:07.233988 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-user-template-login\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.234039 master-0 kubenswrapper[28758]: I0223 14:35:07.234024 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p8km\" (UniqueName: \"kubernetes.io/projected/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-kube-api-access-4p8km\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.234127 master-0 kubenswrapper[28758]: I0223 14:35:07.234054 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbzsj\" (UniqueName: \"kubernetes.io/projected/aa3fb4e1-1d55-46c2-af94-6063cdedd456-kube-api-access-nbzsj\") pod \"console-operator-5df5ffc47c-mlk22\" (UID: \"aa3fb4e1-1d55-46c2-af94-6063cdedd456\") " pod="openshift-console-operator/console-operator-5df5ffc47c-mlk22" Feb 23 14:35:07.234127 master-0 kubenswrapper[28758]: I0223 14:35:07.234072 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.234127 master-0 kubenswrapper[28758]: I0223 14:35:07.234110 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa3fb4e1-1d55-46c2-af94-6063cdedd456-serving-cert\") pod \"console-operator-5df5ffc47c-mlk22\" (UID: \"aa3fb4e1-1d55-46c2-af94-6063cdedd456\") " pod="openshift-console-operator/console-operator-5df5ffc47c-mlk22" Feb 23 14:35:07.234212 master-0 kubenswrapper[28758]: I0223 14:35:07.234130 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-user-template-error\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.234212 master-0 kubenswrapper[28758]: I0223 14:35:07.234149 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-router-certs\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.234212 master-0 kubenswrapper[28758]: I0223 14:35:07.234168 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aa3fb4e1-1d55-46c2-af94-6063cdedd456-trusted-ca\") pod \"console-operator-5df5ffc47c-mlk22\" (UID: \"aa3fb4e1-1d55-46c2-af94-6063cdedd456\") " pod="openshift-console-operator/console-operator-5df5ffc47c-mlk22" Feb 23 14:35:07.234212 master-0 kubenswrapper[28758]: I0223 14:35:07.234192 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.234322 master-0 kubenswrapper[28758]: I0223 14:35:07.234235 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.235120 master-0 kubenswrapper[28758]: I0223 14:35:07.235074 28758 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 23 14:35:07.235356 master-0 kubenswrapper[28758]: I0223 14:35:07.235331 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa3fb4e1-1d55-46c2-af94-6063cdedd456-config\") pod \"console-operator-5df5ffc47c-mlk22\" (UID: \"aa3fb4e1-1d55-46c2-af94-6063cdedd456\") " pod="openshift-console-operator/console-operator-5df5ffc47c-mlk22" Feb 23 14:35:07.235791 master-0 kubenswrapper[28758]: I0223 14:35:07.235759 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aa3fb4e1-1d55-46c2-af94-6063cdedd456-trusted-ca\") pod \"console-operator-5df5ffc47c-mlk22\" (UID: \"aa3fb4e1-1d55-46c2-af94-6063cdedd456\") " pod="openshift-console-operator/console-operator-5df5ffc47c-mlk22" Feb 23 14:35:07.241504 master-0 kubenswrapper[28758]: I0223 14:35:07.240036 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 23 14:35:07.244186 master-0 kubenswrapper[28758]: I0223 14:35:07.244136 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa3fb4e1-1d55-46c2-af94-6063cdedd456-serving-cert\") pod \"console-operator-5df5ffc47c-mlk22\" (UID: \"aa3fb4e1-1d55-46c2-af94-6063cdedd456\") " pod="openshift-console-operator/console-operator-5df5ffc47c-mlk22" Feb 23 14:35:07.294331 master-0 kubenswrapper[28758]: I0223 14:35:07.294219 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbzsj\" (UniqueName: \"kubernetes.io/projected/aa3fb4e1-1d55-46c2-af94-6063cdedd456-kube-api-access-nbzsj\") pod \"console-operator-5df5ffc47c-mlk22\" (UID: \"aa3fb4e1-1d55-46c2-af94-6063cdedd456\") " pod="openshift-console-operator/console-operator-5df5ffc47c-mlk22" Feb 23 14:35:07.335216 master-0 kubenswrapper[28758]: I0223 14:35:07.335141 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-session\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.335216 master-0 kubenswrapper[28758]: I0223 14:35:07.335206 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-audit-policies\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.335526 master-0 kubenswrapper[28758]: I0223 14:35:07.335238 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-user-template-login\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.335526 master-0 kubenswrapper[28758]: I0223 14:35:07.335272 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p8km\" (UniqueName: \"kubernetes.io/projected/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-kube-api-access-4p8km\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.335526 master-0 kubenswrapper[28758]: I0223 14:35:07.335308 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.335526 master-0 kubenswrapper[28758]: I0223 14:35:07.335341 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-user-template-error\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.335526 master-0 kubenswrapper[28758]: I0223 14:35:07.335370 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-router-certs\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.335526 master-0 kubenswrapper[28758]: I0223 14:35:07.335398 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.335526 master-0 kubenswrapper[28758]: I0223 14:35:07.335437 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.335526 master-0 kubenswrapper[28758]: I0223 14:35:07.335462 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-audit-dir\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.335526 master-0 kubenswrapper[28758]: I0223 14:35:07.335501 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.335526 master-0 kubenswrapper[28758]: I0223 14:35:07.335532 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.336003 master-0 kubenswrapper[28758]: I0223 14:35:07.335579 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-service-ca\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.336615 master-0 kubenswrapper[28758]: I0223 14:35:07.336560 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-service-ca\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.337419 master-0 kubenswrapper[28758]: I0223 14:35:07.336821 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-audit-dir\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.338021 master-0 kubenswrapper[28758]: I0223 14:35:07.337979 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-audit-policies\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.338149 master-0 kubenswrapper[28758]: I0223 14:35:07.338109 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.338251 master-0 kubenswrapper[28758]: I0223 14:35:07.338202 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.338834 master-0 kubenswrapper[28758]: I0223 14:35:07.338791 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-user-template-error\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.339345 master-0 kubenswrapper[28758]: I0223 14:35:07.339307 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.339505 master-0 kubenswrapper[28758]: I0223 14:35:07.339468 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.340053 master-0 kubenswrapper[28758]: I0223 14:35:07.340018 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-router-certs\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.340220 master-0 kubenswrapper[28758]: I0223 14:35:07.340184 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.341013 master-0 kubenswrapper[28758]: I0223 14:35:07.340959 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-session\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.347865 master-0 kubenswrapper[28758]: I0223 14:35:07.347812 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-user-template-login\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.353462 master-0 kubenswrapper[28758]: I0223 14:35:07.353412 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p8km\" (UniqueName: \"kubernetes.io/projected/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-kube-api-access-4p8km\") pod \"oauth-openshift-8dc4666db-hhgpf\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.382281 master-0 kubenswrapper[28758]: I0223 14:35:07.382227 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-5df5ffc47c-mlk22" Feb 23 14:35:07.463219 master-0 kubenswrapper[28758]: I0223 14:35:07.463163 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:07.542748 master-0 kubenswrapper[28758]: I0223 14:35:07.542412 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access\") pod \"installer-4-master-0\" (UID: \"e1148263-7b15-4c12-a217-8b030ecd9348\") " pod="openshift-kube-apiserver/installer-4-master-0" Feb 23 14:35:07.542748 master-0 kubenswrapper[28758]: E0223 14:35:07.542727 28758 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Feb 23 14:35:07.542748 master-0 kubenswrapper[28758]: E0223 14:35:07.542749 28758 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-4-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Feb 23 14:35:07.542956 master-0 kubenswrapper[28758]: E0223 14:35:07.542794 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access podName:e1148263-7b15-4c12-a217-8b030ecd9348 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:11.542777807 +0000 UTC m=+43.669093739 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access") pod "installer-4-master-0" (UID: "e1148263-7b15-4c12-a217-8b030ecd9348") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Feb 23 14:35:07.618049 master-0 kubenswrapper[28758]: I0223 14:35:07.617993 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-5df5ffc47c-mlk22"] Feb 23 14:35:07.624004 master-0 kubenswrapper[28758]: W0223 14:35:07.623933 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa3fb4e1_1d55_46c2_af94_6063cdedd456.slice/crio-d538bc68ce43357fa72302c6257f6c58e5a178ba13defaf210083795e4e77d4c WatchSource:0}: Error finding container d538bc68ce43357fa72302c6257f6c58e5a178ba13defaf210083795e4e77d4c: Status 404 returned error can't find the container with id d538bc68ce43357fa72302c6257f6c58e5a178ba13defaf210083795e4e77d4c Feb 23 14:35:07.626149 master-0 kubenswrapper[28758]: I0223 14:35:07.626116 28758 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 14:35:07.738275 master-0 kubenswrapper[28758]: I0223 14:35:07.738213 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-5df5ffc47c-mlk22" event={"ID":"aa3fb4e1-1d55-46c2-af94-6063cdedd456","Type":"ContainerStarted","Data":"d538bc68ce43357fa72302c6257f6c58e5a178ba13defaf210083795e4e77d4c"} Feb 23 14:35:07.875652 master-0 kubenswrapper[28758]: I0223 14:35:07.875521 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-8dc4666db-hhgpf"] Feb 23 14:35:07.881024 master-0 kubenswrapper[28758]: W0223 14:35:07.880943 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c4d0aed_42bc_4f15_a1c8_8d7a2205eabb.slice/crio-ea89b222569bbbe7e572ca8bd65b443664b462ee3b31459db4219befd4864a4f WatchSource:0}: Error finding container ea89b222569bbbe7e572ca8bd65b443664b462ee3b31459db4219befd4864a4f: Status 404 returned error can't find the container with id ea89b222569bbbe7e572ca8bd65b443664b462ee3b31459db4219befd4864a4f Feb 23 14:35:08.313462 master-0 kubenswrapper[28758]: I0223 14:35:08.313382 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-666b887977-f7h55" Feb 23 14:35:08.314032 master-0 kubenswrapper[28758]: I0223 14:35:08.313789 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-67f44b4d6d-7lpn4" Feb 23 14:35:08.745647 master-0 kubenswrapper[28758]: I0223 14:35:08.745581 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" event={"ID":"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb","Type":"ContainerStarted","Data":"ea89b222569bbbe7e572ca8bd65b443664b462ee3b31459db4219befd4864a4f"} Feb 23 14:35:09.748400 master-0 kubenswrapper[28758]: I0223 14:35:09.748287 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:35:09.749016 master-0 kubenswrapper[28758]: I0223 14:35:09.748524 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:35:09.749016 master-0 kubenswrapper[28758]: I0223 14:35:09.748747 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:35:09.752590 master-0 kubenswrapper[28758]: I0223 14:35:09.752535 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:35:09.825758 master-0 kubenswrapper[28758]: I0223 14:35:09.825693 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Feb 23 14:35:09.838337 master-0 kubenswrapper[28758]: I0223 14:35:09.838296 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Feb 23 14:35:10.911713 master-0 kubenswrapper[28758]: I0223 14:35:10.910976 28758 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Feb 23 14:35:10.911713 master-0 kubenswrapper[28758]: I0223 14:35:10.911251 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="afeec80f2ec1ff5cb32c2367912befef" containerName="startup-monitor" containerID="cri-o://d39b757db5c6ad372b3e6ed02073c93d1685170abb93eb92d7e3098cd31c4317" gracePeriod=5 Feb 23 14:35:11.617846 master-0 kubenswrapper[28758]: I0223 14:35:11.617762 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access\") pod \"installer-4-master-0\" (UID: \"e1148263-7b15-4c12-a217-8b030ecd9348\") " pod="openshift-kube-apiserver/installer-4-master-0" Feb 23 14:35:11.618103 master-0 kubenswrapper[28758]: E0223 14:35:11.617979 28758 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Feb 23 14:35:11.618103 master-0 kubenswrapper[28758]: E0223 14:35:11.618014 28758 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-4-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Feb 23 14:35:11.618103 master-0 kubenswrapper[28758]: E0223 14:35:11.618070 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access podName:e1148263-7b15-4c12-a217-8b030ecd9348 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:19.618053512 +0000 UTC m=+51.744369444 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access") pod "installer-4-master-0" (UID: "e1148263-7b15-4c12-a217-8b030ecd9348") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Feb 23 14:35:12.698710 master-0 kubenswrapper[28758]: I0223 14:35:12.698652 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-8dc4666db-hhgpf"] Feb 23 14:35:12.777366 master-0 kubenswrapper[28758]: I0223 14:35:12.777293 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-5df5ffc47c-mlk22" event={"ID":"aa3fb4e1-1d55-46c2-af94-6063cdedd456","Type":"ContainerStarted","Data":"891429870c98f1ebb673037f29c2e9759bd46c56415e28efe8f6b01f526ddfea"} Feb 23 14:35:12.777695 master-0 kubenswrapper[28758]: I0223 14:35:12.777511 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-5df5ffc47c-mlk22" Feb 23 14:35:12.780644 master-0 kubenswrapper[28758]: I0223 14:35:12.780574 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" event={"ID":"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb","Type":"ContainerStarted","Data":"3d971227f0308cf6cf15455c4c4f675670b6a68a643b865faf34544e103fa0f4"} Feb 23 14:35:12.780833 master-0 kubenswrapper[28758]: I0223 14:35:12.780814 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:12.783300 master-0 kubenswrapper[28758]: I0223 14:35:12.783253 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-5df5ffc47c-mlk22" Feb 23 14:35:12.818796 master-0 kubenswrapper[28758]: I0223 14:35:12.818716 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-5df5ffc47c-mlk22" podStartSLOduration=1.354470365 podStartE2EDuration="5.818699249s" podCreationTimestamp="2026-02-23 14:35:07 +0000 UTC" firstStartedPulling="2026-02-23 14:35:07.626064281 +0000 UTC m=+39.752380213" lastFinishedPulling="2026-02-23 14:35:12.090293165 +0000 UTC m=+44.216609097" observedRunningTime="2026-02-23 14:35:12.816206612 +0000 UTC m=+44.942522544" watchObservedRunningTime="2026-02-23 14:35:12.818699249 +0000 UTC m=+44.945015181" Feb 23 14:35:12.856416 master-0 kubenswrapper[28758]: I0223 14:35:12.856337 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" podStartSLOduration=1.666097861 podStartE2EDuration="5.856312462s" podCreationTimestamp="2026-02-23 14:35:07 +0000 UTC" firstStartedPulling="2026-02-23 14:35:07.883132067 +0000 UTC m=+40.009447989" lastFinishedPulling="2026-02-23 14:35:12.073346658 +0000 UTC m=+44.199662590" observedRunningTime="2026-02-23 14:35:12.852631833 +0000 UTC m=+44.978947755" watchObservedRunningTime="2026-02-23 14:35:12.856312462 +0000 UTC m=+44.982628394" Feb 23 14:35:12.914630 master-0 kubenswrapper[28758]: I0223 14:35:12.914591 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:13.038524 master-0 kubenswrapper[28758]: I0223 14:35:13.034848 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-955b69498-krrp8"] Feb 23 14:35:13.038524 master-0 kubenswrapper[28758]: E0223 14:35:13.035142 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afeec80f2ec1ff5cb32c2367912befef" containerName="startup-monitor" Feb 23 14:35:13.038524 master-0 kubenswrapper[28758]: I0223 14:35:13.035160 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="afeec80f2ec1ff5cb32c2367912befef" containerName="startup-monitor" Feb 23 14:35:13.038524 master-0 kubenswrapper[28758]: I0223 14:35:13.035318 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="afeec80f2ec1ff5cb32c2367912befef" containerName="startup-monitor" Feb 23 14:35:13.038524 master-0 kubenswrapper[28758]: I0223 14:35:13.036872 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-955b69498-krrp8" Feb 23 14:35:13.066383 master-0 kubenswrapper[28758]: I0223 14:35:13.065967 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-f7brl" Feb 23 14:35:13.066383 master-0 kubenswrapper[28758]: I0223 14:35:13.066168 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 23 14:35:13.066807 master-0 kubenswrapper[28758]: I0223 14:35:13.066749 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 23 14:35:13.069033 master-0 kubenswrapper[28758]: I0223 14:35:13.068997 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-955b69498-krrp8"] Feb 23 14:35:13.125756 master-0 kubenswrapper[28758]: I0223 14:35:13.125697 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cdrlk" Feb 23 14:35:13.168428 master-0 kubenswrapper[28758]: I0223 14:35:13.168344 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwvbz\" (UniqueName: \"kubernetes.io/projected/7e3a3f31-0bce-4392-affe-446a58284289-kube-api-access-xwvbz\") pod \"downloads-955b69498-krrp8\" (UID: \"7e3a3f31-0bce-4392-affe-446a58284289\") " pod="openshift-console/downloads-955b69498-krrp8" Feb 23 14:35:13.212667 master-0 kubenswrapper[28758]: I0223 14:35:13.212603 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-564f967f45-775n2"] Feb 23 14:35:13.214287 master-0 kubenswrapper[28758]: I0223 14:35:13.213588 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-564f967f45-775n2" Feb 23 14:35:13.215498 master-0 kubenswrapper[28758]: I0223 14:35:13.215455 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-dt6sv" Feb 23 14:35:13.216856 master-0 kubenswrapper[28758]: I0223 14:35:13.216791 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Feb 23 14:35:13.226814 master-0 kubenswrapper[28758]: I0223 14:35:13.226322 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-564f967f45-775n2"] Feb 23 14:35:13.269391 master-0 kubenswrapper[28758]: I0223 14:35:13.269157 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwvbz\" (UniqueName: \"kubernetes.io/projected/7e3a3f31-0bce-4392-affe-446a58284289-kube-api-access-xwvbz\") pod \"downloads-955b69498-krrp8\" (UID: \"7e3a3f31-0bce-4392-affe-446a58284289\") " pod="openshift-console/downloads-955b69498-krrp8" Feb 23 14:35:13.286325 master-0 kubenswrapper[28758]: I0223 14:35:13.286275 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwvbz\" (UniqueName: \"kubernetes.io/projected/7e3a3f31-0bce-4392-affe-446a58284289-kube-api-access-xwvbz\") pod \"downloads-955b69498-krrp8\" (UID: \"7e3a3f31-0bce-4392-affe-446a58284289\") " pod="openshift-console/downloads-955b69498-krrp8" Feb 23 14:35:13.379392 master-0 kubenswrapper[28758]: I0223 14:35:13.379261 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5196fb2b-b771-4a7a-a12c-474f43b82c5f-monitoring-plugin-cert\") pod \"monitoring-plugin-564f967f45-775n2\" (UID: \"5196fb2b-b771-4a7a-a12c-474f43b82c5f\") " pod="openshift-monitoring/monitoring-plugin-564f967f45-775n2" Feb 23 14:35:13.391194 master-0 kubenswrapper[28758]: I0223 14:35:13.391142 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tl6dk" Feb 23 14:35:13.397934 master-0 kubenswrapper[28758]: I0223 14:35:13.397882 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-955b69498-krrp8" Feb 23 14:35:13.480872 master-0 kubenswrapper[28758]: I0223 14:35:13.480255 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5196fb2b-b771-4a7a-a12c-474f43b82c5f-monitoring-plugin-cert\") pod \"monitoring-plugin-564f967f45-775n2\" (UID: \"5196fb2b-b771-4a7a-a12c-474f43b82c5f\") " pod="openshift-monitoring/monitoring-plugin-564f967f45-775n2" Feb 23 14:35:13.484792 master-0 kubenswrapper[28758]: I0223 14:35:13.484761 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5196fb2b-b771-4a7a-a12c-474f43b82c5f-monitoring-plugin-cert\") pod \"monitoring-plugin-564f967f45-775n2\" (UID: \"5196fb2b-b771-4a7a-a12c-474f43b82c5f\") " pod="openshift-monitoring/monitoring-plugin-564f967f45-775n2" Feb 23 14:35:13.534224 master-0 kubenswrapper[28758]: I0223 14:35:13.534149 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-564f967f45-775n2" Feb 23 14:35:13.691963 master-0 kubenswrapper[28758]: I0223 14:35:13.690801 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pfb9h" Feb 23 14:35:13.820068 master-0 kubenswrapper[28758]: I0223 14:35:13.819999 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-955b69498-krrp8"] Feb 23 14:35:13.834945 master-0 kubenswrapper[28758]: W0223 14:35:13.834881 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e3a3f31_0bce_4392_affe_446a58284289.slice/crio-3cd331f63a9b5e525be68cadfa3989dac3c390934a2e8f671c73d204a90f5cfb WatchSource:0}: Error finding container 3cd331f63a9b5e525be68cadfa3989dac3c390934a2e8f671c73d204a90f5cfb: Status 404 returned error can't find the container with id 3cd331f63a9b5e525be68cadfa3989dac3c390934a2e8f671c73d204a90f5cfb Feb 23 14:35:13.931385 master-0 kubenswrapper[28758]: I0223 14:35:13.931333 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-564f967f45-775n2"] Feb 23 14:35:13.995833 master-0 kubenswrapper[28758]: I0223 14:35:13.995759 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fjpvt" Feb 23 14:35:14.793244 master-0 kubenswrapper[28758]: I0223 14:35:14.793197 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-955b69498-krrp8" event={"ID":"7e3a3f31-0bce-4392-affe-446a58284289","Type":"ContainerStarted","Data":"3cd331f63a9b5e525be68cadfa3989dac3c390934a2e8f671c73d204a90f5cfb"} Feb 23 14:35:14.794891 master-0 kubenswrapper[28758]: I0223 14:35:14.794856 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-564f967f45-775n2" event={"ID":"5196fb2b-b771-4a7a-a12c-474f43b82c5f","Type":"ContainerStarted","Data":"71cdb9d7e1a9dad3d07d30ebbac5c63af8c861c65746cb4778e5c7afb862a4b6"} Feb 23 14:35:16.504827 master-0 kubenswrapper[28758]: I0223 14:35:16.504787 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_afeec80f2ec1ff5cb32c2367912befef/startup-monitor/0.log" Feb 23 14:35:16.505310 master-0 kubenswrapper[28758]: I0223 14:35:16.504856 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:35:16.630718 master-0 kubenswrapper[28758]: I0223 14:35:16.630624 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-resource-dir\") pod \"afeec80f2ec1ff5cb32c2367912befef\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " Feb 23 14:35:16.630718 master-0 kubenswrapper[28758]: I0223 14:35:16.630699 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-manifests\") pod \"afeec80f2ec1ff5cb32c2367912befef\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " Feb 23 14:35:16.630986 master-0 kubenswrapper[28758]: I0223 14:35:16.630796 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-var-lock\") pod \"afeec80f2ec1ff5cb32c2367912befef\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " Feb 23 14:35:16.630986 master-0 kubenswrapper[28758]: I0223 14:35:16.630852 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-var-log\") pod \"afeec80f2ec1ff5cb32c2367912befef\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " Feb 23 14:35:16.630986 master-0 kubenswrapper[28758]: I0223 14:35:16.630885 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-pod-resource-dir\") pod \"afeec80f2ec1ff5cb32c2367912befef\" (UID: \"afeec80f2ec1ff5cb32c2367912befef\") " Feb 23 14:35:16.631298 master-0 kubenswrapper[28758]: I0223 14:35:16.631249 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "afeec80f2ec1ff5cb32c2367912befef" (UID: "afeec80f2ec1ff5cb32c2367912befef"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:35:16.631364 master-0 kubenswrapper[28758]: I0223 14:35:16.631321 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-manifests" (OuterVolumeSpecName: "manifests") pod "afeec80f2ec1ff5cb32c2367912befef" (UID: "afeec80f2ec1ff5cb32c2367912befef"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:35:16.631364 master-0 kubenswrapper[28758]: I0223 14:35:16.631351 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-var-lock" (OuterVolumeSpecName: "var-lock") pod "afeec80f2ec1ff5cb32c2367912befef" (UID: "afeec80f2ec1ff5cb32c2367912befef"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:35:16.631458 master-0 kubenswrapper[28758]: I0223 14:35:16.631379 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-var-log" (OuterVolumeSpecName: "var-log") pod "afeec80f2ec1ff5cb32c2367912befef" (UID: "afeec80f2ec1ff5cb32c2367912befef"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:35:16.638435 master-0 kubenswrapper[28758]: I0223 14:35:16.638379 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "afeec80f2ec1ff5cb32c2367912befef" (UID: "afeec80f2ec1ff5cb32c2367912befef"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:35:16.732817 master-0 kubenswrapper[28758]: I0223 14:35:16.732683 28758 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-var-lock\") on node \"master-0\" DevicePath \"\"" Feb 23 14:35:16.732817 master-0 kubenswrapper[28758]: I0223 14:35:16.732731 28758 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-var-log\") on node \"master-0\" DevicePath \"\"" Feb 23 14:35:16.732817 master-0 kubenswrapper[28758]: I0223 14:35:16.732744 28758 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:35:16.732817 master-0 kubenswrapper[28758]: I0223 14:35:16.732759 28758 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-resource-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:35:16.732817 master-0 kubenswrapper[28758]: I0223 14:35:16.732771 28758 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/afeec80f2ec1ff5cb32c2367912befef-manifests\") on node \"master-0\" DevicePath \"\"" Feb 23 14:35:16.809375 master-0 kubenswrapper[28758]: I0223 14:35:16.809324 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-564f967f45-775n2" event={"ID":"5196fb2b-b771-4a7a-a12c-474f43b82c5f","Type":"ContainerStarted","Data":"d2e7dcb1cc1560c731b2c1eebd42bd38d3dc3e70ad8198d8daee51a66dc2d4c9"} Feb 23 14:35:16.809952 master-0 kubenswrapper[28758]: I0223 14:35:16.809920 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-564f967f45-775n2" Feb 23 14:35:16.811323 master-0 kubenswrapper[28758]: I0223 14:35:16.811297 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_afeec80f2ec1ff5cb32c2367912befef/startup-monitor/0.log" Feb 23 14:35:16.811388 master-0 kubenswrapper[28758]: I0223 14:35:16.811350 28758 generic.go:334] "Generic (PLEG): container finished" podID="afeec80f2ec1ff5cb32c2367912befef" containerID="d39b757db5c6ad372b3e6ed02073c93d1685170abb93eb92d7e3098cd31c4317" exitCode=137 Feb 23 14:35:16.811425 master-0 kubenswrapper[28758]: I0223 14:35:16.811410 28758 scope.go:117] "RemoveContainer" containerID="d39b757db5c6ad372b3e6ed02073c93d1685170abb93eb92d7e3098cd31c4317" Feb 23 14:35:16.811621 master-0 kubenswrapper[28758]: I0223 14:35:16.811605 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:35:16.817760 master-0 kubenswrapper[28758]: I0223 14:35:16.814508 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-564f967f45-775n2" Feb 23 14:35:16.833028 master-0 kubenswrapper[28758]: I0223 14:35:16.832882 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-564f967f45-775n2" podStartSLOduration=2.136575037 podStartE2EDuration="3.832867707s" podCreationTimestamp="2026-02-23 14:35:13 +0000 UTC" firstStartedPulling="2026-02-23 14:35:13.944799458 +0000 UTC m=+46.071115390" lastFinishedPulling="2026-02-23 14:35:15.641092138 +0000 UTC m=+47.767408060" observedRunningTime="2026-02-23 14:35:16.828945151 +0000 UTC m=+48.955261083" watchObservedRunningTime="2026-02-23 14:35:16.832867707 +0000 UTC m=+48.959183639" Feb 23 14:35:16.854440 master-0 kubenswrapper[28758]: I0223 14:35:16.854397 28758 scope.go:117] "RemoveContainer" containerID="d39b757db5c6ad372b3e6ed02073c93d1685170abb93eb92d7e3098cd31c4317" Feb 23 14:35:16.854786 master-0 kubenswrapper[28758]: E0223 14:35:16.854762 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d39b757db5c6ad372b3e6ed02073c93d1685170abb93eb92d7e3098cd31c4317\": container with ID starting with d39b757db5c6ad372b3e6ed02073c93d1685170abb93eb92d7e3098cd31c4317 not found: ID does not exist" containerID="d39b757db5c6ad372b3e6ed02073c93d1685170abb93eb92d7e3098cd31c4317" Feb 23 14:35:16.854854 master-0 kubenswrapper[28758]: I0223 14:35:16.854792 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d39b757db5c6ad372b3e6ed02073c93d1685170abb93eb92d7e3098cd31c4317"} err="failed to get container status \"d39b757db5c6ad372b3e6ed02073c93d1685170abb93eb92d7e3098cd31c4317\": rpc error: code = NotFound desc = could not find container \"d39b757db5c6ad372b3e6ed02073c93d1685170abb93eb92d7e3098cd31c4317\": container with ID starting with d39b757db5c6ad372b3e6ed02073c93d1685170abb93eb92d7e3098cd31c4317 not found: ID does not exist" Feb 23 14:35:16.911418 master-0 kubenswrapper[28758]: I0223 14:35:16.911344 28758 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="677f91b8-e643-43cc-8f92-ce3d25ab051a" Feb 23 14:35:18.101471 master-0 kubenswrapper[28758]: I0223 14:35:18.101387 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afeec80f2ec1ff5cb32c2367912befef" path="/var/lib/kubelet/pods/afeec80f2ec1ff5cb32c2367912befef/volumes" Feb 23 14:35:18.102460 master-0 kubenswrapper[28758]: I0223 14:35:18.102109 28758 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="" Feb 23 14:35:18.267573 master-0 kubenswrapper[28758]: I0223 14:35:18.267527 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Feb 23 14:35:18.267862 master-0 kubenswrapper[28758]: I0223 14:35:18.267820 28758 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="677f91b8-e643-43cc-8f92-ce3d25ab051a" Feb 23 14:35:18.270939 master-0 kubenswrapper[28758]: I0223 14:35:18.270768 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Feb 23 14:35:18.270939 master-0 kubenswrapper[28758]: I0223 14:35:18.270796 28758 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="677f91b8-e643-43cc-8f92-ce3d25ab051a" Feb 23 14:35:19.537803 master-0 kubenswrapper[28758]: I0223 14:35:19.537745 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-87c947f7d-jz97m"] Feb 23 14:35:19.539699 master-0 kubenswrapper[28758]: I0223 14:35:19.539665 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-87c947f7d-jz97m" Feb 23 14:35:19.542044 master-0 kubenswrapper[28758]: I0223 14:35:19.541619 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 23 14:35:19.543008 master-0 kubenswrapper[28758]: I0223 14:35:19.542160 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 23 14:35:19.543008 master-0 kubenswrapper[28758]: I0223 14:35:19.542317 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 23 14:35:19.543008 master-0 kubenswrapper[28758]: I0223 14:35:19.542462 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 23 14:35:19.543008 master-0 kubenswrapper[28758]: I0223 14:35:19.542634 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-gd88p" Feb 23 14:35:19.543008 master-0 kubenswrapper[28758]: I0223 14:35:19.542645 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 23 14:35:19.559330 master-0 kubenswrapper[28758]: I0223 14:35:19.559272 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-87c947f7d-jz97m"] Feb 23 14:35:19.684505 master-0 kubenswrapper[28758]: I0223 14:35:19.682115 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-console-oauth-config\") pod \"console-87c947f7d-jz97m\" (UID: \"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6\") " pod="openshift-console/console-87c947f7d-jz97m" Feb 23 14:35:19.684505 master-0 kubenswrapper[28758]: I0223 14:35:19.682200 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-console-serving-cert\") pod \"console-87c947f7d-jz97m\" (UID: \"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6\") " pod="openshift-console/console-87c947f7d-jz97m" Feb 23 14:35:19.684505 master-0 kubenswrapper[28758]: I0223 14:35:19.682227 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wplbg\" (UniqueName: \"kubernetes.io/projected/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-kube-api-access-wplbg\") pod \"console-87c947f7d-jz97m\" (UID: \"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6\") " pod="openshift-console/console-87c947f7d-jz97m" Feb 23 14:35:19.684505 master-0 kubenswrapper[28758]: I0223 14:35:19.682266 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access\") pod \"installer-4-master-0\" (UID: \"e1148263-7b15-4c12-a217-8b030ecd9348\") " pod="openshift-kube-apiserver/installer-4-master-0" Feb 23 14:35:19.684505 master-0 kubenswrapper[28758]: I0223 14:35:19.682286 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-console-config\") pod \"console-87c947f7d-jz97m\" (UID: \"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6\") " pod="openshift-console/console-87c947f7d-jz97m" Feb 23 14:35:19.684505 master-0 kubenswrapper[28758]: I0223 14:35:19.682320 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-oauth-serving-cert\") pod \"console-87c947f7d-jz97m\" (UID: \"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6\") " pod="openshift-console/console-87c947f7d-jz97m" Feb 23 14:35:19.684505 master-0 kubenswrapper[28758]: I0223 14:35:19.682336 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-service-ca\") pod \"console-87c947f7d-jz97m\" (UID: \"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6\") " pod="openshift-console/console-87c947f7d-jz97m" Feb 23 14:35:19.684505 master-0 kubenswrapper[28758]: E0223 14:35:19.682501 28758 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Feb 23 14:35:19.684505 master-0 kubenswrapper[28758]: E0223 14:35:19.682518 28758 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-4-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Feb 23 14:35:19.684505 master-0 kubenswrapper[28758]: E0223 14:35:19.682556 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access podName:e1148263-7b15-4c12-a217-8b030ecd9348 nodeName:}" failed. No retries permitted until 2026-02-23 14:35:35.682539262 +0000 UTC m=+67.808855194 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access") pod "installer-4-master-0" (UID: "e1148263-7b15-4c12-a217-8b030ecd9348") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Feb 23 14:35:19.784290 master-0 kubenswrapper[28758]: I0223 14:35:19.784224 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-console-serving-cert\") pod \"console-87c947f7d-jz97m\" (UID: \"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6\") " pod="openshift-console/console-87c947f7d-jz97m" Feb 23 14:35:19.784502 master-0 kubenswrapper[28758]: I0223 14:35:19.784420 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wplbg\" (UniqueName: \"kubernetes.io/projected/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-kube-api-access-wplbg\") pod \"console-87c947f7d-jz97m\" (UID: \"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6\") " pod="openshift-console/console-87c947f7d-jz97m" Feb 23 14:35:19.784569 master-0 kubenswrapper[28758]: I0223 14:35:19.784501 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-console-config\") pod \"console-87c947f7d-jz97m\" (UID: \"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6\") " pod="openshift-console/console-87c947f7d-jz97m" Feb 23 14:35:19.784569 master-0 kubenswrapper[28758]: I0223 14:35:19.784556 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-oauth-serving-cert\") pod \"console-87c947f7d-jz97m\" (UID: \"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6\") " pod="openshift-console/console-87c947f7d-jz97m" Feb 23 14:35:19.784636 master-0 kubenswrapper[28758]: I0223 14:35:19.784580 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-service-ca\") pod \"console-87c947f7d-jz97m\" (UID: \"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6\") " pod="openshift-console/console-87c947f7d-jz97m" Feb 23 14:35:19.784814 master-0 kubenswrapper[28758]: I0223 14:35:19.784770 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-console-oauth-config\") pod \"console-87c947f7d-jz97m\" (UID: \"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6\") " pod="openshift-console/console-87c947f7d-jz97m" Feb 23 14:35:19.785654 master-0 kubenswrapper[28758]: I0223 14:35:19.785621 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-oauth-serving-cert\") pod \"console-87c947f7d-jz97m\" (UID: \"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6\") " pod="openshift-console/console-87c947f7d-jz97m" Feb 23 14:35:19.785654 master-0 kubenswrapper[28758]: I0223 14:35:19.785647 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-service-ca\") pod \"console-87c947f7d-jz97m\" (UID: \"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6\") " pod="openshift-console/console-87c947f7d-jz97m" Feb 23 14:35:19.785741 master-0 kubenswrapper[28758]: I0223 14:35:19.785714 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-console-config\") pod \"console-87c947f7d-jz97m\" (UID: \"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6\") " pod="openshift-console/console-87c947f7d-jz97m" Feb 23 14:35:19.788898 master-0 kubenswrapper[28758]: I0223 14:35:19.788820 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-console-serving-cert\") pod \"console-87c947f7d-jz97m\" (UID: \"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6\") " pod="openshift-console/console-87c947f7d-jz97m" Feb 23 14:35:19.789037 master-0 kubenswrapper[28758]: I0223 14:35:19.789012 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-console-oauth-config\") pod \"console-87c947f7d-jz97m\" (UID: \"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6\") " pod="openshift-console/console-87c947f7d-jz97m" Feb 23 14:35:19.805189 master-0 kubenswrapper[28758]: I0223 14:35:19.805134 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wplbg\" (UniqueName: \"kubernetes.io/projected/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-kube-api-access-wplbg\") pod \"console-87c947f7d-jz97m\" (UID: \"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6\") " pod="openshift-console/console-87c947f7d-jz97m" Feb 23 14:35:19.865140 master-0 kubenswrapper[28758]: I0223 14:35:19.865079 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-87c947f7d-jz97m" Feb 23 14:35:20.253494 master-0 kubenswrapper[28758]: I0223 14:35:20.253436 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-87c947f7d-jz97m"] Feb 23 14:35:20.267649 master-0 kubenswrapper[28758]: W0223 14:35:20.267584 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26bb163c_8d5f_42bc_b5a6_25f7e2e214e6.slice/crio-f6a667abcba45d3005bf5f20f9c1ebf19e7024613296386a208e38e2361f9569 WatchSource:0}: Error finding container f6a667abcba45d3005bf5f20f9c1ebf19e7024613296386a208e38e2361f9569: Status 404 returned error can't find the container with id f6a667abcba45d3005bf5f20f9c1ebf19e7024613296386a208e38e2361f9569 Feb 23 14:35:20.846637 master-0 kubenswrapper[28758]: I0223 14:35:20.846578 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-87c947f7d-jz97m" event={"ID":"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6","Type":"ContainerStarted","Data":"f6a667abcba45d3005bf5f20f9c1ebf19e7024613296386a208e38e2361f9569"} Feb 23 14:35:21.577719 master-0 kubenswrapper[28758]: I0223 14:35:21.577058 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6fb7cc48c6-v85rz"] Feb 23 14:35:21.578301 master-0 kubenswrapper[28758]: I0223 14:35:21.578282 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fb7cc48c6-v85rz" Feb 23 14:35:21.598872 master-0 kubenswrapper[28758]: I0223 14:35:21.598820 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 23 14:35:21.600409 master-0 kubenswrapper[28758]: I0223 14:35:21.600376 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fb7cc48c6-v85rz"] Feb 23 14:35:21.718936 master-0 kubenswrapper[28758]: I0223 14:35:21.718868 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-oauth-serving-cert\") pod \"console-6fb7cc48c6-v85rz\" (UID: \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\") " pod="openshift-console/console-6fb7cc48c6-v85rz" Feb 23 14:35:21.718936 master-0 kubenswrapper[28758]: I0223 14:35:21.718930 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-console-serving-cert\") pod \"console-6fb7cc48c6-v85rz\" (UID: \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\") " pod="openshift-console/console-6fb7cc48c6-v85rz" Feb 23 14:35:21.719168 master-0 kubenswrapper[28758]: I0223 14:35:21.718961 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-trusted-ca-bundle\") pod \"console-6fb7cc48c6-v85rz\" (UID: \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\") " pod="openshift-console/console-6fb7cc48c6-v85rz" Feb 23 14:35:21.719168 master-0 kubenswrapper[28758]: I0223 14:35:21.719013 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-console-oauth-config\") pod \"console-6fb7cc48c6-v85rz\" (UID: \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\") " pod="openshift-console/console-6fb7cc48c6-v85rz" Feb 23 14:35:21.719168 master-0 kubenswrapper[28758]: I0223 14:35:21.719030 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ghqw\" (UniqueName: \"kubernetes.io/projected/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-kube-api-access-9ghqw\") pod \"console-6fb7cc48c6-v85rz\" (UID: \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\") " pod="openshift-console/console-6fb7cc48c6-v85rz" Feb 23 14:35:21.719168 master-0 kubenswrapper[28758]: I0223 14:35:21.719049 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-service-ca\") pod \"console-6fb7cc48c6-v85rz\" (UID: \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\") " pod="openshift-console/console-6fb7cc48c6-v85rz" Feb 23 14:35:21.719168 master-0 kubenswrapper[28758]: I0223 14:35:21.719069 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-console-config\") pod \"console-6fb7cc48c6-v85rz\" (UID: \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\") " pod="openshift-console/console-6fb7cc48c6-v85rz" Feb 23 14:35:21.820855 master-0 kubenswrapper[28758]: I0223 14:35:21.820798 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-console-config\") pod \"console-6fb7cc48c6-v85rz\" (UID: \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\") " pod="openshift-console/console-6fb7cc48c6-v85rz" Feb 23 14:35:21.821185 master-0 kubenswrapper[28758]: I0223 14:35:21.820897 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-oauth-serving-cert\") pod \"console-6fb7cc48c6-v85rz\" (UID: \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\") " pod="openshift-console/console-6fb7cc48c6-v85rz" Feb 23 14:35:21.821185 master-0 kubenswrapper[28758]: I0223 14:35:21.820917 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-console-serving-cert\") pod \"console-6fb7cc48c6-v85rz\" (UID: \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\") " pod="openshift-console/console-6fb7cc48c6-v85rz" Feb 23 14:35:21.821185 master-0 kubenswrapper[28758]: I0223 14:35:21.820937 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-trusted-ca-bundle\") pod \"console-6fb7cc48c6-v85rz\" (UID: \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\") " pod="openshift-console/console-6fb7cc48c6-v85rz" Feb 23 14:35:21.821185 master-0 kubenswrapper[28758]: I0223 14:35:21.820987 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-console-oauth-config\") pod \"console-6fb7cc48c6-v85rz\" (UID: \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\") " pod="openshift-console/console-6fb7cc48c6-v85rz" Feb 23 14:35:21.821185 master-0 kubenswrapper[28758]: I0223 14:35:21.821002 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ghqw\" (UniqueName: \"kubernetes.io/projected/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-kube-api-access-9ghqw\") pod \"console-6fb7cc48c6-v85rz\" (UID: \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\") " pod="openshift-console/console-6fb7cc48c6-v85rz" Feb 23 14:35:21.821185 master-0 kubenswrapper[28758]: I0223 14:35:21.821018 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-service-ca\") pod \"console-6fb7cc48c6-v85rz\" (UID: \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\") " pod="openshift-console/console-6fb7cc48c6-v85rz" Feb 23 14:35:21.821899 master-0 kubenswrapper[28758]: I0223 14:35:21.821872 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-service-ca\") pod \"console-6fb7cc48c6-v85rz\" (UID: \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\") " pod="openshift-console/console-6fb7cc48c6-v85rz" Feb 23 14:35:21.823898 master-0 kubenswrapper[28758]: I0223 14:35:21.822873 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-console-config\") pod \"console-6fb7cc48c6-v85rz\" (UID: \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\") " pod="openshift-console/console-6fb7cc48c6-v85rz" Feb 23 14:35:21.823898 master-0 kubenswrapper[28758]: I0223 14:35:21.822934 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-oauth-serving-cert\") pod \"console-6fb7cc48c6-v85rz\" (UID: \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\") " pod="openshift-console/console-6fb7cc48c6-v85rz" Feb 23 14:35:21.824858 master-0 kubenswrapper[28758]: I0223 14:35:21.824795 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-trusted-ca-bundle\") pod \"console-6fb7cc48c6-v85rz\" (UID: \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\") " pod="openshift-console/console-6fb7cc48c6-v85rz" Feb 23 14:35:21.825812 master-0 kubenswrapper[28758]: I0223 14:35:21.825775 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-console-oauth-config\") pod \"console-6fb7cc48c6-v85rz\" (UID: \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\") " pod="openshift-console/console-6fb7cc48c6-v85rz" Feb 23 14:35:21.826125 master-0 kubenswrapper[28758]: I0223 14:35:21.826070 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-console-serving-cert\") pod \"console-6fb7cc48c6-v85rz\" (UID: \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\") " pod="openshift-console/console-6fb7cc48c6-v85rz" Feb 23 14:35:21.841616 master-0 kubenswrapper[28758]: I0223 14:35:21.839311 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ghqw\" (UniqueName: \"kubernetes.io/projected/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-kube-api-access-9ghqw\") pod \"console-6fb7cc48c6-v85rz\" (UID: \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\") " pod="openshift-console/console-6fb7cc48c6-v85rz" Feb 23 14:35:21.904144 master-0 kubenswrapper[28758]: I0223 14:35:21.904064 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fb7cc48c6-v85rz" Feb 23 14:35:22.315665 master-0 kubenswrapper[28758]: I0223 14:35:22.313759 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fb7cc48c6-v85rz"] Feb 23 14:35:22.321255 master-0 kubenswrapper[28758]: W0223 14:35:22.321185 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb6ad4fa_0ddd_42e2_8a8c_535f1a30df9e.slice/crio-c7d2433a662653f036ca2b839c6751db69a8b78e4ff9f783c75994c09abfaada WatchSource:0}: Error finding container c7d2433a662653f036ca2b839c6751db69a8b78e4ff9f783c75994c09abfaada: Status 404 returned error can't find the container with id c7d2433a662653f036ca2b839c6751db69a8b78e4ff9f783c75994c09abfaada Feb 23 14:35:22.862159 master-0 kubenswrapper[28758]: I0223 14:35:22.862083 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fb7cc48c6-v85rz" event={"ID":"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e","Type":"ContainerStarted","Data":"c7d2433a662653f036ca2b839c6751db69a8b78e4ff9f783c75994c09abfaada"} Feb 23 14:35:23.659048 master-0 kubenswrapper[28758]: I0223 14:35:23.658979 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:35:24.880561 master-0 kubenswrapper[28758]: I0223 14:35:24.880489 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-87c947f7d-jz97m" event={"ID":"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6","Type":"ContainerStarted","Data":"49c0fbbae99623d0bf4718bd25ac7308e61922b24b0082198cebf40df25bf2de"} Feb 23 14:35:24.883815 master-0 kubenswrapper[28758]: I0223 14:35:24.883773 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fb7cc48c6-v85rz" event={"ID":"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e","Type":"ContainerStarted","Data":"84008195bd56420835a8499cc20178119ee221e5ffd1f50ddbe5e8c00e472d9e"} Feb 23 14:35:24.902659 master-0 kubenswrapper[28758]: I0223 14:35:24.902569 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-87c947f7d-jz97m" podStartSLOduration=1.6019218309999999 podStartE2EDuration="5.902522806s" podCreationTimestamp="2026-02-23 14:35:19 +0000 UTC" firstStartedPulling="2026-02-23 14:35:20.271691774 +0000 UTC m=+52.398007706" lastFinishedPulling="2026-02-23 14:35:24.572292739 +0000 UTC m=+56.698608681" observedRunningTime="2026-02-23 14:35:24.899770551 +0000 UTC m=+57.026086513" watchObservedRunningTime="2026-02-23 14:35:24.902522806 +0000 UTC m=+57.028838738" Feb 23 14:35:24.923045 master-0 kubenswrapper[28758]: I0223 14:35:24.922811 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6fb7cc48c6-v85rz" podStartSLOduration=1.67008445 podStartE2EDuration="3.922792842s" podCreationTimestamp="2026-02-23 14:35:21 +0000 UTC" firstStartedPulling="2026-02-23 14:35:22.324564882 +0000 UTC m=+54.450880814" lastFinishedPulling="2026-02-23 14:35:24.577273274 +0000 UTC m=+56.703589206" observedRunningTime="2026-02-23 14:35:24.922326739 +0000 UTC m=+57.048642671" watchObservedRunningTime="2026-02-23 14:35:24.922792842 +0000 UTC m=+57.049108774" Feb 23 14:35:28.076194 master-0 kubenswrapper[28758]: I0223 14:35:28.076132 28758 scope.go:117] "RemoveContainer" containerID="40ca3552a0c110bf631be979ddbff1eb4abba63ee7c1c34c419314566066d566" Feb 23 14:35:28.097053 master-0 kubenswrapper[28758]: I0223 14:35:28.097012 28758 scope.go:117] "RemoveContainer" containerID="e3d987d25306f70a7327b5bce6ea549b476972db2d3366cf37d35b30c1531578" Feb 23 14:35:28.115789 master-0 kubenswrapper[28758]: I0223 14:35:28.115739 28758 scope.go:117] "RemoveContainer" containerID="bdd3290dcf6f732f006b381bec2edfc3a7a58623787040a36811efd529225351" Feb 23 14:35:29.865432 master-0 kubenswrapper[28758]: I0223 14:35:29.865335 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-87c947f7d-jz97m" Feb 23 14:35:29.865432 master-0 kubenswrapper[28758]: I0223 14:35:29.865411 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-87c947f7d-jz97m" Feb 23 14:35:29.867181 master-0 kubenswrapper[28758]: I0223 14:35:29.867149 28758 patch_prober.go:28] interesting pod/console-87c947f7d-jz97m container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.93:8443/health\": dial tcp 10.128.0.93:8443: connect: connection refused" start-of-body= Feb 23 14:35:29.867237 master-0 kubenswrapper[28758]: I0223 14:35:29.867186 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-87c947f7d-jz97m" podUID="26bb163c-8d5f-42bc-b5a6-25f7e2e214e6" containerName="console" probeResult="failure" output="Get \"https://10.128.0.93:8443/health\": dial tcp 10.128.0.93:8443: connect: connection refused" Feb 23 14:35:31.904596 master-0 kubenswrapper[28758]: I0223 14:35:31.904486 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6fb7cc48c6-v85rz" Feb 23 14:35:31.904596 master-0 kubenswrapper[28758]: I0223 14:35:31.904577 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6fb7cc48c6-v85rz" Feb 23 14:35:31.906645 master-0 kubenswrapper[28758]: I0223 14:35:31.906596 28758 patch_prober.go:28] interesting pod/console-6fb7cc48c6-v85rz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" start-of-body= Feb 23 14:35:31.906729 master-0 kubenswrapper[28758]: I0223 14:35:31.906659 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6fb7cc48c6-v85rz" podUID="eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e" containerName="console" probeResult="failure" output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" Feb 23 14:35:35.741231 master-0 kubenswrapper[28758]: I0223 14:35:35.741169 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access\") pod \"installer-4-master-0\" (UID: \"e1148263-7b15-4c12-a217-8b030ecd9348\") " pod="openshift-kube-apiserver/installer-4-master-0" Feb 23 14:35:35.741761 master-0 kubenswrapper[28758]: E0223 14:35:35.741348 28758 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Feb 23 14:35:35.741761 master-0 kubenswrapper[28758]: E0223 14:35:35.741369 28758 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-4-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Feb 23 14:35:35.741761 master-0 kubenswrapper[28758]: E0223 14:35:35.741426 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access podName:e1148263-7b15-4c12-a217-8b030ecd9348 nodeName:}" failed. No retries permitted until 2026-02-23 14:36:07.741407942 +0000 UTC m=+99.867723884 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access") pod "installer-4-master-0" (UID: "e1148263-7b15-4c12-a217-8b030ecd9348") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Feb 23 14:35:38.813564 master-0 kubenswrapper[28758]: I0223 14:35:38.813467 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" podUID="9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb" containerName="oauth-openshift" containerID="cri-o://3d971227f0308cf6cf15455c4c4f675670b6a68a643b865faf34544e103fa0f4" gracePeriod=15 Feb 23 14:35:39.866664 master-0 kubenswrapper[28758]: I0223 14:35:39.866593 28758 patch_prober.go:28] interesting pod/console-87c947f7d-jz97m container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.93:8443/health\": dial tcp 10.128.0.93:8443: connect: connection refused" start-of-body= Feb 23 14:35:39.866664 master-0 kubenswrapper[28758]: I0223 14:35:39.866658 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-87c947f7d-jz97m" podUID="26bb163c-8d5f-42bc-b5a6-25f7e2e214e6" containerName="console" probeResult="failure" output="Get \"https://10.128.0.93:8443/health\": dial tcp 10.128.0.93:8443: connect: connection refused" Feb 23 14:35:39.983591 master-0 kubenswrapper[28758]: I0223 14:35:39.983535 28758 generic.go:334] "Generic (PLEG): container finished" podID="9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb" containerID="3d971227f0308cf6cf15455c4c4f675670b6a68a643b865faf34544e103fa0f4" exitCode=0 Feb 23 14:35:39.983591 master-0 kubenswrapper[28758]: I0223 14:35:39.983589 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" event={"ID":"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb","Type":"ContainerDied","Data":"3d971227f0308cf6cf15455c4c4f675670b6a68a643b865faf34544e103fa0f4"} Feb 23 14:35:41.905422 master-0 kubenswrapper[28758]: I0223 14:35:41.905362 28758 patch_prober.go:28] interesting pod/console-6fb7cc48c6-v85rz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" start-of-body= Feb 23 14:35:41.905422 master-0 kubenswrapper[28758]: I0223 14:35:41.905422 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6fb7cc48c6-v85rz" podUID="eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e" containerName="console" probeResult="failure" output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" Feb 23 14:35:42.520299 master-0 kubenswrapper[28758]: I0223 14:35:42.520225 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Feb 23 14:35:42.521220 master-0 kubenswrapper[28758]: I0223 14:35:42.521190 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Feb 23 14:35:42.525097 master-0 kubenswrapper[28758]: I0223 14:35:42.524575 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5gfcq" Feb 23 14:35:42.525097 master-0 kubenswrapper[28758]: I0223 14:35:42.524960 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 23 14:35:42.530365 master-0 kubenswrapper[28758]: I0223 14:35:42.530311 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Feb 23 14:35:42.589557 master-0 kubenswrapper[28758]: I0223 14:35:42.589505 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/069569a4-34c1-4752-af70-b31bcfca4177-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"069569a4-34c1-4752-af70-b31bcfca4177\") " pod="openshift-kube-apiserver/installer-5-master-0" Feb 23 14:35:42.589752 master-0 kubenswrapper[28758]: I0223 14:35:42.589566 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/069569a4-34c1-4752-af70-b31bcfca4177-var-lock\") pod \"installer-5-master-0\" (UID: \"069569a4-34c1-4752-af70-b31bcfca4177\") " pod="openshift-kube-apiserver/installer-5-master-0" Feb 23 14:35:42.589752 master-0 kubenswrapper[28758]: I0223 14:35:42.589606 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/069569a4-34c1-4752-af70-b31bcfca4177-kube-api-access\") pod \"installer-5-master-0\" (UID: \"069569a4-34c1-4752-af70-b31bcfca4177\") " pod="openshift-kube-apiserver/installer-5-master-0" Feb 23 14:35:42.691195 master-0 kubenswrapper[28758]: I0223 14:35:42.691126 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/069569a4-34c1-4752-af70-b31bcfca4177-var-lock\") pod \"installer-5-master-0\" (UID: \"069569a4-34c1-4752-af70-b31bcfca4177\") " pod="openshift-kube-apiserver/installer-5-master-0" Feb 23 14:35:42.691195 master-0 kubenswrapper[28758]: I0223 14:35:42.691184 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/069569a4-34c1-4752-af70-b31bcfca4177-kube-api-access\") pod \"installer-5-master-0\" (UID: \"069569a4-34c1-4752-af70-b31bcfca4177\") " pod="openshift-kube-apiserver/installer-5-master-0" Feb 23 14:35:42.691504 master-0 kubenswrapper[28758]: I0223 14:35:42.691306 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/069569a4-34c1-4752-af70-b31bcfca4177-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"069569a4-34c1-4752-af70-b31bcfca4177\") " pod="openshift-kube-apiserver/installer-5-master-0" Feb 23 14:35:42.691504 master-0 kubenswrapper[28758]: I0223 14:35:42.691383 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/069569a4-34c1-4752-af70-b31bcfca4177-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"069569a4-34c1-4752-af70-b31bcfca4177\") " pod="openshift-kube-apiserver/installer-5-master-0" Feb 23 14:35:42.691504 master-0 kubenswrapper[28758]: I0223 14:35:42.691430 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/069569a4-34c1-4752-af70-b31bcfca4177-var-lock\") pod \"installer-5-master-0\" (UID: \"069569a4-34c1-4752-af70-b31bcfca4177\") " pod="openshift-kube-apiserver/installer-5-master-0" Feb 23 14:35:42.706667 master-0 kubenswrapper[28758]: I0223 14:35:42.706619 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/069569a4-34c1-4752-af70-b31bcfca4177-kube-api-access\") pod \"installer-5-master-0\" (UID: \"069569a4-34c1-4752-af70-b31bcfca4177\") " pod="openshift-kube-apiserver/installer-5-master-0" Feb 23 14:35:42.847700 master-0 kubenswrapper[28758]: I0223 14:35:42.847576 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Feb 23 14:35:47.642791 master-0 kubenswrapper[28758]: I0223 14:35:47.642584 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Feb 23 14:35:48.464411 master-0 kubenswrapper[28758]: I0223 14:35:48.464327 28758 patch_prober.go:28] interesting pod/oauth-openshift-8dc4666db-hhgpf container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.128.0.90:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 14:35:48.464688 master-0 kubenswrapper[28758]: I0223 14:35:48.464406 28758 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" podUID="9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.128.0.90:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:35:48.867746 master-0 kubenswrapper[28758]: W0223 14:35:48.867684 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod069569a4_34c1_4752_af70_b31bcfca4177.slice/crio-ef785d771950769f3a1e100f2ce635b4ca76cd2406c83804433533340c74bdc4 WatchSource:0}: Error finding container ef785d771950769f3a1e100f2ce635b4ca76cd2406c83804433533340c74bdc4: Status 404 returned error can't find the container with id ef785d771950769f3a1e100f2ce635b4ca76cd2406c83804433533340c74bdc4 Feb 23 14:35:48.908622 master-0 kubenswrapper[28758]: I0223 14:35:48.908560 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:49.040444 master-0 kubenswrapper[28758]: I0223 14:35:49.040381 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"069569a4-34c1-4752-af70-b31bcfca4177","Type":"ContainerStarted","Data":"ef785d771950769f3a1e100f2ce635b4ca76cd2406c83804433533340c74bdc4"} Feb 23 14:35:49.042290 master-0 kubenswrapper[28758]: I0223 14:35:49.042265 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" event={"ID":"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb","Type":"ContainerDied","Data":"ea89b222569bbbe7e572ca8bd65b443664b462ee3b31459db4219befd4864a4f"} Feb 23 14:35:49.042342 master-0 kubenswrapper[28758]: I0223 14:35:49.042297 28758 scope.go:117] "RemoveContainer" containerID="3d971227f0308cf6cf15455c4c4f675670b6a68a643b865faf34544e103fa0f4" Feb 23 14:35:49.042375 master-0 kubenswrapper[28758]: I0223 14:35:49.042344 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-8dc4666db-hhgpf" Feb 23 14:35:49.081046 master-0 kubenswrapper[28758]: I0223 14:35:49.080934 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-router-certs\") pod \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " Feb 23 14:35:49.081245 master-0 kubenswrapper[28758]: I0223 14:35:49.081089 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p8km\" (UniqueName: \"kubernetes.io/projected/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-kube-api-access-4p8km\") pod \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " Feb 23 14:35:49.081245 master-0 kubenswrapper[28758]: I0223 14:35:49.081149 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-trusted-ca-bundle\") pod \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " Feb 23 14:35:49.081245 master-0 kubenswrapper[28758]: I0223 14:35:49.081208 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-service-ca\") pod \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " Feb 23 14:35:49.081245 master-0 kubenswrapper[28758]: I0223 14:35:49.081229 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-user-template-error\") pod \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " Feb 23 14:35:49.081418 master-0 kubenswrapper[28758]: I0223 14:35:49.081270 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-user-template-login\") pod \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " Feb 23 14:35:49.081418 master-0 kubenswrapper[28758]: I0223 14:35:49.081338 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-user-template-provider-selection\") pod \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " Feb 23 14:35:49.081418 master-0 kubenswrapper[28758]: I0223 14:35:49.081386 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-audit-dir\") pod \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " Feb 23 14:35:49.081564 master-0 kubenswrapper[28758]: I0223 14:35:49.081430 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-audit-policies\") pod \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " Feb 23 14:35:49.081564 master-0 kubenswrapper[28758]: I0223 14:35:49.081463 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-session\") pod \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " Feb 23 14:35:49.081564 master-0 kubenswrapper[28758]: I0223 14:35:49.081512 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-serving-cert\") pod \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " Feb 23 14:35:49.081564 master-0 kubenswrapper[28758]: I0223 14:35:49.081529 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-ocp-branding-template\") pod \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " Feb 23 14:35:49.081694 master-0 kubenswrapper[28758]: I0223 14:35:49.081568 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-cliconfig\") pod \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\" (UID: \"9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb\") " Feb 23 14:35:49.082100 master-0 kubenswrapper[28758]: I0223 14:35:49.082050 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb" (UID: "9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:35:49.082169 master-0 kubenswrapper[28758]: I0223 14:35:49.082132 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb" (UID: "9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:35:49.082871 master-0 kubenswrapper[28758]: I0223 14:35:49.082694 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb" (UID: "9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:35:49.082987 master-0 kubenswrapper[28758]: I0223 14:35:49.082945 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb" (UID: "9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:35:49.083084 master-0 kubenswrapper[28758]: I0223 14:35:49.083007 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb" (UID: "9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:35:49.084736 master-0 kubenswrapper[28758]: I0223 14:35:49.084671 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb" (UID: "9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:35:49.086023 master-0 kubenswrapper[28758]: I0223 14:35:49.085973 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb" (UID: "9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:35:49.088613 master-0 kubenswrapper[28758]: I0223 14:35:49.088561 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb" (UID: "9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:35:49.088685 master-0 kubenswrapper[28758]: I0223 14:35:49.088617 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb" (UID: "9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:35:49.088930 master-0 kubenswrapper[28758]: I0223 14:35:49.088859 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb" (UID: "9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:35:49.088977 master-0 kubenswrapper[28758]: I0223 14:35:49.088879 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb" (UID: "9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:35:49.089071 master-0 kubenswrapper[28758]: I0223 14:35:49.089023 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-kube-api-access-4p8km" (OuterVolumeSpecName: "kube-api-access-4p8km") pod "9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb" (UID: "9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb"). InnerVolumeSpecName "kube-api-access-4p8km". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:35:49.092577 master-0 kubenswrapper[28758]: I0223 14:35:49.092496 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb" (UID: "9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:35:49.184910 master-0 kubenswrapper[28758]: I0223 14:35:49.184860 28758 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-router-certs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:35:49.184910 master-0 kubenswrapper[28758]: I0223 14:35:49.184912 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p8km\" (UniqueName: \"kubernetes.io/projected/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-kube-api-access-4p8km\") on node \"master-0\" DevicePath \"\"" Feb 23 14:35:49.185136 master-0 kubenswrapper[28758]: I0223 14:35:49.184926 28758 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:35:49.185136 master-0 kubenswrapper[28758]: I0223 14:35:49.184935 28758 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-service-ca\") on node \"master-0\" DevicePath \"\"" Feb 23 14:35:49.185136 master-0 kubenswrapper[28758]: I0223 14:35:49.184948 28758 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-user-template-error\") on node \"master-0\" DevicePath \"\"" Feb 23 14:35:49.185136 master-0 kubenswrapper[28758]: I0223 14:35:49.184960 28758 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-user-template-login\") on node \"master-0\" DevicePath \"\"" Feb 23 14:35:49.185136 master-0 kubenswrapper[28758]: I0223 14:35:49.184973 28758 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-user-template-provider-selection\") on node \"master-0\" DevicePath \"\"" Feb 23 14:35:49.185136 master-0 kubenswrapper[28758]: I0223 14:35:49.184987 28758 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-audit-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:35:49.185136 master-0 kubenswrapper[28758]: I0223 14:35:49.185000 28758 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-audit-policies\") on node \"master-0\" DevicePath \"\"" Feb 23 14:35:49.185136 master-0 kubenswrapper[28758]: I0223 14:35:49.185021 28758 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-session\") on node \"master-0\" DevicePath \"\"" Feb 23 14:35:49.185136 master-0 kubenswrapper[28758]: I0223 14:35:49.185033 28758 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-serving-cert\") on node \"master-0\" DevicePath \"\"" Feb 23 14:35:49.185136 master-0 kubenswrapper[28758]: I0223 14:35:49.185047 28758 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-ocp-branding-template\") on node \"master-0\" DevicePath \"\"" Feb 23 14:35:49.185136 master-0 kubenswrapper[28758]: I0223 14:35:49.185059 28758 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb-v4-0-config-system-cliconfig\") on node \"master-0\" DevicePath \"\"" Feb 23 14:35:49.193090 master-0 kubenswrapper[28758]: I0223 14:35:49.193030 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-86776854f6-g4ksn"] Feb 23 14:35:49.193445 master-0 kubenswrapper[28758]: E0223 14:35:49.193390 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb" containerName="oauth-openshift" Feb 23 14:35:49.193445 master-0 kubenswrapper[28758]: I0223 14:35:49.193410 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb" containerName="oauth-openshift" Feb 23 14:35:49.193620 master-0 kubenswrapper[28758]: I0223 14:35:49.193592 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb" containerName="oauth-openshift" Feb 23 14:35:49.194185 master-0 kubenswrapper[28758]: I0223 14:35:49.194162 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.211775 master-0 kubenswrapper[28758]: I0223 14:35:49.211731 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-86776854f6-g4ksn"] Feb 23 14:35:49.285969 master-0 kubenswrapper[28758]: I0223 14:35:49.285904 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ca8ab367-b3d7-4663-b4af-284e424dced7-v4-0-config-user-template-login\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.286182 master-0 kubenswrapper[28758]: I0223 14:35:49.285977 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ca8ab367-b3d7-4663-b4af-284e424dced7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.286182 master-0 kubenswrapper[28758]: I0223 14:35:49.286002 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ca8ab367-b3d7-4663-b4af-284e424dced7-audit-policies\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.286182 master-0 kubenswrapper[28758]: I0223 14:35:49.286026 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ca8ab367-b3d7-4663-b4af-284e424dced7-v4-0-config-system-service-ca\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.286182 master-0 kubenswrapper[28758]: I0223 14:35:49.286044 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ca8ab367-b3d7-4663-b4af-284e424dced7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.286343 master-0 kubenswrapper[28758]: I0223 14:35:49.286228 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ca8ab367-b3d7-4663-b4af-284e424dced7-v4-0-config-system-session\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.286375 master-0 kubenswrapper[28758]: I0223 14:35:49.286336 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvplm\" (UniqueName: \"kubernetes.io/projected/ca8ab367-b3d7-4663-b4af-284e424dced7-kube-api-access-vvplm\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.286375 master-0 kubenswrapper[28758]: I0223 14:35:49.286361 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca8ab367-b3d7-4663-b4af-284e424dced7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.286437 master-0 kubenswrapper[28758]: I0223 14:35:49.286425 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca8ab367-b3d7-4663-b4af-284e424dced7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.286468 master-0 kubenswrapper[28758]: I0223 14:35:49.286455 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ca8ab367-b3d7-4663-b4af-284e424dced7-v4-0-config-user-template-error\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.286552 master-0 kubenswrapper[28758]: I0223 14:35:49.286526 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ca8ab367-b3d7-4663-b4af-284e424dced7-v4-0-config-system-router-certs\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.286595 master-0 kubenswrapper[28758]: I0223 14:35:49.286584 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ca8ab367-b3d7-4663-b4af-284e424dced7-audit-dir\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.286695 master-0 kubenswrapper[28758]: I0223 14:35:49.286653 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ca8ab367-b3d7-4663-b4af-284e424dced7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.389624 master-0 kubenswrapper[28758]: I0223 14:35:49.389573 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca8ab367-b3d7-4663-b4af-284e424dced7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.389771 master-0 kubenswrapper[28758]: I0223 14:35:49.389744 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ca8ab367-b3d7-4663-b4af-284e424dced7-v4-0-config-user-template-error\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.389816 master-0 kubenswrapper[28758]: I0223 14:35:49.389784 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ca8ab367-b3d7-4663-b4af-284e424dced7-v4-0-config-system-router-certs\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.390010 master-0 kubenswrapper[28758]: I0223 14:35:49.389965 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ca8ab367-b3d7-4663-b4af-284e424dced7-audit-dir\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.390059 master-0 kubenswrapper[28758]: I0223 14:35:49.390042 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ca8ab367-b3d7-4663-b4af-284e424dced7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.390209 master-0 kubenswrapper[28758]: I0223 14:35:49.390155 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ca8ab367-b3d7-4663-b4af-284e424dced7-audit-dir\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.390276 master-0 kubenswrapper[28758]: I0223 14:35:49.390239 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ca8ab367-b3d7-4663-b4af-284e424dced7-v4-0-config-user-template-login\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.390396 master-0 kubenswrapper[28758]: I0223 14:35:49.390362 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ca8ab367-b3d7-4663-b4af-284e424dced7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.390455 master-0 kubenswrapper[28758]: I0223 14:35:49.390418 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ca8ab367-b3d7-4663-b4af-284e424dced7-audit-policies\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.390528 master-0 kubenswrapper[28758]: I0223 14:35:49.390489 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ca8ab367-b3d7-4663-b4af-284e424dced7-v4-0-config-system-service-ca\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.390575 master-0 kubenswrapper[28758]: I0223 14:35:49.390528 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ca8ab367-b3d7-4663-b4af-284e424dced7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.390575 master-0 kubenswrapper[28758]: I0223 14:35:49.390567 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ca8ab367-b3d7-4663-b4af-284e424dced7-v4-0-config-system-session\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.390662 master-0 kubenswrapper[28758]: I0223 14:35:49.390596 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-8dc4666db-hhgpf"] Feb 23 14:35:49.390698 master-0 kubenswrapper[28758]: I0223 14:35:49.390621 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvplm\" (UniqueName: \"kubernetes.io/projected/ca8ab367-b3d7-4663-b4af-284e424dced7-kube-api-access-vvplm\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.390760 master-0 kubenswrapper[28758]: I0223 14:35:49.390734 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca8ab367-b3d7-4663-b4af-284e424dced7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.391233 master-0 kubenswrapper[28758]: I0223 14:35:49.391203 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ca8ab367-b3d7-4663-b4af-284e424dced7-audit-policies\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.391384 master-0 kubenswrapper[28758]: I0223 14:35:49.391338 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ca8ab367-b3d7-4663-b4af-284e424dced7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.391467 master-0 kubenswrapper[28758]: I0223 14:35:49.391347 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ca8ab367-b3d7-4663-b4af-284e424dced7-v4-0-config-system-service-ca\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.392539 master-0 kubenswrapper[28758]: I0223 14:35:49.392507 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca8ab367-b3d7-4663-b4af-284e424dced7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.393994 master-0 kubenswrapper[28758]: I0223 14:35:49.393952 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca8ab367-b3d7-4663-b4af-284e424dced7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.394714 master-0 kubenswrapper[28758]: I0223 14:35:49.394680 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ca8ab367-b3d7-4663-b4af-284e424dced7-v4-0-config-system-session\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.394825 master-0 kubenswrapper[28758]: I0223 14:35:49.394794 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ca8ab367-b3d7-4663-b4af-284e424dced7-v4-0-config-system-router-certs\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.394965 master-0 kubenswrapper[28758]: I0223 14:35:49.394909 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ca8ab367-b3d7-4663-b4af-284e424dced7-v4-0-config-user-template-login\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.395633 master-0 kubenswrapper[28758]: I0223 14:35:49.395602 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ca8ab367-b3d7-4663-b4af-284e424dced7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.397205 master-0 kubenswrapper[28758]: I0223 14:35:49.397159 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ca8ab367-b3d7-4663-b4af-284e424dced7-v4-0-config-user-template-error\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.397747 master-0 kubenswrapper[28758]: I0223 14:35:49.397714 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-8dc4666db-hhgpf"] Feb 23 14:35:49.404654 master-0 kubenswrapper[28758]: I0223 14:35:49.404624 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ca8ab367-b3d7-4663-b4af-284e424dced7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.406218 master-0 kubenswrapper[28758]: I0223 14:35:49.406165 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvplm\" (UniqueName: \"kubernetes.io/projected/ca8ab367-b3d7-4663-b4af-284e424dced7-kube-api-access-vvplm\") pod \"oauth-openshift-86776854f6-g4ksn\" (UID: \"ca8ab367-b3d7-4663-b4af-284e424dced7\") " pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.553928 master-0 kubenswrapper[28758]: I0223 14:35:49.553830 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:49.866043 master-0 kubenswrapper[28758]: I0223 14:35:49.865963 28758 patch_prober.go:28] interesting pod/console-87c947f7d-jz97m container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.93:8443/health\": dial tcp 10.128.0.93:8443: connect: connection refused" start-of-body= Feb 23 14:35:49.866043 master-0 kubenswrapper[28758]: I0223 14:35:49.866022 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-87c947f7d-jz97m" podUID="26bb163c-8d5f-42bc-b5a6-25f7e2e214e6" containerName="console" probeResult="failure" output="Get \"https://10.128.0.93:8443/health\": dial tcp 10.128.0.93:8443: connect: connection refused" Feb 23 14:35:49.974232 master-0 kubenswrapper[28758]: I0223 14:35:49.974160 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-86776854f6-g4ksn"] Feb 23 14:35:49.984799 master-0 kubenswrapper[28758]: W0223 14:35:49.984735 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca8ab367_b3d7_4663_b4af_284e424dced7.slice/crio-019fdeb85131a2fe05b2196d1bd52ac3e20aeea09670836126897e9a26f9dcc5 WatchSource:0}: Error finding container 019fdeb85131a2fe05b2196d1bd52ac3e20aeea09670836126897e9a26f9dcc5: Status 404 returned error can't find the container with id 019fdeb85131a2fe05b2196d1bd52ac3e20aeea09670836126897e9a26f9dcc5 Feb 23 14:35:50.056410 master-0 kubenswrapper[28758]: I0223 14:35:50.056339 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-955b69498-krrp8" event={"ID":"7e3a3f31-0bce-4392-affe-446a58284289","Type":"ContainerStarted","Data":"577b386e3de794b8ea1d71f6499422722906e5c439647ae2594727c93f6172e5"} Feb 23 14:35:50.056824 master-0 kubenswrapper[28758]: I0223 14:35:50.056780 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-955b69498-krrp8" Feb 23 14:35:50.058356 master-0 kubenswrapper[28758]: I0223 14:35:50.058309 28758 patch_prober.go:28] interesting pod/downloads-955b69498-krrp8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.91:8080/\": dial tcp 10.128.0.91:8080: connect: connection refused" start-of-body= Feb 23 14:35:50.058474 master-0 kubenswrapper[28758]: I0223 14:35:50.058361 28758 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-955b69498-krrp8" podUID="7e3a3f31-0bce-4392-affe-446a58284289" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.91:8080/\": dial tcp 10.128.0.91:8080: connect: connection refused" Feb 23 14:35:50.060365 master-0 kubenswrapper[28758]: I0223 14:35:50.060320 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"069569a4-34c1-4752-af70-b31bcfca4177","Type":"ContainerStarted","Data":"bbef5d1f87604552e5038241fb1806db8690b9c5aaff08ce5041ceb6958b2770"} Feb 23 14:35:50.065458 master-0 kubenswrapper[28758]: I0223 14:35:50.065406 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" event={"ID":"ca8ab367-b3d7-4663-b4af-284e424dced7","Type":"ContainerStarted","Data":"019fdeb85131a2fe05b2196d1bd52ac3e20aeea09670836126897e9a26f9dcc5"} Feb 23 14:35:50.076291 master-0 kubenswrapper[28758]: I0223 14:35:50.076170 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-955b69498-krrp8" podStartSLOduration=1.618042286 podStartE2EDuration="37.076147742s" podCreationTimestamp="2026-02-23 14:35:13 +0000 UTC" firstStartedPulling="2026-02-23 14:35:13.839554273 +0000 UTC m=+45.965870205" lastFinishedPulling="2026-02-23 14:35:49.297659729 +0000 UTC m=+81.423975661" observedRunningTime="2026-02-23 14:35:50.071450566 +0000 UTC m=+82.197766498" watchObservedRunningTime="2026-02-23 14:35:50.076147742 +0000 UTC m=+82.202463674" Feb 23 14:35:50.099009 master-0 kubenswrapper[28758]: I0223 14:35:50.097900 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-5-master-0" podStartSLOduration=8.097877288 podStartE2EDuration="8.097877288s" podCreationTimestamp="2026-02-23 14:35:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:35:50.093585022 +0000 UTC m=+82.219900954" watchObservedRunningTime="2026-02-23 14:35:50.097877288 +0000 UTC m=+82.224193220" Feb 23 14:35:50.102258 master-0 kubenswrapper[28758]: I0223 14:35:50.102129 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb" path="/var/lib/kubelet/pods/9c4d0aed-42bc-4f15-a1c8-8d7a2205eabb/volumes" Feb 23 14:35:51.079566 master-0 kubenswrapper[28758]: I0223 14:35:51.079446 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" event={"ID":"ca8ab367-b3d7-4663-b4af-284e424dced7","Type":"ContainerStarted","Data":"b88ea27ac1324d32f8ab971c4cf20a483eb0650b2b16e86e85dc382cd64afc85"} Feb 23 14:35:51.081195 master-0 kubenswrapper[28758]: I0223 14:35:51.081086 28758 patch_prober.go:28] interesting pod/downloads-955b69498-krrp8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.91:8080/\": dial tcp 10.128.0.91:8080: connect: connection refused" start-of-body= Feb 23 14:35:51.081333 master-0 kubenswrapper[28758]: I0223 14:35:51.081205 28758 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-955b69498-krrp8" podUID="7e3a3f31-0bce-4392-affe-446a58284289" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.91:8080/\": dial tcp 10.128.0.91:8080: connect: connection refused" Feb 23 14:35:51.115715 master-0 kubenswrapper[28758]: I0223 14:35:51.115621 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" podStartSLOduration=13.115596437 podStartE2EDuration="13.115596437s" podCreationTimestamp="2026-02-23 14:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:35:51.113671825 +0000 UTC m=+83.239987787" watchObservedRunningTime="2026-02-23 14:35:51.115596437 +0000 UTC m=+83.241912379" Feb 23 14:35:51.905135 master-0 kubenswrapper[28758]: I0223 14:35:51.905055 28758 patch_prober.go:28] interesting pod/console-6fb7cc48c6-v85rz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" start-of-body= Feb 23 14:35:51.905391 master-0 kubenswrapper[28758]: I0223 14:35:51.905137 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6fb7cc48c6-v85rz" podUID="eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e" containerName="console" probeResult="failure" output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" Feb 23 14:35:52.088765 master-0 kubenswrapper[28758]: I0223 14:35:52.088659 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:52.098790 master-0 kubenswrapper[28758]: I0223 14:35:52.098724 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-86776854f6-g4ksn" Feb 23 14:35:53.415537 master-0 kubenswrapper[28758]: I0223 14:35:53.415431 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-955b69498-krrp8" Feb 23 14:35:59.453080 master-0 kubenswrapper[28758]: I0223 14:35:59.453003 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-87c947f7d-jz97m"] Feb 23 14:35:59.498384 master-0 kubenswrapper[28758]: I0223 14:35:59.498315 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-556bbd75bc-q4jfx"] Feb 23 14:35:59.499191 master-0 kubenswrapper[28758]: I0223 14:35:59.499165 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-556bbd75bc-q4jfx" Feb 23 14:35:59.560239 master-0 kubenswrapper[28758]: I0223 14:35:59.560170 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0119f38b-9247-4e4c-af16-31202765777a-oauth-serving-cert\") pod \"console-556bbd75bc-q4jfx\" (UID: \"0119f38b-9247-4e4c-af16-31202765777a\") " pod="openshift-console/console-556bbd75bc-q4jfx" Feb 23 14:35:59.560544 master-0 kubenswrapper[28758]: I0223 14:35:59.560525 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0119f38b-9247-4e4c-af16-31202765777a-trusted-ca-bundle\") pod \"console-556bbd75bc-q4jfx\" (UID: \"0119f38b-9247-4e4c-af16-31202765777a\") " pod="openshift-console/console-556bbd75bc-q4jfx" Feb 23 14:35:59.560693 master-0 kubenswrapper[28758]: I0223 14:35:59.560672 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0119f38b-9247-4e4c-af16-31202765777a-console-serving-cert\") pod \"console-556bbd75bc-q4jfx\" (UID: \"0119f38b-9247-4e4c-af16-31202765777a\") " pod="openshift-console/console-556bbd75bc-q4jfx" Feb 23 14:35:59.560879 master-0 kubenswrapper[28758]: I0223 14:35:59.560840 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0119f38b-9247-4e4c-af16-31202765777a-console-config\") pod \"console-556bbd75bc-q4jfx\" (UID: \"0119f38b-9247-4e4c-af16-31202765777a\") " pod="openshift-console/console-556bbd75bc-q4jfx" Feb 23 14:35:59.560932 master-0 kubenswrapper[28758]: I0223 14:35:59.560887 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0119f38b-9247-4e4c-af16-31202765777a-console-oauth-config\") pod \"console-556bbd75bc-q4jfx\" (UID: \"0119f38b-9247-4e4c-af16-31202765777a\") " pod="openshift-console/console-556bbd75bc-q4jfx" Feb 23 14:35:59.561161 master-0 kubenswrapper[28758]: I0223 14:35:59.561109 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0119f38b-9247-4e4c-af16-31202765777a-service-ca\") pod \"console-556bbd75bc-q4jfx\" (UID: \"0119f38b-9247-4e4c-af16-31202765777a\") " pod="openshift-console/console-556bbd75bc-q4jfx" Feb 23 14:35:59.561207 master-0 kubenswrapper[28758]: I0223 14:35:59.561171 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b54fz\" (UniqueName: \"kubernetes.io/projected/0119f38b-9247-4e4c-af16-31202765777a-kube-api-access-b54fz\") pod \"console-556bbd75bc-q4jfx\" (UID: \"0119f38b-9247-4e4c-af16-31202765777a\") " pod="openshift-console/console-556bbd75bc-q4jfx" Feb 23 14:35:59.662257 master-0 kubenswrapper[28758]: I0223 14:35:59.662158 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0119f38b-9247-4e4c-af16-31202765777a-console-oauth-config\") pod \"console-556bbd75bc-q4jfx\" (UID: \"0119f38b-9247-4e4c-af16-31202765777a\") " pod="openshift-console/console-556bbd75bc-q4jfx" Feb 23 14:35:59.662257 master-0 kubenswrapper[28758]: I0223 14:35:59.662217 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0119f38b-9247-4e4c-af16-31202765777a-console-config\") pod \"console-556bbd75bc-q4jfx\" (UID: \"0119f38b-9247-4e4c-af16-31202765777a\") " pod="openshift-console/console-556bbd75bc-q4jfx" Feb 23 14:35:59.662681 master-0 kubenswrapper[28758]: I0223 14:35:59.662291 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0119f38b-9247-4e4c-af16-31202765777a-service-ca\") pod \"console-556bbd75bc-q4jfx\" (UID: \"0119f38b-9247-4e4c-af16-31202765777a\") " pod="openshift-console/console-556bbd75bc-q4jfx" Feb 23 14:35:59.662681 master-0 kubenswrapper[28758]: I0223 14:35:59.662318 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b54fz\" (UniqueName: \"kubernetes.io/projected/0119f38b-9247-4e4c-af16-31202765777a-kube-api-access-b54fz\") pod \"console-556bbd75bc-q4jfx\" (UID: \"0119f38b-9247-4e4c-af16-31202765777a\") " pod="openshift-console/console-556bbd75bc-q4jfx" Feb 23 14:35:59.662681 master-0 kubenswrapper[28758]: I0223 14:35:59.662342 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0119f38b-9247-4e4c-af16-31202765777a-oauth-serving-cert\") pod \"console-556bbd75bc-q4jfx\" (UID: \"0119f38b-9247-4e4c-af16-31202765777a\") " pod="openshift-console/console-556bbd75bc-q4jfx" Feb 23 14:35:59.662681 master-0 kubenswrapper[28758]: I0223 14:35:59.662370 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0119f38b-9247-4e4c-af16-31202765777a-trusted-ca-bundle\") pod \"console-556bbd75bc-q4jfx\" (UID: \"0119f38b-9247-4e4c-af16-31202765777a\") " pod="openshift-console/console-556bbd75bc-q4jfx" Feb 23 14:35:59.662681 master-0 kubenswrapper[28758]: I0223 14:35:59.662414 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0119f38b-9247-4e4c-af16-31202765777a-console-serving-cert\") pod \"console-556bbd75bc-q4jfx\" (UID: \"0119f38b-9247-4e4c-af16-31202765777a\") " pod="openshift-console/console-556bbd75bc-q4jfx" Feb 23 14:35:59.664378 master-0 kubenswrapper[28758]: I0223 14:35:59.664327 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0119f38b-9247-4e4c-af16-31202765777a-console-config\") pod \"console-556bbd75bc-q4jfx\" (UID: \"0119f38b-9247-4e4c-af16-31202765777a\") " pod="openshift-console/console-556bbd75bc-q4jfx" Feb 23 14:35:59.664859 master-0 kubenswrapper[28758]: I0223 14:35:59.664802 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0119f38b-9247-4e4c-af16-31202765777a-service-ca\") pod \"console-556bbd75bc-q4jfx\" (UID: \"0119f38b-9247-4e4c-af16-31202765777a\") " pod="openshift-console/console-556bbd75bc-q4jfx" Feb 23 14:35:59.665382 master-0 kubenswrapper[28758]: I0223 14:35:59.665339 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0119f38b-9247-4e4c-af16-31202765777a-oauth-serving-cert\") pod \"console-556bbd75bc-q4jfx\" (UID: \"0119f38b-9247-4e4c-af16-31202765777a\") " pod="openshift-console/console-556bbd75bc-q4jfx" Feb 23 14:35:59.665718 master-0 kubenswrapper[28758]: I0223 14:35:59.665657 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0119f38b-9247-4e4c-af16-31202765777a-trusted-ca-bundle\") pod \"console-556bbd75bc-q4jfx\" (UID: \"0119f38b-9247-4e4c-af16-31202765777a\") " pod="openshift-console/console-556bbd75bc-q4jfx" Feb 23 14:35:59.665718 master-0 kubenswrapper[28758]: I0223 14:35:59.665709 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0119f38b-9247-4e4c-af16-31202765777a-console-serving-cert\") pod \"console-556bbd75bc-q4jfx\" (UID: \"0119f38b-9247-4e4c-af16-31202765777a\") " pod="openshift-console/console-556bbd75bc-q4jfx" Feb 23 14:35:59.669365 master-0 kubenswrapper[28758]: I0223 14:35:59.669317 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0119f38b-9247-4e4c-af16-31202765777a-console-oauth-config\") pod \"console-556bbd75bc-q4jfx\" (UID: \"0119f38b-9247-4e4c-af16-31202765777a\") " pod="openshift-console/console-556bbd75bc-q4jfx" Feb 23 14:36:00.082249 master-0 kubenswrapper[28758]: I0223 14:36:00.081740 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-556bbd75bc-q4jfx"] Feb 23 14:36:00.100603 master-0 kubenswrapper[28758]: I0223 14:36:00.100522 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b54fz\" (UniqueName: \"kubernetes.io/projected/0119f38b-9247-4e4c-af16-31202765777a-kube-api-access-b54fz\") pod \"console-556bbd75bc-q4jfx\" (UID: \"0119f38b-9247-4e4c-af16-31202765777a\") " pod="openshift-console/console-556bbd75bc-q4jfx" Feb 23 14:36:00.119602 master-0 kubenswrapper[28758]: I0223 14:36:00.119544 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-556bbd75bc-q4jfx" Feb 23 14:36:00.517553 master-0 kubenswrapper[28758]: I0223 14:36:00.516840 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-556bbd75bc-q4jfx"] Feb 23 14:36:00.527249 master-0 kubenswrapper[28758]: W0223 14:36:00.527180 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0119f38b_9247_4e4c_af16_31202765777a.slice/crio-6539d142a95f7360921c41ed8fc49fd18b776d7d20b04875bb0b029b370f8ffd WatchSource:0}: Error finding container 6539d142a95f7360921c41ed8fc49fd18b776d7d20b04875bb0b029b370f8ffd: Status 404 returned error can't find the container with id 6539d142a95f7360921c41ed8fc49fd18b776d7d20b04875bb0b029b370f8ffd Feb 23 14:36:01.166753 master-0 kubenswrapper[28758]: I0223 14:36:01.166696 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-556bbd75bc-q4jfx" event={"ID":"0119f38b-9247-4e4c-af16-31202765777a","Type":"ContainerStarted","Data":"4b9305d2d2ffc30dbee7c9a127f79d70cae04ade1e03a8ca762fc497eed55098"} Feb 23 14:36:01.166753 master-0 kubenswrapper[28758]: I0223 14:36:01.166758 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-556bbd75bc-q4jfx" event={"ID":"0119f38b-9247-4e4c-af16-31202765777a","Type":"ContainerStarted","Data":"6539d142a95f7360921c41ed8fc49fd18b776d7d20b04875bb0b029b370f8ffd"} Feb 23 14:36:01.905710 master-0 kubenswrapper[28758]: I0223 14:36:01.905622 28758 patch_prober.go:28] interesting pod/console-6fb7cc48c6-v85rz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" start-of-body= Feb 23 14:36:01.906301 master-0 kubenswrapper[28758]: I0223 14:36:01.905706 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6fb7cc48c6-v85rz" podUID="eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e" containerName="console" probeResult="failure" output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" Feb 23 14:36:03.160649 master-0 kubenswrapper[28758]: I0223 14:36:03.160535 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-556bbd75bc-q4jfx" podStartSLOduration=4.160507323 podStartE2EDuration="4.160507323s" podCreationTimestamp="2026-02-23 14:35:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:36:03.157072882 +0000 UTC m=+95.283388904" watchObservedRunningTime="2026-02-23 14:36:03.160507323 +0000 UTC m=+95.286823285" Feb 23 14:36:07.792110 master-0 kubenswrapper[28758]: I0223 14:36:07.792009 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access\") pod \"installer-4-master-0\" (UID: \"e1148263-7b15-4c12-a217-8b030ecd9348\") " pod="openshift-kube-apiserver/installer-4-master-0" Feb 23 14:36:07.795575 master-0 kubenswrapper[28758]: I0223 14:36:07.795529 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access\") pod \"installer-4-master-0\" (UID: \"e1148263-7b15-4c12-a217-8b030ecd9348\") " pod="openshift-kube-apiserver/installer-4-master-0" Feb 23 14:36:07.893767 master-0 kubenswrapper[28758]: I0223 14:36:07.893678 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access\") pod \"e1148263-7b15-4c12-a217-8b030ecd9348\" (UID: \"e1148263-7b15-4c12-a217-8b030ecd9348\") " Feb 23 14:36:07.896465 master-0 kubenswrapper[28758]: I0223 14:36:07.896396 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e1148263-7b15-4c12-a217-8b030ecd9348" (UID: "e1148263-7b15-4c12-a217-8b030ecd9348"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:36:07.995183 master-0 kubenswrapper[28758]: I0223 14:36:07.995107 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1148263-7b15-4c12-a217-8b030ecd9348-kube-api-access\") on node \"master-0\" DevicePath \"\"" Feb 23 14:36:10.120373 master-0 kubenswrapper[28758]: I0223 14:36:10.120283 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-556bbd75bc-q4jfx" Feb 23 14:36:10.120373 master-0 kubenswrapper[28758]: I0223 14:36:10.120372 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-556bbd75bc-q4jfx" Feb 23 14:36:10.122952 master-0 kubenswrapper[28758]: I0223 14:36:10.122898 28758 patch_prober.go:28] interesting pod/console-556bbd75bc-q4jfx container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Feb 23 14:36:10.123030 master-0 kubenswrapper[28758]: I0223 14:36:10.122991 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-556bbd75bc-q4jfx" podUID="0119f38b-9247-4e4c-af16-31202765777a" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Feb 23 14:36:11.905544 master-0 kubenswrapper[28758]: I0223 14:36:11.905415 28758 patch_prober.go:28] interesting pod/console-6fb7cc48c6-v85rz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" start-of-body= Feb 23 14:36:11.905544 master-0 kubenswrapper[28758]: I0223 14:36:11.905500 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6fb7cc48c6-v85rz" podUID="eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e" containerName="console" probeResult="failure" output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" Feb 23 14:36:20.120882 master-0 kubenswrapper[28758]: I0223 14:36:20.120750 28758 patch_prober.go:28] interesting pod/console-556bbd75bc-q4jfx container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Feb 23 14:36:20.122132 master-0 kubenswrapper[28758]: I0223 14:36:20.120926 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-556bbd75bc-q4jfx" podUID="0119f38b-9247-4e4c-af16-31202765777a" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Feb 23 14:36:21.906406 master-0 kubenswrapper[28758]: I0223 14:36:21.905686 28758 patch_prober.go:28] interesting pod/console-6fb7cc48c6-v85rz container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" start-of-body= Feb 23 14:36:21.907077 master-0 kubenswrapper[28758]: I0223 14:36:21.906441 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6fb7cc48c6-v85rz" podUID="eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e" containerName="console" probeResult="failure" output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" Feb 23 14:36:22.048357 master-0 kubenswrapper[28758]: I0223 14:36:22.048286 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-55d786cb4c-cqkbt"] Feb 23 14:36:22.048614 master-0 kubenswrapper[28758]: I0223 14:36:22.048563 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" podUID="959c2393-e914-4c10-a18f-b30fcf012d19" containerName="controller-manager" containerID="cri-o://18b02500a922018fef0fe170792a110deb1ca490ebe442765b459f2885b97744" gracePeriod=30 Feb 23 14:36:22.068301 master-0 kubenswrapper[28758]: I0223 14:36:22.068248 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f"] Feb 23 14:36:22.068845 master-0 kubenswrapper[28758]: I0223 14:36:22.068813 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" podUID="482284fd-6911-4ba6-8d57-7966cc51117a" containerName="route-controller-manager" containerID="cri-o://f3016d799a76a0e861077c16a305235311ad634f8054a70ca605ccf2e9c27c2c" gracePeriod=30 Feb 23 14:36:22.425195 master-0 kubenswrapper[28758]: I0223 14:36:22.425069 28758 generic.go:334] "Generic (PLEG): container finished" podID="959c2393-e914-4c10-a18f-b30fcf012d19" containerID="18b02500a922018fef0fe170792a110deb1ca490ebe442765b459f2885b97744" exitCode=0 Feb 23 14:36:22.425395 master-0 kubenswrapper[28758]: I0223 14:36:22.425220 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" event={"ID":"959c2393-e914-4c10-a18f-b30fcf012d19","Type":"ContainerDied","Data":"18b02500a922018fef0fe170792a110deb1ca490ebe442765b459f2885b97744"} Feb 23 14:36:22.425395 master-0 kubenswrapper[28758]: I0223 14:36:22.425328 28758 scope.go:117] "RemoveContainer" containerID="943dceb3c19889e0c21143fb06ce16ff62e733710dc9afea16ddd3ae92da4904" Feb 23 14:36:22.434661 master-0 kubenswrapper[28758]: I0223 14:36:22.434623 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-8bb99f4f-msq8f_482284fd-6911-4ba6-8d57-7966cc51117a/route-controller-manager/1.log" Feb 23 14:36:22.434820 master-0 kubenswrapper[28758]: I0223 14:36:22.434691 28758 generic.go:334] "Generic (PLEG): container finished" podID="482284fd-6911-4ba6-8d57-7966cc51117a" containerID="f3016d799a76a0e861077c16a305235311ad634f8054a70ca605ccf2e9c27c2c" exitCode=0 Feb 23 14:36:22.434820 master-0 kubenswrapper[28758]: I0223 14:36:22.434733 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" event={"ID":"482284fd-6911-4ba6-8d57-7966cc51117a","Type":"ContainerDied","Data":"f3016d799a76a0e861077c16a305235311ad634f8054a70ca605ccf2e9c27c2c"} Feb 23 14:36:22.487239 master-0 kubenswrapper[28758]: I0223 14:36:22.487181 28758 scope.go:117] "RemoveContainer" containerID="cc5b0e807a282b75c570fbfb71a174caf59e3ff1678808f33d1b9369bbe859b7" Feb 23 14:36:22.499173 master-0 kubenswrapper[28758]: I0223 14:36:22.499092 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:36:22.568744 master-0 kubenswrapper[28758]: I0223 14:36:22.568704 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:36:22.613131 master-0 kubenswrapper[28758]: I0223 14:36:22.612739 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/959c2393-e914-4c10-a18f-b30fcf012d19-serving-cert\") pod \"959c2393-e914-4c10-a18f-b30fcf012d19\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " Feb 23 14:36:22.613131 master-0 kubenswrapper[28758]: I0223 14:36:22.612863 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-config\") pod \"959c2393-e914-4c10-a18f-b30fcf012d19\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " Feb 23 14:36:22.613131 master-0 kubenswrapper[28758]: I0223 14:36:22.612928 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-client-ca\") pod \"959c2393-e914-4c10-a18f-b30fcf012d19\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " Feb 23 14:36:22.613131 master-0 kubenswrapper[28758]: I0223 14:36:22.612999 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42sml\" (UniqueName: \"kubernetes.io/projected/959c2393-e914-4c10-a18f-b30fcf012d19-kube-api-access-42sml\") pod \"959c2393-e914-4c10-a18f-b30fcf012d19\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " Feb 23 14:36:22.613131 master-0 kubenswrapper[28758]: I0223 14:36:22.613035 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-proxy-ca-bundles\") pod \"959c2393-e914-4c10-a18f-b30fcf012d19\" (UID: \"959c2393-e914-4c10-a18f-b30fcf012d19\") " Feb 23 14:36:22.613590 master-0 kubenswrapper[28758]: I0223 14:36:22.613560 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-client-ca" (OuterVolumeSpecName: "client-ca") pod "959c2393-e914-4c10-a18f-b30fcf012d19" (UID: "959c2393-e914-4c10-a18f-b30fcf012d19"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:36:22.613772 master-0 kubenswrapper[28758]: I0223 14:36:22.613747 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-config" (OuterVolumeSpecName: "config") pod "959c2393-e914-4c10-a18f-b30fcf012d19" (UID: "959c2393-e914-4c10-a18f-b30fcf012d19"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:36:22.613829 master-0 kubenswrapper[28758]: I0223 14:36:22.613742 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "959c2393-e914-4c10-a18f-b30fcf012d19" (UID: "959c2393-e914-4c10-a18f-b30fcf012d19"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:36:22.614210 master-0 kubenswrapper[28758]: I0223 14:36:22.614172 28758 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-client-ca\") on node \"master-0\" DevicePath \"\"" Feb 23 14:36:22.614307 master-0 kubenswrapper[28758]: I0223 14:36:22.614257 28758 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Feb 23 14:36:22.614388 master-0 kubenswrapper[28758]: I0223 14:36:22.614369 28758 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/959c2393-e914-4c10-a18f-b30fcf012d19-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:36:22.615749 master-0 kubenswrapper[28758]: I0223 14:36:22.615727 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/959c2393-e914-4c10-a18f-b30fcf012d19-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "959c2393-e914-4c10-a18f-b30fcf012d19" (UID: "959c2393-e914-4c10-a18f-b30fcf012d19"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:36:22.615821 master-0 kubenswrapper[28758]: I0223 14:36:22.615789 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/959c2393-e914-4c10-a18f-b30fcf012d19-kube-api-access-42sml" (OuterVolumeSpecName: "kube-api-access-42sml") pod "959c2393-e914-4c10-a18f-b30fcf012d19" (UID: "959c2393-e914-4c10-a18f-b30fcf012d19"). InnerVolumeSpecName "kube-api-access-42sml". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:36:22.715016 master-0 kubenswrapper[28758]: I0223 14:36:22.714947 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/482284fd-6911-4ba6-8d57-7966cc51117a-serving-cert\") pod \"482284fd-6911-4ba6-8d57-7966cc51117a\" (UID: \"482284fd-6911-4ba6-8d57-7966cc51117a\") " Feb 23 14:36:22.715016 master-0 kubenswrapper[28758]: I0223 14:36:22.714995 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/482284fd-6911-4ba6-8d57-7966cc51117a-client-ca\") pod \"482284fd-6911-4ba6-8d57-7966cc51117a\" (UID: \"482284fd-6911-4ba6-8d57-7966cc51117a\") " Feb 23 14:36:22.715285 master-0 kubenswrapper[28758]: I0223 14:36:22.715074 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khfkr\" (UniqueName: \"kubernetes.io/projected/482284fd-6911-4ba6-8d57-7966cc51117a-kube-api-access-khfkr\") pod \"482284fd-6911-4ba6-8d57-7966cc51117a\" (UID: \"482284fd-6911-4ba6-8d57-7966cc51117a\") " Feb 23 14:36:22.715285 master-0 kubenswrapper[28758]: I0223 14:36:22.715118 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/482284fd-6911-4ba6-8d57-7966cc51117a-config\") pod \"482284fd-6911-4ba6-8d57-7966cc51117a\" (UID: \"482284fd-6911-4ba6-8d57-7966cc51117a\") " Feb 23 14:36:22.715285 master-0 kubenswrapper[28758]: I0223 14:36:22.715283 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42sml\" (UniqueName: \"kubernetes.io/projected/959c2393-e914-4c10-a18f-b30fcf012d19-kube-api-access-42sml\") on node \"master-0\" DevicePath \"\"" Feb 23 14:36:22.715422 master-0 kubenswrapper[28758]: I0223 14:36:22.715298 28758 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/959c2393-e914-4c10-a18f-b30fcf012d19-serving-cert\") on node \"master-0\" DevicePath \"\"" Feb 23 14:36:22.715925 master-0 kubenswrapper[28758]: I0223 14:36:22.715857 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/482284fd-6911-4ba6-8d57-7966cc51117a-client-ca" (OuterVolumeSpecName: "client-ca") pod "482284fd-6911-4ba6-8d57-7966cc51117a" (UID: "482284fd-6911-4ba6-8d57-7966cc51117a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:36:22.715996 master-0 kubenswrapper[28758]: I0223 14:36:22.715874 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/482284fd-6911-4ba6-8d57-7966cc51117a-config" (OuterVolumeSpecName: "config") pod "482284fd-6911-4ba6-8d57-7966cc51117a" (UID: "482284fd-6911-4ba6-8d57-7966cc51117a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:36:22.718223 master-0 kubenswrapper[28758]: I0223 14:36:22.718179 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/482284fd-6911-4ba6-8d57-7966cc51117a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "482284fd-6911-4ba6-8d57-7966cc51117a" (UID: "482284fd-6911-4ba6-8d57-7966cc51117a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:36:22.718301 master-0 kubenswrapper[28758]: I0223 14:36:22.718243 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/482284fd-6911-4ba6-8d57-7966cc51117a-kube-api-access-khfkr" (OuterVolumeSpecName: "kube-api-access-khfkr") pod "482284fd-6911-4ba6-8d57-7966cc51117a" (UID: "482284fd-6911-4ba6-8d57-7966cc51117a"). InnerVolumeSpecName "kube-api-access-khfkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:36:22.816105 master-0 kubenswrapper[28758]: I0223 14:36:22.816039 28758 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/482284fd-6911-4ba6-8d57-7966cc51117a-serving-cert\") on node \"master-0\" DevicePath \"\"" Feb 23 14:36:22.816105 master-0 kubenswrapper[28758]: I0223 14:36:22.816080 28758 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/482284fd-6911-4ba6-8d57-7966cc51117a-client-ca\") on node \"master-0\" DevicePath \"\"" Feb 23 14:36:22.816105 master-0 kubenswrapper[28758]: I0223 14:36:22.816092 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khfkr\" (UniqueName: \"kubernetes.io/projected/482284fd-6911-4ba6-8d57-7966cc51117a-kube-api-access-khfkr\") on node \"master-0\" DevicePath \"\"" Feb 23 14:36:22.816105 master-0 kubenswrapper[28758]: I0223 14:36:22.816103 28758 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/482284fd-6911-4ba6-8d57-7966cc51117a-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:36:23.285236 master-0 kubenswrapper[28758]: I0223 14:36:23.285154 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5bf558b489-f8nz4"] Feb 23 14:36:23.285760 master-0 kubenswrapper[28758]: E0223 14:36:23.285600 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="959c2393-e914-4c10-a18f-b30fcf012d19" containerName="controller-manager" Feb 23 14:36:23.285760 master-0 kubenswrapper[28758]: I0223 14:36:23.285618 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="959c2393-e914-4c10-a18f-b30fcf012d19" containerName="controller-manager" Feb 23 14:36:23.285760 master-0 kubenswrapper[28758]: E0223 14:36:23.285648 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="482284fd-6911-4ba6-8d57-7966cc51117a" containerName="route-controller-manager" Feb 23 14:36:23.285760 master-0 kubenswrapper[28758]: I0223 14:36:23.285657 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="482284fd-6911-4ba6-8d57-7966cc51117a" containerName="route-controller-manager" Feb 23 14:36:23.285760 master-0 kubenswrapper[28758]: E0223 14:36:23.285668 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="959c2393-e914-4c10-a18f-b30fcf012d19" containerName="controller-manager" Feb 23 14:36:23.285760 master-0 kubenswrapper[28758]: I0223 14:36:23.285677 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="959c2393-e914-4c10-a18f-b30fcf012d19" containerName="controller-manager" Feb 23 14:36:23.285760 master-0 kubenswrapper[28758]: E0223 14:36:23.285695 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="482284fd-6911-4ba6-8d57-7966cc51117a" containerName="route-controller-manager" Feb 23 14:36:23.285760 master-0 kubenswrapper[28758]: I0223 14:36:23.285703 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="482284fd-6911-4ba6-8d57-7966cc51117a" containerName="route-controller-manager" Feb 23 14:36:23.286035 master-0 kubenswrapper[28758]: I0223 14:36:23.286002 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="482284fd-6911-4ba6-8d57-7966cc51117a" containerName="route-controller-manager" Feb 23 14:36:23.286077 master-0 kubenswrapper[28758]: I0223 14:36:23.286042 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="959c2393-e914-4c10-a18f-b30fcf012d19" containerName="controller-manager" Feb 23 14:36:23.286077 master-0 kubenswrapper[28758]: I0223 14:36:23.286057 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="959c2393-e914-4c10-a18f-b30fcf012d19" containerName="controller-manager" Feb 23 14:36:23.286668 master-0 kubenswrapper[28758]: I0223 14:36:23.286644 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bf558b489-f8nz4" Feb 23 14:36:23.288898 master-0 kubenswrapper[28758]: I0223 14:36:23.288852 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-kdvbz" Feb 23 14:36:23.289845 master-0 kubenswrapper[28758]: I0223 14:36:23.289793 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c58f5c645-vwxsj"] Feb 23 14:36:23.290336 master-0 kubenswrapper[28758]: I0223 14:36:23.290302 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="482284fd-6911-4ba6-8d57-7966cc51117a" containerName="route-controller-manager" Feb 23 14:36:23.290920 master-0 kubenswrapper[28758]: I0223 14:36:23.290887 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c58f5c645-vwxsj" Feb 23 14:36:23.296949 master-0 kubenswrapper[28758]: I0223 14:36:23.296904 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c58f5c645-vwxsj"] Feb 23 14:36:23.299102 master-0 kubenswrapper[28758]: I0223 14:36:23.299057 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5bf558b489-f8nz4"] Feb 23 14:36:23.408387 master-0 kubenswrapper[28758]: I0223 14:36:23.406950 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-79f587d78f-xpdg9"] Feb 23 14:36:23.411512 master-0 kubenswrapper[28758]: I0223 14:36:23.411441 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-79f587d78f-xpdg9" Feb 23 14:36:23.414553 master-0 kubenswrapper[28758]: I0223 14:36:23.414512 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 23 14:36:23.414762 master-0 kubenswrapper[28758]: I0223 14:36:23.414727 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 23 14:36:23.420114 master-0 kubenswrapper[28758]: I0223 14:36:23.420028 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-79f587d78f-xpdg9"] Feb 23 14:36:23.425383 master-0 kubenswrapper[28758]: I0223 14:36:23.425102 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7r9f\" (UniqueName: \"kubernetes.io/projected/de9d1b65-e208-41f5-8624-c78a11b2dfb6-kube-api-access-x7r9f\") pod \"route-controller-manager-5c58f5c645-vwxsj\" (UID: \"de9d1b65-e208-41f5-8624-c78a11b2dfb6\") " pod="openshift-route-controller-manager/route-controller-manager-5c58f5c645-vwxsj" Feb 23 14:36:23.425383 master-0 kubenswrapper[28758]: I0223 14:36:23.425190 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de9d1b65-e208-41f5-8624-c78a11b2dfb6-client-ca\") pod \"route-controller-manager-5c58f5c645-vwxsj\" (UID: \"de9d1b65-e208-41f5-8624-c78a11b2dfb6\") " pod="openshift-route-controller-manager/route-controller-manager-5c58f5c645-vwxsj" Feb 23 14:36:23.425383 master-0 kubenswrapper[28758]: I0223 14:36:23.425251 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/72510270-7c44-432a-9c75-f6e99adefa27-client-ca\") pod \"controller-manager-5bf558b489-f8nz4\" (UID: \"72510270-7c44-432a-9c75-f6e99adefa27\") " pod="openshift-controller-manager/controller-manager-5bf558b489-f8nz4" Feb 23 14:36:23.425383 master-0 kubenswrapper[28758]: I0223 14:36:23.425274 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd4cw\" (UniqueName: \"kubernetes.io/projected/72510270-7c44-432a-9c75-f6e99adefa27-kube-api-access-kd4cw\") pod \"controller-manager-5bf558b489-f8nz4\" (UID: \"72510270-7c44-432a-9c75-f6e99adefa27\") " pod="openshift-controller-manager/controller-manager-5bf558b489-f8nz4" Feb 23 14:36:23.425383 master-0 kubenswrapper[28758]: I0223 14:36:23.425331 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de9d1b65-e208-41f5-8624-c78a11b2dfb6-serving-cert\") pod \"route-controller-manager-5c58f5c645-vwxsj\" (UID: \"de9d1b65-e208-41f5-8624-c78a11b2dfb6\") " pod="openshift-route-controller-manager/route-controller-manager-5c58f5c645-vwxsj" Feb 23 14:36:23.425383 master-0 kubenswrapper[28758]: I0223 14:36:23.425366 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/72510270-7c44-432a-9c75-f6e99adefa27-proxy-ca-bundles\") pod \"controller-manager-5bf558b489-f8nz4\" (UID: \"72510270-7c44-432a-9c75-f6e99adefa27\") " pod="openshift-controller-manager/controller-manager-5bf558b489-f8nz4" Feb 23 14:36:23.425888 master-0 kubenswrapper[28758]: I0223 14:36:23.425394 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72510270-7c44-432a-9c75-f6e99adefa27-config\") pod \"controller-manager-5bf558b489-f8nz4\" (UID: \"72510270-7c44-432a-9c75-f6e99adefa27\") " pod="openshift-controller-manager/controller-manager-5bf558b489-f8nz4" Feb 23 14:36:23.425888 master-0 kubenswrapper[28758]: I0223 14:36:23.425417 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de9d1b65-e208-41f5-8624-c78a11b2dfb6-config\") pod \"route-controller-manager-5c58f5c645-vwxsj\" (UID: \"de9d1b65-e208-41f5-8624-c78a11b2dfb6\") " pod="openshift-route-controller-manager/route-controller-manager-5c58f5c645-vwxsj" Feb 23 14:36:23.425888 master-0 kubenswrapper[28758]: I0223 14:36:23.425440 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72510270-7c44-432a-9c75-f6e99adefa27-serving-cert\") pod \"controller-manager-5bf558b489-f8nz4\" (UID: \"72510270-7c44-432a-9c75-f6e99adefa27\") " pod="openshift-controller-manager/controller-manager-5bf558b489-f8nz4" Feb 23 14:36:23.444061 master-0 kubenswrapper[28758]: I0223 14:36:23.444005 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" event={"ID":"959c2393-e914-4c10-a18f-b30fcf012d19","Type":"ContainerDied","Data":"121fb1d62a402b22b2ce0dcefcc58af76b44ad548fdacc6da5113c93b5d1d4e0"} Feb 23 14:36:23.444061 master-0 kubenswrapper[28758]: I0223 14:36:23.444058 28758 scope.go:117] "RemoveContainer" containerID="18b02500a922018fef0fe170792a110deb1ca490ebe442765b459f2885b97744" Feb 23 14:36:23.444301 master-0 kubenswrapper[28758]: I0223 14:36:23.444124 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55d786cb4c-cqkbt" Feb 23 14:36:23.455224 master-0 kubenswrapper[28758]: I0223 14:36:23.454713 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" event={"ID":"482284fd-6911-4ba6-8d57-7966cc51117a","Type":"ContainerDied","Data":"842c59a633c6726baab1699104248bceff992214333b768aa99b1550ee1de3d0"} Feb 23 14:36:23.455224 master-0 kubenswrapper[28758]: I0223 14:36:23.454755 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f" Feb 23 14:36:23.479732 master-0 kubenswrapper[28758]: I0223 14:36:23.479594 28758 scope.go:117] "RemoveContainer" containerID="f3016d799a76a0e861077c16a305235311ad634f8054a70ca605ccf2e9c27c2c" Feb 23 14:36:23.488514 master-0 kubenswrapper[28758]: I0223 14:36:23.488387 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-55d786cb4c-cqkbt"] Feb 23 14:36:23.491540 master-0 kubenswrapper[28758]: I0223 14:36:23.491470 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-55d786cb4c-cqkbt"] Feb 23 14:36:23.504159 master-0 kubenswrapper[28758]: I0223 14:36:23.504092 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f"] Feb 23 14:36:23.507663 master-0 kubenswrapper[28758]: I0223 14:36:23.507620 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8bb99f4f-msq8f"] Feb 23 14:36:23.527041 master-0 kubenswrapper[28758]: I0223 14:36:23.527006 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de9d1b65-e208-41f5-8624-c78a11b2dfb6-serving-cert\") pod \"route-controller-manager-5c58f5c645-vwxsj\" (UID: \"de9d1b65-e208-41f5-8624-c78a11b2dfb6\") " pod="openshift-route-controller-manager/route-controller-manager-5c58f5c645-vwxsj" Feb 23 14:36:23.527179 master-0 kubenswrapper[28758]: I0223 14:36:23.527061 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/72510270-7c44-432a-9c75-f6e99adefa27-proxy-ca-bundles\") pod \"controller-manager-5bf558b489-f8nz4\" (UID: \"72510270-7c44-432a-9c75-f6e99adefa27\") " pod="openshift-controller-manager/controller-manager-5bf558b489-f8nz4" Feb 23 14:36:23.527179 master-0 kubenswrapper[28758]: I0223 14:36:23.527084 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72510270-7c44-432a-9c75-f6e99adefa27-config\") pod \"controller-manager-5bf558b489-f8nz4\" (UID: \"72510270-7c44-432a-9c75-f6e99adefa27\") " pod="openshift-controller-manager/controller-manager-5bf558b489-f8nz4" Feb 23 14:36:23.527179 master-0 kubenswrapper[28758]: I0223 14:36:23.527102 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de9d1b65-e208-41f5-8624-c78a11b2dfb6-config\") pod \"route-controller-manager-5c58f5c645-vwxsj\" (UID: \"de9d1b65-e208-41f5-8624-c78a11b2dfb6\") " pod="openshift-route-controller-manager/route-controller-manager-5c58f5c645-vwxsj" Feb 23 14:36:23.527179 master-0 kubenswrapper[28758]: I0223 14:36:23.527121 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/83f5f03a-8877-4435-9e4f-be8e53313076-nginx-conf\") pod \"networking-console-plugin-79f587d78f-xpdg9\" (UID: \"83f5f03a-8877-4435-9e4f-be8e53313076\") " pod="openshift-network-console/networking-console-plugin-79f587d78f-xpdg9" Feb 23 14:36:23.527179 master-0 kubenswrapper[28758]: I0223 14:36:23.527143 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72510270-7c44-432a-9c75-f6e99adefa27-serving-cert\") pod \"controller-manager-5bf558b489-f8nz4\" (UID: \"72510270-7c44-432a-9c75-f6e99adefa27\") " pod="openshift-controller-manager/controller-manager-5bf558b489-f8nz4" Feb 23 14:36:23.527179 master-0 kubenswrapper[28758]: I0223 14:36:23.527173 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/83f5f03a-8877-4435-9e4f-be8e53313076-networking-console-plugin-cert\") pod \"networking-console-plugin-79f587d78f-xpdg9\" (UID: \"83f5f03a-8877-4435-9e4f-be8e53313076\") " pod="openshift-network-console/networking-console-plugin-79f587d78f-xpdg9" Feb 23 14:36:23.527408 master-0 kubenswrapper[28758]: I0223 14:36:23.527202 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7r9f\" (UniqueName: \"kubernetes.io/projected/de9d1b65-e208-41f5-8624-c78a11b2dfb6-kube-api-access-x7r9f\") pod \"route-controller-manager-5c58f5c645-vwxsj\" (UID: \"de9d1b65-e208-41f5-8624-c78a11b2dfb6\") " pod="openshift-route-controller-manager/route-controller-manager-5c58f5c645-vwxsj" Feb 23 14:36:23.527408 master-0 kubenswrapper[28758]: I0223 14:36:23.527288 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de9d1b65-e208-41f5-8624-c78a11b2dfb6-client-ca\") pod \"route-controller-manager-5c58f5c645-vwxsj\" (UID: \"de9d1b65-e208-41f5-8624-c78a11b2dfb6\") " pod="openshift-route-controller-manager/route-controller-manager-5c58f5c645-vwxsj" Feb 23 14:36:23.527408 master-0 kubenswrapper[28758]: I0223 14:36:23.527338 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/72510270-7c44-432a-9c75-f6e99adefa27-client-ca\") pod \"controller-manager-5bf558b489-f8nz4\" (UID: \"72510270-7c44-432a-9c75-f6e99adefa27\") " pod="openshift-controller-manager/controller-manager-5bf558b489-f8nz4" Feb 23 14:36:23.527408 master-0 kubenswrapper[28758]: I0223 14:36:23.527365 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd4cw\" (UniqueName: \"kubernetes.io/projected/72510270-7c44-432a-9c75-f6e99adefa27-kube-api-access-kd4cw\") pod \"controller-manager-5bf558b489-f8nz4\" (UID: \"72510270-7c44-432a-9c75-f6e99adefa27\") " pod="openshift-controller-manager/controller-manager-5bf558b489-f8nz4" Feb 23 14:36:23.528397 master-0 kubenswrapper[28758]: I0223 14:36:23.528356 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de9d1b65-e208-41f5-8624-c78a11b2dfb6-client-ca\") pod \"route-controller-manager-5c58f5c645-vwxsj\" (UID: \"de9d1b65-e208-41f5-8624-c78a11b2dfb6\") " pod="openshift-route-controller-manager/route-controller-manager-5c58f5c645-vwxsj" Feb 23 14:36:23.528461 master-0 kubenswrapper[28758]: I0223 14:36:23.528411 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/72510270-7c44-432a-9c75-f6e99adefa27-client-ca\") pod \"controller-manager-5bf558b489-f8nz4\" (UID: \"72510270-7c44-432a-9c75-f6e99adefa27\") " pod="openshift-controller-manager/controller-manager-5bf558b489-f8nz4" Feb 23 14:36:23.528531 master-0 kubenswrapper[28758]: I0223 14:36:23.528507 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de9d1b65-e208-41f5-8624-c78a11b2dfb6-config\") pod \"route-controller-manager-5c58f5c645-vwxsj\" (UID: \"de9d1b65-e208-41f5-8624-c78a11b2dfb6\") " pod="openshift-route-controller-manager/route-controller-manager-5c58f5c645-vwxsj" Feb 23 14:36:23.528570 master-0 kubenswrapper[28758]: I0223 14:36:23.528520 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72510270-7c44-432a-9c75-f6e99adefa27-config\") pod \"controller-manager-5bf558b489-f8nz4\" (UID: \"72510270-7c44-432a-9c75-f6e99adefa27\") " pod="openshift-controller-manager/controller-manager-5bf558b489-f8nz4" Feb 23 14:36:23.528714 master-0 kubenswrapper[28758]: I0223 14:36:23.528688 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/72510270-7c44-432a-9c75-f6e99adefa27-proxy-ca-bundles\") pod \"controller-manager-5bf558b489-f8nz4\" (UID: \"72510270-7c44-432a-9c75-f6e99adefa27\") " pod="openshift-controller-manager/controller-manager-5bf558b489-f8nz4" Feb 23 14:36:23.543536 master-0 kubenswrapper[28758]: I0223 14:36:23.530267 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de9d1b65-e208-41f5-8624-c78a11b2dfb6-serving-cert\") pod \"route-controller-manager-5c58f5c645-vwxsj\" (UID: \"de9d1b65-e208-41f5-8624-c78a11b2dfb6\") " pod="openshift-route-controller-manager/route-controller-manager-5c58f5c645-vwxsj" Feb 23 14:36:23.543536 master-0 kubenswrapper[28758]: I0223 14:36:23.531059 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72510270-7c44-432a-9c75-f6e99adefa27-serving-cert\") pod \"controller-manager-5bf558b489-f8nz4\" (UID: \"72510270-7c44-432a-9c75-f6e99adefa27\") " pod="openshift-controller-manager/controller-manager-5bf558b489-f8nz4" Feb 23 14:36:23.543536 master-0 kubenswrapper[28758]: I0223 14:36:23.542565 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd4cw\" (UniqueName: \"kubernetes.io/projected/72510270-7c44-432a-9c75-f6e99adefa27-kube-api-access-kd4cw\") pod \"controller-manager-5bf558b489-f8nz4\" (UID: \"72510270-7c44-432a-9c75-f6e99adefa27\") " pod="openshift-controller-manager/controller-manager-5bf558b489-f8nz4" Feb 23 14:36:23.543536 master-0 kubenswrapper[28758]: I0223 14:36:23.542847 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7r9f\" (UniqueName: \"kubernetes.io/projected/de9d1b65-e208-41f5-8624-c78a11b2dfb6-kube-api-access-x7r9f\") pod \"route-controller-manager-5c58f5c645-vwxsj\" (UID: \"de9d1b65-e208-41f5-8624-c78a11b2dfb6\") " pod="openshift-route-controller-manager/route-controller-manager-5c58f5c645-vwxsj" Feb 23 14:36:23.617157 master-0 kubenswrapper[28758]: I0223 14:36:23.617088 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bf558b489-f8nz4" Feb 23 14:36:23.628628 master-0 kubenswrapper[28758]: I0223 14:36:23.628574 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/83f5f03a-8877-4435-9e4f-be8e53313076-nginx-conf\") pod \"networking-console-plugin-79f587d78f-xpdg9\" (UID: \"83f5f03a-8877-4435-9e4f-be8e53313076\") " pod="openshift-network-console/networking-console-plugin-79f587d78f-xpdg9" Feb 23 14:36:23.628807 master-0 kubenswrapper[28758]: I0223 14:36:23.628772 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c58f5c645-vwxsj" Feb 23 14:36:23.629175 master-0 kubenswrapper[28758]: I0223 14:36:23.629113 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/83f5f03a-8877-4435-9e4f-be8e53313076-networking-console-plugin-cert\") pod \"networking-console-plugin-79f587d78f-xpdg9\" (UID: \"83f5f03a-8877-4435-9e4f-be8e53313076\") " pod="openshift-network-console/networking-console-plugin-79f587d78f-xpdg9" Feb 23 14:36:23.629248 master-0 kubenswrapper[28758]: E0223 14:36:23.629225 28758 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Feb 23 14:36:23.629390 master-0 kubenswrapper[28758]: E0223 14:36:23.629275 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83f5f03a-8877-4435-9e4f-be8e53313076-networking-console-plugin-cert podName:83f5f03a-8877-4435-9e4f-be8e53313076 nodeName:}" failed. No retries permitted until 2026-02-23 14:36:24.129257247 +0000 UTC m=+116.255573179 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/83f5f03a-8877-4435-9e4f-be8e53313076-networking-console-plugin-cert") pod "networking-console-plugin-79f587d78f-xpdg9" (UID: "83f5f03a-8877-4435-9e4f-be8e53313076") : secret "networking-console-plugin-cert" not found Feb 23 14:36:23.629451 master-0 kubenswrapper[28758]: I0223 14:36:23.629429 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/83f5f03a-8877-4435-9e4f-be8e53313076-nginx-conf\") pod \"networking-console-plugin-79f587d78f-xpdg9\" (UID: \"83f5f03a-8877-4435-9e4f-be8e53313076\") " pod="openshift-network-console/networking-console-plugin-79f587d78f-xpdg9" Feb 23 14:36:23.835232 master-0 kubenswrapper[28758]: I0223 14:36:23.835167 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6fb7cc48c6-v85rz"] Feb 23 14:36:23.869634 master-0 kubenswrapper[28758]: I0223 14:36:23.869577 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7c65cbb888-4xqr4"] Feb 23 14:36:23.870865 master-0 kubenswrapper[28758]: I0223 14:36:23.870833 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c65cbb888-4xqr4" Feb 23 14:36:23.884421 master-0 kubenswrapper[28758]: I0223 14:36:23.884209 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c65cbb888-4xqr4"] Feb 23 14:36:23.934265 master-0 kubenswrapper[28758]: I0223 14:36:23.934225 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zpvn\" (UniqueName: \"kubernetes.io/projected/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-kube-api-access-5zpvn\") pod \"console-7c65cbb888-4xqr4\" (UID: \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\") " pod="openshift-console/console-7c65cbb888-4xqr4" Feb 23 14:36:23.934496 master-0 kubenswrapper[28758]: I0223 14:36:23.934274 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-trusted-ca-bundle\") pod \"console-7c65cbb888-4xqr4\" (UID: \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\") " pod="openshift-console/console-7c65cbb888-4xqr4" Feb 23 14:36:23.934496 master-0 kubenswrapper[28758]: I0223 14:36:23.934319 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-service-ca\") pod \"console-7c65cbb888-4xqr4\" (UID: \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\") " pod="openshift-console/console-7c65cbb888-4xqr4" Feb 23 14:36:23.934496 master-0 kubenswrapper[28758]: I0223 14:36:23.934391 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-console-config\") pod \"console-7c65cbb888-4xqr4\" (UID: \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\") " pod="openshift-console/console-7c65cbb888-4xqr4" Feb 23 14:36:23.934496 master-0 kubenswrapper[28758]: I0223 14:36:23.934423 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-oauth-serving-cert\") pod \"console-7c65cbb888-4xqr4\" (UID: \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\") " pod="openshift-console/console-7c65cbb888-4xqr4" Feb 23 14:36:23.934496 master-0 kubenswrapper[28758]: I0223 14:36:23.934471 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-console-oauth-config\") pod \"console-7c65cbb888-4xqr4\" (UID: \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\") " pod="openshift-console/console-7c65cbb888-4xqr4" Feb 23 14:36:23.934720 master-0 kubenswrapper[28758]: I0223 14:36:23.934518 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-console-serving-cert\") pod \"console-7c65cbb888-4xqr4\" (UID: \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\") " pod="openshift-console/console-7c65cbb888-4xqr4" Feb 23 14:36:24.016737 master-0 kubenswrapper[28758]: I0223 14:36:24.016585 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5bf558b489-f8nz4"] Feb 23 14:36:24.035836 master-0 kubenswrapper[28758]: I0223 14:36:24.035795 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zpvn\" (UniqueName: \"kubernetes.io/projected/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-kube-api-access-5zpvn\") pod \"console-7c65cbb888-4xqr4\" (UID: \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\") " pod="openshift-console/console-7c65cbb888-4xqr4" Feb 23 14:36:24.035960 master-0 kubenswrapper[28758]: I0223 14:36:24.035848 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-trusted-ca-bundle\") pod \"console-7c65cbb888-4xqr4\" (UID: \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\") " pod="openshift-console/console-7c65cbb888-4xqr4" Feb 23 14:36:24.035960 master-0 kubenswrapper[28758]: I0223 14:36:24.035883 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-service-ca\") pod \"console-7c65cbb888-4xqr4\" (UID: \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\") " pod="openshift-console/console-7c65cbb888-4xqr4" Feb 23 14:36:24.035960 master-0 kubenswrapper[28758]: I0223 14:36:24.035949 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-console-config\") pod \"console-7c65cbb888-4xqr4\" (UID: \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\") " pod="openshift-console/console-7c65cbb888-4xqr4" Feb 23 14:36:24.036082 master-0 kubenswrapper[28758]: I0223 14:36:24.035985 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-oauth-serving-cert\") pod \"console-7c65cbb888-4xqr4\" (UID: \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\") " pod="openshift-console/console-7c65cbb888-4xqr4" Feb 23 14:36:24.036082 master-0 kubenswrapper[28758]: I0223 14:36:24.036053 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-console-oauth-config\") pod \"console-7c65cbb888-4xqr4\" (UID: \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\") " pod="openshift-console/console-7c65cbb888-4xqr4" Feb 23 14:36:24.036173 master-0 kubenswrapper[28758]: I0223 14:36:24.036085 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-console-serving-cert\") pod \"console-7c65cbb888-4xqr4\" (UID: \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\") " pod="openshift-console/console-7c65cbb888-4xqr4" Feb 23 14:36:24.036932 master-0 kubenswrapper[28758]: I0223 14:36:24.036907 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-service-ca\") pod \"console-7c65cbb888-4xqr4\" (UID: \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\") " pod="openshift-console/console-7c65cbb888-4xqr4" Feb 23 14:36:24.036932 master-0 kubenswrapper[28758]: I0223 14:36:24.036914 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-trusted-ca-bundle\") pod \"console-7c65cbb888-4xqr4\" (UID: \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\") " pod="openshift-console/console-7c65cbb888-4xqr4" Feb 23 14:36:24.037519 master-0 kubenswrapper[28758]: I0223 14:36:24.037465 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-oauth-serving-cert\") pod \"console-7c65cbb888-4xqr4\" (UID: \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\") " pod="openshift-console/console-7c65cbb888-4xqr4" Feb 23 14:36:24.037626 master-0 kubenswrapper[28758]: I0223 14:36:24.037598 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-console-config\") pod \"console-7c65cbb888-4xqr4\" (UID: \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\") " pod="openshift-console/console-7c65cbb888-4xqr4" Feb 23 14:36:24.039610 master-0 kubenswrapper[28758]: I0223 14:36:24.039577 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-console-oauth-config\") pod \"console-7c65cbb888-4xqr4\" (UID: \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\") " pod="openshift-console/console-7c65cbb888-4xqr4" Feb 23 14:36:24.040057 master-0 kubenswrapper[28758]: I0223 14:36:24.040030 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-console-serving-cert\") pod \"console-7c65cbb888-4xqr4\" (UID: \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\") " pod="openshift-console/console-7c65cbb888-4xqr4" Feb 23 14:36:24.052952 master-0 kubenswrapper[28758]: I0223 14:36:24.052922 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zpvn\" (UniqueName: \"kubernetes.io/projected/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-kube-api-access-5zpvn\") pod \"console-7c65cbb888-4xqr4\" (UID: \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\") " pod="openshift-console/console-7c65cbb888-4xqr4" Feb 23 14:36:24.095193 master-0 kubenswrapper[28758]: I0223 14:36:24.095141 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="482284fd-6911-4ba6-8d57-7966cc51117a" path="/var/lib/kubelet/pods/482284fd-6911-4ba6-8d57-7966cc51117a/volumes" Feb 23 14:36:24.095797 master-0 kubenswrapper[28758]: I0223 14:36:24.095758 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="959c2393-e914-4c10-a18f-b30fcf012d19" path="/var/lib/kubelet/pods/959c2393-e914-4c10-a18f-b30fcf012d19/volumes" Feb 23 14:36:24.112364 master-0 kubenswrapper[28758]: I0223 14:36:24.112293 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c58f5c645-vwxsj"] Feb 23 14:36:24.119935 master-0 kubenswrapper[28758]: W0223 14:36:24.119888 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde9d1b65_e208_41f5_8624_c78a11b2dfb6.slice/crio-299c6d2b628f904a258c87bc65140a968cb61c4b9ea1611e8652c0683a653c58 WatchSource:0}: Error finding container 299c6d2b628f904a258c87bc65140a968cb61c4b9ea1611e8652c0683a653c58: Status 404 returned error can't find the container with id 299c6d2b628f904a258c87bc65140a968cb61c4b9ea1611e8652c0683a653c58 Feb 23 14:36:24.136947 master-0 kubenswrapper[28758]: I0223 14:36:24.136901 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/83f5f03a-8877-4435-9e4f-be8e53313076-networking-console-plugin-cert\") pod \"networking-console-plugin-79f587d78f-xpdg9\" (UID: \"83f5f03a-8877-4435-9e4f-be8e53313076\") " pod="openshift-network-console/networking-console-plugin-79f587d78f-xpdg9" Feb 23 14:36:24.140363 master-0 kubenswrapper[28758]: I0223 14:36:24.140325 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/83f5f03a-8877-4435-9e4f-be8e53313076-networking-console-plugin-cert\") pod \"networking-console-plugin-79f587d78f-xpdg9\" (UID: \"83f5f03a-8877-4435-9e4f-be8e53313076\") " pod="openshift-network-console/networking-console-plugin-79f587d78f-xpdg9" Feb 23 14:36:24.192461 master-0 kubenswrapper[28758]: I0223 14:36:24.192393 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c65cbb888-4xqr4" Feb 23 14:36:24.339432 master-0 kubenswrapper[28758]: I0223 14:36:24.338601 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-79f587d78f-xpdg9" Feb 23 14:36:24.500930 master-0 kubenswrapper[28758]: I0223 14:36:24.500856 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-87c947f7d-jz97m" podUID="26bb163c-8d5f-42bc-b5a6-25f7e2e214e6" containerName="console" containerID="cri-o://49c0fbbae99623d0bf4718bd25ac7308e61922b24b0082198cebf40df25bf2de" gracePeriod=15 Feb 23 14:36:24.503409 master-0 kubenswrapper[28758]: I0223 14:36:24.503369 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c58f5c645-vwxsj" event={"ID":"de9d1b65-e208-41f5-8624-c78a11b2dfb6","Type":"ContainerStarted","Data":"e906c967d2eb0e363b70c955a963d9e21bd9b503014f8441ecad48f908c702a9"} Feb 23 14:36:24.503508 master-0 kubenswrapper[28758]: I0223 14:36:24.503435 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c58f5c645-vwxsj" event={"ID":"de9d1b65-e208-41f5-8624-c78a11b2dfb6","Type":"ContainerStarted","Data":"299c6d2b628f904a258c87bc65140a968cb61c4b9ea1611e8652c0683a653c58"} Feb 23 14:36:24.503828 master-0 kubenswrapper[28758]: I0223 14:36:24.503787 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c58f5c645-vwxsj" Feb 23 14:36:24.513501 master-0 kubenswrapper[28758]: I0223 14:36:24.511689 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bf558b489-f8nz4" event={"ID":"72510270-7c44-432a-9c75-f6e99adefa27","Type":"ContainerStarted","Data":"fe91d57e406c3f57dab73ae70275069dacc64c2ed9fef380457832c7eddba718"} Feb 23 14:36:24.513501 master-0 kubenswrapper[28758]: I0223 14:36:24.511742 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bf558b489-f8nz4" event={"ID":"72510270-7c44-432a-9c75-f6e99adefa27","Type":"ContainerStarted","Data":"07a2640df67116e8e4dc0bb571456bc99b19f5bf7e46b9e6eba58ccd8ad28418"} Feb 23 14:36:24.513753 master-0 kubenswrapper[28758]: I0223 14:36:24.513643 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5bf558b489-f8nz4" Feb 23 14:36:24.522978 master-0 kubenswrapper[28758]: I0223 14:36:24.522864 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c58f5c645-vwxsj" podStartSLOduration=2.522841789 podStartE2EDuration="2.522841789s" podCreationTimestamp="2026-02-23 14:36:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:36:24.522794477 +0000 UTC m=+116.649110419" watchObservedRunningTime="2026-02-23 14:36:24.522841789 +0000 UTC m=+116.649157721" Feb 23 14:36:24.523461 master-0 kubenswrapper[28758]: I0223 14:36:24.523397 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5bf558b489-f8nz4" Feb 23 14:36:24.554609 master-0 kubenswrapper[28758]: I0223 14:36:24.554519 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5bf558b489-f8nz4" podStartSLOduration=2.554495233 podStartE2EDuration="2.554495233s" podCreationTimestamp="2026-02-23 14:36:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:36:24.552246734 +0000 UTC m=+116.678562686" watchObservedRunningTime="2026-02-23 14:36:24.554495233 +0000 UTC m=+116.680811165" Feb 23 14:36:24.650844 master-0 kubenswrapper[28758]: I0223 14:36:24.650778 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c58f5c645-vwxsj" Feb 23 14:36:24.681041 master-0 kubenswrapper[28758]: I0223 14:36:24.680973 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c65cbb888-4xqr4"] Feb 23 14:36:24.797853 master-0 kubenswrapper[28758]: I0223 14:36:24.797803 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-79f587d78f-xpdg9"] Feb 23 14:36:24.939813 master-0 kubenswrapper[28758]: I0223 14:36:24.939775 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-87c947f7d-jz97m_26bb163c-8d5f-42bc-b5a6-25f7e2e214e6/console/0.log" Feb 23 14:36:24.939935 master-0 kubenswrapper[28758]: I0223 14:36:24.939868 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-87c947f7d-jz97m" Feb 23 14:36:24.950315 master-0 kubenswrapper[28758]: I0223 14:36:24.950259 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-oauth-serving-cert\") pod \"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6\" (UID: \"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6\") " Feb 23 14:36:24.950315 master-0 kubenswrapper[28758]: I0223 14:36:24.950307 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-console-oauth-config\") pod \"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6\" (UID: \"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6\") " Feb 23 14:36:24.950591 master-0 kubenswrapper[28758]: I0223 14:36:24.950346 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-console-serving-cert\") pod \"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6\" (UID: \"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6\") " Feb 23 14:36:24.950591 master-0 kubenswrapper[28758]: I0223 14:36:24.950369 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-service-ca\") pod \"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6\" (UID: \"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6\") " Feb 23 14:36:24.950591 master-0 kubenswrapper[28758]: I0223 14:36:24.950409 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-console-config\") pod \"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6\" (UID: \"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6\") " Feb 23 14:36:24.950591 master-0 kubenswrapper[28758]: I0223 14:36:24.950456 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wplbg\" (UniqueName: \"kubernetes.io/projected/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-kube-api-access-wplbg\") pod \"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6\" (UID: \"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6\") " Feb 23 14:36:24.951745 master-0 kubenswrapper[28758]: I0223 14:36:24.951696 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-service-ca" (OuterVolumeSpecName: "service-ca") pod "26bb163c-8d5f-42bc-b5a6-25f7e2e214e6" (UID: "26bb163c-8d5f-42bc-b5a6-25f7e2e214e6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:36:24.952276 master-0 kubenswrapper[28758]: I0223 14:36:24.952238 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-console-config" (OuterVolumeSpecName: "console-config") pod "26bb163c-8d5f-42bc-b5a6-25f7e2e214e6" (UID: "26bb163c-8d5f-42bc-b5a6-25f7e2e214e6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:36:24.952409 master-0 kubenswrapper[28758]: I0223 14:36:24.952346 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "26bb163c-8d5f-42bc-b5a6-25f7e2e214e6" (UID: "26bb163c-8d5f-42bc-b5a6-25f7e2e214e6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:36:24.953428 master-0 kubenswrapper[28758]: I0223 14:36:24.953389 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-kube-api-access-wplbg" (OuterVolumeSpecName: "kube-api-access-wplbg") pod "26bb163c-8d5f-42bc-b5a6-25f7e2e214e6" (UID: "26bb163c-8d5f-42bc-b5a6-25f7e2e214e6"). InnerVolumeSpecName "kube-api-access-wplbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:36:24.953951 master-0 kubenswrapper[28758]: I0223 14:36:24.953917 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "26bb163c-8d5f-42bc-b5a6-25f7e2e214e6" (UID: "26bb163c-8d5f-42bc-b5a6-25f7e2e214e6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:36:24.954125 master-0 kubenswrapper[28758]: I0223 14:36:24.954098 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "26bb163c-8d5f-42bc-b5a6-25f7e2e214e6" (UID: "26bb163c-8d5f-42bc-b5a6-25f7e2e214e6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:36:25.052088 master-0 kubenswrapper[28758]: I0223 14:36:25.051964 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wplbg\" (UniqueName: \"kubernetes.io/projected/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-kube-api-access-wplbg\") on node \"master-0\" DevicePath \"\"" Feb 23 14:36:25.052088 master-0 kubenswrapper[28758]: I0223 14:36:25.052024 28758 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Feb 23 14:36:25.052088 master-0 kubenswrapper[28758]: I0223 14:36:25.052040 28758 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:36:25.052088 master-0 kubenswrapper[28758]: I0223 14:36:25.052051 28758 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Feb 23 14:36:25.052088 master-0 kubenswrapper[28758]: I0223 14:36:25.052063 28758 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-service-ca\") on node \"master-0\" DevicePath \"\"" Feb 23 14:36:25.052088 master-0 kubenswrapper[28758]: I0223 14:36:25.052074 28758 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6-console-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:36:25.523815 master-0 kubenswrapper[28758]: I0223 14:36:25.523747 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c65cbb888-4xqr4" event={"ID":"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae","Type":"ContainerStarted","Data":"b04cf2581ce5855388add9518ef51a60c0cff59ad6ca1763c50b74796b962ecc"} Feb 23 14:36:25.523815 master-0 kubenswrapper[28758]: I0223 14:36:25.523808 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c65cbb888-4xqr4" event={"ID":"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae","Type":"ContainerStarted","Data":"7b441f14a306510646ba50e569176b796adaf9ce100e9c4629c4acf49c8a1a24"} Feb 23 14:36:25.527100 master-0 kubenswrapper[28758]: I0223 14:36:25.527056 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-87c947f7d-jz97m_26bb163c-8d5f-42bc-b5a6-25f7e2e214e6/console/0.log" Feb 23 14:36:25.527206 master-0 kubenswrapper[28758]: I0223 14:36:25.527101 28758 generic.go:334] "Generic (PLEG): container finished" podID="26bb163c-8d5f-42bc-b5a6-25f7e2e214e6" containerID="49c0fbbae99623d0bf4718bd25ac7308e61922b24b0082198cebf40df25bf2de" exitCode=2 Feb 23 14:36:25.527206 master-0 kubenswrapper[28758]: I0223 14:36:25.527185 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-87c947f7d-jz97m" Feb 23 14:36:25.527352 master-0 kubenswrapper[28758]: I0223 14:36:25.527323 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-87c947f7d-jz97m" event={"ID":"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6","Type":"ContainerDied","Data":"49c0fbbae99623d0bf4718bd25ac7308e61922b24b0082198cebf40df25bf2de"} Feb 23 14:36:25.527442 master-0 kubenswrapper[28758]: I0223 14:36:25.527428 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-87c947f7d-jz97m" event={"ID":"26bb163c-8d5f-42bc-b5a6-25f7e2e214e6","Type":"ContainerDied","Data":"f6a667abcba45d3005bf5f20f9c1ebf19e7024613296386a208e38e2361f9569"} Feb 23 14:36:25.527530 master-0 kubenswrapper[28758]: I0223 14:36:25.527490 28758 scope.go:117] "RemoveContainer" containerID="49c0fbbae99623d0bf4718bd25ac7308e61922b24b0082198cebf40df25bf2de" Feb 23 14:36:25.530241 master-0 kubenswrapper[28758]: I0223 14:36:25.530220 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-79f587d78f-xpdg9" event={"ID":"83f5f03a-8877-4435-9e4f-be8e53313076","Type":"ContainerStarted","Data":"f5438ea037111713be94273d0cca734a23d32a2aa516d680dd1c80e349bd6933"} Feb 23 14:36:25.544341 master-0 kubenswrapper[28758]: I0223 14:36:25.544268 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7c65cbb888-4xqr4" podStartSLOduration=2.544245139 podStartE2EDuration="2.544245139s" podCreationTimestamp="2026-02-23 14:36:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:36:25.541870007 +0000 UTC m=+117.668185959" watchObservedRunningTime="2026-02-23 14:36:25.544245139 +0000 UTC m=+117.670561071" Feb 23 14:36:25.547669 master-0 kubenswrapper[28758]: I0223 14:36:25.547630 28758 scope.go:117] "RemoveContainer" containerID="49c0fbbae99623d0bf4718bd25ac7308e61922b24b0082198cebf40df25bf2de" Feb 23 14:36:25.548924 master-0 kubenswrapper[28758]: E0223 14:36:25.548884 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49c0fbbae99623d0bf4718bd25ac7308e61922b24b0082198cebf40df25bf2de\": container with ID starting with 49c0fbbae99623d0bf4718bd25ac7308e61922b24b0082198cebf40df25bf2de not found: ID does not exist" containerID="49c0fbbae99623d0bf4718bd25ac7308e61922b24b0082198cebf40df25bf2de" Feb 23 14:36:25.549065 master-0 kubenswrapper[28758]: I0223 14:36:25.549036 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49c0fbbae99623d0bf4718bd25ac7308e61922b24b0082198cebf40df25bf2de"} err="failed to get container status \"49c0fbbae99623d0bf4718bd25ac7308e61922b24b0082198cebf40df25bf2de\": rpc error: code = NotFound desc = could not find container \"49c0fbbae99623d0bf4718bd25ac7308e61922b24b0082198cebf40df25bf2de\": container with ID starting with 49c0fbbae99623d0bf4718bd25ac7308e61922b24b0082198cebf40df25bf2de not found: ID does not exist" Feb 23 14:36:25.561512 master-0 kubenswrapper[28758]: I0223 14:36:25.561426 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-87c947f7d-jz97m"] Feb 23 14:36:25.566432 master-0 kubenswrapper[28758]: I0223 14:36:25.566389 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-87c947f7d-jz97m"] Feb 23 14:36:26.101494 master-0 kubenswrapper[28758]: I0223 14:36:26.101408 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26bb163c-8d5f-42bc-b5a6-25f7e2e214e6" path="/var/lib/kubelet/pods/26bb163c-8d5f-42bc-b5a6-25f7e2e214e6/volumes" Feb 23 14:36:26.540229 master-0 kubenswrapper[28758]: I0223 14:36:26.540131 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-79f587d78f-xpdg9" event={"ID":"83f5f03a-8877-4435-9e4f-be8e53313076","Type":"ContainerStarted","Data":"33bd871aa62f988228d4d4b7d718105ef56474ee59e30e518d629ed1794306b1"} Feb 23 14:36:26.571990 master-0 kubenswrapper[28758]: I0223 14:36:26.571842 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-79f587d78f-xpdg9" podStartSLOduration=2.250390875 podStartE2EDuration="3.57176064s" podCreationTimestamp="2026-02-23 14:36:23 +0000 UTC" firstStartedPulling="2026-02-23 14:36:24.809598047 +0000 UTC m=+116.935913979" lastFinishedPulling="2026-02-23 14:36:26.130967812 +0000 UTC m=+118.257283744" observedRunningTime="2026-02-23 14:36:26.567686842 +0000 UTC m=+118.694002794" watchObservedRunningTime="2026-02-23 14:36:26.57176064 +0000 UTC m=+118.698076582" Feb 23 14:36:27.091368 master-0 kubenswrapper[28758]: I0223 14:36:27.091288 28758 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Feb 23 14:36:27.091816 master-0 kubenswrapper[28758]: E0223 14:36:27.091628 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26bb163c-8d5f-42bc-b5a6-25f7e2e214e6" containerName="console" Feb 23 14:36:27.091816 master-0 kubenswrapper[28758]: I0223 14:36:27.091644 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="26bb163c-8d5f-42bc-b5a6-25f7e2e214e6" containerName="console" Feb 23 14:36:27.091918 master-0 kubenswrapper[28758]: I0223 14:36:27.091868 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="26bb163c-8d5f-42bc-b5a6-25f7e2e214e6" containerName="console" Feb 23 14:36:27.092513 master-0 kubenswrapper[28758]: I0223 14:36:27.092427 28758 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Feb 23 14:36:27.092688 master-0 kubenswrapper[28758]: I0223 14:36:27.092632 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:36:27.092772 master-0 kubenswrapper[28758]: I0223 14:36:27.092734 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="959c75833224b4ba3fa488b77d8f5032" containerName="kube-apiserver" containerID="cri-o://8daf30ba9c8438cb4829a86d151f5a35be6ceb72bec19daa3149b2925e7076a2" gracePeriod=15 Feb 23 14:36:27.092844 master-0 kubenswrapper[28758]: I0223 14:36:27.092808 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="959c75833224b4ba3fa488b77d8f5032" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://f48e8181c4b1545411cb74be92ee46d43b72112e508bbbde3d0b0625382cc193" gracePeriod=15 Feb 23 14:36:27.092919 master-0 kubenswrapper[28758]: I0223 14:36:27.092869 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="959c75833224b4ba3fa488b77d8f5032" containerName="kube-apiserver-cert-syncer" containerID="cri-o://db3bcfab91dd51437354641c4ae5c853de368a114ab29260a261ff0716a6e3aa" gracePeriod=15 Feb 23 14:36:27.092919 master-0 kubenswrapper[28758]: I0223 14:36:27.092903 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="959c75833224b4ba3fa488b77d8f5032" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://03f063ba05fdab3010ecf16036a12816aeaffe3e50e5e9cbc85f6a31b61cfdf9" gracePeriod=15 Feb 23 14:36:27.093015 master-0 kubenswrapper[28758]: I0223 14:36:27.092748 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="959c75833224b4ba3fa488b77d8f5032" containerName="kube-apiserver-check-endpoints" containerID="cri-o://f441f13cd166edd4b57db31fab13e616ea6f8d8cb025cae8395a8b8e6b1895b4" gracePeriod=15 Feb 23 14:36:27.095615 master-0 kubenswrapper[28758]: I0223 14:36:27.094547 28758 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Feb 23 14:36:27.095615 master-0 kubenswrapper[28758]: E0223 14:36:27.094907 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="959c75833224b4ba3fa488b77d8f5032" containerName="kube-apiserver-cert-regeneration-controller" Feb 23 14:36:27.095615 master-0 kubenswrapper[28758]: I0223 14:36:27.094924 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="959c75833224b4ba3fa488b77d8f5032" containerName="kube-apiserver-cert-regeneration-controller" Feb 23 14:36:27.095615 master-0 kubenswrapper[28758]: E0223 14:36:27.094955 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="959c75833224b4ba3fa488b77d8f5032" containerName="kube-apiserver-check-endpoints" Feb 23 14:36:27.095615 master-0 kubenswrapper[28758]: I0223 14:36:27.094988 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="959c75833224b4ba3fa488b77d8f5032" containerName="kube-apiserver-check-endpoints" Feb 23 14:36:27.095615 master-0 kubenswrapper[28758]: E0223 14:36:27.095008 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="959c75833224b4ba3fa488b77d8f5032" containerName="setup" Feb 23 14:36:27.095615 master-0 kubenswrapper[28758]: I0223 14:36:27.095018 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="959c75833224b4ba3fa488b77d8f5032" containerName="setup" Feb 23 14:36:27.095615 master-0 kubenswrapper[28758]: E0223 14:36:27.095034 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="959c75833224b4ba3fa488b77d8f5032" containerName="kube-apiserver" Feb 23 14:36:27.095615 master-0 kubenswrapper[28758]: I0223 14:36:27.095043 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="959c75833224b4ba3fa488b77d8f5032" containerName="kube-apiserver" Feb 23 14:36:27.095615 master-0 kubenswrapper[28758]: E0223 14:36:27.095079 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="959c75833224b4ba3fa488b77d8f5032" containerName="kube-apiserver-check-endpoints" Feb 23 14:36:27.095615 master-0 kubenswrapper[28758]: I0223 14:36:27.095089 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="959c75833224b4ba3fa488b77d8f5032" containerName="kube-apiserver-check-endpoints" Feb 23 14:36:27.095615 master-0 kubenswrapper[28758]: E0223 14:36:27.095100 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="959c75833224b4ba3fa488b77d8f5032" containerName="kube-apiserver-insecure-readyz" Feb 23 14:36:27.095615 master-0 kubenswrapper[28758]: I0223 14:36:27.095108 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="959c75833224b4ba3fa488b77d8f5032" containerName="kube-apiserver-insecure-readyz" Feb 23 14:36:27.095615 master-0 kubenswrapper[28758]: E0223 14:36:27.095126 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="959c75833224b4ba3fa488b77d8f5032" containerName="kube-apiserver-cert-syncer" Feb 23 14:36:27.097183 master-0 kubenswrapper[28758]: I0223 14:36:27.095157 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="959c75833224b4ba3fa488b77d8f5032" containerName="kube-apiserver-cert-syncer" Feb 23 14:36:27.097183 master-0 kubenswrapper[28758]: I0223 14:36:27.095923 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="959c75833224b4ba3fa488b77d8f5032" containerName="kube-apiserver-insecure-readyz" Feb 23 14:36:27.097183 master-0 kubenswrapper[28758]: I0223 14:36:27.095940 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="959c75833224b4ba3fa488b77d8f5032" containerName="kube-apiserver-cert-regeneration-controller" Feb 23 14:36:27.097183 master-0 kubenswrapper[28758]: I0223 14:36:27.095997 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="959c75833224b4ba3fa488b77d8f5032" containerName="kube-apiserver-check-endpoints" Feb 23 14:36:27.097183 master-0 kubenswrapper[28758]: I0223 14:36:27.096014 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="959c75833224b4ba3fa488b77d8f5032" containerName="kube-apiserver" Feb 23 14:36:27.097183 master-0 kubenswrapper[28758]: I0223 14:36:27.096033 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="959c75833224b4ba3fa488b77d8f5032" containerName="kube-apiserver-cert-syncer" Feb 23 14:36:27.097183 master-0 kubenswrapper[28758]: I0223 14:36:27.096075 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="959c75833224b4ba3fa488b77d8f5032" containerName="kube-apiserver-check-endpoints" Feb 23 14:36:27.135455 master-0 kubenswrapper[28758]: I0223 14:36:27.135141 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Feb 23 14:36:27.188458 master-0 kubenswrapper[28758]: I0223 14:36:27.188390 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2146f0e3671998cad8bbc2464b009ab7-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"2146f0e3671998cad8bbc2464b009ab7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:36:27.188458 master-0 kubenswrapper[28758]: I0223 14:36:27.188459 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/2146f0e3671998cad8bbc2464b009ab7-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"2146f0e3671998cad8bbc2464b009ab7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:36:27.188747 master-0 kubenswrapper[28758]: I0223 14:36:27.188563 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2146f0e3671998cad8bbc2464b009ab7-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"2146f0e3671998cad8bbc2464b009ab7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:36:27.192597 master-0 kubenswrapper[28758]: I0223 14:36:27.189064 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/2146f0e3671998cad8bbc2464b009ab7-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"2146f0e3671998cad8bbc2464b009ab7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:36:27.192597 master-0 kubenswrapper[28758]: I0223 14:36:27.189321 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2146f0e3671998cad8bbc2464b009ab7-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"2146f0e3671998cad8bbc2464b009ab7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:36:27.291453 master-0 kubenswrapper[28758]: I0223 14:36:27.291404 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2146f0e3671998cad8bbc2464b009ab7-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"2146f0e3671998cad8bbc2464b009ab7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:36:27.291453 master-0 kubenswrapper[28758]: I0223 14:36:27.291451 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/2146f0e3671998cad8bbc2464b009ab7-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"2146f0e3671998cad8bbc2464b009ab7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:36:27.291906 master-0 kubenswrapper[28758]: I0223 14:36:27.291502 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2146f0e3671998cad8bbc2464b009ab7-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"2146f0e3671998cad8bbc2464b009ab7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:36:27.291906 master-0 kubenswrapper[28758]: I0223 14:36:27.291593 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2146f0e3671998cad8bbc2464b009ab7-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"2146f0e3671998cad8bbc2464b009ab7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:36:27.291906 master-0 kubenswrapper[28758]: I0223 14:36:27.291653 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/487622064474ed0ec70f7bf2a0fcb80b-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"487622064474ed0ec70f7bf2a0fcb80b\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:36:27.291906 master-0 kubenswrapper[28758]: I0223 14:36:27.291683 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2146f0e3671998cad8bbc2464b009ab7-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"2146f0e3671998cad8bbc2464b009ab7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:36:27.291906 master-0 kubenswrapper[28758]: I0223 14:36:27.291731 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/2146f0e3671998cad8bbc2464b009ab7-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"2146f0e3671998cad8bbc2464b009ab7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:36:27.291906 master-0 kubenswrapper[28758]: I0223 14:36:27.291801 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/2146f0e3671998cad8bbc2464b009ab7-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"2146f0e3671998cad8bbc2464b009ab7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:36:27.292182 master-0 kubenswrapper[28758]: I0223 14:36:27.291920 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/487622064474ed0ec70f7bf2a0fcb80b-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"487622064474ed0ec70f7bf2a0fcb80b\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:36:27.292182 master-0 kubenswrapper[28758]: I0223 14:36:27.291959 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/2146f0e3671998cad8bbc2464b009ab7-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"2146f0e3671998cad8bbc2464b009ab7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:36:27.292182 master-0 kubenswrapper[28758]: I0223 14:36:27.291972 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2146f0e3671998cad8bbc2464b009ab7-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"2146f0e3671998cad8bbc2464b009ab7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:36:27.292182 master-0 kubenswrapper[28758]: I0223 14:36:27.292001 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/487622064474ed0ec70f7bf2a0fcb80b-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"487622064474ed0ec70f7bf2a0fcb80b\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:36:27.292182 master-0 kubenswrapper[28758]: I0223 14:36:27.292008 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2146f0e3671998cad8bbc2464b009ab7-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"2146f0e3671998cad8bbc2464b009ab7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:36:27.394355 master-0 kubenswrapper[28758]: I0223 14:36:27.394014 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/487622064474ed0ec70f7bf2a0fcb80b-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"487622064474ed0ec70f7bf2a0fcb80b\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:36:27.394355 master-0 kubenswrapper[28758]: I0223 14:36:27.394090 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/487622064474ed0ec70f7bf2a0fcb80b-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"487622064474ed0ec70f7bf2a0fcb80b\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:36:27.394355 master-0 kubenswrapper[28758]: I0223 14:36:27.394120 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/487622064474ed0ec70f7bf2a0fcb80b-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"487622064474ed0ec70f7bf2a0fcb80b\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:36:27.394355 master-0 kubenswrapper[28758]: I0223 14:36:27.394242 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/487622064474ed0ec70f7bf2a0fcb80b-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"487622064474ed0ec70f7bf2a0fcb80b\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:36:27.396025 master-0 kubenswrapper[28758]: I0223 14:36:27.394391 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/487622064474ed0ec70f7bf2a0fcb80b-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"487622064474ed0ec70f7bf2a0fcb80b\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:36:27.396025 master-0 kubenswrapper[28758]: I0223 14:36:27.394504 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/487622064474ed0ec70f7bf2a0fcb80b-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"487622064474ed0ec70f7bf2a0fcb80b\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:36:27.433592 master-0 kubenswrapper[28758]: I0223 14:36:27.433027 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:36:27.459137 master-0 kubenswrapper[28758]: E0223 14:36:27.458862 28758 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.1896e6e7fc07d0b5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:2146f0e3671998cad8bbc2464b009ab7,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fd63e2c1185e529c6e9f6e1426222ff2ac195132b44a1775f407e4593b66d4c\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:36:27.457073333 +0000 UTC m=+119.583389265,LastTimestamp:2026-02-23 14:36:27.457073333 +0000 UTC m=+119.583389265,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:36:27.564987 master-0 kubenswrapper[28758]: I0223 14:36:27.563683 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_959c75833224b4ba3fa488b77d8f5032/kube-apiserver-check-endpoints/0.log" Feb 23 14:36:27.565911 master-0 kubenswrapper[28758]: I0223 14:36:27.565879 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_959c75833224b4ba3fa488b77d8f5032/kube-apiserver-cert-syncer/0.log" Feb 23 14:36:27.566923 master-0 kubenswrapper[28758]: I0223 14:36:27.566889 28758 generic.go:334] "Generic (PLEG): container finished" podID="959c75833224b4ba3fa488b77d8f5032" containerID="f441f13cd166edd4b57db31fab13e616ea6f8d8cb025cae8395a8b8e6b1895b4" exitCode=0 Feb 23 14:36:27.566923 master-0 kubenswrapper[28758]: I0223 14:36:27.566916 28758 generic.go:334] "Generic (PLEG): container finished" podID="959c75833224b4ba3fa488b77d8f5032" containerID="03f063ba05fdab3010ecf16036a12816aeaffe3e50e5e9cbc85f6a31b61cfdf9" exitCode=0 Feb 23 14:36:27.566923 master-0 kubenswrapper[28758]: I0223 14:36:27.566925 28758 generic.go:334] "Generic (PLEG): container finished" podID="959c75833224b4ba3fa488b77d8f5032" containerID="f48e8181c4b1545411cb74be92ee46d43b72112e508bbbde3d0b0625382cc193" exitCode=0 Feb 23 14:36:27.567053 master-0 kubenswrapper[28758]: I0223 14:36:27.566932 28758 generic.go:334] "Generic (PLEG): container finished" podID="959c75833224b4ba3fa488b77d8f5032" containerID="db3bcfab91dd51437354641c4ae5c853de368a114ab29260a261ff0716a6e3aa" exitCode=2 Feb 23 14:36:27.567053 master-0 kubenswrapper[28758]: I0223 14:36:27.567015 28758 scope.go:117] "RemoveContainer" containerID="7b1c7cc7b2e3d4c1fdbe5b4592355d6abc03a37f10ae5ab746402745b7ae1aa2" Feb 23 14:36:27.569237 master-0 kubenswrapper[28758]: I0223 14:36:27.569020 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"2146f0e3671998cad8bbc2464b009ab7","Type":"ContainerStarted","Data":"b2bc371871929cb2e427ec98bc13e1481e2d40928164d960089529430133aa86"} Feb 23 14:36:27.571510 master-0 kubenswrapper[28758]: I0223 14:36:27.571446 28758 generic.go:334] "Generic (PLEG): container finished" podID="069569a4-34c1-4752-af70-b31bcfca4177" containerID="bbef5d1f87604552e5038241fb1806db8690b9c5aaff08ce5041ceb6958b2770" exitCode=0 Feb 23 14:36:27.571675 master-0 kubenswrapper[28758]: I0223 14:36:27.571637 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"069569a4-34c1-4752-af70-b31bcfca4177","Type":"ContainerDied","Data":"bbef5d1f87604552e5038241fb1806db8690b9c5aaff08ce5041ceb6958b2770"} Feb 23 14:36:27.572887 master-0 kubenswrapper[28758]: I0223 14:36:27.572846 28758 status_manager.go:851] "Failed to get status for pod" podUID="069569a4-34c1-4752-af70-b31bcfca4177" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:36:27.573640 master-0 kubenswrapper[28758]: I0223 14:36:27.573592 28758 status_manager.go:851] "Failed to get status for pod" podUID="2146f0e3671998cad8bbc2464b009ab7" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:36:27.574127 master-0 kubenswrapper[28758]: I0223 14:36:27.574100 28758 status_manager.go:851] "Failed to get status for pod" podUID="959c75833224b4ba3fa488b77d8f5032" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:36:27.645501 master-0 kubenswrapper[28758]: E0223 14:36:27.645382 28758 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:36:27.646026 master-0 kubenswrapper[28758]: E0223 14:36:27.645985 28758 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:36:27.646675 master-0 kubenswrapper[28758]: E0223 14:36:27.646647 28758 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:36:27.647269 master-0 kubenswrapper[28758]: E0223 14:36:27.647240 28758 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:36:27.647946 master-0 kubenswrapper[28758]: E0223 14:36:27.647870 28758 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:36:27.648012 master-0 kubenswrapper[28758]: I0223 14:36:27.647950 28758 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 23 14:36:27.648525 master-0 kubenswrapper[28758]: E0223 14:36:27.648494 28758 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Feb 23 14:36:27.849732 master-0 kubenswrapper[28758]: E0223 14:36:27.849634 28758 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Feb 23 14:36:28.097467 master-0 kubenswrapper[28758]: I0223 14:36:28.097357 28758 status_manager.go:851] "Failed to get status for pod" podUID="069569a4-34c1-4752-af70-b31bcfca4177" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:36:28.098568 master-0 kubenswrapper[28758]: I0223 14:36:28.098452 28758 status_manager.go:851] "Failed to get status for pod" podUID="2146f0e3671998cad8bbc2464b009ab7" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:36:28.099630 master-0 kubenswrapper[28758]: I0223 14:36:28.099457 28758 status_manager.go:851] "Failed to get status for pod" podUID="959c75833224b4ba3fa488b77d8f5032" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:36:28.252015 master-0 kubenswrapper[28758]: E0223 14:36:28.251829 28758 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Feb 23 14:36:28.582079 master-0 kubenswrapper[28758]: I0223 14:36:28.582019 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_959c75833224b4ba3fa488b77d8f5032/kube-apiserver-cert-syncer/0.log" Feb 23 14:36:28.585698 master-0 kubenswrapper[28758]: I0223 14:36:28.585030 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"2146f0e3671998cad8bbc2464b009ab7","Type":"ContainerStarted","Data":"27ac3fa903e6a8eb9dd2efcd9b8ab1a296891591c03489473aaf98dc481f635c"} Feb 23 14:36:28.586733 master-0 kubenswrapper[28758]: I0223 14:36:28.586671 28758 status_manager.go:851] "Failed to get status for pod" podUID="069569a4-34c1-4752-af70-b31bcfca4177" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:36:28.587340 master-0 kubenswrapper[28758]: I0223 14:36:28.587269 28758 status_manager.go:851] "Failed to get status for pod" podUID="2146f0e3671998cad8bbc2464b009ab7" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:36:28.922642 master-0 kubenswrapper[28758]: I0223 14:36:28.922579 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Feb 23 14:36:28.923911 master-0 kubenswrapper[28758]: I0223 14:36:28.923840 28758 status_manager.go:851] "Failed to get status for pod" podUID="069569a4-34c1-4752-af70-b31bcfca4177" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:36:28.924590 master-0 kubenswrapper[28758]: I0223 14:36:28.924548 28758 status_manager.go:851] "Failed to get status for pod" podUID="2146f0e3671998cad8bbc2464b009ab7" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:36:28.929000 master-0 kubenswrapper[28758]: I0223 14:36:28.928955 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/069569a4-34c1-4752-af70-b31bcfca4177-var-lock\") pod \"069569a4-34c1-4752-af70-b31bcfca4177\" (UID: \"069569a4-34c1-4752-af70-b31bcfca4177\") " Feb 23 14:36:28.929105 master-0 kubenswrapper[28758]: I0223 14:36:28.929010 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/069569a4-34c1-4752-af70-b31bcfca4177-kubelet-dir\") pod \"069569a4-34c1-4752-af70-b31bcfca4177\" (UID: \"069569a4-34c1-4752-af70-b31bcfca4177\") " Feb 23 14:36:28.929105 master-0 kubenswrapper[28758]: I0223 14:36:28.929068 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/069569a4-34c1-4752-af70-b31bcfca4177-kube-api-access\") pod \"069569a4-34c1-4752-af70-b31bcfca4177\" (UID: \"069569a4-34c1-4752-af70-b31bcfca4177\") " Feb 23 14:36:28.929105 master-0 kubenswrapper[28758]: I0223 14:36:28.929077 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/069569a4-34c1-4752-af70-b31bcfca4177-var-lock" (OuterVolumeSpecName: "var-lock") pod "069569a4-34c1-4752-af70-b31bcfca4177" (UID: "069569a4-34c1-4752-af70-b31bcfca4177"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:36:28.929225 master-0 kubenswrapper[28758]: I0223 14:36:28.929126 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/069569a4-34c1-4752-af70-b31bcfca4177-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "069569a4-34c1-4752-af70-b31bcfca4177" (UID: "069569a4-34c1-4752-af70-b31bcfca4177"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:36:28.929543 master-0 kubenswrapper[28758]: I0223 14:36:28.929508 28758 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/069569a4-34c1-4752-af70-b31bcfca4177-var-lock\") on node \"master-0\" DevicePath \"\"" Feb 23 14:36:28.929604 master-0 kubenswrapper[28758]: I0223 14:36:28.929541 28758 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/069569a4-34c1-4752-af70-b31bcfca4177-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:36:28.932113 master-0 kubenswrapper[28758]: I0223 14:36:28.932084 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/069569a4-34c1-4752-af70-b31bcfca4177-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "069569a4-34c1-4752-af70-b31bcfca4177" (UID: "069569a4-34c1-4752-af70-b31bcfca4177"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:36:29.030405 master-0 kubenswrapper[28758]: I0223 14:36:29.030327 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/069569a4-34c1-4752-af70-b31bcfca4177-kube-api-access\") on node \"master-0\" DevicePath \"\"" Feb 23 14:36:29.054674 master-0 kubenswrapper[28758]: E0223 14:36:29.054568 28758 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Feb 23 14:36:29.546743 master-0 kubenswrapper[28758]: I0223 14:36:29.546691 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_959c75833224b4ba3fa488b77d8f5032/kube-apiserver-cert-syncer/0.log" Feb 23 14:36:29.547556 master-0 kubenswrapper[28758]: I0223 14:36:29.547507 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:36:29.548514 master-0 kubenswrapper[28758]: I0223 14:36:29.548455 28758 status_manager.go:851] "Failed to get status for pod" podUID="069569a4-34c1-4752-af70-b31bcfca4177" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:36:29.548967 master-0 kubenswrapper[28758]: I0223 14:36:29.548926 28758 status_manager.go:851] "Failed to get status for pod" podUID="2146f0e3671998cad8bbc2464b009ab7" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:36:29.549556 master-0 kubenswrapper[28758]: I0223 14:36:29.549508 28758 status_manager.go:851] "Failed to get status for pod" podUID="959c75833224b4ba3fa488b77d8f5032" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:36:29.595558 master-0 kubenswrapper[28758]: I0223 14:36:29.595452 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Feb 23 14:36:29.595558 master-0 kubenswrapper[28758]: I0223 14:36:29.595443 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"069569a4-34c1-4752-af70-b31bcfca4177","Type":"ContainerDied","Data":"ef785d771950769f3a1e100f2ce635b4ca76cd2406c83804433533340c74bdc4"} Feb 23 14:36:29.595558 master-0 kubenswrapper[28758]: I0223 14:36:29.595550 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef785d771950769f3a1e100f2ce635b4ca76cd2406c83804433533340c74bdc4" Feb 23 14:36:29.599072 master-0 kubenswrapper[28758]: I0223 14:36:29.599023 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_959c75833224b4ba3fa488b77d8f5032/kube-apiserver-cert-syncer/0.log" Feb 23 14:36:29.600162 master-0 kubenswrapper[28758]: I0223 14:36:29.600088 28758 generic.go:334] "Generic (PLEG): container finished" podID="959c75833224b4ba3fa488b77d8f5032" containerID="8daf30ba9c8438cb4829a86d151f5a35be6ceb72bec19daa3149b2925e7076a2" exitCode=0 Feb 23 14:36:29.600258 master-0 kubenswrapper[28758]: I0223 14:36:29.600199 28758 scope.go:117] "RemoveContainer" containerID="f441f13cd166edd4b57db31fab13e616ea6f8d8cb025cae8395a8b8e6b1895b4" Feb 23 14:36:29.600258 master-0 kubenswrapper[28758]: I0223 14:36:29.600223 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:36:29.626610 master-0 kubenswrapper[28758]: I0223 14:36:29.626533 28758 status_manager.go:851] "Failed to get status for pod" podUID="069569a4-34c1-4752-af70-b31bcfca4177" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:36:29.627265 master-0 kubenswrapper[28758]: I0223 14:36:29.627207 28758 status_manager.go:851] "Failed to get status for pod" podUID="2146f0e3671998cad8bbc2464b009ab7" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:36:29.627762 master-0 kubenswrapper[28758]: I0223 14:36:29.627708 28758 status_manager.go:851] "Failed to get status for pod" podUID="959c75833224b4ba3fa488b77d8f5032" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:36:29.628882 master-0 kubenswrapper[28758]: I0223 14:36:29.628847 28758 scope.go:117] "RemoveContainer" containerID="03f063ba05fdab3010ecf16036a12816aeaffe3e50e5e9cbc85f6a31b61cfdf9" Feb 23 14:36:29.643805 master-0 kubenswrapper[28758]: I0223 14:36:29.643764 28758 scope.go:117] "RemoveContainer" containerID="f48e8181c4b1545411cb74be92ee46d43b72112e508bbbde3d0b0625382cc193" Feb 23 14:36:29.654194 master-0 kubenswrapper[28758]: I0223 14:36:29.654097 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/959c75833224b4ba3fa488b77d8f5032-cert-dir\") pod \"959c75833224b4ba3fa488b77d8f5032\" (UID: \"959c75833224b4ba3fa488b77d8f5032\") " Feb 23 14:36:29.654295 master-0 kubenswrapper[28758]: I0223 14:36:29.654233 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/959c75833224b4ba3fa488b77d8f5032-audit-dir\") pod \"959c75833224b4ba3fa488b77d8f5032\" (UID: \"959c75833224b4ba3fa488b77d8f5032\") " Feb 23 14:36:29.654295 master-0 kubenswrapper[28758]: I0223 14:36:29.654251 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/959c75833224b4ba3fa488b77d8f5032-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "959c75833224b4ba3fa488b77d8f5032" (UID: "959c75833224b4ba3fa488b77d8f5032"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:36:29.654295 master-0 kubenswrapper[28758]: I0223 14:36:29.654277 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/959c75833224b4ba3fa488b77d8f5032-resource-dir\") pod \"959c75833224b4ba3fa488b77d8f5032\" (UID: \"959c75833224b4ba3fa488b77d8f5032\") " Feb 23 14:36:29.654450 master-0 kubenswrapper[28758]: I0223 14:36:29.654310 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/959c75833224b4ba3fa488b77d8f5032-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "959c75833224b4ba3fa488b77d8f5032" (UID: "959c75833224b4ba3fa488b77d8f5032"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:36:29.654450 master-0 kubenswrapper[28758]: I0223 14:36:29.654411 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/959c75833224b4ba3fa488b77d8f5032-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "959c75833224b4ba3fa488b77d8f5032" (UID: "959c75833224b4ba3fa488b77d8f5032"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:36:29.654665 master-0 kubenswrapper[28758]: I0223 14:36:29.654634 28758 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/959c75833224b4ba3fa488b77d8f5032-cert-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:36:29.654665 master-0 kubenswrapper[28758]: I0223 14:36:29.654651 28758 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/959c75833224b4ba3fa488b77d8f5032-audit-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:36:29.654665 master-0 kubenswrapper[28758]: I0223 14:36:29.654659 28758 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/959c75833224b4ba3fa488b77d8f5032-resource-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:36:29.657530 master-0 kubenswrapper[28758]: I0223 14:36:29.657316 28758 scope.go:117] "RemoveContainer" containerID="db3bcfab91dd51437354641c4ae5c853de368a114ab29260a261ff0716a6e3aa" Feb 23 14:36:29.669132 master-0 kubenswrapper[28758]: I0223 14:36:29.669080 28758 scope.go:117] "RemoveContainer" containerID="8daf30ba9c8438cb4829a86d151f5a35be6ceb72bec19daa3149b2925e7076a2" Feb 23 14:36:29.685698 master-0 kubenswrapper[28758]: I0223 14:36:29.685642 28758 scope.go:117] "RemoveContainer" containerID="40cb1664e8a96775d97586c3b2bf51f0c43fd54057e211ddda21f17bebe65211" Feb 23 14:36:29.710558 master-0 kubenswrapper[28758]: I0223 14:36:29.710498 28758 scope.go:117] "RemoveContainer" containerID="f441f13cd166edd4b57db31fab13e616ea6f8d8cb025cae8395a8b8e6b1895b4" Feb 23 14:36:29.711151 master-0 kubenswrapper[28758]: E0223 14:36:29.711112 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f441f13cd166edd4b57db31fab13e616ea6f8d8cb025cae8395a8b8e6b1895b4\": container with ID starting with f441f13cd166edd4b57db31fab13e616ea6f8d8cb025cae8395a8b8e6b1895b4 not found: ID does not exist" containerID="f441f13cd166edd4b57db31fab13e616ea6f8d8cb025cae8395a8b8e6b1895b4" Feb 23 14:36:29.711202 master-0 kubenswrapper[28758]: I0223 14:36:29.711165 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f441f13cd166edd4b57db31fab13e616ea6f8d8cb025cae8395a8b8e6b1895b4"} err="failed to get container status \"f441f13cd166edd4b57db31fab13e616ea6f8d8cb025cae8395a8b8e6b1895b4\": rpc error: code = NotFound desc = could not find container \"f441f13cd166edd4b57db31fab13e616ea6f8d8cb025cae8395a8b8e6b1895b4\": container with ID starting with f441f13cd166edd4b57db31fab13e616ea6f8d8cb025cae8395a8b8e6b1895b4 not found: ID does not exist" Feb 23 14:36:29.711202 master-0 kubenswrapper[28758]: I0223 14:36:29.711193 28758 scope.go:117] "RemoveContainer" containerID="03f063ba05fdab3010ecf16036a12816aeaffe3e50e5e9cbc85f6a31b61cfdf9" Feb 23 14:36:29.711655 master-0 kubenswrapper[28758]: E0223 14:36:29.711554 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03f063ba05fdab3010ecf16036a12816aeaffe3e50e5e9cbc85f6a31b61cfdf9\": container with ID starting with 03f063ba05fdab3010ecf16036a12816aeaffe3e50e5e9cbc85f6a31b61cfdf9 not found: ID does not exist" containerID="03f063ba05fdab3010ecf16036a12816aeaffe3e50e5e9cbc85f6a31b61cfdf9" Feb 23 14:36:29.711655 master-0 kubenswrapper[28758]: I0223 14:36:29.711578 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03f063ba05fdab3010ecf16036a12816aeaffe3e50e5e9cbc85f6a31b61cfdf9"} err="failed to get container status \"03f063ba05fdab3010ecf16036a12816aeaffe3e50e5e9cbc85f6a31b61cfdf9\": rpc error: code = NotFound desc = could not find container \"03f063ba05fdab3010ecf16036a12816aeaffe3e50e5e9cbc85f6a31b61cfdf9\": container with ID starting with 03f063ba05fdab3010ecf16036a12816aeaffe3e50e5e9cbc85f6a31b61cfdf9 not found: ID does not exist" Feb 23 14:36:29.711655 master-0 kubenswrapper[28758]: I0223 14:36:29.711593 28758 scope.go:117] "RemoveContainer" containerID="f48e8181c4b1545411cb74be92ee46d43b72112e508bbbde3d0b0625382cc193" Feb 23 14:36:29.711958 master-0 kubenswrapper[28758]: E0223 14:36:29.711924 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f48e8181c4b1545411cb74be92ee46d43b72112e508bbbde3d0b0625382cc193\": container with ID starting with f48e8181c4b1545411cb74be92ee46d43b72112e508bbbde3d0b0625382cc193 not found: ID does not exist" containerID="f48e8181c4b1545411cb74be92ee46d43b72112e508bbbde3d0b0625382cc193" Feb 23 14:36:29.712018 master-0 kubenswrapper[28758]: I0223 14:36:29.711957 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f48e8181c4b1545411cb74be92ee46d43b72112e508bbbde3d0b0625382cc193"} err="failed to get container status \"f48e8181c4b1545411cb74be92ee46d43b72112e508bbbde3d0b0625382cc193\": rpc error: code = NotFound desc = could not find container \"f48e8181c4b1545411cb74be92ee46d43b72112e508bbbde3d0b0625382cc193\": container with ID starting with f48e8181c4b1545411cb74be92ee46d43b72112e508bbbde3d0b0625382cc193 not found: ID does not exist" Feb 23 14:36:29.712018 master-0 kubenswrapper[28758]: I0223 14:36:29.711985 28758 scope.go:117] "RemoveContainer" containerID="db3bcfab91dd51437354641c4ae5c853de368a114ab29260a261ff0716a6e3aa" Feb 23 14:36:29.712437 master-0 kubenswrapper[28758]: E0223 14:36:29.712414 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db3bcfab91dd51437354641c4ae5c853de368a114ab29260a261ff0716a6e3aa\": container with ID starting with db3bcfab91dd51437354641c4ae5c853de368a114ab29260a261ff0716a6e3aa not found: ID does not exist" containerID="db3bcfab91dd51437354641c4ae5c853de368a114ab29260a261ff0716a6e3aa" Feb 23 14:36:29.712519 master-0 kubenswrapper[28758]: I0223 14:36:29.712437 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db3bcfab91dd51437354641c4ae5c853de368a114ab29260a261ff0716a6e3aa"} err="failed to get container status \"db3bcfab91dd51437354641c4ae5c853de368a114ab29260a261ff0716a6e3aa\": rpc error: code = NotFound desc = could not find container \"db3bcfab91dd51437354641c4ae5c853de368a114ab29260a261ff0716a6e3aa\": container with ID starting with db3bcfab91dd51437354641c4ae5c853de368a114ab29260a261ff0716a6e3aa not found: ID does not exist" Feb 23 14:36:29.712519 master-0 kubenswrapper[28758]: I0223 14:36:29.712451 28758 scope.go:117] "RemoveContainer" containerID="8daf30ba9c8438cb4829a86d151f5a35be6ceb72bec19daa3149b2925e7076a2" Feb 23 14:36:29.713059 master-0 kubenswrapper[28758]: E0223 14:36:29.712949 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8daf30ba9c8438cb4829a86d151f5a35be6ceb72bec19daa3149b2925e7076a2\": container with ID starting with 8daf30ba9c8438cb4829a86d151f5a35be6ceb72bec19daa3149b2925e7076a2 not found: ID does not exist" containerID="8daf30ba9c8438cb4829a86d151f5a35be6ceb72bec19daa3149b2925e7076a2" Feb 23 14:36:29.713059 master-0 kubenswrapper[28758]: I0223 14:36:29.712978 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8daf30ba9c8438cb4829a86d151f5a35be6ceb72bec19daa3149b2925e7076a2"} err="failed to get container status \"8daf30ba9c8438cb4829a86d151f5a35be6ceb72bec19daa3149b2925e7076a2\": rpc error: code = NotFound desc = could not find container \"8daf30ba9c8438cb4829a86d151f5a35be6ceb72bec19daa3149b2925e7076a2\": container with ID starting with 8daf30ba9c8438cb4829a86d151f5a35be6ceb72bec19daa3149b2925e7076a2 not found: ID does not exist" Feb 23 14:36:29.713059 master-0 kubenswrapper[28758]: I0223 14:36:29.712997 28758 scope.go:117] "RemoveContainer" containerID="40cb1664e8a96775d97586c3b2bf51f0c43fd54057e211ddda21f17bebe65211" Feb 23 14:36:29.713700 master-0 kubenswrapper[28758]: E0223 14:36:29.713654 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40cb1664e8a96775d97586c3b2bf51f0c43fd54057e211ddda21f17bebe65211\": container with ID starting with 40cb1664e8a96775d97586c3b2bf51f0c43fd54057e211ddda21f17bebe65211 not found: ID does not exist" containerID="40cb1664e8a96775d97586c3b2bf51f0c43fd54057e211ddda21f17bebe65211" Feb 23 14:36:29.713757 master-0 kubenswrapper[28758]: I0223 14:36:29.713712 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40cb1664e8a96775d97586c3b2bf51f0c43fd54057e211ddda21f17bebe65211"} err="failed to get container status \"40cb1664e8a96775d97586c3b2bf51f0c43fd54057e211ddda21f17bebe65211\": rpc error: code = NotFound desc = could not find container \"40cb1664e8a96775d97586c3b2bf51f0c43fd54057e211ddda21f17bebe65211\": container with ID starting with 40cb1664e8a96775d97586c3b2bf51f0c43fd54057e211ddda21f17bebe65211 not found: ID does not exist" Feb 23 14:36:29.927834 master-0 kubenswrapper[28758]: I0223 14:36:29.927769 28758 status_manager.go:851] "Failed to get status for pod" podUID="2146f0e3671998cad8bbc2464b009ab7" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:36:29.928532 master-0 kubenswrapper[28758]: I0223 14:36:29.928470 28758 status_manager.go:851] "Failed to get status for pod" podUID="959c75833224b4ba3fa488b77d8f5032" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:36:29.929357 master-0 kubenswrapper[28758]: I0223 14:36:29.929283 28758 status_manager.go:851] "Failed to get status for pod" podUID="069569a4-34c1-4752-af70-b31bcfca4177" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:36:30.102844 master-0 kubenswrapper[28758]: I0223 14:36:30.102767 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="959c75833224b4ba3fa488b77d8f5032" path="/var/lib/kubelet/pods/959c75833224b4ba3fa488b77d8f5032/volumes" Feb 23 14:36:30.121516 master-0 kubenswrapper[28758]: I0223 14:36:30.121418 28758 patch_prober.go:28] interesting pod/console-556bbd75bc-q4jfx container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Feb 23 14:36:30.121799 master-0 kubenswrapper[28758]: I0223 14:36:30.121544 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-556bbd75bc-q4jfx" podUID="0119f38b-9247-4e4c-af16-31202765777a" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Feb 23 14:36:30.655924 master-0 kubenswrapper[28758]: E0223 14:36:30.655800 28758 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Feb 23 14:36:33.857430 master-0 kubenswrapper[28758]: E0223 14:36:33.857332 28758 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Feb 23 14:36:34.193571 master-0 kubenswrapper[28758]: I0223 14:36:34.193447 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7c65cbb888-4xqr4" Feb 23 14:36:34.193768 master-0 kubenswrapper[28758]: I0223 14:36:34.193583 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7c65cbb888-4xqr4" Feb 23 14:36:34.196645 master-0 kubenswrapper[28758]: I0223 14:36:34.196588 28758 patch_prober.go:28] interesting pod/console-7c65cbb888-4xqr4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.101:8443/health\": dial tcp 10.128.0.101:8443: connect: connection refused" start-of-body= Feb 23 14:36:34.196772 master-0 kubenswrapper[28758]: I0223 14:36:34.196662 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7c65cbb888-4xqr4" podUID="4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae" containerName="console" probeResult="failure" output="Get \"https://10.128.0.101:8443/health\": dial tcp 10.128.0.101:8443: connect: connection refused" Feb 23 14:36:35.997373 master-0 kubenswrapper[28758]: E0223 14:36:35.997048 28758 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.1896e6e7fc07d0b5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:2146f0e3671998cad8bbc2464b009ab7,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fd63e2c1185e529c6e9f6e1426222ff2ac195132b44a1775f407e4593b66d4c\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-02-23 14:36:27.457073333 +0000 UTC m=+119.583389265,LastTimestamp:2026-02-23 14:36:27.457073333 +0000 UTC m=+119.583389265,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Feb 23 14:36:38.088036 master-0 kubenswrapper[28758]: I0223 14:36:38.087927 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:36:38.093209 master-0 kubenswrapper[28758]: I0223 14:36:38.093067 28758 status_manager.go:851] "Failed to get status for pod" podUID="2146f0e3671998cad8bbc2464b009ab7" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:36:38.094257 master-0 kubenswrapper[28758]: I0223 14:36:38.094140 28758 status_manager.go:851] "Failed to get status for pod" podUID="069569a4-34c1-4752-af70-b31bcfca4177" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:36:38.095746 master-0 kubenswrapper[28758]: I0223 14:36:38.095653 28758 status_manager.go:851] "Failed to get status for pod" podUID="2146f0e3671998cad8bbc2464b009ab7" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:36:38.096240 master-0 kubenswrapper[28758]: I0223 14:36:38.096182 28758 status_manager.go:851] "Failed to get status for pod" podUID="069569a4-34c1-4752-af70-b31bcfca4177" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:36:38.126599 master-0 kubenswrapper[28758]: I0223 14:36:38.126505 28758 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="09a8c00b-d4df-401f-b301-08df8f97782b" Feb 23 14:36:38.126599 master-0 kubenswrapper[28758]: I0223 14:36:38.126562 28758 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="09a8c00b-d4df-401f-b301-08df8f97782b" Feb 23 14:36:38.127571 master-0 kubenswrapper[28758]: E0223 14:36:38.127447 28758 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:36:38.128401 master-0 kubenswrapper[28758]: I0223 14:36:38.128296 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:36:38.167612 master-0 kubenswrapper[28758]: W0223 14:36:38.167532 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod487622064474ed0ec70f7bf2a0fcb80b.slice/crio-5f9896c2c6b52ba0ce21620f1872e016723a6320628d7059a04116d581937739 WatchSource:0}: Error finding container 5f9896c2c6b52ba0ce21620f1872e016723a6320628d7059a04116d581937739: Status 404 returned error can't find the container with id 5f9896c2c6b52ba0ce21620f1872e016723a6320628d7059a04116d581937739 Feb 23 14:36:38.680404 master-0 kubenswrapper[28758]: I0223 14:36:38.680258 28758 generic.go:334] "Generic (PLEG): container finished" podID="487622064474ed0ec70f7bf2a0fcb80b" containerID="56cf075e055ff35cea5870931c1bc804544740a2f9a7983d36923353d062ffdd" exitCode=0 Feb 23 14:36:38.680404 master-0 kubenswrapper[28758]: I0223 14:36:38.680343 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"487622064474ed0ec70f7bf2a0fcb80b","Type":"ContainerDied","Data":"56cf075e055ff35cea5870931c1bc804544740a2f9a7983d36923353d062ffdd"} Feb 23 14:36:38.680670 master-0 kubenswrapper[28758]: I0223 14:36:38.680463 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"487622064474ed0ec70f7bf2a0fcb80b","Type":"ContainerStarted","Data":"5f9896c2c6b52ba0ce21620f1872e016723a6320628d7059a04116d581937739"} Feb 23 14:36:38.681075 master-0 kubenswrapper[28758]: I0223 14:36:38.681027 28758 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="09a8c00b-d4df-401f-b301-08df8f97782b" Feb 23 14:36:38.681125 master-0 kubenswrapper[28758]: I0223 14:36:38.681073 28758 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="09a8c00b-d4df-401f-b301-08df8f97782b" Feb 23 14:36:38.682168 master-0 kubenswrapper[28758]: E0223 14:36:38.682104 28758 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:36:38.682252 master-0 kubenswrapper[28758]: I0223 14:36:38.682103 28758 status_manager.go:851] "Failed to get status for pod" podUID="069569a4-34c1-4752-af70-b31bcfca4177" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:36:38.683880 master-0 kubenswrapper[28758]: I0223 14:36:38.683300 28758 status_manager.go:851] "Failed to get status for pod" podUID="2146f0e3671998cad8bbc2464b009ab7" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Feb 23 14:36:39.694677 master-0 kubenswrapper[28758]: I0223 14:36:39.694603 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"487622064474ed0ec70f7bf2a0fcb80b","Type":"ContainerStarted","Data":"e488ecaa2b4f660f184cfc1fef32f12fa6a76aac1ad8e5eec00a9250ca7aaafe"} Feb 23 14:36:39.694677 master-0 kubenswrapper[28758]: I0223 14:36:39.694657 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"487622064474ed0ec70f7bf2a0fcb80b","Type":"ContainerStarted","Data":"c91c53d27899902747387ba92dc5c80dbab4cab1909c8ea7ee8dc3f9ab7012d4"} Feb 23 14:36:39.694677 master-0 kubenswrapper[28758]: I0223 14:36:39.694667 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"487622064474ed0ec70f7bf2a0fcb80b","Type":"ContainerStarted","Data":"c7de474d394e3fff01d1e175080b6c6c80f1c2755e6aa91aadee8bc80947eab4"} Feb 23 14:36:40.121053 master-0 kubenswrapper[28758]: I0223 14:36:40.121004 28758 patch_prober.go:28] interesting pod/console-556bbd75bc-q4jfx container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Feb 23 14:36:40.121319 master-0 kubenswrapper[28758]: I0223 14:36:40.121065 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-556bbd75bc-q4jfx" podUID="0119f38b-9247-4e4c-af16-31202765777a" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Feb 23 14:36:40.707204 master-0 kubenswrapper[28758]: I0223 14:36:40.707149 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"487622064474ed0ec70f7bf2a0fcb80b","Type":"ContainerStarted","Data":"9ca0eb9f8cb28eabbddc16f9f6314096e97af9b67457ebf615ad253901193e0c"} Feb 23 14:36:40.707204 master-0 kubenswrapper[28758]: I0223 14:36:40.707198 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"487622064474ed0ec70f7bf2a0fcb80b","Type":"ContainerStarted","Data":"ba91043472ac532df4fba363063592c698c3328c29514a05ca4647f93a5fb855"} Feb 23 14:36:40.708100 master-0 kubenswrapper[28758]: I0223 14:36:40.707292 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:36:40.708100 master-0 kubenswrapper[28758]: I0223 14:36:40.707430 28758 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="09a8c00b-d4df-401f-b301-08df8f97782b" Feb 23 14:36:40.708100 master-0 kubenswrapper[28758]: I0223 14:36:40.707457 28758 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="09a8c00b-d4df-401f-b301-08df8f97782b" Feb 23 14:36:41.718243 master-0 kubenswrapper[28758]: I0223 14:36:41.718084 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_181adc3f4810f127b44f3750f5d2460c/kube-controller-manager/0.log" Feb 23 14:36:41.718243 master-0 kubenswrapper[28758]: I0223 14:36:41.718144 28758 generic.go:334] "Generic (PLEG): container finished" podID="181adc3f4810f127b44f3750f5d2460c" containerID="ea1409538bec46d9eceb195d8a31f70cddcab9c02d2f2d5acf77e88b46aed24f" exitCode=1 Feb 23 14:36:41.718243 master-0 kubenswrapper[28758]: I0223 14:36:41.718180 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"181adc3f4810f127b44f3750f5d2460c","Type":"ContainerDied","Data":"ea1409538bec46d9eceb195d8a31f70cddcab9c02d2f2d5acf77e88b46aed24f"} Feb 23 14:36:41.719884 master-0 kubenswrapper[28758]: I0223 14:36:41.718765 28758 scope.go:117] "RemoveContainer" containerID="ea1409538bec46d9eceb195d8a31f70cddcab9c02d2f2d5acf77e88b46aed24f" Feb 23 14:36:42.733399 master-0 kubenswrapper[28758]: I0223 14:36:42.733362 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_181adc3f4810f127b44f3750f5d2460c/kube-controller-manager/0.log" Feb 23 14:36:42.734090 master-0 kubenswrapper[28758]: I0223 14:36:42.734052 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"181adc3f4810f127b44f3750f5d2460c","Type":"ContainerStarted","Data":"7b6a30c67bda806ef66a202fea13c367daf0dee629c1c44dffc741cdc340946a"} Feb 23 14:36:43.128722 master-0 kubenswrapper[28758]: I0223 14:36:43.128589 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:36:43.128944 master-0 kubenswrapper[28758]: I0223 14:36:43.128808 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:36:43.134317 master-0 kubenswrapper[28758]: I0223 14:36:43.134280 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:36:44.194248 master-0 kubenswrapper[28758]: I0223 14:36:44.194178 28758 patch_prober.go:28] interesting pod/console-7c65cbb888-4xqr4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.101:8443/health\": dial tcp 10.128.0.101:8443: connect: connection refused" start-of-body= Feb 23 14:36:44.194248 master-0 kubenswrapper[28758]: I0223 14:36:44.194245 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7c65cbb888-4xqr4" podUID="4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae" containerName="console" probeResult="failure" output="Get \"https://10.128.0.101:8443/health\": dial tcp 10.128.0.101:8443: connect: connection refused" Feb 23 14:36:45.771855 master-0 kubenswrapper[28758]: I0223 14:36:45.771801 28758 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:36:46.761353 master-0 kubenswrapper[28758]: I0223 14:36:46.761221 28758 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="09a8c00b-d4df-401f-b301-08df8f97782b" Feb 23 14:36:46.761353 master-0 kubenswrapper[28758]: I0223 14:36:46.761275 28758 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="09a8c00b-d4df-401f-b301-08df8f97782b" Feb 23 14:36:46.770125 master-0 kubenswrapper[28758]: I0223 14:36:46.770054 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:36:47.768452 master-0 kubenswrapper[28758]: I0223 14:36:47.768386 28758 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="09a8c00b-d4df-401f-b301-08df8f97782b" Feb 23 14:36:47.768452 master-0 kubenswrapper[28758]: I0223 14:36:47.768433 28758 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="09a8c00b-d4df-401f-b301-08df8f97782b" Feb 23 14:36:48.105691 master-0 kubenswrapper[28758]: I0223 14:36:48.105521 28758 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="487622064474ed0ec70f7bf2a0fcb80b" podUID="8c491d2c-717f-469e-9d16-1740f62a5b3b" Feb 23 14:36:48.878232 master-0 kubenswrapper[28758]: I0223 14:36:48.878143 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-6fb7cc48c6-v85rz" podUID="eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e" containerName="console" containerID="cri-o://84008195bd56420835a8499cc20178119ee221e5ffd1f50ddbe5e8c00e472d9e" gracePeriod=15 Feb 23 14:36:49.613181 master-0 kubenswrapper[28758]: I0223 14:36:49.613133 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6fb7cc48c6-v85rz_eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e/console/0.log" Feb 23 14:36:49.613377 master-0 kubenswrapper[28758]: I0223 14:36:49.613232 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fb7cc48c6-v85rz" Feb 23 14:36:49.665637 master-0 kubenswrapper[28758]: I0223 14:36:49.665581 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-oauth-serving-cert\") pod \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\" (UID: \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\") " Feb 23 14:36:49.666010 master-0 kubenswrapper[28758]: I0223 14:36:49.665984 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e" (UID: "eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:36:49.666113 master-0 kubenswrapper[28758]: I0223 14:36:49.666095 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-service-ca\") pod \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\" (UID: \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\") " Feb 23 14:36:49.666166 master-0 kubenswrapper[28758]: I0223 14:36:49.666123 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ghqw\" (UniqueName: \"kubernetes.io/projected/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-kube-api-access-9ghqw\") pod \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\" (UID: \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\") " Feb 23 14:36:49.666535 master-0 kubenswrapper[28758]: I0223 14:36:49.666510 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-service-ca" (OuterVolumeSpecName: "service-ca") pod "eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e" (UID: "eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:36:49.666629 master-0 kubenswrapper[28758]: I0223 14:36:49.666601 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-trusted-ca-bundle\") pod \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\" (UID: \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\") " Feb 23 14:36:49.667030 master-0 kubenswrapper[28758]: I0223 14:36:49.667002 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e" (UID: "eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:36:49.667135 master-0 kubenswrapper[28758]: I0223 14:36:49.667113 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-console-config\") pod \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\" (UID: \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\") " Feb 23 14:36:49.667187 master-0 kubenswrapper[28758]: I0223 14:36:49.667170 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-console-serving-cert\") pod \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\" (UID: \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\") " Feb 23 14:36:49.667255 master-0 kubenswrapper[28758]: I0223 14:36:49.667234 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-console-oauth-config\") pod \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\" (UID: \"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e\") " Feb 23 14:36:49.667652 master-0 kubenswrapper[28758]: I0223 14:36:49.667602 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-console-config" (OuterVolumeSpecName: "console-config") pod "eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e" (UID: "eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:36:49.667705 master-0 kubenswrapper[28758]: I0223 14:36:49.667632 28758 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Feb 23 14:36:49.667705 master-0 kubenswrapper[28758]: I0223 14:36:49.667689 28758 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-service-ca\") on node \"master-0\" DevicePath \"\"" Feb 23 14:36:49.667705 master-0 kubenswrapper[28758]: I0223 14:36:49.667703 28758 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:36:49.670240 master-0 kubenswrapper[28758]: I0223 14:36:49.670202 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e" (UID: "eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:36:49.670789 master-0 kubenswrapper[28758]: I0223 14:36:49.670725 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e" (UID: "eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:36:49.672121 master-0 kubenswrapper[28758]: I0223 14:36:49.672078 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-kube-api-access-9ghqw" (OuterVolumeSpecName: "kube-api-access-9ghqw") pod "eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e" (UID: "eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e"). InnerVolumeSpecName "kube-api-access-9ghqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:36:49.741628 master-0 kubenswrapper[28758]: I0223 14:36:49.741569 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:36:49.741934 master-0 kubenswrapper[28758]: I0223 14:36:49.741901 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:36:49.742277 master-0 kubenswrapper[28758]: I0223 14:36:49.742239 28758 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Feb 23 14:36:49.742386 master-0 kubenswrapper[28758]: I0223 14:36:49.742281 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="181adc3f4810f127b44f3750f5d2460c" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Feb 23 14:36:49.768768 master-0 kubenswrapper[28758]: I0223 14:36:49.768703 28758 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-console-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:36:49.768768 master-0 kubenswrapper[28758]: I0223 14:36:49.768754 28758 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Feb 23 14:36:49.768768 master-0 kubenswrapper[28758]: I0223 14:36:49.768767 28758 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:36:49.768768 master-0 kubenswrapper[28758]: I0223 14:36:49.768776 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ghqw\" (UniqueName: \"kubernetes.io/projected/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e-kube-api-access-9ghqw\") on node \"master-0\" DevicePath \"\"" Feb 23 14:36:49.782222 master-0 kubenswrapper[28758]: I0223 14:36:49.782187 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6fb7cc48c6-v85rz_eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e/console/0.log" Feb 23 14:36:49.782529 master-0 kubenswrapper[28758]: I0223 14:36:49.782502 28758 generic.go:334] "Generic (PLEG): container finished" podID="eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e" containerID="84008195bd56420835a8499cc20178119ee221e5ffd1f50ddbe5e8c00e472d9e" exitCode=2 Feb 23 14:36:49.782650 master-0 kubenswrapper[28758]: I0223 14:36:49.782556 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fb7cc48c6-v85rz" Feb 23 14:36:49.782745 master-0 kubenswrapper[28758]: I0223 14:36:49.782573 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fb7cc48c6-v85rz" event={"ID":"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e","Type":"ContainerDied","Data":"84008195bd56420835a8499cc20178119ee221e5ffd1f50ddbe5e8c00e472d9e"} Feb 23 14:36:49.782809 master-0 kubenswrapper[28758]: I0223 14:36:49.782753 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fb7cc48c6-v85rz" event={"ID":"eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e","Type":"ContainerDied","Data":"c7d2433a662653f036ca2b839c6751db69a8b78e4ff9f783c75994c09abfaada"} Feb 23 14:36:49.782809 master-0 kubenswrapper[28758]: I0223 14:36:49.782771 28758 scope.go:117] "RemoveContainer" containerID="84008195bd56420835a8499cc20178119ee221e5ffd1f50ddbe5e8c00e472d9e" Feb 23 14:36:49.801153 master-0 kubenswrapper[28758]: I0223 14:36:49.801111 28758 scope.go:117] "RemoveContainer" containerID="84008195bd56420835a8499cc20178119ee221e5ffd1f50ddbe5e8c00e472d9e" Feb 23 14:36:49.801544 master-0 kubenswrapper[28758]: E0223 14:36:49.801487 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84008195bd56420835a8499cc20178119ee221e5ffd1f50ddbe5e8c00e472d9e\": container with ID starting with 84008195bd56420835a8499cc20178119ee221e5ffd1f50ddbe5e8c00e472d9e not found: ID does not exist" containerID="84008195bd56420835a8499cc20178119ee221e5ffd1f50ddbe5e8c00e472d9e" Feb 23 14:36:49.801615 master-0 kubenswrapper[28758]: I0223 14:36:49.801533 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84008195bd56420835a8499cc20178119ee221e5ffd1f50ddbe5e8c00e472d9e"} err="failed to get container status \"84008195bd56420835a8499cc20178119ee221e5ffd1f50ddbe5e8c00e472d9e\": rpc error: code = NotFound desc = could not find container \"84008195bd56420835a8499cc20178119ee221e5ffd1f50ddbe5e8c00e472d9e\": container with ID starting with 84008195bd56420835a8499cc20178119ee221e5ffd1f50ddbe5e8c00e472d9e not found: ID does not exist" Feb 23 14:36:50.121178 master-0 kubenswrapper[28758]: I0223 14:36:50.121111 28758 patch_prober.go:28] interesting pod/console-556bbd75bc-q4jfx container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Feb 23 14:36:50.121178 master-0 kubenswrapper[28758]: I0223 14:36:50.121164 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-556bbd75bc-q4jfx" podUID="0119f38b-9247-4e4c-af16-31202765777a" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Feb 23 14:36:52.303687 master-0 kubenswrapper[28758]: I0223 14:36:52.303565 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 23 14:36:52.477051 master-0 kubenswrapper[28758]: I0223 14:36:52.476945 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 23 14:36:52.482707 master-0 kubenswrapper[28758]: I0223 14:36:52.482637 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 23 14:36:52.692871 master-0 kubenswrapper[28758]: I0223 14:36:52.692811 28758 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 23 14:36:52.752101 master-0 kubenswrapper[28758]: I0223 14:36:52.752019 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 23 14:36:52.779687 master-0 kubenswrapper[28758]: I0223 14:36:52.779562 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 23 14:36:52.897435 master-0 kubenswrapper[28758]: I0223 14:36:52.897335 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 23 14:36:53.207271 master-0 kubenswrapper[28758]: I0223 14:36:53.207184 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 23 14:36:53.271785 master-0 kubenswrapper[28758]: I0223 14:36:53.271690 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-cpnmg" Feb 23 14:36:53.291980 master-0 kubenswrapper[28758]: I0223 14:36:53.291911 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-kdvbz" Feb 23 14:36:53.698219 master-0 kubenswrapper[28758]: I0223 14:36:53.698139 28758 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 23 14:36:54.042350 master-0 kubenswrapper[28758]: I0223 14:36:54.042111 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Feb 23 14:36:54.169966 master-0 kubenswrapper[28758]: I0223 14:36:54.169869 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Feb 23 14:36:54.193617 master-0 kubenswrapper[28758]: I0223 14:36:54.193458 28758 patch_prober.go:28] interesting pod/console-7c65cbb888-4xqr4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.101:8443/health\": dial tcp 10.128.0.101:8443: connect: connection refused" start-of-body= Feb 23 14:36:54.193617 master-0 kubenswrapper[28758]: I0223 14:36:54.193574 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7c65cbb888-4xqr4" podUID="4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae" containerName="console" probeResult="failure" output="Get \"https://10.128.0.101:8443/health\": dial tcp 10.128.0.101:8443: connect: connection refused" Feb 23 14:36:54.303379 master-0 kubenswrapper[28758]: I0223 14:36:54.303176 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 23 14:36:54.449469 master-0 kubenswrapper[28758]: I0223 14:36:54.449357 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 23 14:36:54.689891 master-0 kubenswrapper[28758]: I0223 14:36:54.689820 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Feb 23 14:36:54.892077 master-0 kubenswrapper[28758]: I0223 14:36:54.892023 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-xc5pk" Feb 23 14:36:55.094820 master-0 kubenswrapper[28758]: I0223 14:36:55.094619 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 23 14:36:55.239804 master-0 kubenswrapper[28758]: I0223 14:36:55.239715 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 23 14:36:55.821725 master-0 kubenswrapper[28758]: I0223 14:36:55.821600 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 23 14:36:55.937551 master-0 kubenswrapper[28758]: I0223 14:36:55.937443 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 23 14:36:56.122933 master-0 kubenswrapper[28758]: I0223 14:36:56.122829 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 23 14:36:56.463734 master-0 kubenswrapper[28758]: I0223 14:36:56.463616 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 23 14:36:56.799693 master-0 kubenswrapper[28758]: I0223 14:36:56.799286 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Feb 23 14:36:56.944045 master-0 kubenswrapper[28758]: I0223 14:36:56.943966 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Feb 23 14:36:57.065428 master-0 kubenswrapper[28758]: I0223 14:36:57.065300 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 23 14:36:57.318586 master-0 kubenswrapper[28758]: I0223 14:36:57.318405 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Feb 23 14:36:57.507950 master-0 kubenswrapper[28758]: I0223 14:36:57.507894 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 23 14:36:57.690907 master-0 kubenswrapper[28758]: I0223 14:36:57.690859 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-hnfcp" Feb 23 14:36:57.693634 master-0 kubenswrapper[28758]: I0223 14:36:57.693580 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 23 14:36:57.964886 master-0 kubenswrapper[28758]: I0223 14:36:57.964773 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Feb 23 14:36:57.978931 master-0 kubenswrapper[28758]: I0223 14:36:57.978887 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 23 14:36:58.003159 master-0 kubenswrapper[28758]: I0223 14:36:58.003107 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-jgw8n" Feb 23 14:36:58.039353 master-0 kubenswrapper[28758]: I0223 14:36:58.039287 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 23 14:36:58.042845 master-0 kubenswrapper[28758]: I0223 14:36:58.042818 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 23 14:36:58.050345 master-0 kubenswrapper[28758]: I0223 14:36:58.050277 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Feb 23 14:36:58.318769 master-0 kubenswrapper[28758]: I0223 14:36:58.318606 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-zqmp5" Feb 23 14:36:58.325379 master-0 kubenswrapper[28758]: I0223 14:36:58.325308 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Feb 23 14:36:58.464020 master-0 kubenswrapper[28758]: I0223 14:36:58.463944 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 23 14:36:58.564811 master-0 kubenswrapper[28758]: I0223 14:36:58.564736 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Feb 23 14:36:58.610906 master-0 kubenswrapper[28758]: I0223 14:36:58.610734 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 23 14:36:58.905172 master-0 kubenswrapper[28758]: I0223 14:36:58.904990 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 23 14:36:59.078103 master-0 kubenswrapper[28758]: I0223 14:36:59.078034 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 23 14:36:59.099221 master-0 kubenswrapper[28758]: I0223 14:36:59.099165 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 23 14:36:59.179408 master-0 kubenswrapper[28758]: I0223 14:36:59.179258 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Feb 23 14:36:59.362551 master-0 kubenswrapper[28758]: I0223 14:36:59.362469 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 23 14:36:59.484927 master-0 kubenswrapper[28758]: I0223 14:36:59.484857 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 23 14:36:59.580201 master-0 kubenswrapper[28758]: I0223 14:36:59.579981 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 23 14:36:59.742618 master-0 kubenswrapper[28758]: I0223 14:36:59.742411 28758 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Feb 23 14:36:59.742618 master-0 kubenswrapper[28758]: I0223 14:36:59.742543 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="181adc3f4810f127b44f3750f5d2460c" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Feb 23 14:36:59.810511 master-0 kubenswrapper[28758]: I0223 14:36:59.810438 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 23 14:36:59.850973 master-0 kubenswrapper[28758]: I0223 14:36:59.850886 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Feb 23 14:37:00.043228 master-0 kubenswrapper[28758]: I0223 14:37:00.042980 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 23 14:37:00.044947 master-0 kubenswrapper[28758]: I0223 14:37:00.044907 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-jdc7c" Feb 23 14:37:00.121327 master-0 kubenswrapper[28758]: I0223 14:37:00.121220 28758 patch_prober.go:28] interesting pod/console-556bbd75bc-q4jfx container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Feb 23 14:37:00.121327 master-0 kubenswrapper[28758]: I0223 14:37:00.121290 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-556bbd75bc-q4jfx" podUID="0119f38b-9247-4e4c-af16-31202765777a" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Feb 23 14:37:00.335710 master-0 kubenswrapper[28758]: I0223 14:37:00.335542 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-dockercfg-vvbrf" Feb 23 14:37:00.373259 master-0 kubenswrapper[28758]: I0223 14:37:00.373192 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Feb 23 14:37:00.400900 master-0 kubenswrapper[28758]: I0223 14:37:00.400810 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Feb 23 14:37:00.459661 master-0 kubenswrapper[28758]: I0223 14:37:00.459565 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 23 14:37:00.467598 master-0 kubenswrapper[28758]: I0223 14:37:00.467450 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 23 14:37:00.640196 master-0 kubenswrapper[28758]: I0223 14:37:00.640044 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 23 14:37:00.741843 master-0 kubenswrapper[28758]: I0223 14:37:00.741782 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 23 14:37:00.865163 master-0 kubenswrapper[28758]: I0223 14:37:00.865057 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 23 14:37:00.930518 master-0 kubenswrapper[28758]: I0223 14:37:00.930338 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 23 14:37:00.999326 master-0 kubenswrapper[28758]: I0223 14:37:00.999243 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Feb 23 14:37:01.098270 master-0 kubenswrapper[28758]: I0223 14:37:01.098167 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Feb 23 14:37:01.211140 master-0 kubenswrapper[28758]: I0223 14:37:01.210926 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Feb 23 14:37:01.239555 master-0 kubenswrapper[28758]: I0223 14:37:01.239452 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 23 14:37:01.547982 master-0 kubenswrapper[28758]: I0223 14:37:01.547751 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 23 14:37:01.557365 master-0 kubenswrapper[28758]: I0223 14:37:01.557296 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 23 14:37:01.646509 master-0 kubenswrapper[28758]: I0223 14:37:01.646439 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 23 14:37:01.729245 master-0 kubenswrapper[28758]: I0223 14:37:01.729160 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 23 14:37:01.831280 master-0 kubenswrapper[28758]: I0223 14:37:01.831100 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 23 14:37:01.876648 master-0 kubenswrapper[28758]: I0223 14:37:01.876545 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-hcgtz" Feb 23 14:37:01.885870 master-0 kubenswrapper[28758]: I0223 14:37:01.885768 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Feb 23 14:37:01.945278 master-0 kubenswrapper[28758]: I0223 14:37:01.945144 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 23 14:37:02.102926 master-0 kubenswrapper[28758]: I0223 14:37:02.102722 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 23 14:37:02.219742 master-0 kubenswrapper[28758]: I0223 14:37:02.219660 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Feb 23 14:37:02.238402 master-0 kubenswrapper[28758]: I0223 14:37:02.238168 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 23 14:37:02.250291 master-0 kubenswrapper[28758]: I0223 14:37:02.250215 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-bqgj7" Feb 23 14:37:02.298515 master-0 kubenswrapper[28758]: I0223 14:37:02.298423 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 23 14:37:02.379064 master-0 kubenswrapper[28758]: I0223 14:37:02.378909 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Feb 23 14:37:02.380119 master-0 kubenswrapper[28758]: I0223 14:37:02.380080 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 23 14:37:02.548008 master-0 kubenswrapper[28758]: I0223 14:37:02.547922 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 23 14:37:02.585701 master-0 kubenswrapper[28758]: I0223 14:37:02.585617 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 23 14:37:02.603201 master-0 kubenswrapper[28758]: I0223 14:37:02.603116 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 23 14:37:02.725716 master-0 kubenswrapper[28758]: I0223 14:37:02.725632 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 23 14:37:02.734230 master-0 kubenswrapper[28758]: I0223 14:37:02.733804 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 23 14:37:02.734513 master-0 kubenswrapper[28758]: I0223 14:37:02.734446 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 23 14:37:02.814172 master-0 kubenswrapper[28758]: I0223 14:37:02.814119 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-f27w9" Feb 23 14:37:02.831142 master-0 kubenswrapper[28758]: I0223 14:37:02.831101 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-dt6sv" Feb 23 14:37:02.834540 master-0 kubenswrapper[28758]: I0223 14:37:02.834522 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 23 14:37:02.845860 master-0 kubenswrapper[28758]: I0223 14:37:02.845820 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 23 14:37:02.880783 master-0 kubenswrapper[28758]: I0223 14:37:02.880724 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Feb 23 14:37:02.927876 master-0 kubenswrapper[28758]: I0223 14:37:02.927817 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 23 14:37:02.968609 master-0 kubenswrapper[28758]: I0223 14:37:02.968510 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Feb 23 14:37:02.997763 master-0 kubenswrapper[28758]: I0223 14:37:02.997464 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 23 14:37:03.292820 master-0 kubenswrapper[28758]: I0223 14:37:03.292678 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 23 14:37:03.355452 master-0 kubenswrapper[28758]: I0223 14:37:03.355380 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 23 14:37:03.386599 master-0 kubenswrapper[28758]: I0223 14:37:03.386537 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 23 14:37:03.506905 master-0 kubenswrapper[28758]: I0223 14:37:03.506834 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Feb 23 14:37:03.674194 master-0 kubenswrapper[28758]: I0223 14:37:03.674050 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Feb 23 14:37:03.693098 master-0 kubenswrapper[28758]: I0223 14:37:03.693037 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-b9hzz" Feb 23 14:37:03.700199 master-0 kubenswrapper[28758]: I0223 14:37:03.700117 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Feb 23 14:37:03.700199 master-0 kubenswrapper[28758]: I0223 14:37:03.700117 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Feb 23 14:37:03.718055 master-0 kubenswrapper[28758]: I0223 14:37:03.717989 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Feb 23 14:37:03.725294 master-0 kubenswrapper[28758]: I0223 14:37:03.725238 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 23 14:37:03.862910 master-0 kubenswrapper[28758]: I0223 14:37:03.862828 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 23 14:37:03.973389 master-0 kubenswrapper[28758]: I0223 14:37:03.973315 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 23 14:37:04.105331 master-0 kubenswrapper[28758]: I0223 14:37:04.105271 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 23 14:37:04.132354 master-0 kubenswrapper[28758]: I0223 14:37:04.132310 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Feb 23 14:37:04.193828 master-0 kubenswrapper[28758]: I0223 14:37:04.193761 28758 patch_prober.go:28] interesting pod/console-7c65cbb888-4xqr4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.101:8443/health\": dial tcp 10.128.0.101:8443: connect: connection refused" start-of-body= Feb 23 14:37:04.194224 master-0 kubenswrapper[28758]: I0223 14:37:04.194176 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7c65cbb888-4xqr4" podUID="4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae" containerName="console" probeResult="failure" output="Get \"https://10.128.0.101:8443/health\": dial tcp 10.128.0.101:8443: connect: connection refused" Feb 23 14:37:04.197839 master-0 kubenswrapper[28758]: I0223 14:37:04.197775 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 23 14:37:04.213016 master-0 kubenswrapper[28758]: I0223 14:37:04.212960 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Feb 23 14:37:04.262818 master-0 kubenswrapper[28758]: I0223 14:37:04.262646 28758 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 23 14:37:04.266267 master-0 kubenswrapper[28758]: I0223 14:37:04.266194 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podStartSLOduration=37.266172734 podStartE2EDuration="37.266172734s" podCreationTimestamp="2026-02-23 14:36:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:36:45.798618093 +0000 UTC m=+137.924934075" watchObservedRunningTime="2026-02-23 14:37:04.266172734 +0000 UTC m=+156.392488706" Feb 23 14:37:04.269800 master-0 kubenswrapper[28758]: I0223 14:37:04.269762 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0","openshift-console/console-6fb7cc48c6-v85rz"] Feb 23 14:37:04.269850 master-0 kubenswrapper[28758]: I0223 14:37:04.269825 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Feb 23 14:37:04.274679 master-0 kubenswrapper[28758]: I0223 14:37:04.274635 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Feb 23 14:37:04.290934 master-0 kubenswrapper[28758]: I0223 14:37:04.290882 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 23 14:37:04.309119 master-0 kubenswrapper[28758]: I0223 14:37:04.309053 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=19.309033623 podStartE2EDuration="19.309033623s" podCreationTimestamp="2026-02-23 14:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:37:04.287467325 +0000 UTC m=+156.413783247" watchObservedRunningTime="2026-02-23 14:37:04.309033623 +0000 UTC m=+156.435349555" Feb 23 14:37:04.317619 master-0 kubenswrapper[28758]: I0223 14:37:04.317594 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 23 14:37:04.339146 master-0 kubenswrapper[28758]: I0223 14:37:04.339088 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Feb 23 14:37:04.437206 master-0 kubenswrapper[28758]: I0223 14:37:04.437154 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 23 14:37:04.553846 master-0 kubenswrapper[28758]: I0223 14:37:04.553702 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Feb 23 14:37:04.587081 master-0 kubenswrapper[28758]: I0223 14:37:04.587004 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Feb 23 14:37:04.611306 master-0 kubenswrapper[28758]: I0223 14:37:04.611227 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 23 14:37:04.691540 master-0 kubenswrapper[28758]: I0223 14:37:04.691363 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 23 14:37:04.734212 master-0 kubenswrapper[28758]: I0223 14:37:04.733995 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 23 14:37:04.734656 master-0 kubenswrapper[28758]: I0223 14:37:04.734239 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Feb 23 14:37:04.757956 master-0 kubenswrapper[28758]: I0223 14:37:04.757879 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 23 14:37:04.760868 master-0 kubenswrapper[28758]: I0223 14:37:04.760826 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-wvfd9" Feb 23 14:37:04.861633 master-0 kubenswrapper[28758]: I0223 14:37:04.861423 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-5wrwc" Feb 23 14:37:04.897251 master-0 kubenswrapper[28758]: I0223 14:37:04.897188 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 23 14:37:04.933376 master-0 kubenswrapper[28758]: I0223 14:37:04.933310 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Feb 23 14:37:04.946146 master-0 kubenswrapper[28758]: I0223 14:37:04.946066 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Feb 23 14:37:04.957086 master-0 kubenswrapper[28758]: I0223 14:37:04.957026 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 23 14:37:05.006313 master-0 kubenswrapper[28758]: I0223 14:37:05.006271 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 23 14:37:05.055751 master-0 kubenswrapper[28758]: I0223 14:37:05.055698 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Feb 23 14:37:05.094431 master-0 kubenswrapper[28758]: I0223 14:37:05.094361 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 23 14:37:05.137842 master-0 kubenswrapper[28758]: I0223 14:37:05.137718 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 23 14:37:05.143445 master-0 kubenswrapper[28758]: I0223 14:37:05.143408 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 23 14:37:05.152341 master-0 kubenswrapper[28758]: I0223 14:37:05.152298 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-vkk2b" Feb 23 14:37:05.158540 master-0 kubenswrapper[28758]: I0223 14:37:05.158499 28758 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 23 14:37:05.232519 master-0 kubenswrapper[28758]: I0223 14:37:05.232380 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 23 14:37:05.272038 master-0 kubenswrapper[28758]: I0223 14:37:05.271983 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 23 14:37:05.302412 master-0 kubenswrapper[28758]: I0223 14:37:05.302340 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 23 14:37:05.310706 master-0 kubenswrapper[28758]: I0223 14:37:05.310654 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 23 14:37:05.430678 master-0 kubenswrapper[28758]: I0223 14:37:05.430442 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 23 14:37:05.436392 master-0 kubenswrapper[28758]: I0223 14:37:05.436338 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 23 14:37:05.558276 master-0 kubenswrapper[28758]: I0223 14:37:05.558164 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 23 14:37:05.681727 master-0 kubenswrapper[28758]: I0223 14:37:05.681469 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 23 14:37:05.704147 master-0 kubenswrapper[28758]: I0223 14:37:05.704058 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 23 14:37:05.761708 master-0 kubenswrapper[28758]: I0223 14:37:05.761636 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 23 14:37:05.772355 master-0 kubenswrapper[28758]: I0223 14:37:05.772300 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 23 14:37:05.947429 master-0 kubenswrapper[28758]: I0223 14:37:05.947302 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 23 14:37:06.009586 master-0 kubenswrapper[28758]: I0223 14:37:06.009106 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 23 14:37:06.064961 master-0 kubenswrapper[28758]: I0223 14:37:06.064888 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 23 14:37:06.099804 master-0 kubenswrapper[28758]: I0223 14:37:06.099701 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e" path="/var/lib/kubelet/pods/eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e/volumes" Feb 23 14:37:06.115016 master-0 kubenswrapper[28758]: I0223 14:37:06.114978 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 23 14:37:06.131253 master-0 kubenswrapper[28758]: I0223 14:37:06.131174 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 23 14:37:06.132040 master-0 kubenswrapper[28758]: I0223 14:37:06.131998 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Feb 23 14:37:06.187090 master-0 kubenswrapper[28758]: I0223 14:37:06.187026 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Feb 23 14:37:06.199284 master-0 kubenswrapper[28758]: I0223 14:37:06.199093 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Feb 23 14:37:06.228876 master-0 kubenswrapper[28758]: I0223 14:37:06.228792 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-gd88p" Feb 23 14:37:06.331244 master-0 kubenswrapper[28758]: I0223 14:37:06.331116 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Feb 23 14:37:06.367969 master-0 kubenswrapper[28758]: I0223 14:37:06.367867 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 23 14:37:06.388634 master-0 kubenswrapper[28758]: I0223 14:37:06.388574 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-vzz9z" Feb 23 14:37:06.398978 master-0 kubenswrapper[28758]: I0223 14:37:06.398936 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 23 14:37:06.423988 master-0 kubenswrapper[28758]: I0223 14:37:06.423926 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 23 14:37:06.543320 master-0 kubenswrapper[28758]: I0223 14:37:06.543217 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Feb 23 14:37:06.588515 master-0 kubenswrapper[28758]: I0223 14:37:06.588401 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 23 14:37:06.605153 master-0 kubenswrapper[28758]: I0223 14:37:06.605061 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 23 14:37:06.611500 master-0 kubenswrapper[28758]: I0223 14:37:06.611352 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Feb 23 14:37:06.621248 master-0 kubenswrapper[28758]: I0223 14:37:06.621147 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 23 14:37:06.625121 master-0 kubenswrapper[28758]: I0223 14:37:06.625071 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-h82x8" Feb 23 14:37:06.731313 master-0 kubenswrapper[28758]: I0223 14:37:06.730519 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 23 14:37:06.759580 master-0 kubenswrapper[28758]: I0223 14:37:06.759521 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 23 14:37:06.832310 master-0 kubenswrapper[28758]: I0223 14:37:06.832178 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 23 14:37:06.858448 master-0 kubenswrapper[28758]: I0223 14:37:06.858384 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Feb 23 14:37:06.888978 master-0 kubenswrapper[28758]: I0223 14:37:06.888748 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Feb 23 14:37:06.949230 master-0 kubenswrapper[28758]: I0223 14:37:06.949148 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 23 14:37:06.974393 master-0 kubenswrapper[28758]: I0223 14:37:06.974305 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 23 14:37:07.010860 master-0 kubenswrapper[28758]: I0223 14:37:07.010794 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 23 14:37:07.149582 master-0 kubenswrapper[28758]: I0223 14:37:07.149372 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 23 14:37:07.186758 master-0 kubenswrapper[28758]: I0223 14:37:07.186657 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 23 14:37:07.262538 master-0 kubenswrapper[28758]: I0223 14:37:07.262420 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 23 14:37:07.287504 master-0 kubenswrapper[28758]: I0223 14:37:07.287405 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-srdtb" Feb 23 14:37:07.398046 master-0 kubenswrapper[28758]: I0223 14:37:07.397970 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 23 14:37:07.401106 master-0 kubenswrapper[28758]: I0223 14:37:07.401022 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Feb 23 14:37:07.540406 master-0 kubenswrapper[28758]: I0223 14:37:07.540306 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 23 14:37:07.591883 master-0 kubenswrapper[28758]: I0223 14:37:07.591794 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 23 14:37:07.685928 master-0 kubenswrapper[28758]: I0223 14:37:07.685852 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 23 14:37:07.739621 master-0 kubenswrapper[28758]: I0223 14:37:07.739576 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 23 14:37:07.754070 master-0 kubenswrapper[28758]: I0223 14:37:07.754037 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 23 14:37:07.762581 master-0 kubenswrapper[28758]: I0223 14:37:07.762536 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 23 14:37:07.787289 master-0 kubenswrapper[28758]: I0223 14:37:07.787239 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 23 14:37:07.799103 master-0 kubenswrapper[28758]: I0223 14:37:07.799051 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Feb 23 14:37:07.948515 master-0 kubenswrapper[28758]: I0223 14:37:07.948310 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 23 14:37:07.949959 master-0 kubenswrapper[28758]: I0223 14:37:07.949899 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 23 14:37:07.952814 master-0 kubenswrapper[28758]: I0223 14:37:07.952751 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 23 14:37:07.959215 master-0 kubenswrapper[28758]: I0223 14:37:07.959145 28758 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 23 14:37:07.977628 master-0 kubenswrapper[28758]: I0223 14:37:07.977572 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Feb 23 14:37:08.019770 master-0 kubenswrapper[28758]: I0223 14:37:08.017666 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 23 14:37:08.049956 master-0 kubenswrapper[28758]: I0223 14:37:08.049892 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 23 14:37:08.056512 master-0 kubenswrapper[28758]: I0223 14:37:08.056426 28758 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Feb 23 14:37:08.056897 master-0 kubenswrapper[28758]: I0223 14:37:08.056853 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 23 14:37:08.057024 master-0 kubenswrapper[28758]: I0223 14:37:08.056821 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="2146f0e3671998cad8bbc2464b009ab7" containerName="startup-monitor" containerID="cri-o://27ac3fa903e6a8eb9dd2efcd9b8ab1a296891591c03489473aaf98dc481f635c" gracePeriod=5 Feb 23 14:37:08.126104 master-0 kubenswrapper[28758]: I0223 14:37:08.126065 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 23 14:37:08.205011 master-0 kubenswrapper[28758]: I0223 14:37:08.204885 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 23 14:37:08.217194 master-0 kubenswrapper[28758]: I0223 14:37:08.217119 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-zc2l5" Feb 23 14:37:08.219221 master-0 kubenswrapper[28758]: I0223 14:37:08.219191 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Feb 23 14:37:08.257437 master-0 kubenswrapper[28758]: I0223 14:37:08.257379 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 23 14:37:08.291070 master-0 kubenswrapper[28758]: I0223 14:37:08.290958 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Feb 23 14:37:08.327123 master-0 kubenswrapper[28758]: I0223 14:37:08.327057 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 23 14:37:08.343612 master-0 kubenswrapper[28758]: I0223 14:37:08.343544 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-l72nr" Feb 23 14:37:08.371528 master-0 kubenswrapper[28758]: I0223 14:37:08.371432 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 23 14:37:08.397449 master-0 kubenswrapper[28758]: I0223 14:37:08.397369 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 23 14:37:08.418328 master-0 kubenswrapper[28758]: I0223 14:37:08.418250 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-r2snt" Feb 23 14:37:08.551142 master-0 kubenswrapper[28758]: I0223 14:37:08.551013 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 23 14:37:08.626987 master-0 kubenswrapper[28758]: I0223 14:37:08.626855 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 23 14:37:08.704514 master-0 kubenswrapper[28758]: I0223 14:37:08.704430 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 23 14:37:08.766292 master-0 kubenswrapper[28758]: I0223 14:37:08.766217 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 23 14:37:08.786046 master-0 kubenswrapper[28758]: I0223 14:37:08.785980 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-hrd9b" Feb 23 14:37:08.787575 master-0 kubenswrapper[28758]: I0223 14:37:08.787542 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 23 14:37:08.857235 master-0 kubenswrapper[28758]: I0223 14:37:08.857114 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 23 14:37:08.946986 master-0 kubenswrapper[28758]: I0223 14:37:08.946916 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 23 14:37:09.117807 master-0 kubenswrapper[28758]: I0223 14:37:09.117616 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-5q5j4" Feb 23 14:37:09.125666 master-0 kubenswrapper[28758]: I0223 14:37:09.125601 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 23 14:37:09.126854 master-0 kubenswrapper[28758]: I0223 14:37:09.126784 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 23 14:37:09.134113 master-0 kubenswrapper[28758]: I0223 14:37:09.134057 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Feb 23 14:37:09.144520 master-0 kubenswrapper[28758]: I0223 14:37:09.144411 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 23 14:37:09.207633 master-0 kubenswrapper[28758]: I0223 14:37:09.207516 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 23 14:37:09.213917 master-0 kubenswrapper[28758]: I0223 14:37:09.213858 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 23 14:37:09.234183 master-0 kubenswrapper[28758]: I0223 14:37:09.234117 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 23 14:37:09.240809 master-0 kubenswrapper[28758]: I0223 14:37:09.240771 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 23 14:37:09.332431 master-0 kubenswrapper[28758]: I0223 14:37:09.332344 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 23 14:37:09.372837 master-0 kubenswrapper[28758]: I0223 14:37:09.372688 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 23 14:37:09.392895 master-0 kubenswrapper[28758]: I0223 14:37:09.392809 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 23 14:37:09.408847 master-0 kubenswrapper[28758]: I0223 14:37:09.408790 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 23 14:37:09.540128 master-0 kubenswrapper[28758]: I0223 14:37:09.540046 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 23 14:37:09.552755 master-0 kubenswrapper[28758]: I0223 14:37:09.552696 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Feb 23 14:37:09.558530 master-0 kubenswrapper[28758]: I0223 14:37:09.556576 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 23 14:37:09.564051 master-0 kubenswrapper[28758]: I0223 14:37:09.562666 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 23 14:37:09.679884 master-0 kubenswrapper[28758]: I0223 14:37:09.679689 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 23 14:37:09.742330 master-0 kubenswrapper[28758]: I0223 14:37:09.742210 28758 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Feb 23 14:37:09.742699 master-0 kubenswrapper[28758]: I0223 14:37:09.742322 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="181adc3f4810f127b44f3750f5d2460c" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Feb 23 14:37:09.742699 master-0 kubenswrapper[28758]: I0223 14:37:09.742625 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:37:09.746523 master-0 kubenswrapper[28758]: I0223 14:37:09.743610 28758 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"7b6a30c67bda806ef66a202fea13c367daf0dee629c1c44dffc741cdc340946a"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Feb 23 14:37:09.746523 master-0 kubenswrapper[28758]: I0223 14:37:09.743831 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="181adc3f4810f127b44f3750f5d2460c" containerName="kube-controller-manager" containerID="cri-o://7b6a30c67bda806ef66a202fea13c367daf0dee629c1c44dffc741cdc340946a" gracePeriod=30 Feb 23 14:37:09.790892 master-0 kubenswrapper[28758]: I0223 14:37:09.790829 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 23 14:37:09.804137 master-0 kubenswrapper[28758]: I0223 14:37:09.804006 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 23 14:37:09.821430 master-0 kubenswrapper[28758]: I0223 14:37:09.821317 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-8k4nq" Feb 23 14:37:09.852187 master-0 kubenswrapper[28758]: I0223 14:37:09.852114 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 23 14:37:09.875788 master-0 kubenswrapper[28758]: I0223 14:37:09.875731 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 23 14:37:10.017950 master-0 kubenswrapper[28758]: I0223 14:37:10.017876 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 23 14:37:10.120720 master-0 kubenswrapper[28758]: I0223 14:37:10.120637 28758 patch_prober.go:28] interesting pod/console-556bbd75bc-q4jfx container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" start-of-body= Feb 23 14:37:10.120997 master-0 kubenswrapper[28758]: I0223 14:37:10.120716 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-556bbd75bc-q4jfx" podUID="0119f38b-9247-4e4c-af16-31202765777a" containerName="console" probeResult="failure" output="Get \"https://10.128.0.97:8443/health\": dial tcp 10.128.0.97:8443: connect: connection refused" Feb 23 14:37:10.210392 master-0 kubenswrapper[28758]: I0223 14:37:10.210320 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 23 14:37:10.224531 master-0 kubenswrapper[28758]: I0223 14:37:10.224464 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 23 14:37:10.239754 master-0 kubenswrapper[28758]: I0223 14:37:10.239694 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Feb 23 14:37:10.306760 master-0 kubenswrapper[28758]: I0223 14:37:10.306635 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 23 14:37:10.382019 master-0 kubenswrapper[28758]: I0223 14:37:10.381945 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 23 14:37:10.426354 master-0 kubenswrapper[28758]: I0223 14:37:10.426297 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Feb 23 14:37:10.492133 master-0 kubenswrapper[28758]: I0223 14:37:10.492066 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 23 14:37:10.528986 master-0 kubenswrapper[28758]: I0223 14:37:10.528916 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 23 14:37:10.827509 master-0 kubenswrapper[28758]: I0223 14:37:10.827429 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 23 14:37:10.860829 master-0 kubenswrapper[28758]: I0223 14:37:10.860769 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Feb 23 14:37:11.070565 master-0 kubenswrapper[28758]: I0223 14:37:11.070500 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 23 14:37:11.242711 master-0 kubenswrapper[28758]: I0223 14:37:11.242652 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 23 14:37:11.428505 master-0 kubenswrapper[28758]: I0223 14:37:11.428408 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Feb 23 14:37:11.451077 master-0 kubenswrapper[28758]: I0223 14:37:11.451003 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 23 14:37:11.616846 master-0 kubenswrapper[28758]: I0223 14:37:11.616680 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 23 14:37:11.691432 master-0 kubenswrapper[28758]: I0223 14:37:11.691351 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Feb 23 14:37:11.859104 master-0 kubenswrapper[28758]: I0223 14:37:11.859055 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 23 14:37:11.880797 master-0 kubenswrapper[28758]: I0223 14:37:11.880673 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Feb 23 14:37:11.883225 master-0 kubenswrapper[28758]: I0223 14:37:11.883175 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 23 14:37:12.101155 master-0 kubenswrapper[28758]: I0223 14:37:12.101078 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 23 14:37:12.155625 master-0 kubenswrapper[28758]: I0223 14:37:12.155396 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 23 14:37:12.166212 master-0 kubenswrapper[28758]: I0223 14:37:12.166144 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 23 14:37:12.174646 master-0 kubenswrapper[28758]: I0223 14:37:12.174567 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 23 14:37:12.333883 master-0 kubenswrapper[28758]: I0223 14:37:12.333796 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Feb 23 14:37:12.457102 master-0 kubenswrapper[28758]: I0223 14:37:12.457037 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 23 14:37:12.493423 master-0 kubenswrapper[28758]: I0223 14:37:12.493338 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 23 14:37:12.612419 master-0 kubenswrapper[28758]: I0223 14:37:12.609636 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 23 14:37:12.974887 master-0 kubenswrapper[28758]: I0223 14:37:12.974817 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 23 14:37:13.115384 master-0 kubenswrapper[28758]: I0223 14:37:13.115332 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 23 14:37:13.144002 master-0 kubenswrapper[28758]: I0223 14:37:13.143944 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 23 14:37:13.192756 master-0 kubenswrapper[28758]: I0223 14:37:13.192671 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 23 14:37:13.197518 master-0 kubenswrapper[28758]: I0223 14:37:13.197426 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-j9tjr" Feb 23 14:37:13.378801 master-0 kubenswrapper[28758]: I0223 14:37:13.378072 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 23 14:37:13.457058 master-0 kubenswrapper[28758]: I0223 14:37:13.456970 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 23 14:37:13.468467 master-0 kubenswrapper[28758]: I0223 14:37:13.468403 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 23 14:37:13.568807 master-0 kubenswrapper[28758]: I0223 14:37:13.568709 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Feb 23 14:37:13.637018 master-0 kubenswrapper[28758]: I0223 14:37:13.636910 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_2146f0e3671998cad8bbc2464b009ab7/startup-monitor/0.log" Feb 23 14:37:13.637018 master-0 kubenswrapper[28758]: I0223 14:37:13.636972 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:37:13.732419 master-0 kubenswrapper[28758]: I0223 14:37:13.732333 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/2146f0e3671998cad8bbc2464b009ab7-manifests\") pod \"2146f0e3671998cad8bbc2464b009ab7\" (UID: \"2146f0e3671998cad8bbc2464b009ab7\") " Feb 23 14:37:13.732419 master-0 kubenswrapper[28758]: I0223 14:37:13.732426 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2146f0e3671998cad8bbc2464b009ab7-var-log\") pod \"2146f0e3671998cad8bbc2464b009ab7\" (UID: \"2146f0e3671998cad8bbc2464b009ab7\") " Feb 23 14:37:13.732419 master-0 kubenswrapper[28758]: I0223 14:37:13.732448 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/2146f0e3671998cad8bbc2464b009ab7-pod-resource-dir\") pod \"2146f0e3671998cad8bbc2464b009ab7\" (UID: \"2146f0e3671998cad8bbc2464b009ab7\") " Feb 23 14:37:13.732939 master-0 kubenswrapper[28758]: I0223 14:37:13.732561 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2146f0e3671998cad8bbc2464b009ab7-resource-dir\") pod \"2146f0e3671998cad8bbc2464b009ab7\" (UID: \"2146f0e3671998cad8bbc2464b009ab7\") " Feb 23 14:37:13.732939 master-0 kubenswrapper[28758]: I0223 14:37:13.732600 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2146f0e3671998cad8bbc2464b009ab7-var-lock\") pod \"2146f0e3671998cad8bbc2464b009ab7\" (UID: \"2146f0e3671998cad8bbc2464b009ab7\") " Feb 23 14:37:13.732939 master-0 kubenswrapper[28758]: I0223 14:37:13.732717 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2146f0e3671998cad8bbc2464b009ab7-var-log" (OuterVolumeSpecName: "var-log") pod "2146f0e3671998cad8bbc2464b009ab7" (UID: "2146f0e3671998cad8bbc2464b009ab7"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:37:13.732939 master-0 kubenswrapper[28758]: I0223 14:37:13.732805 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2146f0e3671998cad8bbc2464b009ab7-var-lock" (OuterVolumeSpecName: "var-lock") pod "2146f0e3671998cad8bbc2464b009ab7" (UID: "2146f0e3671998cad8bbc2464b009ab7"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:37:13.732939 master-0 kubenswrapper[28758]: I0223 14:37:13.732730 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2146f0e3671998cad8bbc2464b009ab7-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "2146f0e3671998cad8bbc2464b009ab7" (UID: "2146f0e3671998cad8bbc2464b009ab7"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:37:13.733305 master-0 kubenswrapper[28758]: I0223 14:37:13.733262 28758 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2146f0e3671998cad8bbc2464b009ab7-resource-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:37:13.733305 master-0 kubenswrapper[28758]: I0223 14:37:13.733299 28758 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2146f0e3671998cad8bbc2464b009ab7-var-lock\") on node \"master-0\" DevicePath \"\"" Feb 23 14:37:13.733389 master-0 kubenswrapper[28758]: I0223 14:37:13.733314 28758 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2146f0e3671998cad8bbc2464b009ab7-var-log\") on node \"master-0\" DevicePath \"\"" Feb 23 14:37:13.733426 master-0 kubenswrapper[28758]: I0223 14:37:13.733412 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2146f0e3671998cad8bbc2464b009ab7-manifests" (OuterVolumeSpecName: "manifests") pod "2146f0e3671998cad8bbc2464b009ab7" (UID: "2146f0e3671998cad8bbc2464b009ab7"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:37:13.737413 master-0 kubenswrapper[28758]: I0223 14:37:13.737343 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2146f0e3671998cad8bbc2464b009ab7-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "2146f0e3671998cad8bbc2464b009ab7" (UID: "2146f0e3671998cad8bbc2464b009ab7"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:37:13.834973 master-0 kubenswrapper[28758]: I0223 14:37:13.834890 28758 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/2146f0e3671998cad8bbc2464b009ab7-manifests\") on node \"master-0\" DevicePath \"\"" Feb 23 14:37:13.834973 master-0 kubenswrapper[28758]: I0223 14:37:13.834938 28758 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/2146f0e3671998cad8bbc2464b009ab7-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:37:13.862416 master-0 kubenswrapper[28758]: I0223 14:37:13.862314 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-f7brl" Feb 23 14:37:13.985544 master-0 kubenswrapper[28758]: I0223 14:37:13.980435 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_2146f0e3671998cad8bbc2464b009ab7/startup-monitor/0.log" Feb 23 14:37:13.985544 master-0 kubenswrapper[28758]: I0223 14:37:13.980540 28758 generic.go:334] "Generic (PLEG): container finished" podID="2146f0e3671998cad8bbc2464b009ab7" containerID="27ac3fa903e6a8eb9dd2efcd9b8ab1a296891591c03489473aaf98dc481f635c" exitCode=137 Feb 23 14:37:13.985544 master-0 kubenswrapper[28758]: I0223 14:37:13.980614 28758 scope.go:117] "RemoveContainer" containerID="27ac3fa903e6a8eb9dd2efcd9b8ab1a296891591c03489473aaf98dc481f635c" Feb 23 14:37:13.985544 master-0 kubenswrapper[28758]: I0223 14:37:13.980709 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Feb 23 14:37:14.006912 master-0 kubenswrapper[28758]: I0223 14:37:14.006832 28758 scope.go:117] "RemoveContainer" containerID="27ac3fa903e6a8eb9dd2efcd9b8ab1a296891591c03489473aaf98dc481f635c" Feb 23 14:37:14.007501 master-0 kubenswrapper[28758]: E0223 14:37:14.007399 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27ac3fa903e6a8eb9dd2efcd9b8ab1a296891591c03489473aaf98dc481f635c\": container with ID starting with 27ac3fa903e6a8eb9dd2efcd9b8ab1a296891591c03489473aaf98dc481f635c not found: ID does not exist" containerID="27ac3fa903e6a8eb9dd2efcd9b8ab1a296891591c03489473aaf98dc481f635c" Feb 23 14:37:14.007647 master-0 kubenswrapper[28758]: I0223 14:37:14.007602 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27ac3fa903e6a8eb9dd2efcd9b8ab1a296891591c03489473aaf98dc481f635c"} err="failed to get container status \"27ac3fa903e6a8eb9dd2efcd9b8ab1a296891591c03489473aaf98dc481f635c\": rpc error: code = NotFound desc = could not find container \"27ac3fa903e6a8eb9dd2efcd9b8ab1a296891591c03489473aaf98dc481f635c\": container with ID starting with 27ac3fa903e6a8eb9dd2efcd9b8ab1a296891591c03489473aaf98dc481f635c not found: ID does not exist" Feb 23 14:37:14.098628 master-0 kubenswrapper[28758]: I0223 14:37:14.098575 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 23 14:37:14.100450 master-0 kubenswrapper[28758]: I0223 14:37:14.100409 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2146f0e3671998cad8bbc2464b009ab7" path="/var/lib/kubelet/pods/2146f0e3671998cad8bbc2464b009ab7/volumes" Feb 23 14:37:14.100843 master-0 kubenswrapper[28758]: I0223 14:37:14.100772 28758 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="" Feb 23 14:37:14.104968 master-0 kubenswrapper[28758]: I0223 14:37:14.104938 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 23 14:37:14.120382 master-0 kubenswrapper[28758]: I0223 14:37:14.120305 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Feb 23 14:37:14.120382 master-0 kubenswrapper[28758]: I0223 14:37:14.120350 28758 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="908c0646-c03e-4edf-a744-9b5895c4964a" Feb 23 14:37:14.125542 master-0 kubenswrapper[28758]: I0223 14:37:14.125493 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Feb 23 14:37:14.125542 master-0 kubenswrapper[28758]: I0223 14:37:14.125535 28758 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="908c0646-c03e-4edf-a744-9b5895c4964a" Feb 23 14:37:14.199247 master-0 kubenswrapper[28758]: I0223 14:37:14.199119 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7c65cbb888-4xqr4" Feb 23 14:37:14.203606 master-0 kubenswrapper[28758]: I0223 14:37:14.203562 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7c65cbb888-4xqr4" Feb 23 14:37:14.512871 master-0 kubenswrapper[28758]: I0223 14:37:14.512745 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 23 14:37:14.706988 master-0 kubenswrapper[28758]: I0223 14:37:14.706885 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 23 14:37:14.882860 master-0 kubenswrapper[28758]: I0223 14:37:14.882646 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 23 14:37:15.592979 master-0 kubenswrapper[28758]: I0223 14:37:15.592897 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 23 14:37:15.634735 master-0 kubenswrapper[28758]: I0223 14:37:15.634646 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-dnwls" Feb 23 14:37:15.909887 master-0 kubenswrapper[28758]: I0223 14:37:15.909744 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-zmwkf" Feb 23 14:37:16.197462 master-0 kubenswrapper[28758]: I0223 14:37:16.197344 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-car2df00nf4i0" Feb 23 14:37:20.126730 master-0 kubenswrapper[28758]: I0223 14:37:20.126661 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-556bbd75bc-q4jfx" Feb 23 14:37:20.130818 master-0 kubenswrapper[28758]: I0223 14:37:20.130754 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-556bbd75bc-q4jfx" Feb 23 14:37:40.207780 master-0 kubenswrapper[28758]: I0223 14:37:40.207703 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_181adc3f4810f127b44f3750f5d2460c/kube-controller-manager/1.log" Feb 23 14:37:40.210143 master-0 kubenswrapper[28758]: I0223 14:37:40.210101 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_181adc3f4810f127b44f3750f5d2460c/kube-controller-manager/0.log" Feb 23 14:37:40.210242 master-0 kubenswrapper[28758]: I0223 14:37:40.210195 28758 generic.go:334] "Generic (PLEG): container finished" podID="181adc3f4810f127b44f3750f5d2460c" containerID="7b6a30c67bda806ef66a202fea13c367daf0dee629c1c44dffc741cdc340946a" exitCode=137 Feb 23 14:37:40.210285 master-0 kubenswrapper[28758]: I0223 14:37:40.210257 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"181adc3f4810f127b44f3750f5d2460c","Type":"ContainerDied","Data":"7b6a30c67bda806ef66a202fea13c367daf0dee629c1c44dffc741cdc340946a"} Feb 23 14:37:40.210352 master-0 kubenswrapper[28758]: I0223 14:37:40.210325 28758 scope.go:117] "RemoveContainer" containerID="ea1409538bec46d9eceb195d8a31f70cddcab9c02d2f2d5acf77e88b46aed24f" Feb 23 14:37:41.222626 master-0 kubenswrapper[28758]: I0223 14:37:41.222552 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_181adc3f4810f127b44f3750f5d2460c/kube-controller-manager/1.log" Feb 23 14:37:41.223989 master-0 kubenswrapper[28758]: I0223 14:37:41.223897 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"181adc3f4810f127b44f3750f5d2460c","Type":"ContainerStarted","Data":"001d001096f1064e07116818eedc5c1059991cb817caf22d97ae69b6af8df521"} Feb 23 14:37:49.742221 master-0 kubenswrapper[28758]: I0223 14:37:49.742110 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:37:49.743173 master-0 kubenswrapper[28758]: I0223 14:37:49.742245 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:37:49.746828 master-0 kubenswrapper[28758]: I0223 14:37:49.746748 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:37:50.299764 master-0 kubenswrapper[28758]: I0223 14:37:50.299642 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:37:57.323451 master-0 kubenswrapper[28758]: I0223 14:37:57.323378 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-sxgv7"] Feb 23 14:37:57.324157 master-0 kubenswrapper[28758]: E0223 14:37:57.323716 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e" containerName="console" Feb 23 14:37:57.324157 master-0 kubenswrapper[28758]: I0223 14:37:57.323733 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e" containerName="console" Feb 23 14:37:57.324157 master-0 kubenswrapper[28758]: E0223 14:37:57.323752 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2146f0e3671998cad8bbc2464b009ab7" containerName="startup-monitor" Feb 23 14:37:57.324157 master-0 kubenswrapper[28758]: I0223 14:37:57.323760 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="2146f0e3671998cad8bbc2464b009ab7" containerName="startup-monitor" Feb 23 14:37:57.324157 master-0 kubenswrapper[28758]: E0223 14:37:57.323786 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="069569a4-34c1-4752-af70-b31bcfca4177" containerName="installer" Feb 23 14:37:57.324157 master-0 kubenswrapper[28758]: I0223 14:37:57.323795 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="069569a4-34c1-4752-af70-b31bcfca4177" containerName="installer" Feb 23 14:37:57.324157 master-0 kubenswrapper[28758]: I0223 14:37:57.323951 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="2146f0e3671998cad8bbc2464b009ab7" containerName="startup-monitor" Feb 23 14:37:57.324157 master-0 kubenswrapper[28758]: I0223 14:37:57.323983 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="069569a4-34c1-4752-af70-b31bcfca4177" containerName="installer" Feb 23 14:37:57.324157 master-0 kubenswrapper[28758]: I0223 14:37:57.324012 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb6ad4fa-0ddd-42e2-8a8c-535f1a30df9e" containerName="console" Feb 23 14:37:57.324585 master-0 kubenswrapper[28758]: I0223 14:37:57.324560 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sxgv7" Feb 23 14:37:57.348985 master-0 kubenswrapper[28758]: I0223 14:37:57.348934 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 23 14:37:57.349181 master-0 kubenswrapper[28758]: I0223 14:37:57.349144 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-gxgmg" Feb 23 14:37:57.359966 master-0 kubenswrapper[28758]: I0223 14:37:57.359635 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0325482a-66bb-476c-93ac-01b718836a37-serviceca\") pod \"node-ca-sxgv7\" (UID: \"0325482a-66bb-476c-93ac-01b718836a37\") " pod="openshift-image-registry/node-ca-sxgv7" Feb 23 14:37:57.359966 master-0 kubenswrapper[28758]: I0223 14:37:57.359676 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0325482a-66bb-476c-93ac-01b718836a37-host\") pod \"node-ca-sxgv7\" (UID: \"0325482a-66bb-476c-93ac-01b718836a37\") " pod="openshift-image-registry/node-ca-sxgv7" Feb 23 14:37:57.359966 master-0 kubenswrapper[28758]: I0223 14:37:57.359698 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk6qg\" (UniqueName: \"kubernetes.io/projected/0325482a-66bb-476c-93ac-01b718836a37-kube-api-access-vk6qg\") pod \"node-ca-sxgv7\" (UID: \"0325482a-66bb-476c-93ac-01b718836a37\") " pod="openshift-image-registry/node-ca-sxgv7" Feb 23 14:37:57.365509 master-0 kubenswrapper[28758]: I0223 14:37:57.362793 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-556bbd75bc-q4jfx"] Feb 23 14:37:57.407747 master-0 kubenswrapper[28758]: I0223 14:37:57.407671 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/metrics-server-f55d8f669-b2gf9"] Feb 23 14:37:57.408025 master-0 kubenswrapper[28758]: I0223 14:37:57.407979 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" podUID="9416f5d0-32b4-4065-b678-26913af8b6dd" containerName="metrics-server" containerID="cri-o://f866731e4ac5121ccde39a6f28422037df55500fc5889296919662d103c3a36f" gracePeriod=170 Feb 23 14:37:57.413433 master-0 kubenswrapper[28758]: I0223 14:37:57.413360 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-dbf68b6c5-fp955"] Feb 23 14:37:57.417929 master-0 kubenswrapper[28758]: I0223 14:37:57.416502 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" Feb 23 14:37:57.419326 master-0 kubenswrapper[28758]: I0223 14:37:57.418965 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Feb 23 14:37:57.419326 master-0 kubenswrapper[28758]: I0223 14:37:57.419135 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Feb 23 14:37:57.419326 master-0 kubenswrapper[28758]: I0223 14:37:57.419277 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Feb 23 14:37:57.419489 master-0 kubenswrapper[28758]: I0223 14:37:57.419391 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-dockercfg-tscn7" Feb 23 14:37:57.419574 master-0 kubenswrapper[28758]: I0223 14:37:57.419542 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Feb 23 14:37:57.419695 master-0 kubenswrapper[28758]: I0223 14:37:57.419658 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-56556ccb8b-kfqz7"] Feb 23 14:37:57.423465 master-0 kubenswrapper[28758]: I0223 14:37:57.420660 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-56556ccb8b-kfqz7" Feb 23 14:37:57.424539 master-0 kubenswrapper[28758]: I0223 14:37:57.424507 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7bb4c94777-hhjw5"] Feb 23 14:37:57.429458 master-0 kubenswrapper[28758]: I0223 14:37:57.428043 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" Feb 23 14:37:57.439604 master-0 kubenswrapper[28758]: I0223 14:37:57.433125 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Feb 23 14:37:57.439604 master-0 kubenswrapper[28758]: I0223 14:37:57.434239 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-18dsosi089n0n" Feb 23 14:37:57.439604 master-0 kubenswrapper[28758]: I0223 14:37:57.434827 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-8jdq5bq3t36v7" Feb 23 14:37:57.439604 master-0 kubenswrapper[28758]: I0223 14:37:57.435331 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Feb 23 14:37:57.439604 master-0 kubenswrapper[28758]: I0223 14:37:57.435355 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Feb 23 14:37:57.439604 master-0 kubenswrapper[28758]: I0223 14:37:57.435408 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Feb 23 14:37:57.439604 master-0 kubenswrapper[28758]: I0223 14:37:57.435468 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Feb 23 14:37:57.439604 master-0 kubenswrapper[28758]: I0223 14:37:57.435526 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Feb 23 14:37:57.439604 master-0 kubenswrapper[28758]: I0223 14:37:57.436425 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-8i12ta5c71j38" Feb 23 14:37:57.450005 master-0 kubenswrapper[28758]: I0223 14:37:57.449952 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-56556ccb8b-kfqz7"] Feb 23 14:37:57.458391 master-0 kubenswrapper[28758]: I0223 14:37:57.458348 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-dbf68b6c5-fp955"] Feb 23 14:37:57.460538 master-0 kubenswrapper[28758]: I0223 14:37:57.460488 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/29e46f35-a59a-4c26-82dd-d7573bdcb564-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7bb4c94777-hhjw5\" (UID: \"29e46f35-a59a-4c26-82dd-d7573bdcb564\") " pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" Feb 23 14:37:57.460616 master-0 kubenswrapper[28758]: I0223 14:37:57.460549 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/810a9771-08c1-45d8-944c-f9a341d90bec-audit-log\") pod \"metrics-server-56556ccb8b-kfqz7\" (UID: \"810a9771-08c1-45d8-944c-f9a341d90bec\") " pod="openshift-monitoring/metrics-server-56556ccb8b-kfqz7" Feb 23 14:37:57.460616 master-0 kubenswrapper[28758]: I0223 14:37:57.460579 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/212126c7-1db5-456c-add5-d0e3f38fe315-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-dbf68b6c5-fp955\" (UID: \"212126c7-1db5-456c-add5-d0e3f38fe315\") " pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" Feb 23 14:37:57.460616 master-0 kubenswrapper[28758]: I0223 14:37:57.460598 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/810a9771-08c1-45d8-944c-f9a341d90bec-metrics-server-audit-profiles\") pod \"metrics-server-56556ccb8b-kfqz7\" (UID: \"810a9771-08c1-45d8-944c-f9a341d90bec\") " pod="openshift-monitoring/metrics-server-56556ccb8b-kfqz7" Feb 23 14:37:57.460708 master-0 kubenswrapper[28758]: I0223 14:37:57.460632 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0325482a-66bb-476c-93ac-01b718836a37-host\") pod \"node-ca-sxgv7\" (UID: \"0325482a-66bb-476c-93ac-01b718836a37\") " pod="openshift-image-registry/node-ca-sxgv7" Feb 23 14:37:57.460708 master-0 kubenswrapper[28758]: I0223 14:37:57.460650 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0325482a-66bb-476c-93ac-01b718836a37-serviceca\") pod \"node-ca-sxgv7\" (UID: \"0325482a-66bb-476c-93ac-01b718836a37\") " pod="openshift-image-registry/node-ca-sxgv7" Feb 23 14:37:57.460708 master-0 kubenswrapper[28758]: I0223 14:37:57.460675 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk6qg\" (UniqueName: \"kubernetes.io/projected/0325482a-66bb-476c-93ac-01b718836a37-kube-api-access-vk6qg\") pod \"node-ca-sxgv7\" (UID: \"0325482a-66bb-476c-93ac-01b718836a37\") " pod="openshift-image-registry/node-ca-sxgv7" Feb 23 14:37:57.460808 master-0 kubenswrapper[28758]: I0223 14:37:57.460705 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/810a9771-08c1-45d8-944c-f9a341d90bec-secret-metrics-server-tls\") pod \"metrics-server-56556ccb8b-kfqz7\" (UID: \"810a9771-08c1-45d8-944c-f9a341d90bec\") " pod="openshift-monitoring/metrics-server-56556ccb8b-kfqz7" Feb 23 14:37:57.460808 master-0 kubenswrapper[28758]: I0223 14:37:57.460735 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/29e46f35-a59a-4c26-82dd-d7573bdcb564-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7bb4c94777-hhjw5\" (UID: \"29e46f35-a59a-4c26-82dd-d7573bdcb564\") " pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" Feb 23 14:37:57.460808 master-0 kubenswrapper[28758]: I0223 14:37:57.460759 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/212126c7-1db5-456c-add5-d0e3f38fe315-telemeter-client-tls\") pod \"telemeter-client-dbf68b6c5-fp955\" (UID: \"212126c7-1db5-456c-add5-d0e3f38fe315\") " pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" Feb 23 14:37:57.460808 master-0 kubenswrapper[28758]: I0223 14:37:57.460778 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/212126c7-1db5-456c-add5-d0e3f38fe315-serving-certs-ca-bundle\") pod \"telemeter-client-dbf68b6c5-fp955\" (UID: \"212126c7-1db5-456c-add5-d0e3f38fe315\") " pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" Feb 23 14:37:57.460808 master-0 kubenswrapper[28758]: I0223 14:37:57.460796 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dd78\" (UniqueName: \"kubernetes.io/projected/29e46f35-a59a-4c26-82dd-d7573bdcb564-kube-api-access-8dd78\") pod \"thanos-querier-7bb4c94777-hhjw5\" (UID: \"29e46f35-a59a-4c26-82dd-d7573bdcb564\") " pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" Feb 23 14:37:57.460978 master-0 kubenswrapper[28758]: I0223 14:37:57.460874 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/212126c7-1db5-456c-add5-d0e3f38fe315-metrics-client-ca\") pod \"telemeter-client-dbf68b6c5-fp955\" (UID: \"212126c7-1db5-456c-add5-d0e3f38fe315\") " pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" Feb 23 14:37:57.460978 master-0 kubenswrapper[28758]: I0223 14:37:57.460953 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0325482a-66bb-476c-93ac-01b718836a37-host\") pod \"node-ca-sxgv7\" (UID: \"0325482a-66bb-476c-93ac-01b718836a37\") " pod="openshift-image-registry/node-ca-sxgv7" Feb 23 14:37:57.461134 master-0 kubenswrapper[28758]: I0223 14:37:57.461101 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/29e46f35-a59a-4c26-82dd-d7573bdcb564-secret-thanos-querier-tls\") pod \"thanos-querier-7bb4c94777-hhjw5\" (UID: \"29e46f35-a59a-4c26-82dd-d7573bdcb564\") " pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" Feb 23 14:37:57.461182 master-0 kubenswrapper[28758]: I0223 14:37:57.461137 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/29e46f35-a59a-4c26-82dd-d7573bdcb564-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7bb4c94777-hhjw5\" (UID: \"29e46f35-a59a-4c26-82dd-d7573bdcb564\") " pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" Feb 23 14:37:57.461301 master-0 kubenswrapper[28758]: I0223 14:37:57.461272 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0325482a-66bb-476c-93ac-01b718836a37-serviceca\") pod \"node-ca-sxgv7\" (UID: \"0325482a-66bb-476c-93ac-01b718836a37\") " pod="openshift-image-registry/node-ca-sxgv7" Feb 23 14:37:57.461343 master-0 kubenswrapper[28758]: I0223 14:37:57.461275 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/810a9771-08c1-45d8-944c-f9a341d90bec-secret-metrics-client-certs\") pod \"metrics-server-56556ccb8b-kfqz7\" (UID: \"810a9771-08c1-45d8-944c-f9a341d90bec\") " pod="openshift-monitoring/metrics-server-56556ccb8b-kfqz7" Feb 23 14:37:57.461386 master-0 kubenswrapper[28758]: I0223 14:37:57.461366 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vqcj\" (UniqueName: \"kubernetes.io/projected/212126c7-1db5-456c-add5-d0e3f38fe315-kube-api-access-4vqcj\") pod \"telemeter-client-dbf68b6c5-fp955\" (UID: \"212126c7-1db5-456c-add5-d0e3f38fe315\") " pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" Feb 23 14:37:57.461438 master-0 kubenswrapper[28758]: I0223 14:37:57.461408 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/810a9771-08c1-45d8-944c-f9a341d90bec-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-56556ccb8b-kfqz7\" (UID: \"810a9771-08c1-45d8-944c-f9a341d90bec\") " pod="openshift-monitoring/metrics-server-56556ccb8b-kfqz7" Feb 23 14:37:57.461438 master-0 kubenswrapper[28758]: I0223 14:37:57.461429 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/29e46f35-a59a-4c26-82dd-d7573bdcb564-metrics-client-ca\") pod \"thanos-querier-7bb4c94777-hhjw5\" (UID: \"29e46f35-a59a-4c26-82dd-d7573bdcb564\") " pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" Feb 23 14:37:57.461556 master-0 kubenswrapper[28758]: I0223 14:37:57.461449 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/810a9771-08c1-45d8-944c-f9a341d90bec-client-ca-bundle\") pod \"metrics-server-56556ccb8b-kfqz7\" (UID: \"810a9771-08c1-45d8-944c-f9a341d90bec\") " pod="openshift-monitoring/metrics-server-56556ccb8b-kfqz7" Feb 23 14:37:57.462134 master-0 kubenswrapper[28758]: I0223 14:37:57.461784 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/29e46f35-a59a-4c26-82dd-d7573bdcb564-secret-grpc-tls\") pod \"thanos-querier-7bb4c94777-hhjw5\" (UID: \"29e46f35-a59a-4c26-82dd-d7573bdcb564\") " pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" Feb 23 14:37:57.462134 master-0 kubenswrapper[28758]: I0223 14:37:57.461826 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpdf8\" (UniqueName: \"kubernetes.io/projected/810a9771-08c1-45d8-944c-f9a341d90bec-kube-api-access-xpdf8\") pod \"metrics-server-56556ccb8b-kfqz7\" (UID: \"810a9771-08c1-45d8-944c-f9a341d90bec\") " pod="openshift-monitoring/metrics-server-56556ccb8b-kfqz7" Feb 23 14:37:57.462134 master-0 kubenswrapper[28758]: I0223 14:37:57.461845 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/29e46f35-a59a-4c26-82dd-d7573bdcb564-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7bb4c94777-hhjw5\" (UID: \"29e46f35-a59a-4c26-82dd-d7573bdcb564\") " pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" Feb 23 14:37:57.462134 master-0 kubenswrapper[28758]: I0223 14:37:57.461884 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/212126c7-1db5-456c-add5-d0e3f38fe315-federate-client-tls\") pod \"telemeter-client-dbf68b6c5-fp955\" (UID: \"212126c7-1db5-456c-add5-d0e3f38fe315\") " pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" Feb 23 14:37:57.462134 master-0 kubenswrapper[28758]: I0223 14:37:57.461904 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/212126c7-1db5-456c-add5-d0e3f38fe315-telemeter-trusted-ca-bundle\") pod \"telemeter-client-dbf68b6c5-fp955\" (UID: \"212126c7-1db5-456c-add5-d0e3f38fe315\") " pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" Feb 23 14:37:57.462134 master-0 kubenswrapper[28758]: I0223 14:37:57.461946 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/212126c7-1db5-456c-add5-d0e3f38fe315-secret-telemeter-client\") pod \"telemeter-client-dbf68b6c5-fp955\" (UID: \"212126c7-1db5-456c-add5-d0e3f38fe315\") " pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" Feb 23 14:37:57.464372 master-0 kubenswrapper[28758]: I0223 14:37:57.464311 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7bb4c94777-hhjw5"] Feb 23 14:37:57.477606 master-0 kubenswrapper[28758]: I0223 14:37:57.476986 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk6qg\" (UniqueName: \"kubernetes.io/projected/0325482a-66bb-476c-93ac-01b718836a37-kube-api-access-vk6qg\") pod \"node-ca-sxgv7\" (UID: \"0325482a-66bb-476c-93ac-01b718836a37\") " pod="openshift-image-registry/node-ca-sxgv7" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.563359 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vqcj\" (UniqueName: \"kubernetes.io/projected/212126c7-1db5-456c-add5-d0e3f38fe315-kube-api-access-4vqcj\") pod \"telemeter-client-dbf68b6c5-fp955\" (UID: \"212126c7-1db5-456c-add5-d0e3f38fe315\") " pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.563443 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/810a9771-08c1-45d8-944c-f9a341d90bec-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-56556ccb8b-kfqz7\" (UID: \"810a9771-08c1-45d8-944c-f9a341d90bec\") " pod="openshift-monitoring/metrics-server-56556ccb8b-kfqz7" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.563612 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/29e46f35-a59a-4c26-82dd-d7573bdcb564-metrics-client-ca\") pod \"thanos-querier-7bb4c94777-hhjw5\" (UID: \"29e46f35-a59a-4c26-82dd-d7573bdcb564\") " pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.563681 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/810a9771-08c1-45d8-944c-f9a341d90bec-client-ca-bundle\") pod \"metrics-server-56556ccb8b-kfqz7\" (UID: \"810a9771-08c1-45d8-944c-f9a341d90bec\") " pod="openshift-monitoring/metrics-server-56556ccb8b-kfqz7" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.563869 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/29e46f35-a59a-4c26-82dd-d7573bdcb564-secret-grpc-tls\") pod \"thanos-querier-7bb4c94777-hhjw5\" (UID: \"29e46f35-a59a-4c26-82dd-d7573bdcb564\") " pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.563953 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpdf8\" (UniqueName: \"kubernetes.io/projected/810a9771-08c1-45d8-944c-f9a341d90bec-kube-api-access-xpdf8\") pod \"metrics-server-56556ccb8b-kfqz7\" (UID: \"810a9771-08c1-45d8-944c-f9a341d90bec\") " pod="openshift-monitoring/metrics-server-56556ccb8b-kfqz7" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.563977 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/29e46f35-a59a-4c26-82dd-d7573bdcb564-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7bb4c94777-hhjw5\" (UID: \"29e46f35-a59a-4c26-82dd-d7573bdcb564\") " pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.564024 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/212126c7-1db5-456c-add5-d0e3f38fe315-federate-client-tls\") pod \"telemeter-client-dbf68b6c5-fp955\" (UID: \"212126c7-1db5-456c-add5-d0e3f38fe315\") " pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.564043 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/212126c7-1db5-456c-add5-d0e3f38fe315-telemeter-trusted-ca-bundle\") pod \"telemeter-client-dbf68b6c5-fp955\" (UID: \"212126c7-1db5-456c-add5-d0e3f38fe315\") " pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.564103 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/212126c7-1db5-456c-add5-d0e3f38fe315-secret-telemeter-client\") pod \"telemeter-client-dbf68b6c5-fp955\" (UID: \"212126c7-1db5-456c-add5-d0e3f38fe315\") " pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.564136 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/29e46f35-a59a-4c26-82dd-d7573bdcb564-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7bb4c94777-hhjw5\" (UID: \"29e46f35-a59a-4c26-82dd-d7573bdcb564\") " pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.564176 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/810a9771-08c1-45d8-944c-f9a341d90bec-audit-log\") pod \"metrics-server-56556ccb8b-kfqz7\" (UID: \"810a9771-08c1-45d8-944c-f9a341d90bec\") " pod="openshift-monitoring/metrics-server-56556ccb8b-kfqz7" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.564208 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/212126c7-1db5-456c-add5-d0e3f38fe315-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-dbf68b6c5-fp955\" (UID: \"212126c7-1db5-456c-add5-d0e3f38fe315\") " pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.564230 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/810a9771-08c1-45d8-944c-f9a341d90bec-metrics-server-audit-profiles\") pod \"metrics-server-56556ccb8b-kfqz7\" (UID: \"810a9771-08c1-45d8-944c-f9a341d90bec\") " pod="openshift-monitoring/metrics-server-56556ccb8b-kfqz7" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.564712 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/810a9771-08c1-45d8-944c-f9a341d90bec-audit-log\") pod \"metrics-server-56556ccb8b-kfqz7\" (UID: \"810a9771-08c1-45d8-944c-f9a341d90bec\") " pod="openshift-monitoring/metrics-server-56556ccb8b-kfqz7" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.564784 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/810a9771-08c1-45d8-944c-f9a341d90bec-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-56556ccb8b-kfqz7\" (UID: \"810a9771-08c1-45d8-944c-f9a341d90bec\") " pod="openshift-monitoring/metrics-server-56556ccb8b-kfqz7" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.564881 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/29e46f35-a59a-4c26-82dd-d7573bdcb564-metrics-client-ca\") pod \"thanos-querier-7bb4c94777-hhjw5\" (UID: \"29e46f35-a59a-4c26-82dd-d7573bdcb564\") " pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.565043 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/810a9771-08c1-45d8-944c-f9a341d90bec-secret-metrics-server-tls\") pod \"metrics-server-56556ccb8b-kfqz7\" (UID: \"810a9771-08c1-45d8-944c-f9a341d90bec\") " pod="openshift-monitoring/metrics-server-56556ccb8b-kfqz7" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.565087 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/29e46f35-a59a-4c26-82dd-d7573bdcb564-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7bb4c94777-hhjw5\" (UID: \"29e46f35-a59a-4c26-82dd-d7573bdcb564\") " pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.565136 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/212126c7-1db5-456c-add5-d0e3f38fe315-telemeter-client-tls\") pod \"telemeter-client-dbf68b6c5-fp955\" (UID: \"212126c7-1db5-456c-add5-d0e3f38fe315\") " pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.565166 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/212126c7-1db5-456c-add5-d0e3f38fe315-serving-certs-ca-bundle\") pod \"telemeter-client-dbf68b6c5-fp955\" (UID: \"212126c7-1db5-456c-add5-d0e3f38fe315\") " pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.565198 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dd78\" (UniqueName: \"kubernetes.io/projected/29e46f35-a59a-4c26-82dd-d7573bdcb564-kube-api-access-8dd78\") pod \"thanos-querier-7bb4c94777-hhjw5\" (UID: \"29e46f35-a59a-4c26-82dd-d7573bdcb564\") " pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.565236 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/212126c7-1db5-456c-add5-d0e3f38fe315-metrics-client-ca\") pod \"telemeter-client-dbf68b6c5-fp955\" (UID: \"212126c7-1db5-456c-add5-d0e3f38fe315\") " pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.565275 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/29e46f35-a59a-4c26-82dd-d7573bdcb564-secret-thanos-querier-tls\") pod \"thanos-querier-7bb4c94777-hhjw5\" (UID: \"29e46f35-a59a-4c26-82dd-d7573bdcb564\") " pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.565300 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/29e46f35-a59a-4c26-82dd-d7573bdcb564-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7bb4c94777-hhjw5\" (UID: \"29e46f35-a59a-4c26-82dd-d7573bdcb564\") " pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.565348 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/810a9771-08c1-45d8-944c-f9a341d90bec-secret-metrics-client-certs\") pod \"metrics-server-56556ccb8b-kfqz7\" (UID: \"810a9771-08c1-45d8-944c-f9a341d90bec\") " pod="openshift-monitoring/metrics-server-56556ccb8b-kfqz7" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.565435 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/810a9771-08c1-45d8-944c-f9a341d90bec-metrics-server-audit-profiles\") pod \"metrics-server-56556ccb8b-kfqz7\" (UID: \"810a9771-08c1-45d8-944c-f9a341d90bec\") " pod="openshift-monitoring/metrics-server-56556ccb8b-kfqz7" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.566166 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/212126c7-1db5-456c-add5-d0e3f38fe315-serving-certs-ca-bundle\") pod \"telemeter-client-dbf68b6c5-fp955\" (UID: \"212126c7-1db5-456c-add5-d0e3f38fe315\") " pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: E0223 14:37:57.566690 28758 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: E0223 14:37:57.566825 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/212126c7-1db5-456c-add5-d0e3f38fe315-telemeter-client-tls podName:212126c7-1db5-456c-add5-d0e3f38fe315 nodeName:}" failed. No retries permitted until 2026-02-23 14:37:58.066795296 +0000 UTC m=+210.193111228 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/212126c7-1db5-456c-add5-d0e3f38fe315-telemeter-client-tls") pod "telemeter-client-dbf68b6c5-fp955" (UID: "212126c7-1db5-456c-add5-d0e3f38fe315") : secret "telemeter-client-tls" not found Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.567566 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/810a9771-08c1-45d8-944c-f9a341d90bec-client-ca-bundle\") pod \"metrics-server-56556ccb8b-kfqz7\" (UID: \"810a9771-08c1-45d8-944c-f9a341d90bec\") " pod="openshift-monitoring/metrics-server-56556ccb8b-kfqz7" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.567581 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/212126c7-1db5-456c-add5-d0e3f38fe315-secret-telemeter-client\") pod \"telemeter-client-dbf68b6c5-fp955\" (UID: \"212126c7-1db5-456c-add5-d0e3f38fe315\") " pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.567773 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/212126c7-1db5-456c-add5-d0e3f38fe315-metrics-client-ca\") pod \"telemeter-client-dbf68b6c5-fp955\" (UID: \"212126c7-1db5-456c-add5-d0e3f38fe315\") " pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.568058 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/29e46f35-a59a-4c26-82dd-d7573bdcb564-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7bb4c94777-hhjw5\" (UID: \"29e46f35-a59a-4c26-82dd-d7573bdcb564\") " pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" Feb 23 14:37:57.568905 master-0 kubenswrapper[28758]: I0223 14:37:57.569384 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/212126c7-1db5-456c-add5-d0e3f38fe315-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-dbf68b6c5-fp955\" (UID: \"212126c7-1db5-456c-add5-d0e3f38fe315\") " pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" Feb 23 14:37:57.570881 master-0 kubenswrapper[28758]: I0223 14:37:57.569613 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/29e46f35-a59a-4c26-82dd-d7573bdcb564-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7bb4c94777-hhjw5\" (UID: \"29e46f35-a59a-4c26-82dd-d7573bdcb564\") " pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" Feb 23 14:37:57.570881 master-0 kubenswrapper[28758]: I0223 14:37:57.569930 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/212126c7-1db5-456c-add5-d0e3f38fe315-telemeter-trusted-ca-bundle\") pod \"telemeter-client-dbf68b6c5-fp955\" (UID: \"212126c7-1db5-456c-add5-d0e3f38fe315\") " pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" Feb 23 14:37:57.570881 master-0 kubenswrapper[28758]: I0223 14:37:57.569952 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/29e46f35-a59a-4c26-82dd-d7573bdcb564-secret-grpc-tls\") pod \"thanos-querier-7bb4c94777-hhjw5\" (UID: \"29e46f35-a59a-4c26-82dd-d7573bdcb564\") " pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" Feb 23 14:37:57.570881 master-0 kubenswrapper[28758]: I0223 14:37:57.570785 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/29e46f35-a59a-4c26-82dd-d7573bdcb564-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7bb4c94777-hhjw5\" (UID: \"29e46f35-a59a-4c26-82dd-d7573bdcb564\") " pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" Feb 23 14:37:57.571115 master-0 kubenswrapper[28758]: I0223 14:37:57.571081 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/29e46f35-a59a-4c26-82dd-d7573bdcb564-secret-thanos-querier-tls\") pod \"thanos-querier-7bb4c94777-hhjw5\" (UID: \"29e46f35-a59a-4c26-82dd-d7573bdcb564\") " pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" Feb 23 14:37:57.571906 master-0 kubenswrapper[28758]: I0223 14:37:57.571875 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/29e46f35-a59a-4c26-82dd-d7573bdcb564-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7bb4c94777-hhjw5\" (UID: \"29e46f35-a59a-4c26-82dd-d7573bdcb564\") " pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" Feb 23 14:37:57.572941 master-0 kubenswrapper[28758]: I0223 14:37:57.572901 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/212126c7-1db5-456c-add5-d0e3f38fe315-federate-client-tls\") pod \"telemeter-client-dbf68b6c5-fp955\" (UID: \"212126c7-1db5-456c-add5-d0e3f38fe315\") " pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" Feb 23 14:37:57.581940 master-0 kubenswrapper[28758]: I0223 14:37:57.581813 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vqcj\" (UniqueName: \"kubernetes.io/projected/212126c7-1db5-456c-add5-d0e3f38fe315-kube-api-access-4vqcj\") pod \"telemeter-client-dbf68b6c5-fp955\" (UID: \"212126c7-1db5-456c-add5-d0e3f38fe315\") " pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" Feb 23 14:37:57.582098 master-0 kubenswrapper[28758]: I0223 14:37:57.582055 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/810a9771-08c1-45d8-944c-f9a341d90bec-secret-metrics-client-certs\") pod \"metrics-server-56556ccb8b-kfqz7\" (UID: \"810a9771-08c1-45d8-944c-f9a341d90bec\") " pod="openshift-monitoring/metrics-server-56556ccb8b-kfqz7" Feb 23 14:37:57.582384 master-0 kubenswrapper[28758]: I0223 14:37:57.582318 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/810a9771-08c1-45d8-944c-f9a341d90bec-secret-metrics-server-tls\") pod \"metrics-server-56556ccb8b-kfqz7\" (UID: \"810a9771-08c1-45d8-944c-f9a341d90bec\") " pod="openshift-monitoring/metrics-server-56556ccb8b-kfqz7" Feb 23 14:37:57.584749 master-0 kubenswrapper[28758]: I0223 14:37:57.584702 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpdf8\" (UniqueName: \"kubernetes.io/projected/810a9771-08c1-45d8-944c-f9a341d90bec-kube-api-access-xpdf8\") pod \"metrics-server-56556ccb8b-kfqz7\" (UID: \"810a9771-08c1-45d8-944c-f9a341d90bec\") " pod="openshift-monitoring/metrics-server-56556ccb8b-kfqz7" Feb 23 14:37:57.588656 master-0 kubenswrapper[28758]: I0223 14:37:57.588622 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dd78\" (UniqueName: \"kubernetes.io/projected/29e46f35-a59a-4c26-82dd-d7573bdcb564-kube-api-access-8dd78\") pod \"thanos-querier-7bb4c94777-hhjw5\" (UID: \"29e46f35-a59a-4c26-82dd-d7573bdcb564\") " pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" Feb 23 14:37:57.656031 master-0 kubenswrapper[28758]: I0223 14:37:57.655973 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sxgv7" Feb 23 14:37:57.772294 master-0 kubenswrapper[28758]: I0223 14:37:57.772222 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-56556ccb8b-kfqz7" Feb 23 14:37:57.772971 master-0 kubenswrapper[28758]: I0223 14:37:57.772910 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" Feb 23 14:37:58.076496 master-0 kubenswrapper[28758]: I0223 14:37:58.076425 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/212126c7-1db5-456c-add5-d0e3f38fe315-telemeter-client-tls\") pod \"telemeter-client-dbf68b6c5-fp955\" (UID: \"212126c7-1db5-456c-add5-d0e3f38fe315\") " pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" Feb 23 14:37:58.076703 master-0 kubenswrapper[28758]: E0223 14:37:58.076673 28758 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Feb 23 14:37:58.076766 master-0 kubenswrapper[28758]: E0223 14:37:58.076746 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/212126c7-1db5-456c-add5-d0e3f38fe315-telemeter-client-tls podName:212126c7-1db5-456c-add5-d0e3f38fe315 nodeName:}" failed. No retries permitted until 2026-02-23 14:37:59.076725878 +0000 UTC m=+211.203041820 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/212126c7-1db5-456c-add5-d0e3f38fe315-telemeter-client-tls") pod "telemeter-client-dbf68b6c5-fp955" (UID: "212126c7-1db5-456c-add5-d0e3f38fe315") : secret "telemeter-client-tls" not found Feb 23 14:37:58.217956 master-0 kubenswrapper[28758]: I0223 14:37:58.217903 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-56556ccb8b-kfqz7"] Feb 23 14:37:58.222619 master-0 kubenswrapper[28758]: W0223 14:37:58.222576 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod810a9771_08c1_45d8_944c_f9a341d90bec.slice/crio-ca7e5222e9dcf68f9676bf33963aa1f4285a0613cf4e2dff469dc8008a9a0e24 WatchSource:0}: Error finding container ca7e5222e9dcf68f9676bf33963aa1f4285a0613cf4e2dff469dc8008a9a0e24: Status 404 returned error can't find the container with id ca7e5222e9dcf68f9676bf33963aa1f4285a0613cf4e2dff469dc8008a9a0e24 Feb 23 14:37:58.347327 master-0 kubenswrapper[28758]: I0223 14:37:58.347130 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7bb4c94777-hhjw5"] Feb 23 14:37:58.356347 master-0 kubenswrapper[28758]: I0223 14:37:58.356269 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-56556ccb8b-kfqz7" event={"ID":"810a9771-08c1-45d8-944c-f9a341d90bec","Type":"ContainerStarted","Data":"ca7e5222e9dcf68f9676bf33963aa1f4285a0613cf4e2dff469dc8008a9a0e24"} Feb 23 14:37:58.358623 master-0 kubenswrapper[28758]: I0223 14:37:58.358555 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sxgv7" event={"ID":"0325482a-66bb-476c-93ac-01b718836a37","Type":"ContainerStarted","Data":"3d50e78d1879d5076cb93051677bc5071f530bde49b918ce770907d1e3922b84"} Feb 23 14:37:58.365809 master-0 kubenswrapper[28758]: W0223 14:37:58.362615 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29e46f35_a59a_4c26_82dd_d7573bdcb564.slice/crio-5ef40643a5ffd42fc53745898d7929c95fb9f1477fe439c1c55dccb78a97f5ae WatchSource:0}: Error finding container 5ef40643a5ffd42fc53745898d7929c95fb9f1477fe439c1c55dccb78a97f5ae: Status 404 returned error can't find the container with id 5ef40643a5ffd42fc53745898d7929c95fb9f1477fe439c1c55dccb78a97f5ae Feb 23 14:37:59.111018 master-0 kubenswrapper[28758]: I0223 14:37:59.110943 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/212126c7-1db5-456c-add5-d0e3f38fe315-telemeter-client-tls\") pod \"telemeter-client-dbf68b6c5-fp955\" (UID: \"212126c7-1db5-456c-add5-d0e3f38fe315\") " pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" Feb 23 14:37:59.111374 master-0 kubenswrapper[28758]: E0223 14:37:59.111196 28758 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Feb 23 14:37:59.111374 master-0 kubenswrapper[28758]: E0223 14:37:59.111284 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/212126c7-1db5-456c-add5-d0e3f38fe315-telemeter-client-tls podName:212126c7-1db5-456c-add5-d0e3f38fe315 nodeName:}" failed. No retries permitted until 2026-02-23 14:38:01.111264752 +0000 UTC m=+213.237580684 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/212126c7-1db5-456c-add5-d0e3f38fe315-telemeter-client-tls") pod "telemeter-client-dbf68b6c5-fp955" (UID: "212126c7-1db5-456c-add5-d0e3f38fe315") : secret "telemeter-client-tls" not found Feb 23 14:37:59.368139 master-0 kubenswrapper[28758]: I0223 14:37:59.367988 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-56556ccb8b-kfqz7" event={"ID":"810a9771-08c1-45d8-944c-f9a341d90bec","Type":"ContainerStarted","Data":"ece0fe7c926a59d7fe98108d7afa1818701502c654996a46b75abe2cc4f693fa"} Feb 23 14:37:59.369180 master-0 kubenswrapper[28758]: I0223 14:37:59.369135 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" event={"ID":"29e46f35-a59a-4c26-82dd-d7573bdcb564","Type":"ContainerStarted","Data":"5ef40643a5ffd42fc53745898d7929c95fb9f1477fe439c1c55dccb78a97f5ae"} Feb 23 14:37:59.393494 master-0 kubenswrapper[28758]: I0223 14:37:59.393391 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-56556ccb8b-kfqz7" podStartSLOduration=2.393364477 podStartE2EDuration="2.393364477s" podCreationTimestamp="2026-02-23 14:37:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:37:59.387235605 +0000 UTC m=+211.513551547" watchObservedRunningTime="2026-02-23 14:37:59.393364477 +0000 UTC m=+211.519680409" Feb 23 14:38:00.376192 master-0 kubenswrapper[28758]: I0223 14:38:00.376125 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sxgv7" event={"ID":"0325482a-66bb-476c-93ac-01b718836a37","Type":"ContainerStarted","Data":"12d9683968eb833586323efe2ffbba03e8baeec7cbec700d10f761ba06bb903d"} Feb 23 14:38:00.400597 master-0 kubenswrapper[28758]: I0223 14:38:00.400496 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-sxgv7" podStartSLOduration=1.606781726 podStartE2EDuration="3.40045915s" podCreationTimestamp="2026-02-23 14:37:57 +0000 UTC" firstStartedPulling="2026-02-23 14:37:57.677158745 +0000 UTC m=+209.803474677" lastFinishedPulling="2026-02-23 14:37:59.470836169 +0000 UTC m=+211.597152101" observedRunningTime="2026-02-23 14:38:00.39967658 +0000 UTC m=+212.525992522" watchObservedRunningTime="2026-02-23 14:38:00.40045915 +0000 UTC m=+212.526775072" Feb 23 14:38:01.153729 master-0 kubenswrapper[28758]: I0223 14:38:01.153641 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/212126c7-1db5-456c-add5-d0e3f38fe315-telemeter-client-tls\") pod \"telemeter-client-dbf68b6c5-fp955\" (UID: \"212126c7-1db5-456c-add5-d0e3f38fe315\") " pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" Feb 23 14:38:01.154173 master-0 kubenswrapper[28758]: E0223 14:38:01.153857 28758 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Feb 23 14:38:01.154173 master-0 kubenswrapper[28758]: E0223 14:38:01.153958 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/212126c7-1db5-456c-add5-d0e3f38fe315-telemeter-client-tls podName:212126c7-1db5-456c-add5-d0e3f38fe315 nodeName:}" failed. No retries permitted until 2026-02-23 14:38:05.153934849 +0000 UTC m=+217.280250771 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/212126c7-1db5-456c-add5-d0e3f38fe315-telemeter-client-tls") pod "telemeter-client-dbf68b6c5-fp955" (UID: "212126c7-1db5-456c-add5-d0e3f38fe315") : secret "telemeter-client-tls" not found Feb 23 14:38:02.396176 master-0 kubenswrapper[28758]: I0223 14:38:02.396095 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" event={"ID":"29e46f35-a59a-4c26-82dd-d7573bdcb564","Type":"ContainerStarted","Data":"945b7c4243d0c47fa75a7ec4edc1e5829ea91b5e255e83ffee5db43274d81d6a"} Feb 23 14:38:02.396176 master-0 kubenswrapper[28758]: I0223 14:38:02.396161 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" event={"ID":"29e46f35-a59a-4c26-82dd-d7573bdcb564","Type":"ContainerStarted","Data":"1c27cff01f3c50288575cdb0a0703b02ba42704de13f9409ccf4a8348cbb196b"} Feb 23 14:38:02.396176 master-0 kubenswrapper[28758]: I0223 14:38:02.396177 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" event={"ID":"29e46f35-a59a-4c26-82dd-d7573bdcb564","Type":"ContainerStarted","Data":"cec3e6bb79e0f20844ebaea54c3dc7dc1642d7c731c3d3725bd4d3304db06559"} Feb 23 14:38:03.409188 master-0 kubenswrapper[28758]: I0223 14:38:03.408621 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" event={"ID":"29e46f35-a59a-4c26-82dd-d7573bdcb564","Type":"ContainerStarted","Data":"89495ecf2d0dceda02469bc672c9501a1da0fc518b4756795d2a5daeb9a8176e"} Feb 23 14:38:03.409188 master-0 kubenswrapper[28758]: I0223 14:38:03.408718 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" event={"ID":"29e46f35-a59a-4c26-82dd-d7573bdcb564","Type":"ContainerStarted","Data":"36933e67a4875dfdfe7bb17ed4adf9c360084063208ebb8e4a32a70b4b0d82f1"} Feb 23 14:38:03.409188 master-0 kubenswrapper[28758]: I0223 14:38:03.408733 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" event={"ID":"29e46f35-a59a-4c26-82dd-d7573bdcb564","Type":"ContainerStarted","Data":"4559525aafd0bb3c4227292b720decf4ee82b9df11bbc41b24cda7e8fc803c1a"} Feb 23 14:38:03.409188 master-0 kubenswrapper[28758]: I0223 14:38:03.408864 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" Feb 23 14:38:05.223724 master-0 kubenswrapper[28758]: I0223 14:38:05.223637 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/212126c7-1db5-456c-add5-d0e3f38fe315-telemeter-client-tls\") pod \"telemeter-client-dbf68b6c5-fp955\" (UID: \"212126c7-1db5-456c-add5-d0e3f38fe315\") " pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" Feb 23 14:38:05.224888 master-0 kubenswrapper[28758]: E0223 14:38:05.224799 28758 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Feb 23 14:38:05.225009 master-0 kubenswrapper[28758]: E0223 14:38:05.224948 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/212126c7-1db5-456c-add5-d0e3f38fe315-telemeter-client-tls podName:212126c7-1db5-456c-add5-d0e3f38fe315 nodeName:}" failed. No retries permitted until 2026-02-23 14:38:13.224920316 +0000 UTC m=+225.351236278 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/212126c7-1db5-456c-add5-d0e3f38fe315-telemeter-client-tls") pod "telemeter-client-dbf68b6c5-fp955" (UID: "212126c7-1db5-456c-add5-d0e3f38fe315") : secret "telemeter-client-tls" not found Feb 23 14:38:07.780962 master-0 kubenswrapper[28758]: I0223 14:38:07.780898 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" Feb 23 14:38:07.809462 master-0 kubenswrapper[28758]: I0223 14:38:07.809370 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7bb4c94777-hhjw5" podStartSLOduration=6.457467124 podStartE2EDuration="10.809347292s" podCreationTimestamp="2026-02-23 14:37:57 +0000 UTC" firstStartedPulling="2026-02-23 14:37:58.366816652 +0000 UTC m=+210.493132584" lastFinishedPulling="2026-02-23 14:38:02.7186968 +0000 UTC m=+214.845012752" observedRunningTime="2026-02-23 14:38:03.430937302 +0000 UTC m=+215.557253254" watchObservedRunningTime="2026-02-23 14:38:07.809347292 +0000 UTC m=+219.935663224" Feb 23 14:38:13.274462 master-0 kubenswrapper[28758]: I0223 14:38:13.274383 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/212126c7-1db5-456c-add5-d0e3f38fe315-telemeter-client-tls\") pod \"telemeter-client-dbf68b6c5-fp955\" (UID: \"212126c7-1db5-456c-add5-d0e3f38fe315\") " pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" Feb 23 14:38:13.275139 master-0 kubenswrapper[28758]: E0223 14:38:13.274644 28758 secret.go:189] Couldn't get secret openshift-monitoring/telemeter-client-tls: secret "telemeter-client-tls" not found Feb 23 14:38:13.275139 master-0 kubenswrapper[28758]: E0223 14:38:13.274752 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/212126c7-1db5-456c-add5-d0e3f38fe315-telemeter-client-tls podName:212126c7-1db5-456c-add5-d0e3f38fe315 nodeName:}" failed. No retries permitted until 2026-02-23 14:38:29.274728811 +0000 UTC m=+241.401044743 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "telemeter-client-tls" (UniqueName: "kubernetes.io/secret/212126c7-1db5-456c-add5-d0e3f38fe315-telemeter-client-tls") pod "telemeter-client-dbf68b6c5-fp955" (UID: "212126c7-1db5-456c-add5-d0e3f38fe315") : secret "telemeter-client-tls" not found Feb 23 14:38:17.773698 master-0 kubenswrapper[28758]: I0223 14:38:17.773587 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-56556ccb8b-kfqz7" Feb 23 14:38:17.773698 master-0 kubenswrapper[28758]: I0223 14:38:17.773676 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-56556ccb8b-kfqz7" Feb 23 14:38:19.121334 master-0 kubenswrapper[28758]: I0223 14:38:19.119802 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Feb 23 14:38:19.131336 master-0 kubenswrapper[28758]: I0223 14:38:19.131267 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.134693 master-0 kubenswrapper[28758]: I0223 14:38:19.132837 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Feb 23 14:38:19.134693 master-0 kubenswrapper[28758]: I0223 14:38:19.133288 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Feb 23 14:38:19.134693 master-0 kubenswrapper[28758]: I0223 14:38:19.133417 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Feb 23 14:38:19.134693 master-0 kubenswrapper[28758]: I0223 14:38:19.133561 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Feb 23 14:38:19.136465 master-0 kubenswrapper[28758]: I0223 14:38:19.136413 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Feb 23 14:38:19.139912 master-0 kubenswrapper[28758]: I0223 14:38:19.139852 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Feb 23 14:38:19.142740 master-0 kubenswrapper[28758]: I0223 14:38:19.142693 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Feb 23 14:38:19.143088 master-0 kubenswrapper[28758]: I0223 14:38:19.143057 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Feb 23 14:38:19.144834 master-0 kubenswrapper[28758]: I0223 14:38:19.144798 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Feb 23 14:38:19.273718 master-0 kubenswrapper[28758]: I0223 14:38:19.273667 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1f33aa5f-106b-4743-9d67-758977e09c33-config-volume\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.274005 master-0 kubenswrapper[28758]: I0223 14:38:19.273987 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1f33aa5f-106b-4743-9d67-758977e09c33-config-out\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.274097 master-0 kubenswrapper[28758]: I0223 14:38:19.274082 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1f33aa5f-106b-4743-9d67-758977e09c33-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.274214 master-0 kubenswrapper[28758]: I0223 14:38:19.274195 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1f33aa5f-106b-4743-9d67-758977e09c33-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.274323 master-0 kubenswrapper[28758]: I0223 14:38:19.274306 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqcph\" (UniqueName: \"kubernetes.io/projected/1f33aa5f-106b-4743-9d67-758977e09c33-kube-api-access-lqcph\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.274466 master-0 kubenswrapper[28758]: I0223 14:38:19.274444 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1f33aa5f-106b-4743-9d67-758977e09c33-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.274586 master-0 kubenswrapper[28758]: I0223 14:38:19.274572 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1f33aa5f-106b-4743-9d67-758977e09c33-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.274670 master-0 kubenswrapper[28758]: I0223 14:38:19.274658 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1f33aa5f-106b-4743-9d67-758977e09c33-web-config\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.274758 master-0 kubenswrapper[28758]: I0223 14:38:19.274746 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1f33aa5f-106b-4743-9d67-758977e09c33-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.274841 master-0 kubenswrapper[28758]: I0223 14:38:19.274829 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1f33aa5f-106b-4743-9d67-758977e09c33-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.274942 master-0 kubenswrapper[28758]: I0223 14:38:19.274930 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f33aa5f-106b-4743-9d67-758977e09c33-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.275039 master-0 kubenswrapper[28758]: I0223 14:38:19.275026 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1f33aa5f-106b-4743-9d67-758977e09c33-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.376747 master-0 kubenswrapper[28758]: I0223 14:38:19.376615 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1f33aa5f-106b-4743-9d67-758977e09c33-config-volume\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.376747 master-0 kubenswrapper[28758]: I0223 14:38:19.376703 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1f33aa5f-106b-4743-9d67-758977e09c33-config-out\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.376747 master-0 kubenswrapper[28758]: I0223 14:38:19.376728 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1f33aa5f-106b-4743-9d67-758977e09c33-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.377048 master-0 kubenswrapper[28758]: I0223 14:38:19.376760 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1f33aa5f-106b-4743-9d67-758977e09c33-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.377048 master-0 kubenswrapper[28758]: I0223 14:38:19.376786 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqcph\" (UniqueName: \"kubernetes.io/projected/1f33aa5f-106b-4743-9d67-758977e09c33-kube-api-access-lqcph\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.377048 master-0 kubenswrapper[28758]: I0223 14:38:19.376813 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1f33aa5f-106b-4743-9d67-758977e09c33-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.377048 master-0 kubenswrapper[28758]: I0223 14:38:19.376838 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1f33aa5f-106b-4743-9d67-758977e09c33-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.377206 master-0 kubenswrapper[28758]: I0223 14:38:19.377044 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1f33aa5f-106b-4743-9d67-758977e09c33-web-config\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.377206 master-0 kubenswrapper[28758]: I0223 14:38:19.377132 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1f33aa5f-106b-4743-9d67-758977e09c33-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.377206 master-0 kubenswrapper[28758]: I0223 14:38:19.377181 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1f33aa5f-106b-4743-9d67-758977e09c33-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.377385 master-0 kubenswrapper[28758]: I0223 14:38:19.377272 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f33aa5f-106b-4743-9d67-758977e09c33-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.377385 master-0 kubenswrapper[28758]: I0223 14:38:19.377353 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1f33aa5f-106b-4743-9d67-758977e09c33-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.377571 master-0 kubenswrapper[28758]: E0223 14:38:19.377529 28758 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Feb 23 14:38:19.377637 master-0 kubenswrapper[28758]: E0223 14:38:19.377628 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f33aa5f-106b-4743-9d67-758977e09c33-secret-alertmanager-main-tls podName:1f33aa5f-106b-4743-9d67-758977e09c33 nodeName:}" failed. No retries permitted until 2026-02-23 14:38:19.877601236 +0000 UTC m=+232.003917348 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/1f33aa5f-106b-4743-9d67-758977e09c33-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "1f33aa5f-106b-4743-9d67-758977e09c33") : secret "alertmanager-main-tls" not found Feb 23 14:38:19.377875 master-0 kubenswrapper[28758]: I0223 14:38:19.377829 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1f33aa5f-106b-4743-9d67-758977e09c33-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.378613 master-0 kubenswrapper[28758]: I0223 14:38:19.378591 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1f33aa5f-106b-4743-9d67-758977e09c33-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.378775 master-0 kubenswrapper[28758]: I0223 14:38:19.378734 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f33aa5f-106b-4743-9d67-758977e09c33-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.380670 master-0 kubenswrapper[28758]: I0223 14:38:19.380605 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1f33aa5f-106b-4743-9d67-758977e09c33-web-config\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.381220 master-0 kubenswrapper[28758]: I0223 14:38:19.381191 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1f33aa5f-106b-4743-9d67-758977e09c33-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.381905 master-0 kubenswrapper[28758]: I0223 14:38:19.381723 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1f33aa5f-106b-4743-9d67-758977e09c33-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.381905 master-0 kubenswrapper[28758]: I0223 14:38:19.381853 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1f33aa5f-106b-4743-9d67-758977e09c33-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.382624 master-0 kubenswrapper[28758]: I0223 14:38:19.382598 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1f33aa5f-106b-4743-9d67-758977e09c33-config-out\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.382702 master-0 kubenswrapper[28758]: I0223 14:38:19.382634 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1f33aa5f-106b-4743-9d67-758977e09c33-config-volume\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.384888 master-0 kubenswrapper[28758]: I0223 14:38:19.384822 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1f33aa5f-106b-4743-9d67-758977e09c33-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.396510 master-0 kubenswrapper[28758]: I0223 14:38:19.396452 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqcph\" (UniqueName: \"kubernetes.io/projected/1f33aa5f-106b-4743-9d67-758977e09c33-kube-api-access-lqcph\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.886193 master-0 kubenswrapper[28758]: I0223 14:38:19.886119 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1f33aa5f-106b-4743-9d67-758977e09c33-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:19.887226 master-0 kubenswrapper[28758]: E0223 14:38:19.887168 28758 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Feb 23 14:38:19.887320 master-0 kubenswrapper[28758]: E0223 14:38:19.887277 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f33aa5f-106b-4743-9d67-758977e09c33-secret-alertmanager-main-tls podName:1f33aa5f-106b-4743-9d67-758977e09c33 nodeName:}" failed. No retries permitted until 2026-02-23 14:38:20.887252288 +0000 UTC m=+233.013568230 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/1f33aa5f-106b-4743-9d67-758977e09c33-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "1f33aa5f-106b-4743-9d67-758977e09c33") : secret "alertmanager-main-tls" not found Feb 23 14:38:20.903192 master-0 kubenswrapper[28758]: I0223 14:38:20.903123 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1f33aa5f-106b-4743-9d67-758977e09c33-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:20.903828 master-0 kubenswrapper[28758]: E0223 14:38:20.903419 28758 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Feb 23 14:38:20.903828 master-0 kubenswrapper[28758]: E0223 14:38:20.903609 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f33aa5f-106b-4743-9d67-758977e09c33-secret-alertmanager-main-tls podName:1f33aa5f-106b-4743-9d67-758977e09c33 nodeName:}" failed. No retries permitted until 2026-02-23 14:38:22.903575803 +0000 UTC m=+235.029891735 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/1f33aa5f-106b-4743-9d67-758977e09c33-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "1f33aa5f-106b-4743-9d67-758977e09c33") : secret "alertmanager-main-tls" not found Feb 23 14:38:21.132206 master-0 kubenswrapper[28758]: I0223 14:38:21.132145 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Feb 23 14:38:21.134329 master-0 kubenswrapper[28758]: I0223 14:38:21.134292 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.136357 master-0 kubenswrapper[28758]: I0223 14:38:21.136308 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Feb 23 14:38:21.136668 master-0 kubenswrapper[28758]: I0223 14:38:21.136636 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Feb 23 14:38:21.137883 master-0 kubenswrapper[28758]: I0223 14:38:21.137862 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-354e3h36mtupn" Feb 23 14:38:21.137973 master-0 kubenswrapper[28758]: I0223 14:38:21.137891 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Feb 23 14:38:21.138119 master-0 kubenswrapper[28758]: I0223 14:38:21.138099 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Feb 23 14:38:21.138273 master-0 kubenswrapper[28758]: I0223 14:38:21.138202 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Feb 23 14:38:21.138342 master-0 kubenswrapper[28758]: I0223 14:38:21.138117 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Feb 23 14:38:21.138342 master-0 kubenswrapper[28758]: I0223 14:38:21.138336 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Feb 23 14:38:21.138429 master-0 kubenswrapper[28758]: I0223 14:38:21.138379 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Feb 23 14:38:21.138429 master-0 kubenswrapper[28758]: I0223 14:38:21.138412 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Feb 23 14:38:21.139817 master-0 kubenswrapper[28758]: I0223 14:38:21.139786 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Feb 23 14:38:21.143735 master-0 kubenswrapper[28758]: I0223 14:38:21.143427 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Feb 23 14:38:21.155670 master-0 kubenswrapper[28758]: I0223 14:38:21.155547 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Feb 23 14:38:21.311095 master-0 kubenswrapper[28758]: I0223 14:38:21.311031 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c97be973-6eff-47c3-970d-d711ffc2750b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.311310 master-0 kubenswrapper[28758]: I0223 14:38:21.311100 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c97be973-6eff-47c3-970d-d711ffc2750b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.311310 master-0 kubenswrapper[28758]: I0223 14:38:21.311173 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.311310 master-0 kubenswrapper[28758]: I0223 14:38:21.311204 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb42l\" (UniqueName: \"kubernetes.io/projected/c97be973-6eff-47c3-970d-d711ffc2750b-kube-api-access-kb42l\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.311310 master-0 kubenswrapper[28758]: I0223 14:38:21.311238 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.312140 master-0 kubenswrapper[28758]: I0223 14:38:21.312096 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.312198 master-0 kubenswrapper[28758]: I0223 14:38:21.312151 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.312246 master-0 kubenswrapper[28758]: I0223 14:38:21.312227 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c97be973-6eff-47c3-970d-d711ffc2750b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.312281 master-0 kubenswrapper[28758]: I0223 14:38:21.312261 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.312316 master-0 kubenswrapper[28758]: I0223 14:38:21.312285 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c97be973-6eff-47c3-970d-d711ffc2750b-config-out\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.312346 master-0 kubenswrapper[28758]: I0223 14:38:21.312337 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c97be973-6eff-47c3-970d-d711ffc2750b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.312383 master-0 kubenswrapper[28758]: I0223 14:38:21.312358 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c97be973-6eff-47c3-970d-d711ffc2750b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.312415 master-0 kubenswrapper[28758]: I0223 14:38:21.312382 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.312448 master-0 kubenswrapper[28758]: I0223 14:38:21.312413 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c97be973-6eff-47c3-970d-d711ffc2750b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.312448 master-0 kubenswrapper[28758]: I0223 14:38:21.312434 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.312529 master-0 kubenswrapper[28758]: I0223 14:38:21.312461 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c97be973-6eff-47c3-970d-d711ffc2750b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.312529 master-0 kubenswrapper[28758]: I0223 14:38:21.312497 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-web-config\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.312668 master-0 kubenswrapper[28758]: I0223 14:38:21.312535 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-config\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.413261 master-0 kubenswrapper[28758]: I0223 14:38:21.413136 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.413261 master-0 kubenswrapper[28758]: I0223 14:38:21.413188 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c97be973-6eff-47c3-970d-d711ffc2750b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.413261 master-0 kubenswrapper[28758]: I0223 14:38:21.413210 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.413261 master-0 kubenswrapper[28758]: I0223 14:38:21.413228 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c97be973-6eff-47c3-970d-d711ffc2750b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.413261 master-0 kubenswrapper[28758]: I0223 14:38:21.413245 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-web-config\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.413261 master-0 kubenswrapper[28758]: I0223 14:38:21.413271 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-config\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.413617 master-0 kubenswrapper[28758]: I0223 14:38:21.413301 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c97be973-6eff-47c3-970d-d711ffc2750b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.413617 master-0 kubenswrapper[28758]: I0223 14:38:21.413320 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c97be973-6eff-47c3-970d-d711ffc2750b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.413617 master-0 kubenswrapper[28758]: I0223 14:38:21.413348 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.413617 master-0 kubenswrapper[28758]: I0223 14:38:21.413364 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb42l\" (UniqueName: \"kubernetes.io/projected/c97be973-6eff-47c3-970d-d711ffc2750b-kube-api-access-kb42l\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.413617 master-0 kubenswrapper[28758]: I0223 14:38:21.413386 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.413617 master-0 kubenswrapper[28758]: I0223 14:38:21.413414 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.413617 master-0 kubenswrapper[28758]: I0223 14:38:21.413430 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.413617 master-0 kubenswrapper[28758]: I0223 14:38:21.413460 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c97be973-6eff-47c3-970d-d711ffc2750b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.413617 master-0 kubenswrapper[28758]: I0223 14:38:21.413493 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.413617 master-0 kubenswrapper[28758]: I0223 14:38:21.413511 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c97be973-6eff-47c3-970d-d711ffc2750b-config-out\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.413617 master-0 kubenswrapper[28758]: I0223 14:38:21.413538 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c97be973-6eff-47c3-970d-d711ffc2750b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.413617 master-0 kubenswrapper[28758]: I0223 14:38:21.413553 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c97be973-6eff-47c3-970d-d711ffc2750b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.413966 master-0 kubenswrapper[28758]: I0223 14:38:21.413931 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c97be973-6eff-47c3-970d-d711ffc2750b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.414189 master-0 kubenswrapper[28758]: I0223 14:38:21.414146 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c97be973-6eff-47c3-970d-d711ffc2750b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.414289 master-0 kubenswrapper[28758]: E0223 14:38:21.414263 28758 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-k8s-tls: secret "prometheus-k8s-tls" not found Feb 23 14:38:21.414337 master-0 kubenswrapper[28758]: E0223 14:38:21.414314 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-prometheus-k8s-tls podName:c97be973-6eff-47c3-970d-d711ffc2750b nodeName:}" failed. No retries permitted until 2026-02-23 14:38:21.914298704 +0000 UTC m=+234.040614636 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-prometheus-k8s-tls" (UniqueName: "kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-prometheus-k8s-tls") pod "prometheus-k8s-0" (UID: "c97be973-6eff-47c3-970d-d711ffc2750b") : secret "prometheus-k8s-tls" not found Feb 23 14:38:21.415715 master-0 kubenswrapper[28758]: E0223 14:38:21.414629 28758 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-k8s-thanos-sidecar-tls: secret "prometheus-k8s-thanos-sidecar-tls" not found Feb 23 14:38:21.415715 master-0 kubenswrapper[28758]: E0223 14:38:21.414688 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-prometheus-k8s-thanos-sidecar-tls podName:c97be973-6eff-47c3-970d-d711ffc2750b nodeName:}" failed. No retries permitted until 2026-02-23 14:38:21.914672164 +0000 UTC m=+234.040988096 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-prometheus-k8s-thanos-sidecar-tls" (UniqueName: "kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-prometheus-k8s-thanos-sidecar-tls") pod "prometheus-k8s-0" (UID: "c97be973-6eff-47c3-970d-d711ffc2750b") : secret "prometheus-k8s-thanos-sidecar-tls" not found Feb 23 14:38:21.415715 master-0 kubenswrapper[28758]: I0223 14:38:21.415636 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c97be973-6eff-47c3-970d-d711ffc2750b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.415884 master-0 kubenswrapper[28758]: I0223 14:38:21.415819 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c97be973-6eff-47c3-970d-d711ffc2750b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.416924 master-0 kubenswrapper[28758]: I0223 14:38:21.416895 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.417133 master-0 kubenswrapper[28758]: I0223 14:38:21.417105 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-web-config\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.418327 master-0 kubenswrapper[28758]: I0223 14:38:21.418189 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c97be973-6eff-47c3-970d-d711ffc2750b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.419155 master-0 kubenswrapper[28758]: I0223 14:38:21.419109 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-config\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.419718 master-0 kubenswrapper[28758]: I0223 14:38:21.419683 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.419842 master-0 kubenswrapper[28758]: I0223 14:38:21.419815 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c97be973-6eff-47c3-970d-d711ffc2750b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.419946 master-0 kubenswrapper[28758]: I0223 14:38:21.419897 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.420322 master-0 kubenswrapper[28758]: I0223 14:38:21.420284 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.420386 master-0 kubenswrapper[28758]: I0223 14:38:21.420350 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.424592 master-0 kubenswrapper[28758]: I0223 14:38:21.424546 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c97be973-6eff-47c3-970d-d711ffc2750b-config-out\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.425088 master-0 kubenswrapper[28758]: I0223 14:38:21.425053 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c97be973-6eff-47c3-970d-d711ffc2750b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.436786 master-0 kubenswrapper[28758]: I0223 14:38:21.436732 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb42l\" (UniqueName: \"kubernetes.io/projected/c97be973-6eff-47c3-970d-d711ffc2750b-kube-api-access-kb42l\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.922430 master-0 kubenswrapper[28758]: I0223 14:38:21.922255 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.923336 master-0 kubenswrapper[28758]: E0223 14:38:21.922790 28758 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-k8s-thanos-sidecar-tls: secret "prometheus-k8s-thanos-sidecar-tls" not found Feb 23 14:38:21.923336 master-0 kubenswrapper[28758]: I0223 14:38:21.922841 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:21.923336 master-0 kubenswrapper[28758]: E0223 14:38:21.922934 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-prometheus-k8s-thanos-sidecar-tls podName:c97be973-6eff-47c3-970d-d711ffc2750b nodeName:}" failed. No retries permitted until 2026-02-23 14:38:22.922902389 +0000 UTC m=+235.049218351 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-prometheus-k8s-thanos-sidecar-tls" (UniqueName: "kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-prometheus-k8s-thanos-sidecar-tls") pod "prometheus-k8s-0" (UID: "c97be973-6eff-47c3-970d-d711ffc2750b") : secret "prometheus-k8s-thanos-sidecar-tls" not found Feb 23 14:38:21.923336 master-0 kubenswrapper[28758]: E0223 14:38:21.923108 28758 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-k8s-tls: secret "prometheus-k8s-tls" not found Feb 23 14:38:21.923336 master-0 kubenswrapper[28758]: E0223 14:38:21.923209 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-prometheus-k8s-tls podName:c97be973-6eff-47c3-970d-d711ffc2750b nodeName:}" failed. No retries permitted until 2026-02-23 14:38:22.923183366 +0000 UTC m=+235.049499308 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-prometheus-k8s-tls" (UniqueName: "kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-prometheus-k8s-tls") pod "prometheus-k8s-0" (UID: "c97be973-6eff-47c3-970d-d711ffc2750b") : secret "prometheus-k8s-tls" not found Feb 23 14:38:22.403795 master-0 kubenswrapper[28758]: I0223 14:38:22.403702 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-556bbd75bc-q4jfx" podUID="0119f38b-9247-4e4c-af16-31202765777a" containerName="console" containerID="cri-o://4b9305d2d2ffc30dbee7c9a127f79d70cae04ade1e03a8ca762fc497eed55098" gracePeriod=15 Feb 23 14:38:22.555551 master-0 kubenswrapper[28758]: I0223 14:38:22.555471 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-556bbd75bc-q4jfx_0119f38b-9247-4e4c-af16-31202765777a/console/0.log" Feb 23 14:38:22.555551 master-0 kubenswrapper[28758]: I0223 14:38:22.555537 28758 generic.go:334] "Generic (PLEG): container finished" podID="0119f38b-9247-4e4c-af16-31202765777a" containerID="4b9305d2d2ffc30dbee7c9a127f79d70cae04ade1e03a8ca762fc497eed55098" exitCode=2 Feb 23 14:38:22.555789 master-0 kubenswrapper[28758]: I0223 14:38:22.555568 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-556bbd75bc-q4jfx" event={"ID":"0119f38b-9247-4e4c-af16-31202765777a","Type":"ContainerDied","Data":"4b9305d2d2ffc30dbee7c9a127f79d70cae04ade1e03a8ca762fc497eed55098"} Feb 23 14:38:22.873644 master-0 kubenswrapper[28758]: I0223 14:38:22.873589 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-556bbd75bc-q4jfx_0119f38b-9247-4e4c-af16-31202765777a/console/0.log" Feb 23 14:38:22.873829 master-0 kubenswrapper[28758]: I0223 14:38:22.873658 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-556bbd75bc-q4jfx" Feb 23 14:38:22.939108 master-0 kubenswrapper[28758]: I0223 14:38:22.939046 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:22.939903 master-0 kubenswrapper[28758]: I0223 14:38:22.939353 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:22.939903 master-0 kubenswrapper[28758]: I0223 14:38:22.939382 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1f33aa5f-106b-4743-9d67-758977e09c33-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:22.939903 master-0 kubenswrapper[28758]: E0223 14:38:22.939541 28758 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-k8s-thanos-sidecar-tls: secret "prometheus-k8s-thanos-sidecar-tls" not found Feb 23 14:38:22.939903 master-0 kubenswrapper[28758]: E0223 14:38:22.939630 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-prometheus-k8s-thanos-sidecar-tls podName:c97be973-6eff-47c3-970d-d711ffc2750b nodeName:}" failed. No retries permitted until 2026-02-23 14:38:24.939611075 +0000 UTC m=+237.065927007 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "secret-prometheus-k8s-thanos-sidecar-tls" (UniqueName: "kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-prometheus-k8s-thanos-sidecar-tls") pod "prometheus-k8s-0" (UID: "c97be973-6eff-47c3-970d-d711ffc2750b") : secret "prometheus-k8s-thanos-sidecar-tls" not found Feb 23 14:38:22.940353 master-0 kubenswrapper[28758]: E0223 14:38:22.940319 28758 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Feb 23 14:38:22.940399 master-0 kubenswrapper[28758]: E0223 14:38:22.940363 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f33aa5f-106b-4743-9d67-758977e09c33-secret-alertmanager-main-tls podName:1f33aa5f-106b-4743-9d67-758977e09c33 nodeName:}" failed. No retries permitted until 2026-02-23 14:38:26.940353574 +0000 UTC m=+239.066669506 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/1f33aa5f-106b-4743-9d67-758977e09c33-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "1f33aa5f-106b-4743-9d67-758977e09c33") : secret "alertmanager-main-tls" not found Feb 23 14:38:22.940435 master-0 kubenswrapper[28758]: E0223 14:38:22.940415 28758 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-k8s-tls: secret "prometheus-k8s-tls" not found Feb 23 14:38:22.940468 master-0 kubenswrapper[28758]: E0223 14:38:22.940442 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-prometheus-k8s-tls podName:c97be973-6eff-47c3-970d-d711ffc2750b nodeName:}" failed. No retries permitted until 2026-02-23 14:38:24.940434527 +0000 UTC m=+237.066750459 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "secret-prometheus-k8s-tls" (UniqueName: "kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-prometheus-k8s-tls") pod "prometheus-k8s-0" (UID: "c97be973-6eff-47c3-970d-d711ffc2750b") : secret "prometheus-k8s-tls" not found Feb 23 14:38:23.041463 master-0 kubenswrapper[28758]: I0223 14:38:23.041411 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0119f38b-9247-4e4c-af16-31202765777a-oauth-serving-cert\") pod \"0119f38b-9247-4e4c-af16-31202765777a\" (UID: \"0119f38b-9247-4e4c-af16-31202765777a\") " Feb 23 14:38:23.041708 master-0 kubenswrapper[28758]: I0223 14:38:23.041642 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0119f38b-9247-4e4c-af16-31202765777a-trusted-ca-bundle\") pod \"0119f38b-9247-4e4c-af16-31202765777a\" (UID: \"0119f38b-9247-4e4c-af16-31202765777a\") " Feb 23 14:38:23.041708 master-0 kubenswrapper[28758]: I0223 14:38:23.041685 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0119f38b-9247-4e4c-af16-31202765777a-console-serving-cert\") pod \"0119f38b-9247-4e4c-af16-31202765777a\" (UID: \"0119f38b-9247-4e4c-af16-31202765777a\") " Feb 23 14:38:23.041708 master-0 kubenswrapper[28758]: I0223 14:38:23.041700 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0119f38b-9247-4e4c-af16-31202765777a-console-config\") pod \"0119f38b-9247-4e4c-af16-31202765777a\" (UID: \"0119f38b-9247-4e4c-af16-31202765777a\") " Feb 23 14:38:23.041845 master-0 kubenswrapper[28758]: I0223 14:38:23.041729 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b54fz\" (UniqueName: \"kubernetes.io/projected/0119f38b-9247-4e4c-af16-31202765777a-kube-api-access-b54fz\") pod \"0119f38b-9247-4e4c-af16-31202765777a\" (UID: \"0119f38b-9247-4e4c-af16-31202765777a\") " Feb 23 14:38:23.041845 master-0 kubenswrapper[28758]: I0223 14:38:23.041752 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0119f38b-9247-4e4c-af16-31202765777a-console-oauth-config\") pod \"0119f38b-9247-4e4c-af16-31202765777a\" (UID: \"0119f38b-9247-4e4c-af16-31202765777a\") " Feb 23 14:38:23.041845 master-0 kubenswrapper[28758]: I0223 14:38:23.041794 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0119f38b-9247-4e4c-af16-31202765777a-service-ca\") pod \"0119f38b-9247-4e4c-af16-31202765777a\" (UID: \"0119f38b-9247-4e4c-af16-31202765777a\") " Feb 23 14:38:23.042079 master-0 kubenswrapper[28758]: I0223 14:38:23.042020 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0119f38b-9247-4e4c-af16-31202765777a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0119f38b-9247-4e4c-af16-31202765777a" (UID: "0119f38b-9247-4e4c-af16-31202765777a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:38:23.042356 master-0 kubenswrapper[28758]: I0223 14:38:23.042319 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0119f38b-9247-4e4c-af16-31202765777a-console-config" (OuterVolumeSpecName: "console-config") pod "0119f38b-9247-4e4c-af16-31202765777a" (UID: "0119f38b-9247-4e4c-af16-31202765777a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:38:23.042428 master-0 kubenswrapper[28758]: I0223 14:38:23.042352 28758 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0119f38b-9247-4e4c-af16-31202765777a-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Feb 23 14:38:23.042492 master-0 kubenswrapper[28758]: I0223 14:38:23.042434 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0119f38b-9247-4e4c-af16-31202765777a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0119f38b-9247-4e4c-af16-31202765777a" (UID: "0119f38b-9247-4e4c-af16-31202765777a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:38:23.042654 master-0 kubenswrapper[28758]: I0223 14:38:23.042620 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0119f38b-9247-4e4c-af16-31202765777a-service-ca" (OuterVolumeSpecName: "service-ca") pod "0119f38b-9247-4e4c-af16-31202765777a" (UID: "0119f38b-9247-4e4c-af16-31202765777a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:38:23.045345 master-0 kubenswrapper[28758]: I0223 14:38:23.045224 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0119f38b-9247-4e4c-af16-31202765777a-kube-api-access-b54fz" (OuterVolumeSpecName: "kube-api-access-b54fz") pod "0119f38b-9247-4e4c-af16-31202765777a" (UID: "0119f38b-9247-4e4c-af16-31202765777a"). InnerVolumeSpecName "kube-api-access-b54fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:38:23.045497 master-0 kubenswrapper[28758]: I0223 14:38:23.045441 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0119f38b-9247-4e4c-af16-31202765777a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0119f38b-9247-4e4c-af16-31202765777a" (UID: "0119f38b-9247-4e4c-af16-31202765777a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:38:23.048682 master-0 kubenswrapper[28758]: I0223 14:38:23.048638 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0119f38b-9247-4e4c-af16-31202765777a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0119f38b-9247-4e4c-af16-31202765777a" (UID: "0119f38b-9247-4e4c-af16-31202765777a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:38:23.144173 master-0 kubenswrapper[28758]: I0223 14:38:23.144047 28758 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0119f38b-9247-4e4c-af16-31202765777a-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:38:23.144173 master-0 kubenswrapper[28758]: I0223 14:38:23.144088 28758 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0119f38b-9247-4e4c-af16-31202765777a-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Feb 23 14:38:23.144173 master-0 kubenswrapper[28758]: I0223 14:38:23.144097 28758 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0119f38b-9247-4e4c-af16-31202765777a-console-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:38:23.144173 master-0 kubenswrapper[28758]: I0223 14:38:23.144106 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b54fz\" (UniqueName: \"kubernetes.io/projected/0119f38b-9247-4e4c-af16-31202765777a-kube-api-access-b54fz\") on node \"master-0\" DevicePath \"\"" Feb 23 14:38:23.144173 master-0 kubenswrapper[28758]: I0223 14:38:23.144135 28758 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0119f38b-9247-4e4c-af16-31202765777a-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:38:23.144173 master-0 kubenswrapper[28758]: I0223 14:38:23.144144 28758 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0119f38b-9247-4e4c-af16-31202765777a-service-ca\") on node \"master-0\" DevicePath \"\"" Feb 23 14:38:23.563865 master-0 kubenswrapper[28758]: I0223 14:38:23.563829 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-556bbd75bc-q4jfx_0119f38b-9247-4e4c-af16-31202765777a/console/0.log" Feb 23 14:38:23.564169 master-0 kubenswrapper[28758]: I0223 14:38:23.564145 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-556bbd75bc-q4jfx" event={"ID":"0119f38b-9247-4e4c-af16-31202765777a","Type":"ContainerDied","Data":"6539d142a95f7360921c41ed8fc49fd18b776d7d20b04875bb0b029b370f8ffd"} Feb 23 14:38:23.564302 master-0 kubenswrapper[28758]: I0223 14:38:23.564213 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-556bbd75bc-q4jfx" Feb 23 14:38:23.564403 master-0 kubenswrapper[28758]: I0223 14:38:23.564274 28758 scope.go:117] "RemoveContainer" containerID="4b9305d2d2ffc30dbee7c9a127f79d70cae04ade1e03a8ca762fc497eed55098" Feb 23 14:38:23.606004 master-0 kubenswrapper[28758]: I0223 14:38:23.605934 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-556bbd75bc-q4jfx"] Feb 23 14:38:23.613739 master-0 kubenswrapper[28758]: I0223 14:38:23.613684 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-556bbd75bc-q4jfx"] Feb 23 14:38:24.098217 master-0 kubenswrapper[28758]: I0223 14:38:24.097662 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0119f38b-9247-4e4c-af16-31202765777a" path="/var/lib/kubelet/pods/0119f38b-9247-4e4c-af16-31202765777a/volumes" Feb 23 14:38:24.977603 master-0 kubenswrapper[28758]: I0223 14:38:24.977540 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:24.978042 master-0 kubenswrapper[28758]: I0223 14:38:24.977971 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:24.981252 master-0 kubenswrapper[28758]: I0223 14:38:24.981219 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:24.981613 master-0 kubenswrapper[28758]: I0223 14:38:24.981576 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c97be973-6eff-47c3-970d-d711ffc2750b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c97be973-6eff-47c3-970d-d711ffc2750b\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:25.061069 master-0 kubenswrapper[28758]: I0223 14:38:25.061025 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:25.499412 master-0 kubenswrapper[28758]: I0223 14:38:25.499330 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Feb 23 14:38:25.580355 master-0 kubenswrapper[28758]: I0223 14:38:25.580261 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c97be973-6eff-47c3-970d-d711ffc2750b","Type":"ContainerStarted","Data":"025a9195a9b11ce4fa6fe99a3d54f0ae09649b0be15fbe54f16a56c74bb773fe"} Feb 23 14:38:27.017571 master-0 kubenswrapper[28758]: I0223 14:38:27.017417 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1f33aa5f-106b-4743-9d67-758977e09c33-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:27.021254 master-0 kubenswrapper[28758]: I0223 14:38:27.021194 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1f33aa5f-106b-4743-9d67-758977e09c33-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1f33aa5f-106b-4743-9d67-758977e09c33\") " pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:27.255265 master-0 kubenswrapper[28758]: I0223 14:38:27.255178 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Feb 23 14:38:27.601012 master-0 kubenswrapper[28758]: I0223 14:38:27.600862 28758 generic.go:334] "Generic (PLEG): container finished" podID="c97be973-6eff-47c3-970d-d711ffc2750b" containerID="336b05585a9c63d1f1ee0ead8b5c970fd851a6d9c5368483faa75d7d21c88b43" exitCode=0 Feb 23 14:38:27.601012 master-0 kubenswrapper[28758]: I0223 14:38:27.600922 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c97be973-6eff-47c3-970d-d711ffc2750b","Type":"ContainerDied","Data":"336b05585a9c63d1f1ee0ead8b5c970fd851a6d9c5368483faa75d7d21c88b43"} Feb 23 14:38:27.683257 master-0 kubenswrapper[28758]: I0223 14:38:27.683196 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Feb 23 14:38:27.687166 master-0 kubenswrapper[28758]: W0223 14:38:27.687110 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f33aa5f_106b_4743_9d67_758977e09c33.slice/crio-dd56a434776bd4586a4f01e17d91d9b55f10f82e537cbd0bb212bf5d3132dcd0 WatchSource:0}: Error finding container dd56a434776bd4586a4f01e17d91d9b55f10f82e537cbd0bb212bf5d3132dcd0: Status 404 returned error can't find the container with id dd56a434776bd4586a4f01e17d91d9b55f10f82e537cbd0bb212bf5d3132dcd0 Feb 23 14:38:28.609911 master-0 kubenswrapper[28758]: I0223 14:38:28.609782 28758 generic.go:334] "Generic (PLEG): container finished" podID="1f33aa5f-106b-4743-9d67-758977e09c33" containerID="302d39fa3fa85b31649e4c522963605980f8c35a17f9827e57ecdb9bc59ef3e3" exitCode=0 Feb 23 14:38:28.609911 master-0 kubenswrapper[28758]: I0223 14:38:28.609821 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1f33aa5f-106b-4743-9d67-758977e09c33","Type":"ContainerDied","Data":"302d39fa3fa85b31649e4c522963605980f8c35a17f9827e57ecdb9bc59ef3e3"} Feb 23 14:38:28.609911 master-0 kubenswrapper[28758]: I0223 14:38:28.609866 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1f33aa5f-106b-4743-9d67-758977e09c33","Type":"ContainerStarted","Data":"dd56a434776bd4586a4f01e17d91d9b55f10f82e537cbd0bb212bf5d3132dcd0"} Feb 23 14:38:29.359717 master-0 kubenswrapper[28758]: I0223 14:38:29.359627 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/212126c7-1db5-456c-add5-d0e3f38fe315-telemeter-client-tls\") pod \"telemeter-client-dbf68b6c5-fp955\" (UID: \"212126c7-1db5-456c-add5-d0e3f38fe315\") " pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" Feb 23 14:38:29.363505 master-0 kubenswrapper[28758]: I0223 14:38:29.363441 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/212126c7-1db5-456c-add5-d0e3f38fe315-telemeter-client-tls\") pod \"telemeter-client-dbf68b6c5-fp955\" (UID: \"212126c7-1db5-456c-add5-d0e3f38fe315\") " pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" Feb 23 14:38:29.553397 master-0 kubenswrapper[28758]: I0223 14:38:29.553345 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-dockercfg-tscn7" Feb 23 14:38:29.561956 master-0 kubenswrapper[28758]: I0223 14:38:29.561895 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" Feb 23 14:38:31.338174 master-0 kubenswrapper[28758]: I0223 14:38:31.338069 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-dbf68b6c5-fp955"] Feb 23 14:38:31.646656 master-0 kubenswrapper[28758]: I0223 14:38:31.646609 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c97be973-6eff-47c3-970d-d711ffc2750b","Type":"ContainerStarted","Data":"b0b690af54efb6ecd96fc5160f9f7ce8df1d5f8cf777cec5d9429d15fbb566d6"} Feb 23 14:38:31.646656 master-0 kubenswrapper[28758]: I0223 14:38:31.646659 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c97be973-6eff-47c3-970d-d711ffc2750b","Type":"ContainerStarted","Data":"c7a224c555408d4534d78f01e4540292494f514be8c6acb50a95c80a53de58f7"} Feb 23 14:38:31.646867 master-0 kubenswrapper[28758]: I0223 14:38:31.646671 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c97be973-6eff-47c3-970d-d711ffc2750b","Type":"ContainerStarted","Data":"2773886f89f24e294e508ee0a1fa00b8ee75b6b6f2ac9dbcaf040b623052dd6f"} Feb 23 14:38:31.646867 master-0 kubenswrapper[28758]: I0223 14:38:31.646682 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c97be973-6eff-47c3-970d-d711ffc2750b","Type":"ContainerStarted","Data":"0116af32796a424f326ac7bbd8d2481a83f7588442064dccdeda0a8b4527aa3a"} Feb 23 14:38:31.651059 master-0 kubenswrapper[28758]: I0223 14:38:31.651004 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" event={"ID":"212126c7-1db5-456c-add5-d0e3f38fe315","Type":"ContainerStarted","Data":"82b13da129d28800b8ba4b6d92ef850cfe7cdddcd2d3f776f3e4f191ed4d8409"} Feb 23 14:38:31.655740 master-0 kubenswrapper[28758]: I0223 14:38:31.655696 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1f33aa5f-106b-4743-9d67-758977e09c33","Type":"ContainerStarted","Data":"835174dc8478c11b1750a11ae002e124ebb6f276c1b3f3136ee1474518a25843"} Feb 23 14:38:31.655863 master-0 kubenswrapper[28758]: I0223 14:38:31.655743 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1f33aa5f-106b-4743-9d67-758977e09c33","Type":"ContainerStarted","Data":"8edae7745af95ceec2c6c1c5302bab9753c00c3dc9fc895871e8363a3583c272"} Feb 23 14:38:31.655863 master-0 kubenswrapper[28758]: I0223 14:38:31.655757 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1f33aa5f-106b-4743-9d67-758977e09c33","Type":"ContainerStarted","Data":"3f8f393be25a210e084b7daf898a335ce57db96ef2566390be32be107a22080b"} Feb 23 14:38:31.655863 master-0 kubenswrapper[28758]: I0223 14:38:31.655771 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1f33aa5f-106b-4743-9d67-758977e09c33","Type":"ContainerStarted","Data":"41ca31f3b587e2d7d0977f059474d5ae4fbf329c9f65bee5540b1f5d9df7b66b"} Feb 23 14:38:32.668662 master-0 kubenswrapper[28758]: I0223 14:38:32.668606 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1f33aa5f-106b-4743-9d67-758977e09c33","Type":"ContainerStarted","Data":"660399da00c0b1c1a470eab911016afe8eff526d7b8db94364b9e37352ce7cb6"} Feb 23 14:38:32.668662 master-0 kubenswrapper[28758]: I0223 14:38:32.668665 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1f33aa5f-106b-4743-9d67-758977e09c33","Type":"ContainerStarted","Data":"9a61c8f48881dc5c167241e1e0d380bc1dc8360d2a5235df90a0b4923951b635"} Feb 23 14:38:32.674432 master-0 kubenswrapper[28758]: I0223 14:38:32.674409 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c97be973-6eff-47c3-970d-d711ffc2750b","Type":"ContainerStarted","Data":"a6603246a47ce78487493edd6fcf8ffd1e2c99fd75f68e3671f18332440da7e4"} Feb 23 14:38:32.674549 master-0 kubenswrapper[28758]: I0223 14:38:32.674535 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c97be973-6eff-47c3-970d-d711ffc2750b","Type":"ContainerStarted","Data":"c32c7073b1c9a5a33c4374a6e86883cfb9211540d22a9db8087a20a96d8eb71f"} Feb 23 14:38:32.713642 master-0 kubenswrapper[28758]: I0223 14:38:32.713255 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=11.413177213 podStartE2EDuration="13.713232996s" podCreationTimestamp="2026-02-23 14:38:19 +0000 UTC" firstStartedPulling="2026-02-23 14:38:28.611653424 +0000 UTC m=+240.737969356" lastFinishedPulling="2026-02-23 14:38:30.911709207 +0000 UTC m=+243.038025139" observedRunningTime="2026-02-23 14:38:32.712186328 +0000 UTC m=+244.838502290" watchObservedRunningTime="2026-02-23 14:38:32.713232996 +0000 UTC m=+244.839548928" Feb 23 14:38:33.685886 master-0 kubenswrapper[28758]: I0223 14:38:33.685786 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" event={"ID":"212126c7-1db5-456c-add5-d0e3f38fe315","Type":"ContainerStarted","Data":"c4022a5ca52d7e2cd2ffe889a2ecf91d857ee71172ef646b44717c07ab266825"} Feb 23 14:38:33.685886 master-0 kubenswrapper[28758]: I0223 14:38:33.685878 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" event={"ID":"212126c7-1db5-456c-add5-d0e3f38fe315","Type":"ContainerStarted","Data":"0d7a96c8894d96cfac6f63fa19e52340b98edba16abe1356ee95353dbc6ec75c"} Feb 23 14:38:34.696498 master-0 kubenswrapper[28758]: I0223 14:38:34.696415 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" event={"ID":"212126c7-1db5-456c-add5-d0e3f38fe315","Type":"ContainerStarted","Data":"eebd44ebe68642dac135cc46255414798e1b3b5f4054adc4595be24e9b834c99"} Feb 23 14:38:34.724014 master-0 kubenswrapper[28758]: I0223 14:38:34.723919 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=8.320348431 podStartE2EDuration="13.723891919s" podCreationTimestamp="2026-02-23 14:38:21 +0000 UTC" firstStartedPulling="2026-02-23 14:38:25.508671183 +0000 UTC m=+237.634987115" lastFinishedPulling="2026-02-23 14:38:30.912214671 +0000 UTC m=+243.038530603" observedRunningTime="2026-02-23 14:38:32.753322979 +0000 UTC m=+244.879638911" watchObservedRunningTime="2026-02-23 14:38:34.723891919 +0000 UTC m=+246.850207851" Feb 23 14:38:34.727925 master-0 kubenswrapper[28758]: I0223 14:38:34.727862 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-dbf68b6c5-fp955" podStartSLOduration=35.804346422 podStartE2EDuration="37.727845964s" podCreationTimestamp="2026-02-23 14:37:57 +0000 UTC" firstStartedPulling="2026-02-23 14:38:31.35632571 +0000 UTC m=+243.482641642" lastFinishedPulling="2026-02-23 14:38:33.279825242 +0000 UTC m=+245.406141184" observedRunningTime="2026-02-23 14:38:34.721222377 +0000 UTC m=+246.847538309" watchObservedRunningTime="2026-02-23 14:38:34.727845964 +0000 UTC m=+246.854161896" Feb 23 14:38:35.061945 master-0 kubenswrapper[28758]: I0223 14:38:35.061399 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:38:35.431431 master-0 kubenswrapper[28758]: I0223 14:38:35.431360 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-fdcd8d88c-hgj8l"] Feb 23 14:38:35.431918 master-0 kubenswrapper[28758]: E0223 14:38:35.431852 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0119f38b-9247-4e4c-af16-31202765777a" containerName="console" Feb 23 14:38:35.431918 master-0 kubenswrapper[28758]: I0223 14:38:35.431886 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="0119f38b-9247-4e4c-af16-31202765777a" containerName="console" Feb 23 14:38:35.432297 master-0 kubenswrapper[28758]: I0223 14:38:35.432226 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="0119f38b-9247-4e4c-af16-31202765777a" containerName="console" Feb 23 14:38:35.433005 master-0 kubenswrapper[28758]: I0223 14:38:35.432968 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-fdcd8d88c-hgj8l" Feb 23 14:38:35.465959 master-0 kubenswrapper[28758]: I0223 14:38:35.465904 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/871f487c-6da7-49af-9dc8-eff9b29305c4-console-oauth-config\") pod \"console-fdcd8d88c-hgj8l\" (UID: \"871f487c-6da7-49af-9dc8-eff9b29305c4\") " pod="openshift-console/console-fdcd8d88c-hgj8l" Feb 23 14:38:35.466173 master-0 kubenswrapper[28758]: I0223 14:38:35.466064 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/871f487c-6da7-49af-9dc8-eff9b29305c4-console-serving-cert\") pod \"console-fdcd8d88c-hgj8l\" (UID: \"871f487c-6da7-49af-9dc8-eff9b29305c4\") " pod="openshift-console/console-fdcd8d88c-hgj8l" Feb 23 14:38:35.466173 master-0 kubenswrapper[28758]: I0223 14:38:35.466150 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/871f487c-6da7-49af-9dc8-eff9b29305c4-console-config\") pod \"console-fdcd8d88c-hgj8l\" (UID: \"871f487c-6da7-49af-9dc8-eff9b29305c4\") " pod="openshift-console/console-fdcd8d88c-hgj8l" Feb 23 14:38:35.468068 master-0 kubenswrapper[28758]: I0223 14:38:35.468007 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-fdcd8d88c-hgj8l"] Feb 23 14:38:35.567573 master-0 kubenswrapper[28758]: I0223 14:38:35.567515 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/871f487c-6da7-49af-9dc8-eff9b29305c4-console-oauth-config\") pod \"console-fdcd8d88c-hgj8l\" (UID: \"871f487c-6da7-49af-9dc8-eff9b29305c4\") " pod="openshift-console/console-fdcd8d88c-hgj8l" Feb 23 14:38:35.567573 master-0 kubenswrapper[28758]: I0223 14:38:35.567577 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/871f487c-6da7-49af-9dc8-eff9b29305c4-console-serving-cert\") pod \"console-fdcd8d88c-hgj8l\" (UID: \"871f487c-6da7-49af-9dc8-eff9b29305c4\") " pod="openshift-console/console-fdcd8d88c-hgj8l" Feb 23 14:38:35.567842 master-0 kubenswrapper[28758]: I0223 14:38:35.567802 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/871f487c-6da7-49af-9dc8-eff9b29305c4-oauth-serving-cert\") pod \"console-fdcd8d88c-hgj8l\" (UID: \"871f487c-6da7-49af-9dc8-eff9b29305c4\") " pod="openshift-console/console-fdcd8d88c-hgj8l" Feb 23 14:38:35.567896 master-0 kubenswrapper[28758]: I0223 14:38:35.567880 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/871f487c-6da7-49af-9dc8-eff9b29305c4-console-config\") pod \"console-fdcd8d88c-hgj8l\" (UID: \"871f487c-6da7-49af-9dc8-eff9b29305c4\") " pod="openshift-console/console-fdcd8d88c-hgj8l" Feb 23 14:38:35.568016 master-0 kubenswrapper[28758]: I0223 14:38:35.567993 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/871f487c-6da7-49af-9dc8-eff9b29305c4-trusted-ca-bundle\") pod \"console-fdcd8d88c-hgj8l\" (UID: \"871f487c-6da7-49af-9dc8-eff9b29305c4\") " pod="openshift-console/console-fdcd8d88c-hgj8l" Feb 23 14:38:35.568078 master-0 kubenswrapper[28758]: I0223 14:38:35.568034 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/871f487c-6da7-49af-9dc8-eff9b29305c4-service-ca\") pod \"console-fdcd8d88c-hgj8l\" (UID: \"871f487c-6da7-49af-9dc8-eff9b29305c4\") " pod="openshift-console/console-fdcd8d88c-hgj8l" Feb 23 14:38:35.568078 master-0 kubenswrapper[28758]: I0223 14:38:35.568068 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhn2g\" (UniqueName: \"kubernetes.io/projected/871f487c-6da7-49af-9dc8-eff9b29305c4-kube-api-access-xhn2g\") pod \"console-fdcd8d88c-hgj8l\" (UID: \"871f487c-6da7-49af-9dc8-eff9b29305c4\") " pod="openshift-console/console-fdcd8d88c-hgj8l" Feb 23 14:38:35.569536 master-0 kubenswrapper[28758]: I0223 14:38:35.569160 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/871f487c-6da7-49af-9dc8-eff9b29305c4-console-config\") pod \"console-fdcd8d88c-hgj8l\" (UID: \"871f487c-6da7-49af-9dc8-eff9b29305c4\") " pod="openshift-console/console-fdcd8d88c-hgj8l" Feb 23 14:38:35.572362 master-0 kubenswrapper[28758]: I0223 14:38:35.572309 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/871f487c-6da7-49af-9dc8-eff9b29305c4-console-serving-cert\") pod \"console-fdcd8d88c-hgj8l\" (UID: \"871f487c-6da7-49af-9dc8-eff9b29305c4\") " pod="openshift-console/console-fdcd8d88c-hgj8l" Feb 23 14:38:35.574168 master-0 kubenswrapper[28758]: I0223 14:38:35.574120 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/871f487c-6da7-49af-9dc8-eff9b29305c4-console-oauth-config\") pod \"console-fdcd8d88c-hgj8l\" (UID: \"871f487c-6da7-49af-9dc8-eff9b29305c4\") " pod="openshift-console/console-fdcd8d88c-hgj8l" Feb 23 14:38:35.678757 master-0 kubenswrapper[28758]: I0223 14:38:35.673949 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/871f487c-6da7-49af-9dc8-eff9b29305c4-oauth-serving-cert\") pod \"console-fdcd8d88c-hgj8l\" (UID: \"871f487c-6da7-49af-9dc8-eff9b29305c4\") " pod="openshift-console/console-fdcd8d88c-hgj8l" Feb 23 14:38:35.678757 master-0 kubenswrapper[28758]: I0223 14:38:35.674199 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/871f487c-6da7-49af-9dc8-eff9b29305c4-trusted-ca-bundle\") pod \"console-fdcd8d88c-hgj8l\" (UID: \"871f487c-6da7-49af-9dc8-eff9b29305c4\") " pod="openshift-console/console-fdcd8d88c-hgj8l" Feb 23 14:38:35.678757 master-0 kubenswrapper[28758]: I0223 14:38:35.674248 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/871f487c-6da7-49af-9dc8-eff9b29305c4-service-ca\") pod \"console-fdcd8d88c-hgj8l\" (UID: \"871f487c-6da7-49af-9dc8-eff9b29305c4\") " pod="openshift-console/console-fdcd8d88c-hgj8l" Feb 23 14:38:35.678757 master-0 kubenswrapper[28758]: I0223 14:38:35.674368 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhn2g\" (UniqueName: \"kubernetes.io/projected/871f487c-6da7-49af-9dc8-eff9b29305c4-kube-api-access-xhn2g\") pod \"console-fdcd8d88c-hgj8l\" (UID: \"871f487c-6da7-49af-9dc8-eff9b29305c4\") " pod="openshift-console/console-fdcd8d88c-hgj8l" Feb 23 14:38:35.678757 master-0 kubenswrapper[28758]: I0223 14:38:35.675463 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/871f487c-6da7-49af-9dc8-eff9b29305c4-oauth-serving-cert\") pod \"console-fdcd8d88c-hgj8l\" (UID: \"871f487c-6da7-49af-9dc8-eff9b29305c4\") " pod="openshift-console/console-fdcd8d88c-hgj8l" Feb 23 14:38:35.678757 master-0 kubenswrapper[28758]: I0223 14:38:35.675499 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/871f487c-6da7-49af-9dc8-eff9b29305c4-service-ca\") pod \"console-fdcd8d88c-hgj8l\" (UID: \"871f487c-6da7-49af-9dc8-eff9b29305c4\") " pod="openshift-console/console-fdcd8d88c-hgj8l" Feb 23 14:38:35.678757 master-0 kubenswrapper[28758]: I0223 14:38:35.676670 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/871f487c-6da7-49af-9dc8-eff9b29305c4-trusted-ca-bundle\") pod \"console-fdcd8d88c-hgj8l\" (UID: \"871f487c-6da7-49af-9dc8-eff9b29305c4\") " pod="openshift-console/console-fdcd8d88c-hgj8l" Feb 23 14:38:35.700340 master-0 kubenswrapper[28758]: I0223 14:38:35.700240 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhn2g\" (UniqueName: \"kubernetes.io/projected/871f487c-6da7-49af-9dc8-eff9b29305c4-kube-api-access-xhn2g\") pod \"console-fdcd8d88c-hgj8l\" (UID: \"871f487c-6da7-49af-9dc8-eff9b29305c4\") " pod="openshift-console/console-fdcd8d88c-hgj8l" Feb 23 14:38:35.755217 master-0 kubenswrapper[28758]: I0223 14:38:35.755119 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-fdcd8d88c-hgj8l" Feb 23 14:38:36.247753 master-0 kubenswrapper[28758]: I0223 14:38:36.247698 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-fdcd8d88c-hgj8l"] Feb 23 14:38:36.250563 master-0 kubenswrapper[28758]: W0223 14:38:36.250444 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod871f487c_6da7_49af_9dc8_eff9b29305c4.slice/crio-93d907340893351058df10ed91bd2cc6e34de614b9a438a0c034e13d91e33f90 WatchSource:0}: Error finding container 93d907340893351058df10ed91bd2cc6e34de614b9a438a0c034e13d91e33f90: Status 404 returned error can't find the container with id 93d907340893351058df10ed91bd2cc6e34de614b9a438a0c034e13d91e33f90 Feb 23 14:38:36.709385 master-0 kubenswrapper[28758]: I0223 14:38:36.709287 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-fdcd8d88c-hgj8l" event={"ID":"871f487c-6da7-49af-9dc8-eff9b29305c4","Type":"ContainerStarted","Data":"06c83b245d3e13dc0f32cb102c743fa36763ddde8e97ce28406f6f981b3ea50a"} Feb 23 14:38:36.709385 master-0 kubenswrapper[28758]: I0223 14:38:36.709347 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-fdcd8d88c-hgj8l" event={"ID":"871f487c-6da7-49af-9dc8-eff9b29305c4","Type":"ContainerStarted","Data":"93d907340893351058df10ed91bd2cc6e34de614b9a438a0c034e13d91e33f90"} Feb 23 14:38:36.732136 master-0 kubenswrapper[28758]: I0223 14:38:36.732048 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-fdcd8d88c-hgj8l" podStartSLOduration=1.732028254 podStartE2EDuration="1.732028254s" podCreationTimestamp="2026-02-23 14:38:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:38:36.729085055 +0000 UTC m=+248.855400997" watchObservedRunningTime="2026-02-23 14:38:36.732028254 +0000 UTC m=+248.858344186" Feb 23 14:38:37.780395 master-0 kubenswrapper[28758]: I0223 14:38:37.780349 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-56556ccb8b-kfqz7" Feb 23 14:38:37.785978 master-0 kubenswrapper[28758]: I0223 14:38:37.785937 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-56556ccb8b-kfqz7" Feb 23 14:38:45.755397 master-0 kubenswrapper[28758]: I0223 14:38:45.755264 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-fdcd8d88c-hgj8l" Feb 23 14:38:45.755397 master-0 kubenswrapper[28758]: I0223 14:38:45.755358 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-fdcd8d88c-hgj8l" Feb 23 14:38:45.760686 master-0 kubenswrapper[28758]: I0223 14:38:45.760620 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-fdcd8d88c-hgj8l" Feb 23 14:38:45.791689 master-0 kubenswrapper[28758]: I0223 14:38:45.791599 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-fdcd8d88c-hgj8l" Feb 23 14:38:45.870812 master-0 kubenswrapper[28758]: I0223 14:38:45.870698 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7c65cbb888-4xqr4"] Feb 23 14:39:10.915006 master-0 kubenswrapper[28758]: I0223 14:39:10.914909 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-7c65cbb888-4xqr4" podUID="4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae" containerName="console" containerID="cri-o://b04cf2581ce5855388add9518ef51a60c0cff59ad6ca1763c50b74796b962ecc" gracePeriod=15 Feb 23 14:39:11.438650 master-0 kubenswrapper[28758]: I0223 14:39:11.438564 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7c65cbb888-4xqr4_4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae/console/0.log" Feb 23 14:39:11.438650 master-0 kubenswrapper[28758]: I0223 14:39:11.438628 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c65cbb888-4xqr4" Feb 23 14:39:11.451895 master-0 kubenswrapper[28758]: I0223 14:39:11.451844 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zpvn\" (UniqueName: \"kubernetes.io/projected/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-kube-api-access-5zpvn\") pod \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\" (UID: \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\") " Feb 23 14:39:11.452221 master-0 kubenswrapper[28758]: I0223 14:39:11.452177 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-console-config\") pod \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\" (UID: \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\") " Feb 23 14:39:11.452315 master-0 kubenswrapper[28758]: I0223 14:39:11.452277 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-service-ca\") pod \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\" (UID: \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\") " Feb 23 14:39:11.452400 master-0 kubenswrapper[28758]: I0223 14:39:11.452345 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-console-serving-cert\") pod \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\" (UID: \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\") " Feb 23 14:39:11.452400 master-0 kubenswrapper[28758]: I0223 14:39:11.452380 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-console-oauth-config\") pod \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\" (UID: \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\") " Feb 23 14:39:11.452562 master-0 kubenswrapper[28758]: I0223 14:39:11.452407 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-trusted-ca-bundle\") pod \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\" (UID: \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\") " Feb 23 14:39:11.452562 master-0 kubenswrapper[28758]: I0223 14:39:11.452520 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-oauth-serving-cert\") pod \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\" (UID: \"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae\") " Feb 23 14:39:11.453718 master-0 kubenswrapper[28758]: I0223 14:39:11.453621 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae" (UID: "4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:39:11.453718 master-0 kubenswrapper[28758]: I0223 14:39:11.453694 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-service-ca" (OuterVolumeSpecName: "service-ca") pod "4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae" (UID: "4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:39:11.453888 master-0 kubenswrapper[28758]: I0223 14:39:11.453841 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae" (UID: "4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:39:11.454239 master-0 kubenswrapper[28758]: I0223 14:39:11.454210 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-console-config" (OuterVolumeSpecName: "console-config") pod "4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae" (UID: "4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:39:11.455245 master-0 kubenswrapper[28758]: I0223 14:39:11.455197 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-kube-api-access-5zpvn" (OuterVolumeSpecName: "kube-api-access-5zpvn") pod "4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae" (UID: "4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae"). InnerVolumeSpecName "kube-api-access-5zpvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:39:11.457387 master-0 kubenswrapper[28758]: I0223 14:39:11.456720 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae" (UID: "4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:39:11.457387 master-0 kubenswrapper[28758]: I0223 14:39:11.457264 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae" (UID: "4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:39:11.554832 master-0 kubenswrapper[28758]: I0223 14:39:11.554755 28758 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Feb 23 14:39:11.554832 master-0 kubenswrapper[28758]: I0223 14:39:11.554802 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zpvn\" (UniqueName: \"kubernetes.io/projected/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-kube-api-access-5zpvn\") on node \"master-0\" DevicePath \"\"" Feb 23 14:39:11.554832 master-0 kubenswrapper[28758]: I0223 14:39:11.554812 28758 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-console-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:39:11.554832 master-0 kubenswrapper[28758]: I0223 14:39:11.554821 28758 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-service-ca\") on node \"master-0\" DevicePath \"\"" Feb 23 14:39:11.554832 master-0 kubenswrapper[28758]: I0223 14:39:11.554835 28758 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Feb 23 14:39:11.554832 master-0 kubenswrapper[28758]: I0223 14:39:11.554845 28758 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:39:11.555237 master-0 kubenswrapper[28758]: I0223 14:39:11.554854 28758 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:39:12.011613 master-0 kubenswrapper[28758]: I0223 14:39:12.011558 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7c65cbb888-4xqr4_4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae/console/0.log" Feb 23 14:39:12.012173 master-0 kubenswrapper[28758]: I0223 14:39:12.011643 28758 generic.go:334] "Generic (PLEG): container finished" podID="4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae" containerID="b04cf2581ce5855388add9518ef51a60c0cff59ad6ca1763c50b74796b962ecc" exitCode=2 Feb 23 14:39:12.012173 master-0 kubenswrapper[28758]: I0223 14:39:12.011690 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c65cbb888-4xqr4" event={"ID":"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae","Type":"ContainerDied","Data":"b04cf2581ce5855388add9518ef51a60c0cff59ad6ca1763c50b74796b962ecc"} Feb 23 14:39:12.012173 master-0 kubenswrapper[28758]: I0223 14:39:12.011711 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c65cbb888-4xqr4" Feb 23 14:39:12.012173 master-0 kubenswrapper[28758]: I0223 14:39:12.011747 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c65cbb888-4xqr4" event={"ID":"4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae","Type":"ContainerDied","Data":"7b441f14a306510646ba50e569176b796adaf9ce100e9c4629c4acf49c8a1a24"} Feb 23 14:39:12.012173 master-0 kubenswrapper[28758]: I0223 14:39:12.011777 28758 scope.go:117] "RemoveContainer" containerID="b04cf2581ce5855388add9518ef51a60c0cff59ad6ca1763c50b74796b962ecc" Feb 23 14:39:12.039118 master-0 kubenswrapper[28758]: I0223 14:39:12.039082 28758 scope.go:117] "RemoveContainer" containerID="b04cf2581ce5855388add9518ef51a60c0cff59ad6ca1763c50b74796b962ecc" Feb 23 14:39:12.040304 master-0 kubenswrapper[28758]: E0223 14:39:12.040252 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b04cf2581ce5855388add9518ef51a60c0cff59ad6ca1763c50b74796b962ecc\": container with ID starting with b04cf2581ce5855388add9518ef51a60c0cff59ad6ca1763c50b74796b962ecc not found: ID does not exist" containerID="b04cf2581ce5855388add9518ef51a60c0cff59ad6ca1763c50b74796b962ecc" Feb 23 14:39:12.040410 master-0 kubenswrapper[28758]: I0223 14:39:12.040310 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b04cf2581ce5855388add9518ef51a60c0cff59ad6ca1763c50b74796b962ecc"} err="failed to get container status \"b04cf2581ce5855388add9518ef51a60c0cff59ad6ca1763c50b74796b962ecc\": rpc error: code = NotFound desc = could not find container \"b04cf2581ce5855388add9518ef51a60c0cff59ad6ca1763c50b74796b962ecc\": container with ID starting with b04cf2581ce5855388add9518ef51a60c0cff59ad6ca1763c50b74796b962ecc not found: ID does not exist" Feb 23 14:39:12.059319 master-0 kubenswrapper[28758]: I0223 14:39:12.059213 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7c65cbb888-4xqr4"] Feb 23 14:39:12.067671 master-0 kubenswrapper[28758]: I0223 14:39:12.067586 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7c65cbb888-4xqr4"] Feb 23 14:39:12.102464 master-0 kubenswrapper[28758]: I0223 14:39:12.102386 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae" path="/var/lib/kubelet/pods/4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae/volumes" Feb 23 14:39:25.061672 master-0 kubenswrapper[28758]: I0223 14:39:25.061618 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:39:25.099003 master-0 kubenswrapper[28758]: I0223 14:39:25.098964 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:39:25.158475 master-0 kubenswrapper[28758]: I0223 14:39:25.158401 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Feb 23 14:39:28.071124 master-0 kubenswrapper[28758]: I0223 14:39:28.071055 28758 kubelet.go:1505] "Image garbage collection succeeded" Feb 23 14:39:28.328164 master-0 kubenswrapper[28758]: I0223 14:39:28.327979 28758 scope.go:117] "RemoveContainer" containerID="610ae43e00e8e1b0ff3dba88a6993fdf43f969aae5bdeeca94356519cf7c2602" Feb 23 14:39:28.347929 master-0 kubenswrapper[28758]: I0223 14:39:28.347813 28758 scope.go:117] "RemoveContainer" containerID="7a566b5e0634944e8d5422e837762f49e59d372d80be5def3116f1b2efb53f3a" Feb 23 14:39:28.367398 master-0 kubenswrapper[28758]: I0223 14:39:28.367353 28758 scope.go:117] "RemoveContainer" containerID="a83359f382bbf7e84f344fddfee0b01fcac40fd46b179877c37b9571576884e8" Feb 23 14:39:28.389594 master-0 kubenswrapper[28758]: I0223 14:39:28.389555 28758 scope.go:117] "RemoveContainer" containerID="1bc827a8854dfec15010881aca2028b9a63aeaba5a66ba610581f32b2d5f3a53" Feb 23 14:39:28.417156 master-0 kubenswrapper[28758]: I0223 14:39:28.416673 28758 scope.go:117] "RemoveContainer" containerID="a607c62f2f6fcfd8c1e82eea4a4f2c6c0363686e9d645511b15629d774c518ef" Feb 23 14:40:13.503572 master-0 kubenswrapper[28758]: I0223 14:40:13.503522 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-688ffc4cbf-2brtn"] Feb 23 14:40:13.504437 master-0 kubenswrapper[28758]: E0223 14:40:13.504417 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae" containerName="console" Feb 23 14:40:13.504566 master-0 kubenswrapper[28758]: I0223 14:40:13.504556 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae" containerName="console" Feb 23 14:40:13.504795 master-0 kubenswrapper[28758]: I0223 14:40:13.504782 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a9d0b6c-17e9-4a2c-b825-004ae4bc5bae" containerName="console" Feb 23 14:40:13.505577 master-0 kubenswrapper[28758]: I0223 14:40:13.505561 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-688ffc4cbf-2brtn" Feb 23 14:40:13.530876 master-0 kubenswrapper[28758]: I0223 14:40:13.530830 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-688ffc4cbf-2brtn"] Feb 23 14:40:13.602386 master-0 kubenswrapper[28758]: I0223 14:40:13.602302 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a34af5db-bbf9-4a9d-a822-112be40e78fe-oauth-serving-cert\") pod \"console-688ffc4cbf-2brtn\" (UID: \"a34af5db-bbf9-4a9d-a822-112be40e78fe\") " pod="openshift-console/console-688ffc4cbf-2brtn" Feb 23 14:40:13.602644 master-0 kubenswrapper[28758]: I0223 14:40:13.602406 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a34af5db-bbf9-4a9d-a822-112be40e78fe-console-oauth-config\") pod \"console-688ffc4cbf-2brtn\" (UID: \"a34af5db-bbf9-4a9d-a822-112be40e78fe\") " pod="openshift-console/console-688ffc4cbf-2brtn" Feb 23 14:40:13.602644 master-0 kubenswrapper[28758]: I0223 14:40:13.602437 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a34af5db-bbf9-4a9d-a822-112be40e78fe-service-ca\") pod \"console-688ffc4cbf-2brtn\" (UID: \"a34af5db-bbf9-4a9d-a822-112be40e78fe\") " pod="openshift-console/console-688ffc4cbf-2brtn" Feb 23 14:40:13.602644 master-0 kubenswrapper[28758]: I0223 14:40:13.602464 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqmt4\" (UniqueName: \"kubernetes.io/projected/a34af5db-bbf9-4a9d-a822-112be40e78fe-kube-api-access-rqmt4\") pod \"console-688ffc4cbf-2brtn\" (UID: \"a34af5db-bbf9-4a9d-a822-112be40e78fe\") " pod="openshift-console/console-688ffc4cbf-2brtn" Feb 23 14:40:13.602644 master-0 kubenswrapper[28758]: I0223 14:40:13.602547 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a34af5db-bbf9-4a9d-a822-112be40e78fe-console-serving-cert\") pod \"console-688ffc4cbf-2brtn\" (UID: \"a34af5db-bbf9-4a9d-a822-112be40e78fe\") " pod="openshift-console/console-688ffc4cbf-2brtn" Feb 23 14:40:13.602644 master-0 kubenswrapper[28758]: I0223 14:40:13.602582 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a34af5db-bbf9-4a9d-a822-112be40e78fe-trusted-ca-bundle\") pod \"console-688ffc4cbf-2brtn\" (UID: \"a34af5db-bbf9-4a9d-a822-112be40e78fe\") " pod="openshift-console/console-688ffc4cbf-2brtn" Feb 23 14:40:13.602644 master-0 kubenswrapper[28758]: I0223 14:40:13.602604 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a34af5db-bbf9-4a9d-a822-112be40e78fe-console-config\") pod \"console-688ffc4cbf-2brtn\" (UID: \"a34af5db-bbf9-4a9d-a822-112be40e78fe\") " pod="openshift-console/console-688ffc4cbf-2brtn" Feb 23 14:40:13.703405 master-0 kubenswrapper[28758]: I0223 14:40:13.703370 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a34af5db-bbf9-4a9d-a822-112be40e78fe-console-serving-cert\") pod \"console-688ffc4cbf-2brtn\" (UID: \"a34af5db-bbf9-4a9d-a822-112be40e78fe\") " pod="openshift-console/console-688ffc4cbf-2brtn" Feb 23 14:40:13.703734 master-0 kubenswrapper[28758]: I0223 14:40:13.703706 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a34af5db-bbf9-4a9d-a822-112be40e78fe-trusted-ca-bundle\") pod \"console-688ffc4cbf-2brtn\" (UID: \"a34af5db-bbf9-4a9d-a822-112be40e78fe\") " pod="openshift-console/console-688ffc4cbf-2brtn" Feb 23 14:40:13.703877 master-0 kubenswrapper[28758]: I0223 14:40:13.703860 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a34af5db-bbf9-4a9d-a822-112be40e78fe-console-config\") pod \"console-688ffc4cbf-2brtn\" (UID: \"a34af5db-bbf9-4a9d-a822-112be40e78fe\") " pod="openshift-console/console-688ffc4cbf-2brtn" Feb 23 14:40:13.703983 master-0 kubenswrapper[28758]: I0223 14:40:13.703970 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a34af5db-bbf9-4a9d-a822-112be40e78fe-oauth-serving-cert\") pod \"console-688ffc4cbf-2brtn\" (UID: \"a34af5db-bbf9-4a9d-a822-112be40e78fe\") " pod="openshift-console/console-688ffc4cbf-2brtn" Feb 23 14:40:13.704096 master-0 kubenswrapper[28758]: I0223 14:40:13.704083 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a34af5db-bbf9-4a9d-a822-112be40e78fe-console-oauth-config\") pod \"console-688ffc4cbf-2brtn\" (UID: \"a34af5db-bbf9-4a9d-a822-112be40e78fe\") " pod="openshift-console/console-688ffc4cbf-2brtn" Feb 23 14:40:13.704200 master-0 kubenswrapper[28758]: I0223 14:40:13.704181 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a34af5db-bbf9-4a9d-a822-112be40e78fe-service-ca\") pod \"console-688ffc4cbf-2brtn\" (UID: \"a34af5db-bbf9-4a9d-a822-112be40e78fe\") " pod="openshift-console/console-688ffc4cbf-2brtn" Feb 23 14:40:13.704303 master-0 kubenswrapper[28758]: I0223 14:40:13.704289 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqmt4\" (UniqueName: \"kubernetes.io/projected/a34af5db-bbf9-4a9d-a822-112be40e78fe-kube-api-access-rqmt4\") pod \"console-688ffc4cbf-2brtn\" (UID: \"a34af5db-bbf9-4a9d-a822-112be40e78fe\") " pod="openshift-console/console-688ffc4cbf-2brtn" Feb 23 14:40:13.704686 master-0 kubenswrapper[28758]: I0223 14:40:13.704647 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a34af5db-bbf9-4a9d-a822-112be40e78fe-trusted-ca-bundle\") pod \"console-688ffc4cbf-2brtn\" (UID: \"a34af5db-bbf9-4a9d-a822-112be40e78fe\") " pod="openshift-console/console-688ffc4cbf-2brtn" Feb 23 14:40:13.704786 master-0 kubenswrapper[28758]: I0223 14:40:13.704748 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a34af5db-bbf9-4a9d-a822-112be40e78fe-console-config\") pod \"console-688ffc4cbf-2brtn\" (UID: \"a34af5db-bbf9-4a9d-a822-112be40e78fe\") " pod="openshift-console/console-688ffc4cbf-2brtn" Feb 23 14:40:13.704990 master-0 kubenswrapper[28758]: I0223 14:40:13.704878 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a34af5db-bbf9-4a9d-a822-112be40e78fe-oauth-serving-cert\") pod \"console-688ffc4cbf-2brtn\" (UID: \"a34af5db-bbf9-4a9d-a822-112be40e78fe\") " pod="openshift-console/console-688ffc4cbf-2brtn" Feb 23 14:40:13.704990 master-0 kubenswrapper[28758]: I0223 14:40:13.704893 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a34af5db-bbf9-4a9d-a822-112be40e78fe-service-ca\") pod \"console-688ffc4cbf-2brtn\" (UID: \"a34af5db-bbf9-4a9d-a822-112be40e78fe\") " pod="openshift-console/console-688ffc4cbf-2brtn" Feb 23 14:40:13.706445 master-0 kubenswrapper[28758]: I0223 14:40:13.706406 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a34af5db-bbf9-4a9d-a822-112be40e78fe-console-serving-cert\") pod \"console-688ffc4cbf-2brtn\" (UID: \"a34af5db-bbf9-4a9d-a822-112be40e78fe\") " pod="openshift-console/console-688ffc4cbf-2brtn" Feb 23 14:40:13.707157 master-0 kubenswrapper[28758]: I0223 14:40:13.707111 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a34af5db-bbf9-4a9d-a822-112be40e78fe-console-oauth-config\") pod \"console-688ffc4cbf-2brtn\" (UID: \"a34af5db-bbf9-4a9d-a822-112be40e78fe\") " pod="openshift-console/console-688ffc4cbf-2brtn" Feb 23 14:40:13.721355 master-0 kubenswrapper[28758]: I0223 14:40:13.721297 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqmt4\" (UniqueName: \"kubernetes.io/projected/a34af5db-bbf9-4a9d-a822-112be40e78fe-kube-api-access-rqmt4\") pod \"console-688ffc4cbf-2brtn\" (UID: \"a34af5db-bbf9-4a9d-a822-112be40e78fe\") " pod="openshift-console/console-688ffc4cbf-2brtn" Feb 23 14:40:13.855404 master-0 kubenswrapper[28758]: I0223 14:40:13.855251 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-688ffc4cbf-2brtn" Feb 23 14:40:14.281911 master-0 kubenswrapper[28758]: I0223 14:40:14.281852 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-688ffc4cbf-2brtn"] Feb 23 14:40:14.286336 master-0 kubenswrapper[28758]: W0223 14:40:14.286284 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda34af5db_bbf9_4a9d_a822_112be40e78fe.slice/crio-498b3ed84d24314845b96669bd21a641bff94f329e776e9433c847fcf9d3c13d WatchSource:0}: Error finding container 498b3ed84d24314845b96669bd21a641bff94f329e776e9433c847fcf9d3c13d: Status 404 returned error can't find the container with id 498b3ed84d24314845b96669bd21a641bff94f329e776e9433c847fcf9d3c13d Feb 23 14:40:14.725147 master-0 kubenswrapper[28758]: I0223 14:40:14.725074 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-688ffc4cbf-2brtn" event={"ID":"a34af5db-bbf9-4a9d-a822-112be40e78fe","Type":"ContainerStarted","Data":"416ba7badcc5c7f0a84772cee6ed1cecc21c532a8eb50c3097be01e6d740ccfa"} Feb 23 14:40:14.725147 master-0 kubenswrapper[28758]: I0223 14:40:14.725139 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-688ffc4cbf-2brtn" event={"ID":"a34af5db-bbf9-4a9d-a822-112be40e78fe","Type":"ContainerStarted","Data":"498b3ed84d24314845b96669bd21a641bff94f329e776e9433c847fcf9d3c13d"} Feb 23 14:40:14.753064 master-0 kubenswrapper[28758]: I0223 14:40:14.752969 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-688ffc4cbf-2brtn" podStartSLOduration=1.752944587 podStartE2EDuration="1.752944587s" podCreationTimestamp="2026-02-23 14:40:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:40:14.749773712 +0000 UTC m=+346.876089654" watchObservedRunningTime="2026-02-23 14:40:14.752944587 +0000 UTC m=+346.879260509" Feb 23 14:40:23.855961 master-0 kubenswrapper[28758]: I0223 14:40:23.855867 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-688ffc4cbf-2brtn" Feb 23 14:40:23.856725 master-0 kubenswrapper[28758]: I0223 14:40:23.855988 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-688ffc4cbf-2brtn" Feb 23 14:40:23.862049 master-0 kubenswrapper[28758]: I0223 14:40:23.861998 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-688ffc4cbf-2brtn" Feb 23 14:40:24.807277 master-0 kubenswrapper[28758]: I0223 14:40:24.807215 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-688ffc4cbf-2brtn" Feb 23 14:40:24.881668 master-0 kubenswrapper[28758]: I0223 14:40:24.881584 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-fdcd8d88c-hgj8l"] Feb 23 14:40:27.828532 master-0 kubenswrapper[28758]: I0223 14:40:27.828469 28758 generic.go:334] "Generic (PLEG): container finished" podID="9416f5d0-32b4-4065-b678-26913af8b6dd" containerID="f866731e4ac5121ccde39a6f28422037df55500fc5889296919662d103c3a36f" exitCode=0 Feb 23 14:40:27.829103 master-0 kubenswrapper[28758]: I0223 14:40:27.828542 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" event={"ID":"9416f5d0-32b4-4065-b678-26913af8b6dd","Type":"ContainerDied","Data":"f866731e4ac5121ccde39a6f28422037df55500fc5889296919662d103c3a36f"} Feb 23 14:40:27.829103 master-0 kubenswrapper[28758]: I0223 14:40:27.828595 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" event={"ID":"9416f5d0-32b4-4065-b678-26913af8b6dd","Type":"ContainerDied","Data":"ba80f8cbf4454b204ee21a5520078d48d5261a99279a142cb4f152e1edc60436"} Feb 23 14:40:27.829103 master-0 kubenswrapper[28758]: I0223 14:40:27.828612 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba80f8cbf4454b204ee21a5520078d48d5261a99279a142cb4f152e1edc60436" Feb 23 14:40:27.886275 master-0 kubenswrapper[28758]: I0223 14:40:27.886226 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:40:28.013086 master-0 kubenswrapper[28758]: I0223 14:40:28.013011 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-secret-metrics-client-certs\") pod \"9416f5d0-32b4-4065-b678-26913af8b6dd\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " Feb 23 14:40:28.013326 master-0 kubenswrapper[28758]: I0223 14:40:28.013118 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-client-ca-bundle\") pod \"9416f5d0-32b4-4065-b678-26913af8b6dd\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " Feb 23 14:40:28.013326 master-0 kubenswrapper[28758]: I0223 14:40:28.013171 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hnfl\" (UniqueName: \"kubernetes.io/projected/9416f5d0-32b4-4065-b678-26913af8b6dd-kube-api-access-7hnfl\") pod \"9416f5d0-32b4-4065-b678-26913af8b6dd\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " Feb 23 14:40:28.013326 master-0 kubenswrapper[28758]: I0223 14:40:28.013209 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9416f5d0-32b4-4065-b678-26913af8b6dd-configmap-kubelet-serving-ca-bundle\") pod \"9416f5d0-32b4-4065-b678-26913af8b6dd\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " Feb 23 14:40:28.013326 master-0 kubenswrapper[28758]: I0223 14:40:28.013250 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9416f5d0-32b4-4065-b678-26913af8b6dd-audit-log\") pod \"9416f5d0-32b4-4065-b678-26913af8b6dd\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " Feb 23 14:40:28.013326 master-0 kubenswrapper[28758]: I0223 14:40:28.013290 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-secret-metrics-server-tls\") pod \"9416f5d0-32b4-4065-b678-26913af8b6dd\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " Feb 23 14:40:28.013629 master-0 kubenswrapper[28758]: I0223 14:40:28.013331 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9416f5d0-32b4-4065-b678-26913af8b6dd-metrics-server-audit-profiles\") pod \"9416f5d0-32b4-4065-b678-26913af8b6dd\" (UID: \"9416f5d0-32b4-4065-b678-26913af8b6dd\") " Feb 23 14:40:28.014158 master-0 kubenswrapper[28758]: I0223 14:40:28.013958 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9416f5d0-32b4-4065-b678-26913af8b6dd-audit-log" (OuterVolumeSpecName: "audit-log") pod "9416f5d0-32b4-4065-b678-26913af8b6dd" (UID: "9416f5d0-32b4-4065-b678-26913af8b6dd"). InnerVolumeSpecName "audit-log". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:40:28.014158 master-0 kubenswrapper[28758]: I0223 14:40:28.014026 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9416f5d0-32b4-4065-b678-26913af8b6dd-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "9416f5d0-32b4-4065-b678-26913af8b6dd" (UID: "9416f5d0-32b4-4065-b678-26913af8b6dd"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:40:28.014158 master-0 kubenswrapper[28758]: I0223 14:40:28.014123 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9416f5d0-32b4-4065-b678-26913af8b6dd-metrics-server-audit-profiles" (OuterVolumeSpecName: "metrics-server-audit-profiles") pod "9416f5d0-32b4-4065-b678-26913af8b6dd" (UID: "9416f5d0-32b4-4065-b678-26913af8b6dd"). InnerVolumeSpecName "metrics-server-audit-profiles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:40:28.016551 master-0 kubenswrapper[28758]: I0223 14:40:28.016469 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9416f5d0-32b4-4065-b678-26913af8b6dd-kube-api-access-7hnfl" (OuterVolumeSpecName: "kube-api-access-7hnfl") pod "9416f5d0-32b4-4065-b678-26913af8b6dd" (UID: "9416f5d0-32b4-4065-b678-26913af8b6dd"). InnerVolumeSpecName "kube-api-access-7hnfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:40:28.016973 master-0 kubenswrapper[28758]: I0223 14:40:28.016935 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "9416f5d0-32b4-4065-b678-26913af8b6dd" (UID: "9416f5d0-32b4-4065-b678-26913af8b6dd"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:40:28.018683 master-0 kubenswrapper[28758]: I0223 14:40:28.018646 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-secret-metrics-server-tls" (OuterVolumeSpecName: "secret-metrics-server-tls") pod "9416f5d0-32b4-4065-b678-26913af8b6dd" (UID: "9416f5d0-32b4-4065-b678-26913af8b6dd"). InnerVolumeSpecName "secret-metrics-server-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:40:28.018746 master-0 kubenswrapper[28758]: I0223 14:40:28.018715 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-client-ca-bundle" (OuterVolumeSpecName: "client-ca-bundle") pod "9416f5d0-32b4-4065-b678-26913af8b6dd" (UID: "9416f5d0-32b4-4065-b678-26913af8b6dd"). InnerVolumeSpecName "client-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:40:28.115430 master-0 kubenswrapper[28758]: I0223 14:40:28.115267 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hnfl\" (UniqueName: \"kubernetes.io/projected/9416f5d0-32b4-4065-b678-26913af8b6dd-kube-api-access-7hnfl\") on node \"master-0\" DevicePath \"\"" Feb 23 14:40:28.115430 master-0 kubenswrapper[28758]: I0223 14:40:28.115311 28758 reconciler_common.go:293] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9416f5d0-32b4-4065-b678-26913af8b6dd-configmap-kubelet-serving-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:40:28.115430 master-0 kubenswrapper[28758]: I0223 14:40:28.115355 28758 reconciler_common.go:293] "Volume detached for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9416f5d0-32b4-4065-b678-26913af8b6dd-audit-log\") on node \"master-0\" DevicePath \"\"" Feb 23 14:40:28.115430 master-0 kubenswrapper[28758]: I0223 14:40:28.115369 28758 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-secret-metrics-server-tls\") on node \"master-0\" DevicePath \"\"" Feb 23 14:40:28.115430 master-0 kubenswrapper[28758]: I0223 14:40:28.115378 28758 reconciler_common.go:293] "Volume detached for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9416f5d0-32b4-4065-b678-26913af8b6dd-metrics-server-audit-profiles\") on node \"master-0\" DevicePath \"\"" Feb 23 14:40:28.115430 master-0 kubenswrapper[28758]: I0223 14:40:28.115387 28758 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-secret-metrics-client-certs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:40:28.115430 master-0 kubenswrapper[28758]: I0223 14:40:28.115396 28758 reconciler_common.go:293] "Volume detached for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9416f5d0-32b4-4065-b678-26913af8b6dd-client-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:40:28.488325 master-0 kubenswrapper[28758]: I0223 14:40:28.488248 28758 scope.go:117] "RemoveContainer" containerID="f866731e4ac5121ccde39a6f28422037df55500fc5889296919662d103c3a36f" Feb 23 14:40:28.617040 master-0 kubenswrapper[28758]: I0223 14:40:28.616972 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-xjjxf"] Feb 23 14:40:28.617281 master-0 kubenswrapper[28758]: E0223 14:40:28.617255 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9416f5d0-32b4-4065-b678-26913af8b6dd" containerName="metrics-server" Feb 23 14:40:28.617281 master-0 kubenswrapper[28758]: I0223 14:40:28.617268 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="9416f5d0-32b4-4065-b678-26913af8b6dd" containerName="metrics-server" Feb 23 14:40:28.617449 master-0 kubenswrapper[28758]: I0223 14:40:28.617429 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="9416f5d0-32b4-4065-b678-26913af8b6dd" containerName="metrics-server" Feb 23 14:40:28.618103 master-0 kubenswrapper[28758]: I0223 14:40:28.618072 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-78f6d7d749-xjjxf" Feb 23 14:40:28.621025 master-0 kubenswrapper[28758]: I0223 14:40:28.620588 28758 reflector.go:368] Caches populated for *v1.Secret from object-"sushy-emulator"/"os-client-config" Feb 23 14:40:28.621025 master-0 kubenswrapper[28758]: I0223 14:40:28.620823 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"openshift-service-ca.crt" Feb 23 14:40:28.621025 master-0 kubenswrapper[28758]: I0223 14:40:28.620868 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"kube-root-ca.crt" Feb 23 14:40:28.623640 master-0 kubenswrapper[28758]: I0223 14:40:28.623593 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"sushy-emulator-config" Feb 23 14:40:28.628643 master-0 kubenswrapper[28758]: I0223 14:40:28.628586 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-xjjxf"] Feb 23 14:40:28.724281 master-0 kubenswrapper[28758]: I0223 14:40:28.724198 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/dde92d85-6a81-45fa-a900-097b3dc3af38-sushy-emulator-config\") pod \"sushy-emulator-78f6d7d749-xjjxf\" (UID: \"dde92d85-6a81-45fa-a900-097b3dc3af38\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-xjjxf" Feb 23 14:40:28.724281 master-0 kubenswrapper[28758]: I0223 14:40:28.724253 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/dde92d85-6a81-45fa-a900-097b3dc3af38-os-client-config\") pod \"sushy-emulator-78f6d7d749-xjjxf\" (UID: \"dde92d85-6a81-45fa-a900-097b3dc3af38\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-xjjxf" Feb 23 14:40:28.724615 master-0 kubenswrapper[28758]: I0223 14:40:28.724300 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk6d2\" (UniqueName: \"kubernetes.io/projected/dde92d85-6a81-45fa-a900-097b3dc3af38-kube-api-access-lk6d2\") pod \"sushy-emulator-78f6d7d749-xjjxf\" (UID: \"dde92d85-6a81-45fa-a900-097b3dc3af38\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-xjjxf" Feb 23 14:40:28.825969 master-0 kubenswrapper[28758]: I0223 14:40:28.825819 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/dde92d85-6a81-45fa-a900-097b3dc3af38-sushy-emulator-config\") pod \"sushy-emulator-78f6d7d749-xjjxf\" (UID: \"dde92d85-6a81-45fa-a900-097b3dc3af38\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-xjjxf" Feb 23 14:40:28.825969 master-0 kubenswrapper[28758]: I0223 14:40:28.825895 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/dde92d85-6a81-45fa-a900-097b3dc3af38-os-client-config\") pod \"sushy-emulator-78f6d7d749-xjjxf\" (UID: \"dde92d85-6a81-45fa-a900-097b3dc3af38\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-xjjxf" Feb 23 14:40:28.825969 master-0 kubenswrapper[28758]: I0223 14:40:28.825949 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk6d2\" (UniqueName: \"kubernetes.io/projected/dde92d85-6a81-45fa-a900-097b3dc3af38-kube-api-access-lk6d2\") pod \"sushy-emulator-78f6d7d749-xjjxf\" (UID: \"dde92d85-6a81-45fa-a900-097b3dc3af38\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-xjjxf" Feb 23 14:40:28.826929 master-0 kubenswrapper[28758]: I0223 14:40:28.826882 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/dde92d85-6a81-45fa-a900-097b3dc3af38-sushy-emulator-config\") pod \"sushy-emulator-78f6d7d749-xjjxf\" (UID: \"dde92d85-6a81-45fa-a900-097b3dc3af38\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-xjjxf" Feb 23 14:40:28.829084 master-0 kubenswrapper[28758]: I0223 14:40:28.829018 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/dde92d85-6a81-45fa-a900-097b3dc3af38-os-client-config\") pod \"sushy-emulator-78f6d7d749-xjjxf\" (UID: \"dde92d85-6a81-45fa-a900-097b3dc3af38\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-xjjxf" Feb 23 14:40:28.833277 master-0 kubenswrapper[28758]: I0223 14:40:28.833234 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-f55d8f669-b2gf9" Feb 23 14:40:28.842780 master-0 kubenswrapper[28758]: I0223 14:40:28.842736 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk6d2\" (UniqueName: \"kubernetes.io/projected/dde92d85-6a81-45fa-a900-097b3dc3af38-kube-api-access-lk6d2\") pod \"sushy-emulator-78f6d7d749-xjjxf\" (UID: \"dde92d85-6a81-45fa-a900-097b3dc3af38\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-xjjxf" Feb 23 14:40:28.894277 master-0 kubenswrapper[28758]: I0223 14:40:28.894198 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/metrics-server-f55d8f669-b2gf9"] Feb 23 14:40:28.899707 master-0 kubenswrapper[28758]: I0223 14:40:28.899644 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/metrics-server-f55d8f669-b2gf9"] Feb 23 14:40:28.964790 master-0 kubenswrapper[28758]: I0223 14:40:28.964711 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-78f6d7d749-xjjxf" Feb 23 14:40:29.388248 master-0 kubenswrapper[28758]: I0223 14:40:29.388192 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-xjjxf"] Feb 23 14:40:29.389771 master-0 kubenswrapper[28758]: W0223 14:40:29.389718 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddde92d85_6a81_45fa_a900_097b3dc3af38.slice/crio-f56539d3604b2838d97f3306cb4d70a39fba6b8748960044e591b34848cd4570 WatchSource:0}: Error finding container f56539d3604b2838d97f3306cb4d70a39fba6b8748960044e591b34848cd4570: Status 404 returned error can't find the container with id f56539d3604b2838d97f3306cb4d70a39fba6b8748960044e591b34848cd4570 Feb 23 14:40:29.392083 master-0 kubenswrapper[28758]: I0223 14:40:29.392050 28758 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 14:40:29.842120 master-0 kubenswrapper[28758]: I0223 14:40:29.842009 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-78f6d7d749-xjjxf" event={"ID":"dde92d85-6a81-45fa-a900-097b3dc3af38","Type":"ContainerStarted","Data":"f56539d3604b2838d97f3306cb4d70a39fba6b8748960044e591b34848cd4570"} Feb 23 14:40:30.096635 master-0 kubenswrapper[28758]: I0223 14:40:30.096466 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9416f5d0-32b4-4065-b678-26913af8b6dd" path="/var/lib/kubelet/pods/9416f5d0-32b4-4065-b678-26913af8b6dd/volumes" Feb 23 14:40:35.204054 master-0 kubenswrapper[28758]: I0223 14:40:35.203963 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Feb 23 14:40:35.205184 master-0 kubenswrapper[28758]: I0223 14:40:35.205155 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Feb 23 14:40:35.207895 master-0 kubenswrapper[28758]: I0223 14:40:35.207800 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 23 14:40:35.207895 master-0 kubenswrapper[28758]: I0223 14:40:35.207816 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-4hjzs" Feb 23 14:40:35.216553 master-0 kubenswrapper[28758]: I0223 14:40:35.216305 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Feb 23 14:40:35.324324 master-0 kubenswrapper[28758]: I0223 14:40:35.324260 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8416da49-c72c-4f48-a054-bad4fa20e56d-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"8416da49-c72c-4f48-a054-bad4fa20e56d\") " pod="openshift-kube-controller-manager/installer-4-master-0" Feb 23 14:40:35.324324 master-0 kubenswrapper[28758]: I0223 14:40:35.324312 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8416da49-c72c-4f48-a054-bad4fa20e56d-kube-api-access\") pod \"installer-4-master-0\" (UID: \"8416da49-c72c-4f48-a054-bad4fa20e56d\") " pod="openshift-kube-controller-manager/installer-4-master-0" Feb 23 14:40:35.324581 master-0 kubenswrapper[28758]: I0223 14:40:35.324383 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8416da49-c72c-4f48-a054-bad4fa20e56d-var-lock\") pod \"installer-4-master-0\" (UID: \"8416da49-c72c-4f48-a054-bad4fa20e56d\") " pod="openshift-kube-controller-manager/installer-4-master-0" Feb 23 14:40:35.426555 master-0 kubenswrapper[28758]: I0223 14:40:35.426452 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8416da49-c72c-4f48-a054-bad4fa20e56d-var-lock\") pod \"installer-4-master-0\" (UID: \"8416da49-c72c-4f48-a054-bad4fa20e56d\") " pod="openshift-kube-controller-manager/installer-4-master-0" Feb 23 14:40:35.426833 master-0 kubenswrapper[28758]: I0223 14:40:35.426587 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8416da49-c72c-4f48-a054-bad4fa20e56d-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"8416da49-c72c-4f48-a054-bad4fa20e56d\") " pod="openshift-kube-controller-manager/installer-4-master-0" Feb 23 14:40:35.426833 master-0 kubenswrapper[28758]: I0223 14:40:35.426608 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8416da49-c72c-4f48-a054-bad4fa20e56d-kube-api-access\") pod \"installer-4-master-0\" (UID: \"8416da49-c72c-4f48-a054-bad4fa20e56d\") " pod="openshift-kube-controller-manager/installer-4-master-0" Feb 23 14:40:35.427061 master-0 kubenswrapper[28758]: I0223 14:40:35.426990 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8416da49-c72c-4f48-a054-bad4fa20e56d-var-lock\") pod \"installer-4-master-0\" (UID: \"8416da49-c72c-4f48-a054-bad4fa20e56d\") " pod="openshift-kube-controller-manager/installer-4-master-0" Feb 23 14:40:35.427061 master-0 kubenswrapper[28758]: I0223 14:40:35.427004 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8416da49-c72c-4f48-a054-bad4fa20e56d-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"8416da49-c72c-4f48-a054-bad4fa20e56d\") " pod="openshift-kube-controller-manager/installer-4-master-0" Feb 23 14:40:35.462465 master-0 kubenswrapper[28758]: I0223 14:40:35.462325 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8416da49-c72c-4f48-a054-bad4fa20e56d-kube-api-access\") pod \"installer-4-master-0\" (UID: \"8416da49-c72c-4f48-a054-bad4fa20e56d\") " pod="openshift-kube-controller-manager/installer-4-master-0" Feb 23 14:40:35.535947 master-0 kubenswrapper[28758]: I0223 14:40:35.535890 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Feb 23 14:40:35.880168 master-0 kubenswrapper[28758]: I0223 14:40:35.879905 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-78f6d7d749-xjjxf" event={"ID":"dde92d85-6a81-45fa-a900-097b3dc3af38","Type":"ContainerStarted","Data":"3797ca02068c071b29a3679ca5559d7d23124b177fc0e79f3c4f8ab94dc8f7b2"} Feb 23 14:40:35.900355 master-0 kubenswrapper[28758]: I0223 14:40:35.900270 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/sushy-emulator-78f6d7d749-xjjxf" podStartSLOduration=2.090828287 podStartE2EDuration="7.900249781s" podCreationTimestamp="2026-02-23 14:40:28 +0000 UTC" firstStartedPulling="2026-02-23 14:40:29.391930972 +0000 UTC m=+361.518246904" lastFinishedPulling="2026-02-23 14:40:35.201352466 +0000 UTC m=+367.327668398" observedRunningTime="2026-02-23 14:40:35.897153808 +0000 UTC m=+368.023469770" watchObservedRunningTime="2026-02-23 14:40:35.900249781 +0000 UTC m=+368.026565713" Feb 23 14:40:35.960009 master-0 kubenswrapper[28758]: I0223 14:40:35.959952 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Feb 23 14:40:35.962222 master-0 kubenswrapper[28758]: W0223 14:40:35.962175 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8416da49_c72c_4f48_a054_bad4fa20e56d.slice/crio-49e147d878a7a864b2aad39486e66741da19414611fce969bbefa8ddf0ccf382 WatchSource:0}: Error finding container 49e147d878a7a864b2aad39486e66741da19414611fce969bbefa8ddf0ccf382: Status 404 returned error can't find the container with id 49e147d878a7a864b2aad39486e66741da19414611fce969bbefa8ddf0ccf382 Feb 23 14:40:36.904205 master-0 kubenswrapper[28758]: I0223 14:40:36.904015 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"8416da49-c72c-4f48-a054-bad4fa20e56d","Type":"ContainerStarted","Data":"e1b6fdb79b23dddfb94e4cc28afba8aafa9fdd08a8626a8102e0afa087c61c39"} Feb 23 14:40:36.904205 master-0 kubenswrapper[28758]: I0223 14:40:36.904102 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"8416da49-c72c-4f48-a054-bad4fa20e56d","Type":"ContainerStarted","Data":"49e147d878a7a864b2aad39486e66741da19414611fce969bbefa8ddf0ccf382"} Feb 23 14:40:36.930377 master-0 kubenswrapper[28758]: I0223 14:40:36.930272 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-4-master-0" podStartSLOduration=1.930250552 podStartE2EDuration="1.930250552s" podCreationTimestamp="2026-02-23 14:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:40:36.924046726 +0000 UTC m=+369.050362668" watchObservedRunningTime="2026-02-23 14:40:36.930250552 +0000 UTC m=+369.056566474" Feb 23 14:40:38.965847 master-0 kubenswrapper[28758]: I0223 14:40:38.965762 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="sushy-emulator/sushy-emulator-78f6d7d749-xjjxf" Feb 23 14:40:38.966453 master-0 kubenswrapper[28758]: I0223 14:40:38.966438 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="sushy-emulator/sushy-emulator-78f6d7d749-xjjxf" Feb 23 14:40:38.976033 master-0 kubenswrapper[28758]: I0223 14:40:38.975963 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="sushy-emulator/sushy-emulator-78f6d7d749-xjjxf" Feb 23 14:40:39.928980 master-0 kubenswrapper[28758]: I0223 14:40:39.928909 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="sushy-emulator/sushy-emulator-78f6d7d749-xjjxf" Feb 23 14:40:49.918840 master-0 kubenswrapper[28758]: I0223 14:40:49.918703 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-fdcd8d88c-hgj8l" podUID="871f487c-6da7-49af-9dc8-eff9b29305c4" containerName="console" containerID="cri-o://06c83b245d3e13dc0f32cb102c743fa36763ddde8e97ce28406f6f981b3ea50a" gracePeriod=15 Feb 23 14:40:51.007139 master-0 kubenswrapper[28758]: I0223 14:40:51.007081 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-fdcd8d88c-hgj8l_871f487c-6da7-49af-9dc8-eff9b29305c4/console/0.log" Feb 23 14:40:51.007139 master-0 kubenswrapper[28758]: I0223 14:40:51.007134 28758 generic.go:334] "Generic (PLEG): container finished" podID="871f487c-6da7-49af-9dc8-eff9b29305c4" containerID="06c83b245d3e13dc0f32cb102c743fa36763ddde8e97ce28406f6f981b3ea50a" exitCode=2 Feb 23 14:40:51.007932 master-0 kubenswrapper[28758]: I0223 14:40:51.007158 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-fdcd8d88c-hgj8l" event={"ID":"871f487c-6da7-49af-9dc8-eff9b29305c4","Type":"ContainerDied","Data":"06c83b245d3e13dc0f32cb102c743fa36763ddde8e97ce28406f6f981b3ea50a"} Feb 23 14:40:51.098835 master-0 kubenswrapper[28758]: I0223 14:40:51.098768 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-fdcd8d88c-hgj8l_871f487c-6da7-49af-9dc8-eff9b29305c4/console/0.log" Feb 23 14:40:51.099090 master-0 kubenswrapper[28758]: I0223 14:40:51.098864 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-fdcd8d88c-hgj8l" Feb 23 14:40:51.283064 master-0 kubenswrapper[28758]: I0223 14:40:51.282993 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/871f487c-6da7-49af-9dc8-eff9b29305c4-console-oauth-config\") pod \"871f487c-6da7-49af-9dc8-eff9b29305c4\" (UID: \"871f487c-6da7-49af-9dc8-eff9b29305c4\") " Feb 23 14:40:51.283322 master-0 kubenswrapper[28758]: I0223 14:40:51.283098 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/871f487c-6da7-49af-9dc8-eff9b29305c4-trusted-ca-bundle\") pod \"871f487c-6da7-49af-9dc8-eff9b29305c4\" (UID: \"871f487c-6da7-49af-9dc8-eff9b29305c4\") " Feb 23 14:40:51.283322 master-0 kubenswrapper[28758]: I0223 14:40:51.283143 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/871f487c-6da7-49af-9dc8-eff9b29305c4-oauth-serving-cert\") pod \"871f487c-6da7-49af-9dc8-eff9b29305c4\" (UID: \"871f487c-6da7-49af-9dc8-eff9b29305c4\") " Feb 23 14:40:51.283322 master-0 kubenswrapper[28758]: I0223 14:40:51.283206 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/871f487c-6da7-49af-9dc8-eff9b29305c4-console-serving-cert\") pod \"871f487c-6da7-49af-9dc8-eff9b29305c4\" (UID: \"871f487c-6da7-49af-9dc8-eff9b29305c4\") " Feb 23 14:40:51.283322 master-0 kubenswrapper[28758]: I0223 14:40:51.283293 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhn2g\" (UniqueName: \"kubernetes.io/projected/871f487c-6da7-49af-9dc8-eff9b29305c4-kube-api-access-xhn2g\") pod \"871f487c-6da7-49af-9dc8-eff9b29305c4\" (UID: \"871f487c-6da7-49af-9dc8-eff9b29305c4\") " Feb 23 14:40:51.283528 master-0 kubenswrapper[28758]: I0223 14:40:51.283423 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/871f487c-6da7-49af-9dc8-eff9b29305c4-service-ca\") pod \"871f487c-6da7-49af-9dc8-eff9b29305c4\" (UID: \"871f487c-6da7-49af-9dc8-eff9b29305c4\") " Feb 23 14:40:51.284121 master-0 kubenswrapper[28758]: I0223 14:40:51.283555 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/871f487c-6da7-49af-9dc8-eff9b29305c4-console-config\") pod \"871f487c-6da7-49af-9dc8-eff9b29305c4\" (UID: \"871f487c-6da7-49af-9dc8-eff9b29305c4\") " Feb 23 14:40:51.284121 master-0 kubenswrapper[28758]: I0223 14:40:51.283740 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/871f487c-6da7-49af-9dc8-eff9b29305c4-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "871f487c-6da7-49af-9dc8-eff9b29305c4" (UID: "871f487c-6da7-49af-9dc8-eff9b29305c4"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:40:51.284443 master-0 kubenswrapper[28758]: I0223 14:40:51.284394 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/871f487c-6da7-49af-9dc8-eff9b29305c4-service-ca" (OuterVolumeSpecName: "service-ca") pod "871f487c-6da7-49af-9dc8-eff9b29305c4" (UID: "871f487c-6da7-49af-9dc8-eff9b29305c4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:40:51.284531 master-0 kubenswrapper[28758]: I0223 14:40:51.284444 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/871f487c-6da7-49af-9dc8-eff9b29305c4-console-config" (OuterVolumeSpecName: "console-config") pod "871f487c-6da7-49af-9dc8-eff9b29305c4" (UID: "871f487c-6da7-49af-9dc8-eff9b29305c4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:40:51.284700 master-0 kubenswrapper[28758]: I0223 14:40:51.284647 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/871f487c-6da7-49af-9dc8-eff9b29305c4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "871f487c-6da7-49af-9dc8-eff9b29305c4" (UID: "871f487c-6da7-49af-9dc8-eff9b29305c4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:40:51.285168 master-0 kubenswrapper[28758]: I0223 14:40:51.285130 28758 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/871f487c-6da7-49af-9dc8-eff9b29305c4-service-ca\") on node \"master-0\" DevicePath \"\"" Feb 23 14:40:51.285230 master-0 kubenswrapper[28758]: I0223 14:40:51.285171 28758 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/871f487c-6da7-49af-9dc8-eff9b29305c4-console-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:40:51.285230 master-0 kubenswrapper[28758]: I0223 14:40:51.285193 28758 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/871f487c-6da7-49af-9dc8-eff9b29305c4-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:40:51.285230 master-0 kubenswrapper[28758]: I0223 14:40:51.285217 28758 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/871f487c-6da7-49af-9dc8-eff9b29305c4-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Feb 23 14:40:51.289291 master-0 kubenswrapper[28758]: I0223 14:40:51.289227 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/871f487c-6da7-49af-9dc8-eff9b29305c4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "871f487c-6da7-49af-9dc8-eff9b29305c4" (UID: "871f487c-6da7-49af-9dc8-eff9b29305c4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:40:51.289410 master-0 kubenswrapper[28758]: I0223 14:40:51.289357 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/871f487c-6da7-49af-9dc8-eff9b29305c4-kube-api-access-xhn2g" (OuterVolumeSpecName: "kube-api-access-xhn2g") pod "871f487c-6da7-49af-9dc8-eff9b29305c4" (UID: "871f487c-6da7-49af-9dc8-eff9b29305c4"). InnerVolumeSpecName "kube-api-access-xhn2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:40:51.289634 master-0 kubenswrapper[28758]: I0223 14:40:51.289612 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/871f487c-6da7-49af-9dc8-eff9b29305c4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "871f487c-6da7-49af-9dc8-eff9b29305c4" (UID: "871f487c-6da7-49af-9dc8-eff9b29305c4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:40:51.386399 master-0 kubenswrapper[28758]: I0223 14:40:51.386245 28758 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/871f487c-6da7-49af-9dc8-eff9b29305c4-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:40:51.386399 master-0 kubenswrapper[28758]: I0223 14:40:51.386335 28758 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/871f487c-6da7-49af-9dc8-eff9b29305c4-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Feb 23 14:40:51.386399 master-0 kubenswrapper[28758]: I0223 14:40:51.386362 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhn2g\" (UniqueName: \"kubernetes.io/projected/871f487c-6da7-49af-9dc8-eff9b29305c4-kube-api-access-xhn2g\") on node \"master-0\" DevicePath \"\"" Feb 23 14:40:52.015151 master-0 kubenswrapper[28758]: I0223 14:40:52.015080 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-fdcd8d88c-hgj8l_871f487c-6da7-49af-9dc8-eff9b29305c4/console/0.log" Feb 23 14:40:52.015151 master-0 kubenswrapper[28758]: I0223 14:40:52.015150 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-fdcd8d88c-hgj8l" event={"ID":"871f487c-6da7-49af-9dc8-eff9b29305c4","Type":"ContainerDied","Data":"93d907340893351058df10ed91bd2cc6e34de614b9a438a0c034e13d91e33f90"} Feb 23 14:40:52.015886 master-0 kubenswrapper[28758]: I0223 14:40:52.015186 28758 scope.go:117] "RemoveContainer" containerID="06c83b245d3e13dc0f32cb102c743fa36763ddde8e97ce28406f6f981b3ea50a" Feb 23 14:40:52.015886 master-0 kubenswrapper[28758]: I0223 14:40:52.015214 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-fdcd8d88c-hgj8l" Feb 23 14:40:52.098748 master-0 kubenswrapper[28758]: I0223 14:40:52.098697 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-fdcd8d88c-hgj8l"] Feb 23 14:40:52.099345 master-0 kubenswrapper[28758]: I0223 14:40:52.099301 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-fdcd8d88c-hgj8l"] Feb 23 14:40:54.095580 master-0 kubenswrapper[28758]: I0223 14:40:54.095516 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="871f487c-6da7-49af-9dc8-eff9b29305c4" path="/var/lib/kubelet/pods/871f487c-6da7-49af-9dc8-eff9b29305c4/volumes" Feb 23 14:40:57.791832 master-0 kubenswrapper[28758]: I0223 14:40:57.791741 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/nova-console-poller-85fdff4757-8qt2z"] Feb 23 14:40:57.792495 master-0 kubenswrapper[28758]: E0223 14:40:57.792124 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="871f487c-6da7-49af-9dc8-eff9b29305c4" containerName="console" Feb 23 14:40:57.792495 master-0 kubenswrapper[28758]: I0223 14:40:57.792145 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="871f487c-6da7-49af-9dc8-eff9b29305c4" containerName="console" Feb 23 14:40:57.792495 master-0 kubenswrapper[28758]: I0223 14:40:57.792355 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="871f487c-6da7-49af-9dc8-eff9b29305c4" containerName="console" Feb 23 14:40:57.793285 master-0 kubenswrapper[28758]: I0223 14:40:57.793249 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-poller-85fdff4757-8qt2z" Feb 23 14:40:57.882303 master-0 kubenswrapper[28758]: I0223 14:40:57.882233 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-poller-85fdff4757-8qt2z"] Feb 23 14:40:57.981569 master-0 kubenswrapper[28758]: I0223 14:40:57.981442 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/28f4aac4-1063-4387-8d70-188a3d572604-os-client-config\") pod \"nova-console-poller-85fdff4757-8qt2z\" (UID: \"28f4aac4-1063-4387-8d70-188a3d572604\") " pod="sushy-emulator/nova-console-poller-85fdff4757-8qt2z" Feb 23 14:40:57.981867 master-0 kubenswrapper[28758]: I0223 14:40:57.981623 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf4br\" (UniqueName: \"kubernetes.io/projected/28f4aac4-1063-4387-8d70-188a3d572604-kube-api-access-xf4br\") pod \"nova-console-poller-85fdff4757-8qt2z\" (UID: \"28f4aac4-1063-4387-8d70-188a3d572604\") " pod="sushy-emulator/nova-console-poller-85fdff4757-8qt2z" Feb 23 14:40:58.083058 master-0 kubenswrapper[28758]: I0223 14:40:58.082906 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/28f4aac4-1063-4387-8d70-188a3d572604-os-client-config\") pod \"nova-console-poller-85fdff4757-8qt2z\" (UID: \"28f4aac4-1063-4387-8d70-188a3d572604\") " pod="sushy-emulator/nova-console-poller-85fdff4757-8qt2z" Feb 23 14:40:58.083058 master-0 kubenswrapper[28758]: I0223 14:40:58.083055 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf4br\" (UniqueName: \"kubernetes.io/projected/28f4aac4-1063-4387-8d70-188a3d572604-kube-api-access-xf4br\") pod \"nova-console-poller-85fdff4757-8qt2z\" (UID: \"28f4aac4-1063-4387-8d70-188a3d572604\") " pod="sushy-emulator/nova-console-poller-85fdff4757-8qt2z" Feb 23 14:40:58.087624 master-0 kubenswrapper[28758]: I0223 14:40:58.087564 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/28f4aac4-1063-4387-8d70-188a3d572604-os-client-config\") pod \"nova-console-poller-85fdff4757-8qt2z\" (UID: \"28f4aac4-1063-4387-8d70-188a3d572604\") " pod="sushy-emulator/nova-console-poller-85fdff4757-8qt2z" Feb 23 14:40:58.107150 master-0 kubenswrapper[28758]: I0223 14:40:58.107109 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf4br\" (UniqueName: \"kubernetes.io/projected/28f4aac4-1063-4387-8d70-188a3d572604-kube-api-access-xf4br\") pod \"nova-console-poller-85fdff4757-8qt2z\" (UID: \"28f4aac4-1063-4387-8d70-188a3d572604\") " pod="sushy-emulator/nova-console-poller-85fdff4757-8qt2z" Feb 23 14:40:58.114705 master-0 kubenswrapper[28758]: I0223 14:40:58.114653 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-poller-85fdff4757-8qt2z" Feb 23 14:40:58.587283 master-0 kubenswrapper[28758]: I0223 14:40:58.587218 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-poller-85fdff4757-8qt2z"] Feb 23 14:40:58.587726 master-0 kubenswrapper[28758]: W0223 14:40:58.587681 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28f4aac4_1063_4387_8d70_188a3d572604.slice/crio-98744e148771cb14225d3666aff33c4dcbb3726c393f3709ad0a2ca3cb81c50e WatchSource:0}: Error finding container 98744e148771cb14225d3666aff33c4dcbb3726c393f3709ad0a2ca3cb81c50e: Status 404 returned error can't find the container with id 98744e148771cb14225d3666aff33c4dcbb3726c393f3709ad0a2ca3cb81c50e Feb 23 14:40:59.071121 master-0 kubenswrapper[28758]: I0223 14:40:59.071032 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-poller-85fdff4757-8qt2z" event={"ID":"28f4aac4-1063-4387-8d70-188a3d572604","Type":"ContainerStarted","Data":"98744e148771cb14225d3666aff33c4dcbb3726c393f3709ad0a2ca3cb81c50e"} Feb 23 14:41:03.103721 master-0 kubenswrapper[28758]: I0223 14:41:03.098916 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-poller-85fdff4757-8qt2z" event={"ID":"28f4aac4-1063-4387-8d70-188a3d572604","Type":"ContainerStarted","Data":"5fcbb5b493b30fa87c0c44d36f94ebda04225647ab5dc4b00839f8f893214cd4"} Feb 23 14:41:04.109595 master-0 kubenswrapper[28758]: I0223 14:41:04.109514 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-poller-85fdff4757-8qt2z" event={"ID":"28f4aac4-1063-4387-8d70-188a3d572604","Type":"ContainerStarted","Data":"21ecff25ba8b8607c578a87024abcc9a1c92f46ea8c0d7e6401eb742a02dfeef"} Feb 23 14:41:04.132859 master-0 kubenswrapper[28758]: I0223 14:41:04.132771 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/nova-console-poller-85fdff4757-8qt2z" podStartSLOduration=2.5233245 podStartE2EDuration="7.132749927s" podCreationTimestamp="2026-02-23 14:40:57 +0000 UTC" firstStartedPulling="2026-02-23 14:40:58.590062716 +0000 UTC m=+390.716378648" lastFinishedPulling="2026-02-23 14:41:03.199488143 +0000 UTC m=+395.325804075" observedRunningTime="2026-02-23 14:41:04.129183781 +0000 UTC m=+396.255499723" watchObservedRunningTime="2026-02-23 14:41:04.132749927 +0000 UTC m=+396.259065869" Feb 23 14:41:09.294350 master-0 kubenswrapper[28758]: I0223 14:41:09.294215 28758 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Feb 23 14:41:09.294350 master-0 kubenswrapper[28758]: I0223 14:41:09.294361 28758 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Feb 23 14:41:09.295542 master-0 kubenswrapper[28758]: E0223 14:41:09.294823 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="181adc3f4810f127b44f3750f5d2460c" containerName="kube-controller-manager-recovery-controller" Feb 23 14:41:09.295542 master-0 kubenswrapper[28758]: I0223 14:41:09.294847 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="181adc3f4810f127b44f3750f5d2460c" containerName="kube-controller-manager-recovery-controller" Feb 23 14:41:09.295542 master-0 kubenswrapper[28758]: E0223 14:41:09.294877 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="181adc3f4810f127b44f3750f5d2460c" containerName="kube-controller-manager-cert-syncer" Feb 23 14:41:09.295542 master-0 kubenswrapper[28758]: I0223 14:41:09.294897 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="181adc3f4810f127b44f3750f5d2460c" containerName="kube-controller-manager-cert-syncer" Feb 23 14:41:09.295542 master-0 kubenswrapper[28758]: E0223 14:41:09.294932 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="181adc3f4810f127b44f3750f5d2460c" containerName="cluster-policy-controller" Feb 23 14:41:09.295542 master-0 kubenswrapper[28758]: I0223 14:41:09.294953 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="181adc3f4810f127b44f3750f5d2460c" containerName="cluster-policy-controller" Feb 23 14:41:09.295542 master-0 kubenswrapper[28758]: E0223 14:41:09.294984 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="181adc3f4810f127b44f3750f5d2460c" containerName="kube-controller-manager" Feb 23 14:41:09.295542 master-0 kubenswrapper[28758]: I0223 14:41:09.294998 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="181adc3f4810f127b44f3750f5d2460c" containerName="kube-controller-manager" Feb 23 14:41:09.295542 master-0 kubenswrapper[28758]: E0223 14:41:09.295016 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="181adc3f4810f127b44f3750f5d2460c" containerName="kube-controller-manager" Feb 23 14:41:09.295542 master-0 kubenswrapper[28758]: I0223 14:41:09.295028 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="181adc3f4810f127b44f3750f5d2460c" containerName="kube-controller-manager" Feb 23 14:41:09.295542 master-0 kubenswrapper[28758]: E0223 14:41:09.295068 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="181adc3f4810f127b44f3750f5d2460c" containerName="kube-controller-manager" Feb 23 14:41:09.295542 master-0 kubenswrapper[28758]: I0223 14:41:09.295080 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="181adc3f4810f127b44f3750f5d2460c" containerName="kube-controller-manager" Feb 23 14:41:09.295542 master-0 kubenswrapper[28758]: I0223 14:41:09.295330 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="181adc3f4810f127b44f3750f5d2460c" containerName="cluster-policy-controller" Feb 23 14:41:09.295542 master-0 kubenswrapper[28758]: I0223 14:41:09.295352 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="181adc3f4810f127b44f3750f5d2460c" containerName="kube-controller-manager" Feb 23 14:41:09.295542 master-0 kubenswrapper[28758]: I0223 14:41:09.295382 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="181adc3f4810f127b44f3750f5d2460c" containerName="kube-controller-manager" Feb 23 14:41:09.295542 master-0 kubenswrapper[28758]: I0223 14:41:09.295401 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="181adc3f4810f127b44f3750f5d2460c" containerName="kube-controller-manager" Feb 23 14:41:09.295542 master-0 kubenswrapper[28758]: I0223 14:41:09.295424 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="181adc3f4810f127b44f3750f5d2460c" containerName="kube-controller-manager-cert-syncer" Feb 23 14:41:09.295542 master-0 kubenswrapper[28758]: I0223 14:41:09.295446 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="181adc3f4810f127b44f3750f5d2460c" containerName="kube-controller-manager-recovery-controller" Feb 23 14:41:09.297248 master-0 kubenswrapper[28758]: I0223 14:41:09.297170 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="181adc3f4810f127b44f3750f5d2460c" containerName="cluster-policy-controller" containerID="cri-o://0f12986ca20c74365b105ffde80e7b4ab97ae2e79cf0faa03c36002407af9c04" gracePeriod=30 Feb 23 14:41:09.297430 master-0 kubenswrapper[28758]: I0223 14:41:09.297391 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="181adc3f4810f127b44f3750f5d2460c" containerName="kube-controller-manager" containerID="cri-o://001d001096f1064e07116818eedc5c1059991cb817caf22d97ae69b6af8df521" gracePeriod=30 Feb 23 14:41:09.297565 master-0 kubenswrapper[28758]: I0223 14:41:09.297461 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="181adc3f4810f127b44f3750f5d2460c" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://b8d6f49109bc5e9937c7a4c297e2344d7130b49e86fe7057d8a2caa05af89ff5" gracePeriod=30 Feb 23 14:41:09.297565 master-0 kubenswrapper[28758]: I0223 14:41:09.297557 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="181adc3f4810f127b44f3750f5d2460c" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://0b38f3f0c36dbadf8be89c71d3b96febfcea8812afc5013e2c33ad058c7c6088" gracePeriod=30 Feb 23 14:41:09.479212 master-0 kubenswrapper[28758]: I0223 14:41:09.479143 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a750a4cc1e38add4dbf13c3fbf3ac793-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a750a4cc1e38add4dbf13c3fbf3ac793\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:41:09.479351 master-0 kubenswrapper[28758]: I0223 14:41:09.479225 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/a750a4cc1e38add4dbf13c3fbf3ac793-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a750a4cc1e38add4dbf13c3fbf3ac793\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:41:09.569370 master-0 kubenswrapper[28758]: I0223 14:41:09.569213 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_181adc3f4810f127b44f3750f5d2460c/kube-controller-manager/1.log" Feb 23 14:41:09.571330 master-0 kubenswrapper[28758]: I0223 14:41:09.571282 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_181adc3f4810f127b44f3750f5d2460c/kube-controller-manager-cert-syncer/0.log" Feb 23 14:41:09.572305 master-0 kubenswrapper[28758]: I0223 14:41:09.572223 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:41:09.576193 master-0 kubenswrapper[28758]: I0223 14:41:09.576133 28758 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="181adc3f4810f127b44f3750f5d2460c" podUID="a750a4cc1e38add4dbf13c3fbf3ac793" Feb 23 14:41:09.581673 master-0 kubenswrapper[28758]: I0223 14:41:09.581605 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a750a4cc1e38add4dbf13c3fbf3ac793-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a750a4cc1e38add4dbf13c3fbf3ac793\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:41:09.581748 master-0 kubenswrapper[28758]: I0223 14:41:09.581703 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/a750a4cc1e38add4dbf13c3fbf3ac793-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a750a4cc1e38add4dbf13c3fbf3ac793\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:41:09.581878 master-0 kubenswrapper[28758]: I0223 14:41:09.581812 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a750a4cc1e38add4dbf13c3fbf3ac793-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a750a4cc1e38add4dbf13c3fbf3ac793\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:41:09.581944 master-0 kubenswrapper[28758]: I0223 14:41:09.581869 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/a750a4cc1e38add4dbf13c3fbf3ac793-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"a750a4cc1e38add4dbf13c3fbf3ac793\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:41:09.682502 master-0 kubenswrapper[28758]: I0223 14:41:09.682413 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/181adc3f4810f127b44f3750f5d2460c-cert-dir\") pod \"181adc3f4810f127b44f3750f5d2460c\" (UID: \"181adc3f4810f127b44f3750f5d2460c\") " Feb 23 14:41:09.682743 master-0 kubenswrapper[28758]: I0223 14:41:09.682527 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/181adc3f4810f127b44f3750f5d2460c-resource-dir\") pod \"181adc3f4810f127b44f3750f5d2460c\" (UID: \"181adc3f4810f127b44f3750f5d2460c\") " Feb 23 14:41:09.682743 master-0 kubenswrapper[28758]: I0223 14:41:09.682576 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/181adc3f4810f127b44f3750f5d2460c-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "181adc3f4810f127b44f3750f5d2460c" (UID: "181adc3f4810f127b44f3750f5d2460c"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:41:09.682743 master-0 kubenswrapper[28758]: I0223 14:41:09.682605 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/181adc3f4810f127b44f3750f5d2460c-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "181adc3f4810f127b44f3750f5d2460c" (UID: "181adc3f4810f127b44f3750f5d2460c"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:41:09.683649 master-0 kubenswrapper[28758]: I0223 14:41:09.683039 28758 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/181adc3f4810f127b44f3750f5d2460c-cert-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:41:09.683649 master-0 kubenswrapper[28758]: I0223 14:41:09.683074 28758 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/181adc3f4810f127b44f3750f5d2460c-resource-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:41:10.099443 master-0 kubenswrapper[28758]: I0223 14:41:10.099360 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="181adc3f4810f127b44f3750f5d2460c" path="/var/lib/kubelet/pods/181adc3f4810f127b44f3750f5d2460c/volumes" Feb 23 14:41:10.163102 master-0 kubenswrapper[28758]: I0223 14:41:10.162977 28758 generic.go:334] "Generic (PLEG): container finished" podID="8416da49-c72c-4f48-a054-bad4fa20e56d" containerID="e1b6fdb79b23dddfb94e4cc28afba8aafa9fdd08a8626a8102e0afa087c61c39" exitCode=0 Feb 23 14:41:10.163102 master-0 kubenswrapper[28758]: I0223 14:41:10.163019 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"8416da49-c72c-4f48-a054-bad4fa20e56d","Type":"ContainerDied","Data":"e1b6fdb79b23dddfb94e4cc28afba8aafa9fdd08a8626a8102e0afa087c61c39"} Feb 23 14:41:10.165592 master-0 kubenswrapper[28758]: I0223 14:41:10.165546 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_181adc3f4810f127b44f3750f5d2460c/kube-controller-manager/1.log" Feb 23 14:41:10.167102 master-0 kubenswrapper[28758]: I0223 14:41:10.167033 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_181adc3f4810f127b44f3750f5d2460c/kube-controller-manager-cert-syncer/0.log" Feb 23 14:41:10.167606 master-0 kubenswrapper[28758]: I0223 14:41:10.167558 28758 generic.go:334] "Generic (PLEG): container finished" podID="181adc3f4810f127b44f3750f5d2460c" containerID="001d001096f1064e07116818eedc5c1059991cb817caf22d97ae69b6af8df521" exitCode=0 Feb 23 14:41:10.167606 master-0 kubenswrapper[28758]: I0223 14:41:10.167607 28758 generic.go:334] "Generic (PLEG): container finished" podID="181adc3f4810f127b44f3750f5d2460c" containerID="b8d6f49109bc5e9937c7a4c297e2344d7130b49e86fe7057d8a2caa05af89ff5" exitCode=0 Feb 23 14:41:10.167779 master-0 kubenswrapper[28758]: I0223 14:41:10.167618 28758 generic.go:334] "Generic (PLEG): container finished" podID="181adc3f4810f127b44f3750f5d2460c" containerID="0b38f3f0c36dbadf8be89c71d3b96febfcea8812afc5013e2c33ad058c7c6088" exitCode=2 Feb 23 14:41:10.167779 master-0 kubenswrapper[28758]: I0223 14:41:10.167628 28758 generic.go:334] "Generic (PLEG): container finished" podID="181adc3f4810f127b44f3750f5d2460c" containerID="0f12986ca20c74365b105ffde80e7b4ab97ae2e79cf0faa03c36002407af9c04" exitCode=0 Feb 23 14:41:10.167779 master-0 kubenswrapper[28758]: I0223 14:41:10.167646 28758 scope.go:117] "RemoveContainer" containerID="001d001096f1064e07116818eedc5c1059991cb817caf22d97ae69b6af8df521" Feb 23 14:41:10.167779 master-0 kubenswrapper[28758]: I0223 14:41:10.167620 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:41:10.186820 master-0 kubenswrapper[28758]: I0223 14:41:10.186705 28758 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="181adc3f4810f127b44f3750f5d2460c" podUID="a750a4cc1e38add4dbf13c3fbf3ac793" Feb 23 14:41:10.192611 master-0 kubenswrapper[28758]: I0223 14:41:10.192557 28758 scope.go:117] "RemoveContainer" containerID="7b6a30c67bda806ef66a202fea13c367daf0dee629c1c44dffc741cdc340946a" Feb 23 14:41:10.213059 master-0 kubenswrapper[28758]: I0223 14:41:10.212974 28758 scope.go:117] "RemoveContainer" containerID="b8d6f49109bc5e9937c7a4c297e2344d7130b49e86fe7057d8a2caa05af89ff5" Feb 23 14:41:10.228624 master-0 kubenswrapper[28758]: I0223 14:41:10.228502 28758 scope.go:117] "RemoveContainer" containerID="0b38f3f0c36dbadf8be89c71d3b96febfcea8812afc5013e2c33ad058c7c6088" Feb 23 14:41:10.243445 master-0 kubenswrapper[28758]: I0223 14:41:10.243394 28758 scope.go:117] "RemoveContainer" containerID="0f12986ca20c74365b105ffde80e7b4ab97ae2e79cf0faa03c36002407af9c04" Feb 23 14:41:10.266403 master-0 kubenswrapper[28758]: I0223 14:41:10.266343 28758 scope.go:117] "RemoveContainer" containerID="001d001096f1064e07116818eedc5c1059991cb817caf22d97ae69b6af8df521" Feb 23 14:41:10.266782 master-0 kubenswrapper[28758]: E0223 14:41:10.266738 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"001d001096f1064e07116818eedc5c1059991cb817caf22d97ae69b6af8df521\": container with ID starting with 001d001096f1064e07116818eedc5c1059991cb817caf22d97ae69b6af8df521 not found: ID does not exist" containerID="001d001096f1064e07116818eedc5c1059991cb817caf22d97ae69b6af8df521" Feb 23 14:41:10.266849 master-0 kubenswrapper[28758]: I0223 14:41:10.266786 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"001d001096f1064e07116818eedc5c1059991cb817caf22d97ae69b6af8df521"} err="failed to get container status \"001d001096f1064e07116818eedc5c1059991cb817caf22d97ae69b6af8df521\": rpc error: code = NotFound desc = could not find container \"001d001096f1064e07116818eedc5c1059991cb817caf22d97ae69b6af8df521\": container with ID starting with 001d001096f1064e07116818eedc5c1059991cb817caf22d97ae69b6af8df521 not found: ID does not exist" Feb 23 14:41:10.266849 master-0 kubenswrapper[28758]: I0223 14:41:10.266813 28758 scope.go:117] "RemoveContainer" containerID="7b6a30c67bda806ef66a202fea13c367daf0dee629c1c44dffc741cdc340946a" Feb 23 14:41:10.267223 master-0 kubenswrapper[28758]: E0223 14:41:10.267171 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b6a30c67bda806ef66a202fea13c367daf0dee629c1c44dffc741cdc340946a\": container with ID starting with 7b6a30c67bda806ef66a202fea13c367daf0dee629c1c44dffc741cdc340946a not found: ID does not exist" containerID="7b6a30c67bda806ef66a202fea13c367daf0dee629c1c44dffc741cdc340946a" Feb 23 14:41:10.267285 master-0 kubenswrapper[28758]: I0223 14:41:10.267225 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b6a30c67bda806ef66a202fea13c367daf0dee629c1c44dffc741cdc340946a"} err="failed to get container status \"7b6a30c67bda806ef66a202fea13c367daf0dee629c1c44dffc741cdc340946a\": rpc error: code = NotFound desc = could not find container \"7b6a30c67bda806ef66a202fea13c367daf0dee629c1c44dffc741cdc340946a\": container with ID starting with 7b6a30c67bda806ef66a202fea13c367daf0dee629c1c44dffc741cdc340946a not found: ID does not exist" Feb 23 14:41:10.267285 master-0 kubenswrapper[28758]: I0223 14:41:10.267253 28758 scope.go:117] "RemoveContainer" containerID="b8d6f49109bc5e9937c7a4c297e2344d7130b49e86fe7057d8a2caa05af89ff5" Feb 23 14:41:10.267581 master-0 kubenswrapper[28758]: E0223 14:41:10.267542 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8d6f49109bc5e9937c7a4c297e2344d7130b49e86fe7057d8a2caa05af89ff5\": container with ID starting with b8d6f49109bc5e9937c7a4c297e2344d7130b49e86fe7057d8a2caa05af89ff5 not found: ID does not exist" containerID="b8d6f49109bc5e9937c7a4c297e2344d7130b49e86fe7057d8a2caa05af89ff5" Feb 23 14:41:10.267581 master-0 kubenswrapper[28758]: I0223 14:41:10.267570 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8d6f49109bc5e9937c7a4c297e2344d7130b49e86fe7057d8a2caa05af89ff5"} err="failed to get container status \"b8d6f49109bc5e9937c7a4c297e2344d7130b49e86fe7057d8a2caa05af89ff5\": rpc error: code = NotFound desc = could not find container \"b8d6f49109bc5e9937c7a4c297e2344d7130b49e86fe7057d8a2caa05af89ff5\": container with ID starting with b8d6f49109bc5e9937c7a4c297e2344d7130b49e86fe7057d8a2caa05af89ff5 not found: ID does not exist" Feb 23 14:41:10.267787 master-0 kubenswrapper[28758]: I0223 14:41:10.267587 28758 scope.go:117] "RemoveContainer" containerID="0b38f3f0c36dbadf8be89c71d3b96febfcea8812afc5013e2c33ad058c7c6088" Feb 23 14:41:10.267858 master-0 kubenswrapper[28758]: E0223 14:41:10.267829 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b38f3f0c36dbadf8be89c71d3b96febfcea8812afc5013e2c33ad058c7c6088\": container with ID starting with 0b38f3f0c36dbadf8be89c71d3b96febfcea8812afc5013e2c33ad058c7c6088 not found: ID does not exist" containerID="0b38f3f0c36dbadf8be89c71d3b96febfcea8812afc5013e2c33ad058c7c6088" Feb 23 14:41:10.267909 master-0 kubenswrapper[28758]: I0223 14:41:10.267854 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b38f3f0c36dbadf8be89c71d3b96febfcea8812afc5013e2c33ad058c7c6088"} err="failed to get container status \"0b38f3f0c36dbadf8be89c71d3b96febfcea8812afc5013e2c33ad058c7c6088\": rpc error: code = NotFound desc = could not find container \"0b38f3f0c36dbadf8be89c71d3b96febfcea8812afc5013e2c33ad058c7c6088\": container with ID starting with 0b38f3f0c36dbadf8be89c71d3b96febfcea8812afc5013e2c33ad058c7c6088 not found: ID does not exist" Feb 23 14:41:10.267909 master-0 kubenswrapper[28758]: I0223 14:41:10.267873 28758 scope.go:117] "RemoveContainer" containerID="0f12986ca20c74365b105ffde80e7b4ab97ae2e79cf0faa03c36002407af9c04" Feb 23 14:41:10.268145 master-0 kubenswrapper[28758]: E0223 14:41:10.268101 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f12986ca20c74365b105ffde80e7b4ab97ae2e79cf0faa03c36002407af9c04\": container with ID starting with 0f12986ca20c74365b105ffde80e7b4ab97ae2e79cf0faa03c36002407af9c04 not found: ID does not exist" containerID="0f12986ca20c74365b105ffde80e7b4ab97ae2e79cf0faa03c36002407af9c04" Feb 23 14:41:10.268145 master-0 kubenswrapper[28758]: I0223 14:41:10.268130 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f12986ca20c74365b105ffde80e7b4ab97ae2e79cf0faa03c36002407af9c04"} err="failed to get container status \"0f12986ca20c74365b105ffde80e7b4ab97ae2e79cf0faa03c36002407af9c04\": rpc error: code = NotFound desc = could not find container \"0f12986ca20c74365b105ffde80e7b4ab97ae2e79cf0faa03c36002407af9c04\": container with ID starting with 0f12986ca20c74365b105ffde80e7b4ab97ae2e79cf0faa03c36002407af9c04 not found: ID does not exist" Feb 23 14:41:10.268145 master-0 kubenswrapper[28758]: I0223 14:41:10.268147 28758 scope.go:117] "RemoveContainer" containerID="001d001096f1064e07116818eedc5c1059991cb817caf22d97ae69b6af8df521" Feb 23 14:41:10.268398 master-0 kubenswrapper[28758]: I0223 14:41:10.268363 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"001d001096f1064e07116818eedc5c1059991cb817caf22d97ae69b6af8df521"} err="failed to get container status \"001d001096f1064e07116818eedc5c1059991cb817caf22d97ae69b6af8df521\": rpc error: code = NotFound desc = could not find container \"001d001096f1064e07116818eedc5c1059991cb817caf22d97ae69b6af8df521\": container with ID starting with 001d001096f1064e07116818eedc5c1059991cb817caf22d97ae69b6af8df521 not found: ID does not exist" Feb 23 14:41:10.268398 master-0 kubenswrapper[28758]: I0223 14:41:10.268393 28758 scope.go:117] "RemoveContainer" containerID="7b6a30c67bda806ef66a202fea13c367daf0dee629c1c44dffc741cdc340946a" Feb 23 14:41:10.268697 master-0 kubenswrapper[28758]: I0223 14:41:10.268664 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b6a30c67bda806ef66a202fea13c367daf0dee629c1c44dffc741cdc340946a"} err="failed to get container status \"7b6a30c67bda806ef66a202fea13c367daf0dee629c1c44dffc741cdc340946a\": rpc error: code = NotFound desc = could not find container \"7b6a30c67bda806ef66a202fea13c367daf0dee629c1c44dffc741cdc340946a\": container with ID starting with 7b6a30c67bda806ef66a202fea13c367daf0dee629c1c44dffc741cdc340946a not found: ID does not exist" Feb 23 14:41:10.268697 master-0 kubenswrapper[28758]: I0223 14:41:10.268693 28758 scope.go:117] "RemoveContainer" containerID="b8d6f49109bc5e9937c7a4c297e2344d7130b49e86fe7057d8a2caa05af89ff5" Feb 23 14:41:10.268978 master-0 kubenswrapper[28758]: I0223 14:41:10.268897 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8d6f49109bc5e9937c7a4c297e2344d7130b49e86fe7057d8a2caa05af89ff5"} err="failed to get container status \"b8d6f49109bc5e9937c7a4c297e2344d7130b49e86fe7057d8a2caa05af89ff5\": rpc error: code = NotFound desc = could not find container \"b8d6f49109bc5e9937c7a4c297e2344d7130b49e86fe7057d8a2caa05af89ff5\": container with ID starting with b8d6f49109bc5e9937c7a4c297e2344d7130b49e86fe7057d8a2caa05af89ff5 not found: ID does not exist" Feb 23 14:41:10.268978 master-0 kubenswrapper[28758]: I0223 14:41:10.268925 28758 scope.go:117] "RemoveContainer" containerID="0b38f3f0c36dbadf8be89c71d3b96febfcea8812afc5013e2c33ad058c7c6088" Feb 23 14:41:10.269254 master-0 kubenswrapper[28758]: I0223 14:41:10.269204 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b38f3f0c36dbadf8be89c71d3b96febfcea8812afc5013e2c33ad058c7c6088"} err="failed to get container status \"0b38f3f0c36dbadf8be89c71d3b96febfcea8812afc5013e2c33ad058c7c6088\": rpc error: code = NotFound desc = could not find container \"0b38f3f0c36dbadf8be89c71d3b96febfcea8812afc5013e2c33ad058c7c6088\": container with ID starting with 0b38f3f0c36dbadf8be89c71d3b96febfcea8812afc5013e2c33ad058c7c6088 not found: ID does not exist" Feb 23 14:41:10.269254 master-0 kubenswrapper[28758]: I0223 14:41:10.269238 28758 scope.go:117] "RemoveContainer" containerID="0f12986ca20c74365b105ffde80e7b4ab97ae2e79cf0faa03c36002407af9c04" Feb 23 14:41:10.269465 master-0 kubenswrapper[28758]: I0223 14:41:10.269435 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f12986ca20c74365b105ffde80e7b4ab97ae2e79cf0faa03c36002407af9c04"} err="failed to get container status \"0f12986ca20c74365b105ffde80e7b4ab97ae2e79cf0faa03c36002407af9c04\": rpc error: code = NotFound desc = could not find container \"0f12986ca20c74365b105ffde80e7b4ab97ae2e79cf0faa03c36002407af9c04\": container with ID starting with 0f12986ca20c74365b105ffde80e7b4ab97ae2e79cf0faa03c36002407af9c04 not found: ID does not exist" Feb 23 14:41:10.269465 master-0 kubenswrapper[28758]: I0223 14:41:10.269459 28758 scope.go:117] "RemoveContainer" containerID="001d001096f1064e07116818eedc5c1059991cb817caf22d97ae69b6af8df521" Feb 23 14:41:10.269766 master-0 kubenswrapper[28758]: I0223 14:41:10.269695 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"001d001096f1064e07116818eedc5c1059991cb817caf22d97ae69b6af8df521"} err="failed to get container status \"001d001096f1064e07116818eedc5c1059991cb817caf22d97ae69b6af8df521\": rpc error: code = NotFound desc = could not find container \"001d001096f1064e07116818eedc5c1059991cb817caf22d97ae69b6af8df521\": container with ID starting with 001d001096f1064e07116818eedc5c1059991cb817caf22d97ae69b6af8df521 not found: ID does not exist" Feb 23 14:41:10.269766 master-0 kubenswrapper[28758]: I0223 14:41:10.269722 28758 scope.go:117] "RemoveContainer" containerID="7b6a30c67bda806ef66a202fea13c367daf0dee629c1c44dffc741cdc340946a" Feb 23 14:41:10.269965 master-0 kubenswrapper[28758]: I0223 14:41:10.269936 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b6a30c67bda806ef66a202fea13c367daf0dee629c1c44dffc741cdc340946a"} err="failed to get container status \"7b6a30c67bda806ef66a202fea13c367daf0dee629c1c44dffc741cdc340946a\": rpc error: code = NotFound desc = could not find container \"7b6a30c67bda806ef66a202fea13c367daf0dee629c1c44dffc741cdc340946a\": container with ID starting with 7b6a30c67bda806ef66a202fea13c367daf0dee629c1c44dffc741cdc340946a not found: ID does not exist" Feb 23 14:41:10.269965 master-0 kubenswrapper[28758]: I0223 14:41:10.269962 28758 scope.go:117] "RemoveContainer" containerID="b8d6f49109bc5e9937c7a4c297e2344d7130b49e86fe7057d8a2caa05af89ff5" Feb 23 14:41:10.270461 master-0 kubenswrapper[28758]: I0223 14:41:10.270414 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8d6f49109bc5e9937c7a4c297e2344d7130b49e86fe7057d8a2caa05af89ff5"} err="failed to get container status \"b8d6f49109bc5e9937c7a4c297e2344d7130b49e86fe7057d8a2caa05af89ff5\": rpc error: code = NotFound desc = could not find container \"b8d6f49109bc5e9937c7a4c297e2344d7130b49e86fe7057d8a2caa05af89ff5\": container with ID starting with b8d6f49109bc5e9937c7a4c297e2344d7130b49e86fe7057d8a2caa05af89ff5 not found: ID does not exist" Feb 23 14:41:10.270461 master-0 kubenswrapper[28758]: I0223 14:41:10.270441 28758 scope.go:117] "RemoveContainer" containerID="0b38f3f0c36dbadf8be89c71d3b96febfcea8812afc5013e2c33ad058c7c6088" Feb 23 14:41:10.270717 master-0 kubenswrapper[28758]: I0223 14:41:10.270683 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b38f3f0c36dbadf8be89c71d3b96febfcea8812afc5013e2c33ad058c7c6088"} err="failed to get container status \"0b38f3f0c36dbadf8be89c71d3b96febfcea8812afc5013e2c33ad058c7c6088\": rpc error: code = NotFound desc = could not find container \"0b38f3f0c36dbadf8be89c71d3b96febfcea8812afc5013e2c33ad058c7c6088\": container with ID starting with 0b38f3f0c36dbadf8be89c71d3b96febfcea8812afc5013e2c33ad058c7c6088 not found: ID does not exist" Feb 23 14:41:10.270717 master-0 kubenswrapper[28758]: I0223 14:41:10.270705 28758 scope.go:117] "RemoveContainer" containerID="0f12986ca20c74365b105ffde80e7b4ab97ae2e79cf0faa03c36002407af9c04" Feb 23 14:41:10.270947 master-0 kubenswrapper[28758]: I0223 14:41:10.270904 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f12986ca20c74365b105ffde80e7b4ab97ae2e79cf0faa03c36002407af9c04"} err="failed to get container status \"0f12986ca20c74365b105ffde80e7b4ab97ae2e79cf0faa03c36002407af9c04\": rpc error: code = NotFound desc = could not find container \"0f12986ca20c74365b105ffde80e7b4ab97ae2e79cf0faa03c36002407af9c04\": container with ID starting with 0f12986ca20c74365b105ffde80e7b4ab97ae2e79cf0faa03c36002407af9c04 not found: ID does not exist" Feb 23 14:41:10.270947 master-0 kubenswrapper[28758]: I0223 14:41:10.270933 28758 scope.go:117] "RemoveContainer" containerID="001d001096f1064e07116818eedc5c1059991cb817caf22d97ae69b6af8df521" Feb 23 14:41:10.271372 master-0 kubenswrapper[28758]: I0223 14:41:10.271339 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"001d001096f1064e07116818eedc5c1059991cb817caf22d97ae69b6af8df521"} err="failed to get container status \"001d001096f1064e07116818eedc5c1059991cb817caf22d97ae69b6af8df521\": rpc error: code = NotFound desc = could not find container \"001d001096f1064e07116818eedc5c1059991cb817caf22d97ae69b6af8df521\": container with ID starting with 001d001096f1064e07116818eedc5c1059991cb817caf22d97ae69b6af8df521 not found: ID does not exist" Feb 23 14:41:10.271372 master-0 kubenswrapper[28758]: I0223 14:41:10.271360 28758 scope.go:117] "RemoveContainer" containerID="7b6a30c67bda806ef66a202fea13c367daf0dee629c1c44dffc741cdc340946a" Feb 23 14:41:10.271663 master-0 kubenswrapper[28758]: I0223 14:41:10.271626 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b6a30c67bda806ef66a202fea13c367daf0dee629c1c44dffc741cdc340946a"} err="failed to get container status \"7b6a30c67bda806ef66a202fea13c367daf0dee629c1c44dffc741cdc340946a\": rpc error: code = NotFound desc = could not find container \"7b6a30c67bda806ef66a202fea13c367daf0dee629c1c44dffc741cdc340946a\": container with ID starting with 7b6a30c67bda806ef66a202fea13c367daf0dee629c1c44dffc741cdc340946a not found: ID does not exist" Feb 23 14:41:10.271663 master-0 kubenswrapper[28758]: I0223 14:41:10.271651 28758 scope.go:117] "RemoveContainer" containerID="b8d6f49109bc5e9937c7a4c297e2344d7130b49e86fe7057d8a2caa05af89ff5" Feb 23 14:41:10.271937 master-0 kubenswrapper[28758]: I0223 14:41:10.271899 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8d6f49109bc5e9937c7a4c297e2344d7130b49e86fe7057d8a2caa05af89ff5"} err="failed to get container status \"b8d6f49109bc5e9937c7a4c297e2344d7130b49e86fe7057d8a2caa05af89ff5\": rpc error: code = NotFound desc = could not find container \"b8d6f49109bc5e9937c7a4c297e2344d7130b49e86fe7057d8a2caa05af89ff5\": container with ID starting with b8d6f49109bc5e9937c7a4c297e2344d7130b49e86fe7057d8a2caa05af89ff5 not found: ID does not exist" Feb 23 14:41:10.271937 master-0 kubenswrapper[28758]: I0223 14:41:10.271924 28758 scope.go:117] "RemoveContainer" containerID="0b38f3f0c36dbadf8be89c71d3b96febfcea8812afc5013e2c33ad058c7c6088" Feb 23 14:41:10.272267 master-0 kubenswrapper[28758]: I0223 14:41:10.272229 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b38f3f0c36dbadf8be89c71d3b96febfcea8812afc5013e2c33ad058c7c6088"} err="failed to get container status \"0b38f3f0c36dbadf8be89c71d3b96febfcea8812afc5013e2c33ad058c7c6088\": rpc error: code = NotFound desc = could not find container \"0b38f3f0c36dbadf8be89c71d3b96febfcea8812afc5013e2c33ad058c7c6088\": container with ID starting with 0b38f3f0c36dbadf8be89c71d3b96febfcea8812afc5013e2c33ad058c7c6088 not found: ID does not exist" Feb 23 14:41:10.272267 master-0 kubenswrapper[28758]: I0223 14:41:10.272254 28758 scope.go:117] "RemoveContainer" containerID="0f12986ca20c74365b105ffde80e7b4ab97ae2e79cf0faa03c36002407af9c04" Feb 23 14:41:10.272619 master-0 kubenswrapper[28758]: I0223 14:41:10.272584 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f12986ca20c74365b105ffde80e7b4ab97ae2e79cf0faa03c36002407af9c04"} err="failed to get container status \"0f12986ca20c74365b105ffde80e7b4ab97ae2e79cf0faa03c36002407af9c04\": rpc error: code = NotFound desc = could not find container \"0f12986ca20c74365b105ffde80e7b4ab97ae2e79cf0faa03c36002407af9c04\": container with ID starting with 0f12986ca20c74365b105ffde80e7b4ab97ae2e79cf0faa03c36002407af9c04 not found: ID does not exist" Feb 23 14:41:11.509118 master-0 kubenswrapper[28758]: I0223 14:41:11.509046 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Feb 23 14:41:11.611916 master-0 kubenswrapper[28758]: I0223 14:41:11.611807 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8416da49-c72c-4f48-a054-bad4fa20e56d-kubelet-dir\") pod \"8416da49-c72c-4f48-a054-bad4fa20e56d\" (UID: \"8416da49-c72c-4f48-a054-bad4fa20e56d\") " Feb 23 14:41:11.611916 master-0 kubenswrapper[28758]: I0223 14:41:11.611897 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8416da49-c72c-4f48-a054-bad4fa20e56d-kube-api-access\") pod \"8416da49-c72c-4f48-a054-bad4fa20e56d\" (UID: \"8416da49-c72c-4f48-a054-bad4fa20e56d\") " Feb 23 14:41:11.611916 master-0 kubenswrapper[28758]: I0223 14:41:11.611922 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8416da49-c72c-4f48-a054-bad4fa20e56d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8416da49-c72c-4f48-a054-bad4fa20e56d" (UID: "8416da49-c72c-4f48-a054-bad4fa20e56d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:41:11.612407 master-0 kubenswrapper[28758]: I0223 14:41:11.612086 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8416da49-c72c-4f48-a054-bad4fa20e56d-var-lock\") pod \"8416da49-c72c-4f48-a054-bad4fa20e56d\" (UID: \"8416da49-c72c-4f48-a054-bad4fa20e56d\") " Feb 23 14:41:11.612407 master-0 kubenswrapper[28758]: I0223 14:41:11.612283 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8416da49-c72c-4f48-a054-bad4fa20e56d-var-lock" (OuterVolumeSpecName: "var-lock") pod "8416da49-c72c-4f48-a054-bad4fa20e56d" (UID: "8416da49-c72c-4f48-a054-bad4fa20e56d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:41:11.612681 master-0 kubenswrapper[28758]: I0223 14:41:11.612515 28758 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8416da49-c72c-4f48-a054-bad4fa20e56d-var-lock\") on node \"master-0\" DevicePath \"\"" Feb 23 14:41:11.612681 master-0 kubenswrapper[28758]: I0223 14:41:11.612537 28758 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8416da49-c72c-4f48-a054-bad4fa20e56d-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:41:11.614670 master-0 kubenswrapper[28758]: I0223 14:41:11.614597 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8416da49-c72c-4f48-a054-bad4fa20e56d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8416da49-c72c-4f48-a054-bad4fa20e56d" (UID: "8416da49-c72c-4f48-a054-bad4fa20e56d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:41:11.714173 master-0 kubenswrapper[28758]: I0223 14:41:11.714044 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8416da49-c72c-4f48-a054-bad4fa20e56d-kube-api-access\") on node \"master-0\" DevicePath \"\"" Feb 23 14:41:12.189458 master-0 kubenswrapper[28758]: I0223 14:41:12.189381 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"8416da49-c72c-4f48-a054-bad4fa20e56d","Type":"ContainerDied","Data":"49e147d878a7a864b2aad39486e66741da19414611fce969bbefa8ddf0ccf382"} Feb 23 14:41:12.189458 master-0 kubenswrapper[28758]: I0223 14:41:12.189437 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49e147d878a7a864b2aad39486e66741da19414611fce969bbefa8ddf0ccf382" Feb 23 14:41:12.189458 master-0 kubenswrapper[28758]: I0223 14:41:12.189438 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Feb 23 14:41:24.087933 master-0 kubenswrapper[28758]: I0223 14:41:24.087816 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:41:24.110558 master-0 kubenswrapper[28758]: I0223 14:41:24.110443 28758 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="b4b08e3b-d68f-4e07-8071-29d4d324e614" Feb 23 14:41:24.110558 master-0 kubenswrapper[28758]: I0223 14:41:24.110511 28758 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="b4b08e3b-d68f-4e07-8071-29d4d324e614" Feb 23 14:41:24.127450 master-0 kubenswrapper[28758]: I0223 14:41:24.126151 28758 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:41:24.136545 master-0 kubenswrapper[28758]: I0223 14:41:24.136428 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Feb 23 14:41:24.142427 master-0 kubenswrapper[28758]: I0223 14:41:24.142265 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Feb 23 14:41:24.143999 master-0 kubenswrapper[28758]: I0223 14:41:24.143953 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:41:24.150609 master-0 kubenswrapper[28758]: I0223 14:41:24.150564 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Feb 23 14:41:24.168526 master-0 kubenswrapper[28758]: W0223 14:41:24.168442 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda750a4cc1e38add4dbf13c3fbf3ac793.slice/crio-a090eb0b5079dd5d083c8f2e16c4001566810f048d1fdb3978e368dc6641e8f3 WatchSource:0}: Error finding container a090eb0b5079dd5d083c8f2e16c4001566810f048d1fdb3978e368dc6641e8f3: Status 404 returned error can't find the container with id a090eb0b5079dd5d083c8f2e16c4001566810f048d1fdb3978e368dc6641e8f3 Feb 23 14:41:24.297938 master-0 kubenswrapper[28758]: I0223 14:41:24.297859 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a750a4cc1e38add4dbf13c3fbf3ac793","Type":"ContainerStarted","Data":"a090eb0b5079dd5d083c8f2e16c4001566810f048d1fdb3978e368dc6641e8f3"} Feb 23 14:41:25.306236 master-0 kubenswrapper[28758]: I0223 14:41:25.306181 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a750a4cc1e38add4dbf13c3fbf3ac793","Type":"ContainerStarted","Data":"124ba6182bf09f152f4f8d4f9c73ab0727dea5df862c96015425cf730d347ad4"} Feb 23 14:41:25.306236 master-0 kubenswrapper[28758]: I0223 14:41:25.306236 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a750a4cc1e38add4dbf13c3fbf3ac793","Type":"ContainerStarted","Data":"cbb8fa74241683441e2555ce8310af3eb2b0d6861a5d57510814beb2369a394b"} Feb 23 14:41:25.306236 master-0 kubenswrapper[28758]: I0223 14:41:25.306251 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a750a4cc1e38add4dbf13c3fbf3ac793","Type":"ContainerStarted","Data":"37dbacc303a98b7087075cee00b9692de188516367a3dc708728182747bbe1d8"} Feb 23 14:41:25.306996 master-0 kubenswrapper[28758]: I0223 14:41:25.306263 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"a750a4cc1e38add4dbf13c3fbf3ac793","Type":"ContainerStarted","Data":"8f3295f287ff0e5dc8353f4d6c6238a82c80790968e513bf331094cb75b2f5f6"} Feb 23 14:41:25.325733 master-0 kubenswrapper[28758]: I0223 14:41:25.325643 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=1.325627277 podStartE2EDuration="1.325627277s" podCreationTimestamp="2026-02-23 14:41:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:41:25.324112116 +0000 UTC m=+417.450428048" watchObservedRunningTime="2026-02-23 14:41:25.325627277 +0000 UTC m=+417.451943209" Feb 23 14:41:34.146310 master-0 kubenswrapper[28758]: I0223 14:41:34.146167 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:41:34.146310 master-0 kubenswrapper[28758]: I0223 14:41:34.146251 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:41:34.146310 master-0 kubenswrapper[28758]: I0223 14:41:34.146274 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:41:34.146310 master-0 kubenswrapper[28758]: I0223 14:41:34.146294 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:41:34.147090 master-0 kubenswrapper[28758]: I0223 14:41:34.146989 28758 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Feb 23 14:41:34.147126 master-0 kubenswrapper[28758]: I0223 14:41:34.147078 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="a750a4cc1e38add4dbf13c3fbf3ac793" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Feb 23 14:41:34.152338 master-0 kubenswrapper[28758]: I0223 14:41:34.152269 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:41:34.386522 master-0 kubenswrapper[28758]: I0223 14:41:34.386350 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:41:44.152357 master-0 kubenswrapper[28758]: I0223 14:41:44.152274 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:41:44.156966 master-0 kubenswrapper[28758]: I0223 14:41:44.156912 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Feb 23 14:42:04.152726 master-0 kubenswrapper[28758]: I0223 14:42:04.152672 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/nova-console-recorder-9f9c7bbd9-q5c44"] Feb 23 14:42:04.153783 master-0 kubenswrapper[28758]: E0223 14:42:04.153761 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8416da49-c72c-4f48-a054-bad4fa20e56d" containerName="installer" Feb 23 14:42:04.153878 master-0 kubenswrapper[28758]: I0223 14:42:04.153867 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="8416da49-c72c-4f48-a054-bad4fa20e56d" containerName="installer" Feb 23 14:42:04.154095 master-0 kubenswrapper[28758]: I0223 14:42:04.154083 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="8416da49-c72c-4f48-a054-bad4fa20e56d" containerName="installer" Feb 23 14:42:04.154969 master-0 kubenswrapper[28758]: I0223 14:42:04.154947 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-recorder-9f9c7bbd9-q5c44" Feb 23 14:42:04.166998 master-0 kubenswrapper[28758]: I0223 14:42:04.166417 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/eb4ba18b-1dd1-4778-a229-44291446e41b-os-client-config\") pod \"nova-console-recorder-9f9c7bbd9-q5c44\" (UID: \"eb4ba18b-1dd1-4778-a229-44291446e41b\") " pod="sushy-emulator/nova-console-recorder-9f9c7bbd9-q5c44" Feb 23 14:42:04.268894 master-0 kubenswrapper[28758]: I0223 14:42:04.268857 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-recorder-9f9c7bbd9-q5c44"] Feb 23 14:42:04.269134 master-0 kubenswrapper[28758]: I0223 14:42:04.268988 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/eb4ba18b-1dd1-4778-a229-44291446e41b-os-client-config\") pod \"nova-console-recorder-9f9c7bbd9-q5c44\" (UID: \"eb4ba18b-1dd1-4778-a229-44291446e41b\") " pod="sushy-emulator/nova-console-recorder-9f9c7bbd9-q5c44" Feb 23 14:42:04.277256 master-0 kubenswrapper[28758]: I0223 14:42:04.275449 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/eb4ba18b-1dd1-4778-a229-44291446e41b-os-client-config\") pod \"nova-console-recorder-9f9c7bbd9-q5c44\" (UID: \"eb4ba18b-1dd1-4778-a229-44291446e41b\") " pod="sushy-emulator/nova-console-recorder-9f9c7bbd9-q5c44" Feb 23 14:42:04.471748 master-0 kubenswrapper[28758]: I0223 14:42:04.471664 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wprh\" (UniqueName: \"kubernetes.io/projected/eb4ba18b-1dd1-4778-a229-44291446e41b-kube-api-access-9wprh\") pod \"nova-console-recorder-9f9c7bbd9-q5c44\" (UID: \"eb4ba18b-1dd1-4778-a229-44291446e41b\") " pod="sushy-emulator/nova-console-recorder-9f9c7bbd9-q5c44" Feb 23 14:42:04.472013 master-0 kubenswrapper[28758]: I0223 14:42:04.471823 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-console-recordings-pv\" (UniqueName: \"kubernetes.io/nfs/eb4ba18b-1dd1-4778-a229-44291446e41b-nova-console-recordings-pv\") pod \"nova-console-recorder-9f9c7bbd9-q5c44\" (UID: \"eb4ba18b-1dd1-4778-a229-44291446e41b\") " pod="sushy-emulator/nova-console-recorder-9f9c7bbd9-q5c44" Feb 23 14:42:04.574301 master-0 kubenswrapper[28758]: I0223 14:42:04.574227 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wprh\" (UniqueName: \"kubernetes.io/projected/eb4ba18b-1dd1-4778-a229-44291446e41b-kube-api-access-9wprh\") pod \"nova-console-recorder-9f9c7bbd9-q5c44\" (UID: \"eb4ba18b-1dd1-4778-a229-44291446e41b\") " pod="sushy-emulator/nova-console-recorder-9f9c7bbd9-q5c44" Feb 23 14:42:04.574881 master-0 kubenswrapper[28758]: I0223 14:42:04.574829 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-console-recordings-pv\" (UniqueName: \"kubernetes.io/nfs/eb4ba18b-1dd1-4778-a229-44291446e41b-nova-console-recordings-pv\") pod \"nova-console-recorder-9f9c7bbd9-q5c44\" (UID: \"eb4ba18b-1dd1-4778-a229-44291446e41b\") " pod="sushy-emulator/nova-console-recorder-9f9c7bbd9-q5c44" Feb 23 14:42:04.601864 master-0 kubenswrapper[28758]: I0223 14:42:04.601795 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wprh\" (UniqueName: \"kubernetes.io/projected/eb4ba18b-1dd1-4778-a229-44291446e41b-kube-api-access-9wprh\") pod \"nova-console-recorder-9f9c7bbd9-q5c44\" (UID: \"eb4ba18b-1dd1-4778-a229-44291446e41b\") " pod="sushy-emulator/nova-console-recorder-9f9c7bbd9-q5c44" Feb 23 14:42:05.680106 master-0 kubenswrapper[28758]: I0223 14:42:05.679965 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-console-recordings-pv\" (UniqueName: \"kubernetes.io/nfs/eb4ba18b-1dd1-4778-a229-44291446e41b-nova-console-recordings-pv\") pod \"nova-console-recorder-9f9c7bbd9-q5c44\" (UID: \"eb4ba18b-1dd1-4778-a229-44291446e41b\") " pod="sushy-emulator/nova-console-recorder-9f9c7bbd9-q5c44" Feb 23 14:42:05.971226 master-0 kubenswrapper[28758]: I0223 14:42:05.971146 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-recorder-9f9c7bbd9-q5c44" Feb 23 14:42:06.376175 master-0 kubenswrapper[28758]: I0223 14:42:06.376118 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-recorder-9f9c7bbd9-q5c44"] Feb 23 14:42:06.646362 master-0 kubenswrapper[28758]: I0223 14:42:06.646133 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-recorder-9f9c7bbd9-q5c44" event={"ID":"eb4ba18b-1dd1-4778-a229-44291446e41b","Type":"ContainerStarted","Data":"70e2eaa60b2cc6e5e8c92a4e0dc770556482d2c660d74edefcb55dc4c69891f9"} Feb 23 14:42:13.698397 master-0 kubenswrapper[28758]: I0223 14:42:13.698344 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-recorder-9f9c7bbd9-q5c44" event={"ID":"eb4ba18b-1dd1-4778-a229-44291446e41b","Type":"ContainerStarted","Data":"90f13448ed653b780e557664334f2fa2cf6d89efb4afd479fe247f99bd7ed2be"} Feb 23 14:42:15.740702 master-0 kubenswrapper[28758]: I0223 14:42:15.740152 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-recorder-9f9c7bbd9-q5c44" event={"ID":"eb4ba18b-1dd1-4778-a229-44291446e41b","Type":"ContainerStarted","Data":"0905ce56357199db8b0a09eb93eeb2ff03aeb56a7bbc253ece18e78ae24b3b47"} Feb 23 14:42:15.765421 master-0 kubenswrapper[28758]: I0223 14:42:15.765301 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/nova-console-recorder-9f9c7bbd9-q5c44" podStartSLOduration=3.10321639 podStartE2EDuration="11.765278829s" podCreationTimestamp="2026-02-23 14:42:04 +0000 UTC" firstStartedPulling="2026-02-23 14:42:06.383346774 +0000 UTC m=+458.509662696" lastFinishedPulling="2026-02-23 14:42:15.045409183 +0000 UTC m=+467.171725135" observedRunningTime="2026-02-23 14:42:15.759792462 +0000 UTC m=+467.886108394" watchObservedRunningTime="2026-02-23 14:42:15.765278829 +0000 UTC m=+467.891594761" Feb 23 14:42:40.306672 master-0 kubenswrapper[28758]: I0223 14:42:40.306618 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dxvsk"] Feb 23 14:42:40.308897 master-0 kubenswrapper[28758]: I0223 14:42:40.308863 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dxvsk" Feb 23 14:42:40.311421 master-0 kubenswrapper[28758]: I0223 14:42:40.311394 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-49vsk" Feb 23 14:42:40.316069 master-0 kubenswrapper[28758]: I0223 14:42:40.315985 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dxvsk"] Feb 23 14:42:40.470281 master-0 kubenswrapper[28758]: I0223 14:42:40.470236 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c3dfd1fe-c87f-4936-8e0b-6372b8f492d5-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dxvsk\" (UID: \"c3dfd1fe-c87f-4936-8e0b-6372b8f492d5\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dxvsk" Feb 23 14:42:40.470607 master-0 kubenswrapper[28758]: I0223 14:42:40.470583 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt5fr\" (UniqueName: \"kubernetes.io/projected/c3dfd1fe-c87f-4936-8e0b-6372b8f492d5-kube-api-access-jt5fr\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dxvsk\" (UID: \"c3dfd1fe-c87f-4936-8e0b-6372b8f492d5\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dxvsk" Feb 23 14:42:40.470728 master-0 kubenswrapper[28758]: I0223 14:42:40.470711 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c3dfd1fe-c87f-4936-8e0b-6372b8f492d5-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dxvsk\" (UID: \"c3dfd1fe-c87f-4936-8e0b-6372b8f492d5\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dxvsk" Feb 23 14:42:40.572275 master-0 kubenswrapper[28758]: I0223 14:42:40.572042 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt5fr\" (UniqueName: \"kubernetes.io/projected/c3dfd1fe-c87f-4936-8e0b-6372b8f492d5-kube-api-access-jt5fr\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dxvsk\" (UID: \"c3dfd1fe-c87f-4936-8e0b-6372b8f492d5\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dxvsk" Feb 23 14:42:40.572878 master-0 kubenswrapper[28758]: I0223 14:42:40.572810 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c3dfd1fe-c87f-4936-8e0b-6372b8f492d5-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dxvsk\" (UID: \"c3dfd1fe-c87f-4936-8e0b-6372b8f492d5\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dxvsk" Feb 23 14:42:40.573380 master-0 kubenswrapper[28758]: I0223 14:42:40.573337 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c3dfd1fe-c87f-4936-8e0b-6372b8f492d5-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dxvsk\" (UID: \"c3dfd1fe-c87f-4936-8e0b-6372b8f492d5\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dxvsk" Feb 23 14:42:40.573889 master-0 kubenswrapper[28758]: I0223 14:42:40.573420 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c3dfd1fe-c87f-4936-8e0b-6372b8f492d5-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dxvsk\" (UID: \"c3dfd1fe-c87f-4936-8e0b-6372b8f492d5\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dxvsk" Feb 23 14:42:40.574183 master-0 kubenswrapper[28758]: I0223 14:42:40.574115 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c3dfd1fe-c87f-4936-8e0b-6372b8f492d5-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dxvsk\" (UID: \"c3dfd1fe-c87f-4936-8e0b-6372b8f492d5\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dxvsk" Feb 23 14:42:40.604672 master-0 kubenswrapper[28758]: I0223 14:42:40.604567 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt5fr\" (UniqueName: \"kubernetes.io/projected/c3dfd1fe-c87f-4936-8e0b-6372b8f492d5-kube-api-access-jt5fr\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dxvsk\" (UID: \"c3dfd1fe-c87f-4936-8e0b-6372b8f492d5\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dxvsk" Feb 23 14:42:40.635048 master-0 kubenswrapper[28758]: I0223 14:42:40.634977 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dxvsk" Feb 23 14:42:41.087108 master-0 kubenswrapper[28758]: I0223 14:42:41.087025 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dxvsk"] Feb 23 14:42:41.099271 master-0 kubenswrapper[28758]: W0223 14:42:41.099186 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3dfd1fe_c87f_4936_8e0b_6372b8f492d5.slice/crio-d578d95b8f892ad1c17d651d20ae6ad756da2338d4130d148ef157ae52601c0c WatchSource:0}: Error finding container d578d95b8f892ad1c17d651d20ae6ad756da2338d4130d148ef157ae52601c0c: Status 404 returned error can't find the container with id d578d95b8f892ad1c17d651d20ae6ad756da2338d4130d148ef157ae52601c0c Feb 23 14:42:41.968740 master-0 kubenswrapper[28758]: I0223 14:42:41.967446 28758 generic.go:334] "Generic (PLEG): container finished" podID="c3dfd1fe-c87f-4936-8e0b-6372b8f492d5" containerID="f1c9d2e4643c231f491954ebabbce37e94e97d1c7cb05989e509b5855b1d45b9" exitCode=0 Feb 23 14:42:41.968740 master-0 kubenswrapper[28758]: I0223 14:42:41.967565 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dxvsk" event={"ID":"c3dfd1fe-c87f-4936-8e0b-6372b8f492d5","Type":"ContainerDied","Data":"f1c9d2e4643c231f491954ebabbce37e94e97d1c7cb05989e509b5855b1d45b9"} Feb 23 14:42:41.968740 master-0 kubenswrapper[28758]: I0223 14:42:41.967631 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dxvsk" event={"ID":"c3dfd1fe-c87f-4936-8e0b-6372b8f492d5","Type":"ContainerStarted","Data":"d578d95b8f892ad1c17d651d20ae6ad756da2338d4130d148ef157ae52601c0c"} Feb 23 14:42:44.992745 master-0 kubenswrapper[28758]: I0223 14:42:44.992668 28758 generic.go:334] "Generic (PLEG): container finished" podID="c3dfd1fe-c87f-4936-8e0b-6372b8f492d5" containerID="779c75176cecd1df00ffb0ddab190bb998ea5f71fc1f77736121105d7c8df1e4" exitCode=0 Feb 23 14:42:44.992745 master-0 kubenswrapper[28758]: I0223 14:42:44.992740 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dxvsk" event={"ID":"c3dfd1fe-c87f-4936-8e0b-6372b8f492d5","Type":"ContainerDied","Data":"779c75176cecd1df00ffb0ddab190bb998ea5f71fc1f77736121105d7c8df1e4"} Feb 23 14:42:46.004202 master-0 kubenswrapper[28758]: I0223 14:42:46.004141 28758 generic.go:334] "Generic (PLEG): container finished" podID="c3dfd1fe-c87f-4936-8e0b-6372b8f492d5" containerID="cb75c5d3bd52e61c0860f77f8bf70b814e395dd15beb356eb85e1587a8eeaccd" exitCode=0 Feb 23 14:42:46.004202 master-0 kubenswrapper[28758]: I0223 14:42:46.004200 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dxvsk" event={"ID":"c3dfd1fe-c87f-4936-8e0b-6372b8f492d5","Type":"ContainerDied","Data":"cb75c5d3bd52e61c0860f77f8bf70b814e395dd15beb356eb85e1587a8eeaccd"} Feb 23 14:42:47.320509 master-0 kubenswrapper[28758]: I0223 14:42:47.320374 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dxvsk" Feb 23 14:42:47.492317 master-0 kubenswrapper[28758]: I0223 14:42:47.492205 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt5fr\" (UniqueName: \"kubernetes.io/projected/c3dfd1fe-c87f-4936-8e0b-6372b8f492d5-kube-api-access-jt5fr\") pod \"c3dfd1fe-c87f-4936-8e0b-6372b8f492d5\" (UID: \"c3dfd1fe-c87f-4936-8e0b-6372b8f492d5\") " Feb 23 14:42:47.492726 master-0 kubenswrapper[28758]: I0223 14:42:47.492437 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c3dfd1fe-c87f-4936-8e0b-6372b8f492d5-bundle\") pod \"c3dfd1fe-c87f-4936-8e0b-6372b8f492d5\" (UID: \"c3dfd1fe-c87f-4936-8e0b-6372b8f492d5\") " Feb 23 14:42:47.492726 master-0 kubenswrapper[28758]: I0223 14:42:47.492526 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c3dfd1fe-c87f-4936-8e0b-6372b8f492d5-util\") pod \"c3dfd1fe-c87f-4936-8e0b-6372b8f492d5\" (UID: \"c3dfd1fe-c87f-4936-8e0b-6372b8f492d5\") " Feb 23 14:42:47.493916 master-0 kubenswrapper[28758]: I0223 14:42:47.493837 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3dfd1fe-c87f-4936-8e0b-6372b8f492d5-bundle" (OuterVolumeSpecName: "bundle") pod "c3dfd1fe-c87f-4936-8e0b-6372b8f492d5" (UID: "c3dfd1fe-c87f-4936-8e0b-6372b8f492d5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:42:47.495312 master-0 kubenswrapper[28758]: I0223 14:42:47.495249 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3dfd1fe-c87f-4936-8e0b-6372b8f492d5-kube-api-access-jt5fr" (OuterVolumeSpecName: "kube-api-access-jt5fr") pod "c3dfd1fe-c87f-4936-8e0b-6372b8f492d5" (UID: "c3dfd1fe-c87f-4936-8e0b-6372b8f492d5"). InnerVolumeSpecName "kube-api-access-jt5fr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:42:47.509022 master-0 kubenswrapper[28758]: I0223 14:42:47.508949 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3dfd1fe-c87f-4936-8e0b-6372b8f492d5-util" (OuterVolumeSpecName: "util") pod "c3dfd1fe-c87f-4936-8e0b-6372b8f492d5" (UID: "c3dfd1fe-c87f-4936-8e0b-6372b8f492d5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:42:47.594611 master-0 kubenswrapper[28758]: I0223 14:42:47.594511 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt5fr\" (UniqueName: \"kubernetes.io/projected/c3dfd1fe-c87f-4936-8e0b-6372b8f492d5-kube-api-access-jt5fr\") on node \"master-0\" DevicePath \"\"" Feb 23 14:42:47.594611 master-0 kubenswrapper[28758]: I0223 14:42:47.594569 28758 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c3dfd1fe-c87f-4936-8e0b-6372b8f492d5-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:42:47.594611 master-0 kubenswrapper[28758]: I0223 14:42:47.594582 28758 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c3dfd1fe-c87f-4936-8e0b-6372b8f492d5-util\") on node \"master-0\" DevicePath \"\"" Feb 23 14:42:48.023657 master-0 kubenswrapper[28758]: I0223 14:42:48.023558 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dxvsk" event={"ID":"c3dfd1fe-c87f-4936-8e0b-6372b8f492d5","Type":"ContainerDied","Data":"d578d95b8f892ad1c17d651d20ae6ad756da2338d4130d148ef157ae52601c0c"} Feb 23 14:42:48.023657 master-0 kubenswrapper[28758]: I0223 14:42:48.023647 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d578d95b8f892ad1c17d651d20ae6ad756da2338d4130d148ef157ae52601c0c" Feb 23 14:42:48.024004 master-0 kubenswrapper[28758]: I0223 14:42:48.023683 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dxvsk" Feb 23 14:42:53.753645 master-0 kubenswrapper[28758]: I0223 14:42:53.752099 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/lvms-operator-94bf95b68-5255q"] Feb 23 14:42:53.753645 master-0 kubenswrapper[28758]: E0223 14:42:53.752509 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3dfd1fe-c87f-4936-8e0b-6372b8f492d5" containerName="extract" Feb 23 14:42:53.753645 master-0 kubenswrapper[28758]: I0223 14:42:53.752526 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3dfd1fe-c87f-4936-8e0b-6372b8f492d5" containerName="extract" Feb 23 14:42:53.753645 master-0 kubenswrapper[28758]: E0223 14:42:53.752577 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3dfd1fe-c87f-4936-8e0b-6372b8f492d5" containerName="util" Feb 23 14:42:53.753645 master-0 kubenswrapper[28758]: I0223 14:42:53.752586 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3dfd1fe-c87f-4936-8e0b-6372b8f492d5" containerName="util" Feb 23 14:42:53.753645 master-0 kubenswrapper[28758]: E0223 14:42:53.752597 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3dfd1fe-c87f-4936-8e0b-6372b8f492d5" containerName="pull" Feb 23 14:42:53.753645 master-0 kubenswrapper[28758]: I0223 14:42:53.752605 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3dfd1fe-c87f-4936-8e0b-6372b8f492d5" containerName="pull" Feb 23 14:42:53.753645 master-0 kubenswrapper[28758]: I0223 14:42:53.752760 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3dfd1fe-c87f-4936-8e0b-6372b8f492d5" containerName="extract" Feb 23 14:42:53.753645 master-0 kubenswrapper[28758]: I0223 14:42:53.753625 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-94bf95b68-5255q" Feb 23 14:42:53.760414 master-0 kubenswrapper[28758]: I0223 14:42:53.760352 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-service-cert" Feb 23 14:42:53.760523 master-0 kubenswrapper[28758]: I0223 14:42:53.760410 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"kube-root-ca.crt" Feb 23 14:42:53.772286 master-0 kubenswrapper[28758]: I0223 14:42:53.763842 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-metrics-cert" Feb 23 14:42:53.772286 master-0 kubenswrapper[28758]: I0223 14:42:53.764107 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-webhook-server-cert" Feb 23 14:42:53.772286 master-0 kubenswrapper[28758]: I0223 14:42:53.764272 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"openshift-service-ca.crt" Feb 23 14:42:53.790860 master-0 kubenswrapper[28758]: I0223 14:42:53.790770 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-94bf95b68-5255q"] Feb 23 14:42:53.848623 master-0 kubenswrapper[28758]: I0223 14:42:53.848557 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/49b7c98e-74d2-473c-ba94-7f8b65d52e22-metrics-cert\") pod \"lvms-operator-94bf95b68-5255q\" (UID: \"49b7c98e-74d2-473c-ba94-7f8b65d52e22\") " pod="openshift-storage/lvms-operator-94bf95b68-5255q" Feb 23 14:42:53.848623 master-0 kubenswrapper[28758]: I0223 14:42:53.848617 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/49b7c98e-74d2-473c-ba94-7f8b65d52e22-apiservice-cert\") pod \"lvms-operator-94bf95b68-5255q\" (UID: \"49b7c98e-74d2-473c-ba94-7f8b65d52e22\") " pod="openshift-storage/lvms-operator-94bf95b68-5255q" Feb 23 14:42:53.848919 master-0 kubenswrapper[28758]: I0223 14:42:53.848713 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg792\" (UniqueName: \"kubernetes.io/projected/49b7c98e-74d2-473c-ba94-7f8b65d52e22-kube-api-access-gg792\") pod \"lvms-operator-94bf95b68-5255q\" (UID: \"49b7c98e-74d2-473c-ba94-7f8b65d52e22\") " pod="openshift-storage/lvms-operator-94bf95b68-5255q" Feb 23 14:42:53.848919 master-0 kubenswrapper[28758]: I0223 14:42:53.848777 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/49b7c98e-74d2-473c-ba94-7f8b65d52e22-socket-dir\") pod \"lvms-operator-94bf95b68-5255q\" (UID: \"49b7c98e-74d2-473c-ba94-7f8b65d52e22\") " pod="openshift-storage/lvms-operator-94bf95b68-5255q" Feb 23 14:42:53.848919 master-0 kubenswrapper[28758]: I0223 14:42:53.848796 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/49b7c98e-74d2-473c-ba94-7f8b65d52e22-webhook-cert\") pod \"lvms-operator-94bf95b68-5255q\" (UID: \"49b7c98e-74d2-473c-ba94-7f8b65d52e22\") " pod="openshift-storage/lvms-operator-94bf95b68-5255q" Feb 23 14:42:53.949838 master-0 kubenswrapper[28758]: I0223 14:42:53.949761 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/49b7c98e-74d2-473c-ba94-7f8b65d52e22-socket-dir\") pod \"lvms-operator-94bf95b68-5255q\" (UID: \"49b7c98e-74d2-473c-ba94-7f8b65d52e22\") " pod="openshift-storage/lvms-operator-94bf95b68-5255q" Feb 23 14:42:53.949838 master-0 kubenswrapper[28758]: I0223 14:42:53.949812 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/49b7c98e-74d2-473c-ba94-7f8b65d52e22-webhook-cert\") pod \"lvms-operator-94bf95b68-5255q\" (UID: \"49b7c98e-74d2-473c-ba94-7f8b65d52e22\") " pod="openshift-storage/lvms-operator-94bf95b68-5255q" Feb 23 14:42:53.949838 master-0 kubenswrapper[28758]: I0223 14:42:53.949835 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/49b7c98e-74d2-473c-ba94-7f8b65d52e22-metrics-cert\") pod \"lvms-operator-94bf95b68-5255q\" (UID: \"49b7c98e-74d2-473c-ba94-7f8b65d52e22\") " pod="openshift-storage/lvms-operator-94bf95b68-5255q" Feb 23 14:42:53.950209 master-0 kubenswrapper[28758]: I0223 14:42:53.950007 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/49b7c98e-74d2-473c-ba94-7f8b65d52e22-apiservice-cert\") pod \"lvms-operator-94bf95b68-5255q\" (UID: \"49b7c98e-74d2-473c-ba94-7f8b65d52e22\") " pod="openshift-storage/lvms-operator-94bf95b68-5255q" Feb 23 14:42:53.950267 master-0 kubenswrapper[28758]: I0223 14:42:53.950254 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg792\" (UniqueName: \"kubernetes.io/projected/49b7c98e-74d2-473c-ba94-7f8b65d52e22-kube-api-access-gg792\") pod \"lvms-operator-94bf95b68-5255q\" (UID: \"49b7c98e-74d2-473c-ba94-7f8b65d52e22\") " pod="openshift-storage/lvms-operator-94bf95b68-5255q" Feb 23 14:42:53.950399 master-0 kubenswrapper[28758]: I0223 14:42:53.950357 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/49b7c98e-74d2-473c-ba94-7f8b65d52e22-socket-dir\") pod \"lvms-operator-94bf95b68-5255q\" (UID: \"49b7c98e-74d2-473c-ba94-7f8b65d52e22\") " pod="openshift-storage/lvms-operator-94bf95b68-5255q" Feb 23 14:42:53.956068 master-0 kubenswrapper[28758]: I0223 14:42:53.955987 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/49b7c98e-74d2-473c-ba94-7f8b65d52e22-metrics-cert\") pod \"lvms-operator-94bf95b68-5255q\" (UID: \"49b7c98e-74d2-473c-ba94-7f8b65d52e22\") " pod="openshift-storage/lvms-operator-94bf95b68-5255q" Feb 23 14:42:53.956270 master-0 kubenswrapper[28758]: I0223 14:42:53.956249 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/49b7c98e-74d2-473c-ba94-7f8b65d52e22-webhook-cert\") pod \"lvms-operator-94bf95b68-5255q\" (UID: \"49b7c98e-74d2-473c-ba94-7f8b65d52e22\") " pod="openshift-storage/lvms-operator-94bf95b68-5255q" Feb 23 14:42:53.957468 master-0 kubenswrapper[28758]: I0223 14:42:53.957425 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/49b7c98e-74d2-473c-ba94-7f8b65d52e22-apiservice-cert\") pod \"lvms-operator-94bf95b68-5255q\" (UID: \"49b7c98e-74d2-473c-ba94-7f8b65d52e22\") " pod="openshift-storage/lvms-operator-94bf95b68-5255q" Feb 23 14:42:53.980661 master-0 kubenswrapper[28758]: I0223 14:42:53.979553 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg792\" (UniqueName: \"kubernetes.io/projected/49b7c98e-74d2-473c-ba94-7f8b65d52e22-kube-api-access-gg792\") pod \"lvms-operator-94bf95b68-5255q\" (UID: \"49b7c98e-74d2-473c-ba94-7f8b65d52e22\") " pod="openshift-storage/lvms-operator-94bf95b68-5255q" Feb 23 14:42:54.089642 master-0 kubenswrapper[28758]: I0223 14:42:54.089514 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-94bf95b68-5255q" Feb 23 14:42:54.506263 master-0 kubenswrapper[28758]: I0223 14:42:54.506210 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-94bf95b68-5255q"] Feb 23 14:42:54.507383 master-0 kubenswrapper[28758]: W0223 14:42:54.507326 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49b7c98e_74d2_473c_ba94_7f8b65d52e22.slice/crio-e546a35854358087be62c221d3a60c3de807b00aa7a7b6293650d434f338de75 WatchSource:0}: Error finding container e546a35854358087be62c221d3a60c3de807b00aa7a7b6293650d434f338de75: Status 404 returned error can't find the container with id e546a35854358087be62c221d3a60c3de807b00aa7a7b6293650d434f338de75 Feb 23 14:42:55.080773 master-0 kubenswrapper[28758]: I0223 14:42:55.080685 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-94bf95b68-5255q" event={"ID":"49b7c98e-74d2-473c-ba94-7f8b65d52e22","Type":"ContainerStarted","Data":"e546a35854358087be62c221d3a60c3de807b00aa7a7b6293650d434f338de75"} Feb 23 14:43:00.122705 master-0 kubenswrapper[28758]: I0223 14:43:00.122561 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-94bf95b68-5255q" event={"ID":"49b7c98e-74d2-473c-ba94-7f8b65d52e22","Type":"ContainerStarted","Data":"ade142081b88a48e2c5ce44e78c17f75d446febe335209ea284f503c4a36cdc9"} Feb 23 14:43:00.123310 master-0 kubenswrapper[28758]: I0223 14:43:00.123277 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/lvms-operator-94bf95b68-5255q" Feb 23 14:43:00.127199 master-0 kubenswrapper[28758]: I0223 14:43:00.127157 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/lvms-operator-94bf95b68-5255q" Feb 23 14:43:00.144541 master-0 kubenswrapper[28758]: I0223 14:43:00.144432 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/lvms-operator-94bf95b68-5255q" podStartSLOduration=2.334187917 podStartE2EDuration="7.144410464s" podCreationTimestamp="2026-02-23 14:42:53 +0000 UTC" firstStartedPulling="2026-02-23 14:42:54.509580921 +0000 UTC m=+506.635896853" lastFinishedPulling="2026-02-23 14:42:59.319803468 +0000 UTC m=+511.446119400" observedRunningTime="2026-02-23 14:43:00.140798478 +0000 UTC m=+512.267114410" watchObservedRunningTime="2026-02-23 14:43:00.144410464 +0000 UTC m=+512.270726396" Feb 23 14:43:04.365745 master-0 kubenswrapper[28758]: I0223 14:43:04.365694 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hjjc5"] Feb 23 14:43:04.367693 master-0 kubenswrapper[28758]: I0223 14:43:04.367668 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hjjc5" Feb 23 14:43:04.370042 master-0 kubenswrapper[28758]: I0223 14:43:04.369978 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-49vsk" Feb 23 14:43:04.379961 master-0 kubenswrapper[28758]: I0223 14:43:04.379790 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hjjc5"] Feb 23 14:43:04.511391 master-0 kubenswrapper[28758]: I0223 14:43:04.511341 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3fcade64-f544-4158-9a39-31cace8542dc-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hjjc5\" (UID: \"3fcade64-f544-4158-9a39-31cace8542dc\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hjjc5" Feb 23 14:43:04.511869 master-0 kubenswrapper[28758]: I0223 14:43:04.511840 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-756hx\" (UniqueName: \"kubernetes.io/projected/3fcade64-f544-4158-9a39-31cace8542dc-kube-api-access-756hx\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hjjc5\" (UID: \"3fcade64-f544-4158-9a39-31cace8542dc\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hjjc5" Feb 23 14:43:04.512109 master-0 kubenswrapper[28758]: I0223 14:43:04.512079 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3fcade64-f544-4158-9a39-31cace8542dc-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hjjc5\" (UID: \"3fcade64-f544-4158-9a39-31cace8542dc\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hjjc5" Feb 23 14:43:04.612673 master-0 kubenswrapper[28758]: I0223 14:43:04.612607 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3fcade64-f544-4158-9a39-31cace8542dc-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hjjc5\" (UID: \"3fcade64-f544-4158-9a39-31cace8542dc\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hjjc5" Feb 23 14:43:04.612673 master-0 kubenswrapper[28758]: I0223 14:43:04.612680 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-756hx\" (UniqueName: \"kubernetes.io/projected/3fcade64-f544-4158-9a39-31cace8542dc-kube-api-access-756hx\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hjjc5\" (UID: \"3fcade64-f544-4158-9a39-31cace8542dc\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hjjc5" Feb 23 14:43:04.613093 master-0 kubenswrapper[28758]: I0223 14:43:04.612723 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3fcade64-f544-4158-9a39-31cace8542dc-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hjjc5\" (UID: \"3fcade64-f544-4158-9a39-31cace8542dc\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hjjc5" Feb 23 14:43:04.613175 master-0 kubenswrapper[28758]: I0223 14:43:04.613126 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3fcade64-f544-4158-9a39-31cace8542dc-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hjjc5\" (UID: \"3fcade64-f544-4158-9a39-31cace8542dc\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hjjc5" Feb 23 14:43:04.613175 master-0 kubenswrapper[28758]: I0223 14:43:04.613169 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3fcade64-f544-4158-9a39-31cace8542dc-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hjjc5\" (UID: \"3fcade64-f544-4158-9a39-31cace8542dc\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hjjc5" Feb 23 14:43:04.629533 master-0 kubenswrapper[28758]: I0223 14:43:04.629413 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-756hx\" (UniqueName: \"kubernetes.io/projected/3fcade64-f544-4158-9a39-31cace8542dc-kube-api-access-756hx\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hjjc5\" (UID: \"3fcade64-f544-4158-9a39-31cace8542dc\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hjjc5" Feb 23 14:43:04.683990 master-0 kubenswrapper[28758]: I0223 14:43:04.683940 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hjjc5" Feb 23 14:43:05.085910 master-0 kubenswrapper[28758]: W0223 14:43:05.084930 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fcade64_f544_4158_9a39_31cace8542dc.slice/crio-bce3a0437b8e8e3f9c2f432da9c1e325f7d1e21c2a533df16e7f32bbeb45a766 WatchSource:0}: Error finding container bce3a0437b8e8e3f9c2f432da9c1e325f7d1e21c2a533df16e7f32bbeb45a766: Status 404 returned error can't find the container with id bce3a0437b8e8e3f9c2f432da9c1e325f7d1e21c2a533df16e7f32bbeb45a766 Feb 23 14:43:05.088072 master-0 kubenswrapper[28758]: I0223 14:43:05.088022 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hjjc5"] Feb 23 14:43:05.158245 master-0 kubenswrapper[28758]: I0223 14:43:05.158136 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hjjc5" event={"ID":"3fcade64-f544-4158-9a39-31cace8542dc","Type":"ContainerStarted","Data":"bce3a0437b8e8e3f9c2f432da9c1e325f7d1e21c2a533df16e7f32bbeb45a766"} Feb 23 14:43:05.392026 master-0 kubenswrapper[28758]: I0223 14:43:05.391886 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213p2rrm"] Feb 23 14:43:05.393713 master-0 kubenswrapper[28758]: I0223 14:43:05.393676 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213p2rrm" Feb 23 14:43:05.406157 master-0 kubenswrapper[28758]: I0223 14:43:05.406093 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213p2rrm"] Feb 23 14:43:05.526026 master-0 kubenswrapper[28758]: I0223 14:43:05.525962 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c4e41f54-d479-4cbc-8ed9-984aae83ad0d-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213p2rrm\" (UID: \"c4e41f54-d479-4cbc-8ed9-984aae83ad0d\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213p2rrm" Feb 23 14:43:05.526258 master-0 kubenswrapper[28758]: I0223 14:43:05.526098 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb4vf\" (UniqueName: \"kubernetes.io/projected/c4e41f54-d479-4cbc-8ed9-984aae83ad0d-kube-api-access-qb4vf\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213p2rrm\" (UID: \"c4e41f54-d479-4cbc-8ed9-984aae83ad0d\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213p2rrm" Feb 23 14:43:05.526351 master-0 kubenswrapper[28758]: I0223 14:43:05.526321 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c4e41f54-d479-4cbc-8ed9-984aae83ad0d-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213p2rrm\" (UID: \"c4e41f54-d479-4cbc-8ed9-984aae83ad0d\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213p2rrm" Feb 23 14:43:05.627585 master-0 kubenswrapper[28758]: I0223 14:43:05.627510 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c4e41f54-d479-4cbc-8ed9-984aae83ad0d-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213p2rrm\" (UID: \"c4e41f54-d479-4cbc-8ed9-984aae83ad0d\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213p2rrm" Feb 23 14:43:05.627585 master-0 kubenswrapper[28758]: I0223 14:43:05.627567 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb4vf\" (UniqueName: \"kubernetes.io/projected/c4e41f54-d479-4cbc-8ed9-984aae83ad0d-kube-api-access-qb4vf\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213p2rrm\" (UID: \"c4e41f54-d479-4cbc-8ed9-984aae83ad0d\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213p2rrm" Feb 23 14:43:05.627890 master-0 kubenswrapper[28758]: I0223 14:43:05.627653 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c4e41f54-d479-4cbc-8ed9-984aae83ad0d-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213p2rrm\" (UID: \"c4e41f54-d479-4cbc-8ed9-984aae83ad0d\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213p2rrm" Feb 23 14:43:05.628159 master-0 kubenswrapper[28758]: I0223 14:43:05.628116 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c4e41f54-d479-4cbc-8ed9-984aae83ad0d-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213p2rrm\" (UID: \"c4e41f54-d479-4cbc-8ed9-984aae83ad0d\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213p2rrm" Feb 23 14:43:05.628269 master-0 kubenswrapper[28758]: I0223 14:43:05.628221 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c4e41f54-d479-4cbc-8ed9-984aae83ad0d-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213p2rrm\" (UID: \"c4e41f54-d479-4cbc-8ed9-984aae83ad0d\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213p2rrm" Feb 23 14:43:05.643730 master-0 kubenswrapper[28758]: I0223 14:43:05.643628 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb4vf\" (UniqueName: \"kubernetes.io/projected/c4e41f54-d479-4cbc-8ed9-984aae83ad0d-kube-api-access-qb4vf\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213p2rrm\" (UID: \"c4e41f54-d479-4cbc-8ed9-984aae83ad0d\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213p2rrm" Feb 23 14:43:05.995135 master-0 kubenswrapper[28758]: I0223 14:43:05.994852 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213p2rrm" Feb 23 14:43:06.161327 master-0 kubenswrapper[28758]: I0223 14:43:06.161258 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca5pswz"] Feb 23 14:43:06.163966 master-0 kubenswrapper[28758]: I0223 14:43:06.162892 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca5pswz" Feb 23 14:43:06.167452 master-0 kubenswrapper[28758]: I0223 14:43:06.167403 28758 generic.go:334] "Generic (PLEG): container finished" podID="3fcade64-f544-4158-9a39-31cace8542dc" containerID="7dc604e199718de6dd12d7a251110c49bd3492c334ea8a45711e45da6bb96833" exitCode=0 Feb 23 14:43:06.167452 master-0 kubenswrapper[28758]: I0223 14:43:06.167439 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hjjc5" event={"ID":"3fcade64-f544-4158-9a39-31cace8542dc","Type":"ContainerDied","Data":"7dc604e199718de6dd12d7a251110c49bd3492c334ea8a45711e45da6bb96833"} Feb 23 14:43:06.167709 master-0 kubenswrapper[28758]: I0223 14:43:06.167644 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca5pswz"] Feb 23 14:43:06.197716 master-0 kubenswrapper[28758]: I0223 14:43:06.197657 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh7pp\" (UniqueName: \"kubernetes.io/projected/66d96f69-3a24-4996-839e-05fc58aa8fb5-kube-api-access-jh7pp\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca5pswz\" (UID: \"66d96f69-3a24-4996-839e-05fc58aa8fb5\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca5pswz" Feb 23 14:43:06.197938 master-0 kubenswrapper[28758]: I0223 14:43:06.197805 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/66d96f69-3a24-4996-839e-05fc58aa8fb5-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca5pswz\" (UID: \"66d96f69-3a24-4996-839e-05fc58aa8fb5\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca5pswz" Feb 23 14:43:06.197938 master-0 kubenswrapper[28758]: I0223 14:43:06.197898 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/66d96f69-3a24-4996-839e-05fc58aa8fb5-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca5pswz\" (UID: \"66d96f69-3a24-4996-839e-05fc58aa8fb5\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca5pswz" Feb 23 14:43:06.299470 master-0 kubenswrapper[28758]: I0223 14:43:06.299307 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/66d96f69-3a24-4996-839e-05fc58aa8fb5-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca5pswz\" (UID: \"66d96f69-3a24-4996-839e-05fc58aa8fb5\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca5pswz" Feb 23 14:43:06.299470 master-0 kubenswrapper[28758]: I0223 14:43:06.299412 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/66d96f69-3a24-4996-839e-05fc58aa8fb5-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca5pswz\" (UID: \"66d96f69-3a24-4996-839e-05fc58aa8fb5\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca5pswz" Feb 23 14:43:06.299741 master-0 kubenswrapper[28758]: I0223 14:43:06.299503 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh7pp\" (UniqueName: \"kubernetes.io/projected/66d96f69-3a24-4996-839e-05fc58aa8fb5-kube-api-access-jh7pp\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca5pswz\" (UID: \"66d96f69-3a24-4996-839e-05fc58aa8fb5\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca5pswz" Feb 23 14:43:06.299937 master-0 kubenswrapper[28758]: I0223 14:43:06.299888 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/66d96f69-3a24-4996-839e-05fc58aa8fb5-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca5pswz\" (UID: \"66d96f69-3a24-4996-839e-05fc58aa8fb5\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca5pswz" Feb 23 14:43:06.300356 master-0 kubenswrapper[28758]: I0223 14:43:06.300306 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/66d96f69-3a24-4996-839e-05fc58aa8fb5-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca5pswz\" (UID: \"66d96f69-3a24-4996-839e-05fc58aa8fb5\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca5pswz" Feb 23 14:43:06.315825 master-0 kubenswrapper[28758]: I0223 14:43:06.315760 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh7pp\" (UniqueName: \"kubernetes.io/projected/66d96f69-3a24-4996-839e-05fc58aa8fb5-kube-api-access-jh7pp\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca5pswz\" (UID: \"66d96f69-3a24-4996-839e-05fc58aa8fb5\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca5pswz" Feb 23 14:43:06.404994 master-0 kubenswrapper[28758]: I0223 14:43:06.404918 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213p2rrm"] Feb 23 14:43:06.418759 master-0 kubenswrapper[28758]: W0223 14:43:06.418675 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4e41f54_d479_4cbc_8ed9_984aae83ad0d.slice/crio-3b32c4e5c105ce63995977ccd27c168f158710fe49d26e8a795de19b0ecf8182 WatchSource:0}: Error finding container 3b32c4e5c105ce63995977ccd27c168f158710fe49d26e8a795de19b0ecf8182: Status 404 returned error can't find the container with id 3b32c4e5c105ce63995977ccd27c168f158710fe49d26e8a795de19b0ecf8182 Feb 23 14:43:06.488566 master-0 kubenswrapper[28758]: I0223 14:43:06.484980 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca5pswz" Feb 23 14:43:06.897642 master-0 kubenswrapper[28758]: I0223 14:43:06.897555 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca5pswz"] Feb 23 14:43:06.900976 master-0 kubenswrapper[28758]: W0223 14:43:06.900811 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66d96f69_3a24_4996_839e_05fc58aa8fb5.slice/crio-8011cd6ff20f578826a6de51ba0df19bde87bd9b6392edee8df7da72167352d6 WatchSource:0}: Error finding container 8011cd6ff20f578826a6de51ba0df19bde87bd9b6392edee8df7da72167352d6: Status 404 returned error can't find the container with id 8011cd6ff20f578826a6de51ba0df19bde87bd9b6392edee8df7da72167352d6 Feb 23 14:43:07.178093 master-0 kubenswrapper[28758]: I0223 14:43:07.177937 28758 generic.go:334] "Generic (PLEG): container finished" podID="66d96f69-3a24-4996-839e-05fc58aa8fb5" containerID="7302d0dc820b6d660faecaa48533a421d515f5502d5ffc17664736e8bfd2d347" exitCode=0 Feb 23 14:43:07.178093 master-0 kubenswrapper[28758]: I0223 14:43:07.178002 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca5pswz" event={"ID":"66d96f69-3a24-4996-839e-05fc58aa8fb5","Type":"ContainerDied","Data":"7302d0dc820b6d660faecaa48533a421d515f5502d5ffc17664736e8bfd2d347"} Feb 23 14:43:07.178093 master-0 kubenswrapper[28758]: I0223 14:43:07.178030 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca5pswz" event={"ID":"66d96f69-3a24-4996-839e-05fc58aa8fb5","Type":"ContainerStarted","Data":"8011cd6ff20f578826a6de51ba0df19bde87bd9b6392edee8df7da72167352d6"} Feb 23 14:43:07.180784 master-0 kubenswrapper[28758]: I0223 14:43:07.180560 28758 generic.go:334] "Generic (PLEG): container finished" podID="c4e41f54-d479-4cbc-8ed9-984aae83ad0d" containerID="432078001b2afe1e26e14bd764d40eb6859b8827edc36f9eeaa724db25da5516" exitCode=0 Feb 23 14:43:07.180784 master-0 kubenswrapper[28758]: I0223 14:43:07.180618 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213p2rrm" event={"ID":"c4e41f54-d479-4cbc-8ed9-984aae83ad0d","Type":"ContainerDied","Data":"432078001b2afe1e26e14bd764d40eb6859b8827edc36f9eeaa724db25da5516"} Feb 23 14:43:07.180784 master-0 kubenswrapper[28758]: I0223 14:43:07.180652 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213p2rrm" event={"ID":"c4e41f54-d479-4cbc-8ed9-984aae83ad0d","Type":"ContainerStarted","Data":"3b32c4e5c105ce63995977ccd27c168f158710fe49d26e8a795de19b0ecf8182"} Feb 23 14:43:10.209833 master-0 kubenswrapper[28758]: I0223 14:43:10.209280 28758 generic.go:334] "Generic (PLEG): container finished" podID="3fcade64-f544-4158-9a39-31cace8542dc" containerID="08f9a3119e211f3afe00b7e79642980c1fa19bd385fac8e05eabee832d7506b4" exitCode=0 Feb 23 14:43:10.211073 master-0 kubenswrapper[28758]: I0223 14:43:10.209446 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hjjc5" event={"ID":"3fcade64-f544-4158-9a39-31cace8542dc","Type":"ContainerDied","Data":"08f9a3119e211f3afe00b7e79642980c1fa19bd385fac8e05eabee832d7506b4"} Feb 23 14:43:10.214143 master-0 kubenswrapper[28758]: I0223 14:43:10.214092 28758 generic.go:334] "Generic (PLEG): container finished" podID="c4e41f54-d479-4cbc-8ed9-984aae83ad0d" containerID="028739456e18f8408dd9fd6da3da56b0c6839f510528c77f623355bbcf1800be" exitCode=0 Feb 23 14:43:10.214256 master-0 kubenswrapper[28758]: I0223 14:43:10.214206 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213p2rrm" event={"ID":"c4e41f54-d479-4cbc-8ed9-984aae83ad0d","Type":"ContainerDied","Data":"028739456e18f8408dd9fd6da3da56b0c6839f510528c77f623355bbcf1800be"} Feb 23 14:43:10.217830 master-0 kubenswrapper[28758]: I0223 14:43:10.217780 28758 generic.go:334] "Generic (PLEG): container finished" podID="66d96f69-3a24-4996-839e-05fc58aa8fb5" containerID="35f4cfbba29e503bbb50ab9e87a550bce0021b27075c6416411c7d67aa38738c" exitCode=0 Feb 23 14:43:10.217830 master-0 kubenswrapper[28758]: I0223 14:43:10.217816 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca5pswz" event={"ID":"66d96f69-3a24-4996-839e-05fc58aa8fb5","Type":"ContainerDied","Data":"35f4cfbba29e503bbb50ab9e87a550bce0021b27075c6416411c7d67aa38738c"} Feb 23 14:43:11.228565 master-0 kubenswrapper[28758]: I0223 14:43:11.228458 28758 generic.go:334] "Generic (PLEG): container finished" podID="66d96f69-3a24-4996-839e-05fc58aa8fb5" containerID="414cc5cee67fab4cd3292599e1c7cb44926d45ce70184a6797aed97ec799dd25" exitCode=0 Feb 23 14:43:11.229095 master-0 kubenswrapper[28758]: I0223 14:43:11.228595 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca5pswz" event={"ID":"66d96f69-3a24-4996-839e-05fc58aa8fb5","Type":"ContainerDied","Data":"414cc5cee67fab4cd3292599e1c7cb44926d45ce70184a6797aed97ec799dd25"} Feb 23 14:43:11.232311 master-0 kubenswrapper[28758]: I0223 14:43:11.232242 28758 generic.go:334] "Generic (PLEG): container finished" podID="3fcade64-f544-4158-9a39-31cace8542dc" containerID="7f65775e00ac0a94f596500254fa169aa8d37e38e78bf271c231880ea9da9f5e" exitCode=0 Feb 23 14:43:11.232421 master-0 kubenswrapper[28758]: I0223 14:43:11.232329 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hjjc5" event={"ID":"3fcade64-f544-4158-9a39-31cace8542dc","Type":"ContainerDied","Data":"7f65775e00ac0a94f596500254fa169aa8d37e38e78bf271c231880ea9da9f5e"} Feb 23 14:43:11.235144 master-0 kubenswrapper[28758]: I0223 14:43:11.235101 28758 generic.go:334] "Generic (PLEG): container finished" podID="c4e41f54-d479-4cbc-8ed9-984aae83ad0d" containerID="83194a59781a5972b3cb7f94cfa5eb6c29966a6dee9ed29e4577d2f481113f94" exitCode=0 Feb 23 14:43:11.235244 master-0 kubenswrapper[28758]: I0223 14:43:11.235153 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213p2rrm" event={"ID":"c4e41f54-d479-4cbc-8ed9-984aae83ad0d","Type":"ContainerDied","Data":"83194a59781a5972b3cb7f94cfa5eb6c29966a6dee9ed29e4577d2f481113f94"} Feb 23 14:43:12.772218 master-0 kubenswrapper[28758]: I0223 14:43:12.772076 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca5pswz" Feb 23 14:43:12.775057 master-0 kubenswrapper[28758]: I0223 14:43:12.775014 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213p2rrm" Feb 23 14:43:12.777250 master-0 kubenswrapper[28758]: I0223 14:43:12.777198 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hjjc5" Feb 23 14:43:12.805528 master-0 kubenswrapper[28758]: I0223 14:43:12.805467 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c4e41f54-d479-4cbc-8ed9-984aae83ad0d-bundle\") pod \"c4e41f54-d479-4cbc-8ed9-984aae83ad0d\" (UID: \"c4e41f54-d479-4cbc-8ed9-984aae83ad0d\") " Feb 23 14:43:12.805965 master-0 kubenswrapper[28758]: I0223 14:43:12.805676 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c4e41f54-d479-4cbc-8ed9-984aae83ad0d-util\") pod \"c4e41f54-d479-4cbc-8ed9-984aae83ad0d\" (UID: \"c4e41f54-d479-4cbc-8ed9-984aae83ad0d\") " Feb 23 14:43:12.805965 master-0 kubenswrapper[28758]: I0223 14:43:12.805711 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb4vf\" (UniqueName: \"kubernetes.io/projected/c4e41f54-d479-4cbc-8ed9-984aae83ad0d-kube-api-access-qb4vf\") pod \"c4e41f54-d479-4cbc-8ed9-984aae83ad0d\" (UID: \"c4e41f54-d479-4cbc-8ed9-984aae83ad0d\") " Feb 23 14:43:12.805965 master-0 kubenswrapper[28758]: I0223 14:43:12.805730 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-756hx\" (UniqueName: \"kubernetes.io/projected/3fcade64-f544-4158-9a39-31cace8542dc-kube-api-access-756hx\") pod \"3fcade64-f544-4158-9a39-31cace8542dc\" (UID: \"3fcade64-f544-4158-9a39-31cace8542dc\") " Feb 23 14:43:12.805965 master-0 kubenswrapper[28758]: I0223 14:43:12.805768 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/66d96f69-3a24-4996-839e-05fc58aa8fb5-bundle\") pod \"66d96f69-3a24-4996-839e-05fc58aa8fb5\" (UID: \"66d96f69-3a24-4996-839e-05fc58aa8fb5\") " Feb 23 14:43:12.806121 master-0 kubenswrapper[28758]: I0223 14:43:12.805983 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3fcade64-f544-4158-9a39-31cace8542dc-bundle\") pod \"3fcade64-f544-4158-9a39-31cace8542dc\" (UID: \"3fcade64-f544-4158-9a39-31cace8542dc\") " Feb 23 14:43:12.806121 master-0 kubenswrapper[28758]: I0223 14:43:12.806005 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh7pp\" (UniqueName: \"kubernetes.io/projected/66d96f69-3a24-4996-839e-05fc58aa8fb5-kube-api-access-jh7pp\") pod \"66d96f69-3a24-4996-839e-05fc58aa8fb5\" (UID: \"66d96f69-3a24-4996-839e-05fc58aa8fb5\") " Feb 23 14:43:12.806121 master-0 kubenswrapper[28758]: I0223 14:43:12.806040 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3fcade64-f544-4158-9a39-31cace8542dc-util\") pod \"3fcade64-f544-4158-9a39-31cace8542dc\" (UID: \"3fcade64-f544-4158-9a39-31cace8542dc\") " Feb 23 14:43:12.806121 master-0 kubenswrapper[28758]: I0223 14:43:12.806058 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/66d96f69-3a24-4996-839e-05fc58aa8fb5-util\") pod \"66d96f69-3a24-4996-839e-05fc58aa8fb5\" (UID: \"66d96f69-3a24-4996-839e-05fc58aa8fb5\") " Feb 23 14:43:12.812131 master-0 kubenswrapper[28758]: I0223 14:43:12.807161 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4e41f54-d479-4cbc-8ed9-984aae83ad0d-bundle" (OuterVolumeSpecName: "bundle") pod "c4e41f54-d479-4cbc-8ed9-984aae83ad0d" (UID: "c4e41f54-d479-4cbc-8ed9-984aae83ad0d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:43:12.812131 master-0 kubenswrapper[28758]: I0223 14:43:12.810811 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66d96f69-3a24-4996-839e-05fc58aa8fb5-kube-api-access-jh7pp" (OuterVolumeSpecName: "kube-api-access-jh7pp") pod "66d96f69-3a24-4996-839e-05fc58aa8fb5" (UID: "66d96f69-3a24-4996-839e-05fc58aa8fb5"). InnerVolumeSpecName "kube-api-access-jh7pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:43:12.812131 master-0 kubenswrapper[28758]: I0223 14:43:12.811273 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4e41f54-d479-4cbc-8ed9-984aae83ad0d-kube-api-access-qb4vf" (OuterVolumeSpecName: "kube-api-access-qb4vf") pod "c4e41f54-d479-4cbc-8ed9-984aae83ad0d" (UID: "c4e41f54-d479-4cbc-8ed9-984aae83ad0d"). InnerVolumeSpecName "kube-api-access-qb4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:43:12.816683 master-0 kubenswrapper[28758]: I0223 14:43:12.816609 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fcade64-f544-4158-9a39-31cace8542dc-bundle" (OuterVolumeSpecName: "bundle") pod "3fcade64-f544-4158-9a39-31cace8542dc" (UID: "3fcade64-f544-4158-9a39-31cace8542dc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:43:12.817367 master-0 kubenswrapper[28758]: I0223 14:43:12.817228 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66d96f69-3a24-4996-839e-05fc58aa8fb5-bundle" (OuterVolumeSpecName: "bundle") pod "66d96f69-3a24-4996-839e-05fc58aa8fb5" (UID: "66d96f69-3a24-4996-839e-05fc58aa8fb5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:43:12.819331 master-0 kubenswrapper[28758]: I0223 14:43:12.819259 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fcade64-f544-4158-9a39-31cace8542dc-kube-api-access-756hx" (OuterVolumeSpecName: "kube-api-access-756hx") pod "3fcade64-f544-4158-9a39-31cace8542dc" (UID: "3fcade64-f544-4158-9a39-31cace8542dc"). InnerVolumeSpecName "kube-api-access-756hx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:43:12.824882 master-0 kubenswrapper[28758]: I0223 14:43:12.824826 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4e41f54-d479-4cbc-8ed9-984aae83ad0d-util" (OuterVolumeSpecName: "util") pod "c4e41f54-d479-4cbc-8ed9-984aae83ad0d" (UID: "c4e41f54-d479-4cbc-8ed9-984aae83ad0d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:43:12.827820 master-0 kubenswrapper[28758]: I0223 14:43:12.827781 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66d96f69-3a24-4996-839e-05fc58aa8fb5-util" (OuterVolumeSpecName: "util") pod "66d96f69-3a24-4996-839e-05fc58aa8fb5" (UID: "66d96f69-3a24-4996-839e-05fc58aa8fb5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:43:12.834463 master-0 kubenswrapper[28758]: I0223 14:43:12.834371 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fcade64-f544-4158-9a39-31cace8542dc-util" (OuterVolumeSpecName: "util") pod "3fcade64-f544-4158-9a39-31cace8542dc" (UID: "3fcade64-f544-4158-9a39-31cace8542dc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:43:12.907592 master-0 kubenswrapper[28758]: I0223 14:43:12.907544 28758 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c4e41f54-d479-4cbc-8ed9-984aae83ad0d-util\") on node \"master-0\" DevicePath \"\"" Feb 23 14:43:12.907823 master-0 kubenswrapper[28758]: I0223 14:43:12.907808 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qb4vf\" (UniqueName: \"kubernetes.io/projected/c4e41f54-d479-4cbc-8ed9-984aae83ad0d-kube-api-access-qb4vf\") on node \"master-0\" DevicePath \"\"" Feb 23 14:43:12.907888 master-0 kubenswrapper[28758]: I0223 14:43:12.907878 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-756hx\" (UniqueName: \"kubernetes.io/projected/3fcade64-f544-4158-9a39-31cace8542dc-kube-api-access-756hx\") on node \"master-0\" DevicePath \"\"" Feb 23 14:43:12.907983 master-0 kubenswrapper[28758]: I0223 14:43:12.907971 28758 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/66d96f69-3a24-4996-839e-05fc58aa8fb5-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:43:12.908050 master-0 kubenswrapper[28758]: I0223 14:43:12.908041 28758 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3fcade64-f544-4158-9a39-31cace8542dc-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:43:12.908110 master-0 kubenswrapper[28758]: I0223 14:43:12.908100 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jh7pp\" (UniqueName: \"kubernetes.io/projected/66d96f69-3a24-4996-839e-05fc58aa8fb5-kube-api-access-jh7pp\") on node \"master-0\" DevicePath \"\"" Feb 23 14:43:12.908169 master-0 kubenswrapper[28758]: I0223 14:43:12.908159 28758 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3fcade64-f544-4158-9a39-31cace8542dc-util\") on node \"master-0\" DevicePath \"\"" Feb 23 14:43:12.908229 master-0 kubenswrapper[28758]: I0223 14:43:12.908220 28758 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/66d96f69-3a24-4996-839e-05fc58aa8fb5-util\") on node \"master-0\" DevicePath \"\"" Feb 23 14:43:12.908287 master-0 kubenswrapper[28758]: I0223 14:43:12.908277 28758 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c4e41f54-d479-4cbc-8ed9-984aae83ad0d-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:43:13.254669 master-0 kubenswrapper[28758]: I0223 14:43:13.254436 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213p2rrm" event={"ID":"c4e41f54-d479-4cbc-8ed9-984aae83ad0d","Type":"ContainerDied","Data":"3b32c4e5c105ce63995977ccd27c168f158710fe49d26e8a795de19b0ecf8182"} Feb 23 14:43:13.254669 master-0 kubenswrapper[28758]: I0223 14:43:13.254657 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b32c4e5c105ce63995977ccd27c168f158710fe49d26e8a795de19b0ecf8182" Feb 23 14:43:13.254669 master-0 kubenswrapper[28758]: I0223 14:43:13.254521 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213p2rrm" Feb 23 14:43:13.257808 master-0 kubenswrapper[28758]: I0223 14:43:13.257744 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca5pswz" event={"ID":"66d96f69-3a24-4996-839e-05fc58aa8fb5","Type":"ContainerDied","Data":"8011cd6ff20f578826a6de51ba0df19bde87bd9b6392edee8df7da72167352d6"} Feb 23 14:43:13.257808 master-0 kubenswrapper[28758]: I0223 14:43:13.257803 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8011cd6ff20f578826a6de51ba0df19bde87bd9b6392edee8df7da72167352d6" Feb 23 14:43:13.257952 master-0 kubenswrapper[28758]: I0223 14:43:13.257888 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca5pswz" Feb 23 14:43:13.264571 master-0 kubenswrapper[28758]: I0223 14:43:13.264443 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hjjc5" event={"ID":"3fcade64-f544-4158-9a39-31cace8542dc","Type":"ContainerDied","Data":"bce3a0437b8e8e3f9c2f432da9c1e325f7d1e21c2a533df16e7f32bbeb45a766"} Feb 23 14:43:13.264730 master-0 kubenswrapper[28758]: I0223 14:43:13.264576 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bce3a0437b8e8e3f9c2f432da9c1e325f7d1e21c2a533df16e7f32bbeb45a766" Feb 23 14:43:13.264730 master-0 kubenswrapper[28758]: I0223 14:43:13.264716 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hjjc5" Feb 23 14:43:16.647793 master-0 kubenswrapper[28758]: I0223 14:43:16.647704 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0887kg7"] Feb 23 14:43:16.649627 master-0 kubenswrapper[28758]: E0223 14:43:16.648151 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d96f69-3a24-4996-839e-05fc58aa8fb5" containerName="util" Feb 23 14:43:16.649627 master-0 kubenswrapper[28758]: I0223 14:43:16.648190 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d96f69-3a24-4996-839e-05fc58aa8fb5" containerName="util" Feb 23 14:43:16.649627 master-0 kubenswrapper[28758]: E0223 14:43:16.648219 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4e41f54-d479-4cbc-8ed9-984aae83ad0d" containerName="extract" Feb 23 14:43:16.649627 master-0 kubenswrapper[28758]: I0223 14:43:16.648228 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4e41f54-d479-4cbc-8ed9-984aae83ad0d" containerName="extract" Feb 23 14:43:16.649627 master-0 kubenswrapper[28758]: E0223 14:43:16.648269 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fcade64-f544-4158-9a39-31cace8542dc" containerName="extract" Feb 23 14:43:16.649627 master-0 kubenswrapper[28758]: I0223 14:43:16.648281 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fcade64-f544-4158-9a39-31cace8542dc" containerName="extract" Feb 23 14:43:16.649627 master-0 kubenswrapper[28758]: E0223 14:43:16.648297 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fcade64-f544-4158-9a39-31cace8542dc" containerName="pull" Feb 23 14:43:16.649627 master-0 kubenswrapper[28758]: I0223 14:43:16.648306 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fcade64-f544-4158-9a39-31cace8542dc" containerName="pull" Feb 23 14:43:16.649627 master-0 kubenswrapper[28758]: E0223 14:43:16.648341 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d96f69-3a24-4996-839e-05fc58aa8fb5" containerName="extract" Feb 23 14:43:16.649627 master-0 kubenswrapper[28758]: I0223 14:43:16.648350 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d96f69-3a24-4996-839e-05fc58aa8fb5" containerName="extract" Feb 23 14:43:16.649627 master-0 kubenswrapper[28758]: E0223 14:43:16.648369 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fcade64-f544-4158-9a39-31cace8542dc" containerName="util" Feb 23 14:43:16.649627 master-0 kubenswrapper[28758]: I0223 14:43:16.648377 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fcade64-f544-4158-9a39-31cace8542dc" containerName="util" Feb 23 14:43:16.649627 master-0 kubenswrapper[28758]: E0223 14:43:16.648394 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4e41f54-d479-4cbc-8ed9-984aae83ad0d" containerName="util" Feb 23 14:43:16.649627 master-0 kubenswrapper[28758]: I0223 14:43:16.648421 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4e41f54-d479-4cbc-8ed9-984aae83ad0d" containerName="util" Feb 23 14:43:16.649627 master-0 kubenswrapper[28758]: E0223 14:43:16.648438 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4e41f54-d479-4cbc-8ed9-984aae83ad0d" containerName="pull" Feb 23 14:43:16.649627 master-0 kubenswrapper[28758]: I0223 14:43:16.648446 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4e41f54-d479-4cbc-8ed9-984aae83ad0d" containerName="pull" Feb 23 14:43:16.649627 master-0 kubenswrapper[28758]: E0223 14:43:16.648495 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d96f69-3a24-4996-839e-05fc58aa8fb5" containerName="pull" Feb 23 14:43:16.649627 master-0 kubenswrapper[28758]: I0223 14:43:16.648504 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d96f69-3a24-4996-839e-05fc58aa8fb5" containerName="pull" Feb 23 14:43:16.658812 master-0 kubenswrapper[28758]: I0223 14:43:16.652686 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="66d96f69-3a24-4996-839e-05fc58aa8fb5" containerName="extract" Feb 23 14:43:16.658812 master-0 kubenswrapper[28758]: I0223 14:43:16.652783 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4e41f54-d479-4cbc-8ed9-984aae83ad0d" containerName="extract" Feb 23 14:43:16.658812 master-0 kubenswrapper[28758]: I0223 14:43:16.652838 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fcade64-f544-4158-9a39-31cace8542dc" containerName="extract" Feb 23 14:43:16.659542 master-0 kubenswrapper[28758]: I0223 14:43:16.659383 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0887kg7" Feb 23 14:43:16.665495 master-0 kubenswrapper[28758]: I0223 14:43:16.661987 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-49vsk" Feb 23 14:43:16.667463 master-0 kubenswrapper[28758]: I0223 14:43:16.667379 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/07e4ac1b-3e7f-471c-8a03-a320624aa15f-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0887kg7\" (UID: \"07e4ac1b-3e7f-471c-8a03-a320624aa15f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0887kg7" Feb 23 14:43:16.667463 master-0 kubenswrapper[28758]: I0223 14:43:16.667438 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/07e4ac1b-3e7f-471c-8a03-a320624aa15f-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0887kg7\" (UID: \"07e4ac1b-3e7f-471c-8a03-a320624aa15f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0887kg7" Feb 23 14:43:16.667650 master-0 kubenswrapper[28758]: I0223 14:43:16.667560 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8f6v\" (UniqueName: \"kubernetes.io/projected/07e4ac1b-3e7f-471c-8a03-a320624aa15f-kube-api-access-p8f6v\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0887kg7\" (UID: \"07e4ac1b-3e7f-471c-8a03-a320624aa15f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0887kg7" Feb 23 14:43:16.709509 master-0 kubenswrapper[28758]: I0223 14:43:16.695427 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0887kg7"] Feb 23 14:43:16.769676 master-0 kubenswrapper[28758]: I0223 14:43:16.769576 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8f6v\" (UniqueName: \"kubernetes.io/projected/07e4ac1b-3e7f-471c-8a03-a320624aa15f-kube-api-access-p8f6v\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0887kg7\" (UID: \"07e4ac1b-3e7f-471c-8a03-a320624aa15f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0887kg7" Feb 23 14:43:16.769937 master-0 kubenswrapper[28758]: I0223 14:43:16.769894 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/07e4ac1b-3e7f-471c-8a03-a320624aa15f-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0887kg7\" (UID: \"07e4ac1b-3e7f-471c-8a03-a320624aa15f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0887kg7" Feb 23 14:43:16.769981 master-0 kubenswrapper[28758]: I0223 14:43:16.769959 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/07e4ac1b-3e7f-471c-8a03-a320624aa15f-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0887kg7\" (UID: \"07e4ac1b-3e7f-471c-8a03-a320624aa15f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0887kg7" Feb 23 14:43:16.770618 master-0 kubenswrapper[28758]: I0223 14:43:16.770588 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/07e4ac1b-3e7f-471c-8a03-a320624aa15f-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0887kg7\" (UID: \"07e4ac1b-3e7f-471c-8a03-a320624aa15f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0887kg7" Feb 23 14:43:16.770695 master-0 kubenswrapper[28758]: I0223 14:43:16.770580 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/07e4ac1b-3e7f-471c-8a03-a320624aa15f-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0887kg7\" (UID: \"07e4ac1b-3e7f-471c-8a03-a320624aa15f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0887kg7" Feb 23 14:43:16.790814 master-0 kubenswrapper[28758]: I0223 14:43:16.786104 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8f6v\" (UniqueName: \"kubernetes.io/projected/07e4ac1b-3e7f-471c-8a03-a320624aa15f-kube-api-access-p8f6v\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0887kg7\" (UID: \"07e4ac1b-3e7f-471c-8a03-a320624aa15f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0887kg7" Feb 23 14:43:16.994324 master-0 kubenswrapper[28758]: I0223 14:43:16.994213 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0887kg7" Feb 23 14:43:17.414170 master-0 kubenswrapper[28758]: I0223 14:43:17.414037 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0887kg7"] Feb 23 14:43:17.422824 master-0 kubenswrapper[28758]: W0223 14:43:17.422786 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07e4ac1b_3e7f_471c_8a03_a320624aa15f.slice/crio-c34675e7d32b5433c5ec1f541dee133384b10004feaa646ec523488cef978921 WatchSource:0}: Error finding container c34675e7d32b5433c5ec1f541dee133384b10004feaa646ec523488cef978921: Status 404 returned error can't find the container with id c34675e7d32b5433c5ec1f541dee133384b10004feaa646ec523488cef978921 Feb 23 14:43:17.716532 master-0 kubenswrapper[28758]: I0223 14:43:17.716470 28758 generic.go:334] "Generic (PLEG): container finished" podID="07e4ac1b-3e7f-471c-8a03-a320624aa15f" containerID="ff821c21575b6ea65e6d7d0cac76092761ad1aebeea4b9e8e0e5a93e50a3c5f1" exitCode=0 Feb 23 14:43:17.717059 master-0 kubenswrapper[28758]: I0223 14:43:17.716682 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0887kg7" event={"ID":"07e4ac1b-3e7f-471c-8a03-a320624aa15f","Type":"ContainerDied","Data":"ff821c21575b6ea65e6d7d0cac76092761ad1aebeea4b9e8e0e5a93e50a3c5f1"} Feb 23 14:43:17.717160 master-0 kubenswrapper[28758]: I0223 14:43:17.717139 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0887kg7" event={"ID":"07e4ac1b-3e7f-471c-8a03-a320624aa15f","Type":"ContainerStarted","Data":"c34675e7d32b5433c5ec1f541dee133384b10004feaa646ec523488cef978921"} Feb 23 14:43:19.733683 master-0 kubenswrapper[28758]: I0223 14:43:19.733570 28758 generic.go:334] "Generic (PLEG): container finished" podID="07e4ac1b-3e7f-471c-8a03-a320624aa15f" containerID="23729810ee9be9b1f1ea3f237a4a5a3929c02ae05111a63e18f1e6fedc21b92b" exitCode=0 Feb 23 14:43:19.733683 master-0 kubenswrapper[28758]: I0223 14:43:19.733645 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0887kg7" event={"ID":"07e4ac1b-3e7f-471c-8a03-a320624aa15f","Type":"ContainerDied","Data":"23729810ee9be9b1f1ea3f237a4a5a3929c02ae05111a63e18f1e6fedc21b92b"} Feb 23 14:43:20.272583 master-0 kubenswrapper[28758]: I0223 14:43:20.272468 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-pvjnl"] Feb 23 14:43:20.273843 master-0 kubenswrapper[28758]: I0223 14:43:20.273806 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-pvjnl" Feb 23 14:43:20.275996 master-0 kubenswrapper[28758]: I0223 14:43:20.275963 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Feb 23 14:43:20.276097 master-0 kubenswrapper[28758]: I0223 14:43:20.275996 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Feb 23 14:43:20.305844 master-0 kubenswrapper[28758]: I0223 14:43:20.305772 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-pvjnl"] Feb 23 14:43:20.425617 master-0 kubenswrapper[28758]: I0223 14:43:20.424428 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8mt2\" (UniqueName: \"kubernetes.io/projected/faa347c5-7062-48b8-9b71-cacb92a37d53-kube-api-access-t8mt2\") pod \"cert-manager-operator-controller-manager-66c8bdd694-pvjnl\" (UID: \"faa347c5-7062-48b8-9b71-cacb92a37d53\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-pvjnl" Feb 23 14:43:20.425617 master-0 kubenswrapper[28758]: I0223 14:43:20.424581 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/faa347c5-7062-48b8-9b71-cacb92a37d53-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-pvjnl\" (UID: \"faa347c5-7062-48b8-9b71-cacb92a37d53\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-pvjnl" Feb 23 14:43:20.525875 master-0 kubenswrapper[28758]: I0223 14:43:20.525796 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/faa347c5-7062-48b8-9b71-cacb92a37d53-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-pvjnl\" (UID: \"faa347c5-7062-48b8-9b71-cacb92a37d53\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-pvjnl" Feb 23 14:43:20.526083 master-0 kubenswrapper[28758]: I0223 14:43:20.525923 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8mt2\" (UniqueName: \"kubernetes.io/projected/faa347c5-7062-48b8-9b71-cacb92a37d53-kube-api-access-t8mt2\") pod \"cert-manager-operator-controller-manager-66c8bdd694-pvjnl\" (UID: \"faa347c5-7062-48b8-9b71-cacb92a37d53\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-pvjnl" Feb 23 14:43:20.526616 master-0 kubenswrapper[28758]: I0223 14:43:20.526578 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/faa347c5-7062-48b8-9b71-cacb92a37d53-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-pvjnl\" (UID: \"faa347c5-7062-48b8-9b71-cacb92a37d53\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-pvjnl" Feb 23 14:43:20.545714 master-0 kubenswrapper[28758]: I0223 14:43:20.545663 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8mt2\" (UniqueName: \"kubernetes.io/projected/faa347c5-7062-48b8-9b71-cacb92a37d53-kube-api-access-t8mt2\") pod \"cert-manager-operator-controller-manager-66c8bdd694-pvjnl\" (UID: \"faa347c5-7062-48b8-9b71-cacb92a37d53\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-pvjnl" Feb 23 14:43:20.588190 master-0 kubenswrapper[28758]: I0223 14:43:20.588119 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-pvjnl" Feb 23 14:43:20.751083 master-0 kubenswrapper[28758]: I0223 14:43:20.750998 28758 generic.go:334] "Generic (PLEG): container finished" podID="07e4ac1b-3e7f-471c-8a03-a320624aa15f" containerID="ad373440745e14ab8a1af7c847dbc192187ac7732eb030ab5a425b8e7b72b89c" exitCode=0 Feb 23 14:43:20.751083 master-0 kubenswrapper[28758]: I0223 14:43:20.751065 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0887kg7" event={"ID":"07e4ac1b-3e7f-471c-8a03-a320624aa15f","Type":"ContainerDied","Data":"ad373440745e14ab8a1af7c847dbc192187ac7732eb030ab5a425b8e7b72b89c"} Feb 23 14:43:21.261407 master-0 kubenswrapper[28758]: I0223 14:43:21.261262 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-pvjnl"] Feb 23 14:43:21.265773 master-0 kubenswrapper[28758]: W0223 14:43:21.265649 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfaa347c5_7062_48b8_9b71_cacb92a37d53.slice/crio-0aa658aa574012775ae81c119a4a3aa152fb97be3425e23acd8f80686b7d6602 WatchSource:0}: Error finding container 0aa658aa574012775ae81c119a4a3aa152fb97be3425e23acd8f80686b7d6602: Status 404 returned error can't find the container with id 0aa658aa574012775ae81c119a4a3aa152fb97be3425e23acd8f80686b7d6602 Feb 23 14:43:21.759686 master-0 kubenswrapper[28758]: I0223 14:43:21.759623 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-pvjnl" event={"ID":"faa347c5-7062-48b8-9b71-cacb92a37d53","Type":"ContainerStarted","Data":"0aa658aa574012775ae81c119a4a3aa152fb97be3425e23acd8f80686b7d6602"} Feb 23 14:43:22.096849 master-0 kubenswrapper[28758]: I0223 14:43:22.096805 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0887kg7" Feb 23 14:43:22.167716 master-0 kubenswrapper[28758]: I0223 14:43:22.165617 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8f6v\" (UniqueName: \"kubernetes.io/projected/07e4ac1b-3e7f-471c-8a03-a320624aa15f-kube-api-access-p8f6v\") pod \"07e4ac1b-3e7f-471c-8a03-a320624aa15f\" (UID: \"07e4ac1b-3e7f-471c-8a03-a320624aa15f\") " Feb 23 14:43:22.167716 master-0 kubenswrapper[28758]: I0223 14:43:22.165793 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/07e4ac1b-3e7f-471c-8a03-a320624aa15f-bundle\") pod \"07e4ac1b-3e7f-471c-8a03-a320624aa15f\" (UID: \"07e4ac1b-3e7f-471c-8a03-a320624aa15f\") " Feb 23 14:43:22.167716 master-0 kubenswrapper[28758]: I0223 14:43:22.165849 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/07e4ac1b-3e7f-471c-8a03-a320624aa15f-util\") pod \"07e4ac1b-3e7f-471c-8a03-a320624aa15f\" (UID: \"07e4ac1b-3e7f-471c-8a03-a320624aa15f\") " Feb 23 14:43:22.171493 master-0 kubenswrapper[28758]: I0223 14:43:22.170747 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07e4ac1b-3e7f-471c-8a03-a320624aa15f-bundle" (OuterVolumeSpecName: "bundle") pod "07e4ac1b-3e7f-471c-8a03-a320624aa15f" (UID: "07e4ac1b-3e7f-471c-8a03-a320624aa15f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:43:22.175534 master-0 kubenswrapper[28758]: I0223 14:43:22.174795 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07e4ac1b-3e7f-471c-8a03-a320624aa15f-kube-api-access-p8f6v" (OuterVolumeSpecName: "kube-api-access-p8f6v") pod "07e4ac1b-3e7f-471c-8a03-a320624aa15f" (UID: "07e4ac1b-3e7f-471c-8a03-a320624aa15f"). InnerVolumeSpecName "kube-api-access-p8f6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:43:22.194502 master-0 kubenswrapper[28758]: I0223 14:43:22.192808 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07e4ac1b-3e7f-471c-8a03-a320624aa15f-util" (OuterVolumeSpecName: "util") pod "07e4ac1b-3e7f-471c-8a03-a320624aa15f" (UID: "07e4ac1b-3e7f-471c-8a03-a320624aa15f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:43:22.268345 master-0 kubenswrapper[28758]: I0223 14:43:22.268286 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8f6v\" (UniqueName: \"kubernetes.io/projected/07e4ac1b-3e7f-471c-8a03-a320624aa15f-kube-api-access-p8f6v\") on node \"master-0\" DevicePath \"\"" Feb 23 14:43:22.268345 master-0 kubenswrapper[28758]: I0223 14:43:22.268337 28758 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/07e4ac1b-3e7f-471c-8a03-a320624aa15f-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:43:22.268345 master-0 kubenswrapper[28758]: I0223 14:43:22.268352 28758 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/07e4ac1b-3e7f-471c-8a03-a320624aa15f-util\") on node \"master-0\" DevicePath \"\"" Feb 23 14:43:22.775644 master-0 kubenswrapper[28758]: I0223 14:43:22.774791 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0887kg7" event={"ID":"07e4ac1b-3e7f-471c-8a03-a320624aa15f","Type":"ContainerDied","Data":"c34675e7d32b5433c5ec1f541dee133384b10004feaa646ec523488cef978921"} Feb 23 14:43:22.775644 master-0 kubenswrapper[28758]: I0223 14:43:22.774851 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c34675e7d32b5433c5ec1f541dee133384b10004feaa646ec523488cef978921" Feb 23 14:43:22.775644 master-0 kubenswrapper[28758]: I0223 14:43:22.774921 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0887kg7" Feb 23 14:43:25.797803 master-0 kubenswrapper[28758]: I0223 14:43:25.797743 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-pvjnl" event={"ID":"faa347c5-7062-48b8-9b71-cacb92a37d53","Type":"ContainerStarted","Data":"7dfcfaaf14a0c406f8da547c4f41cad96584472631e204cb025d9bcd87fcf5ff"} Feb 23 14:43:25.829574 master-0 kubenswrapper[28758]: I0223 14:43:25.829461 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-pvjnl" podStartSLOduration=1.993432682 podStartE2EDuration="5.829431861s" podCreationTimestamp="2026-02-23 14:43:20 +0000 UTC" firstStartedPulling="2026-02-23 14:43:21.270699158 +0000 UTC m=+533.397015080" lastFinishedPulling="2026-02-23 14:43:25.106698327 +0000 UTC m=+537.233014259" observedRunningTime="2026-02-23 14:43:25.827868689 +0000 UTC m=+537.954184621" watchObservedRunningTime="2026-02-23 14:43:25.829431861 +0000 UTC m=+537.955747793" Feb 23 14:43:28.507670 master-0 kubenswrapper[28758]: I0223 14:43:28.507594 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-qzgk8"] Feb 23 14:43:28.508335 master-0 kubenswrapper[28758]: E0223 14:43:28.507976 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07e4ac1b-3e7f-471c-8a03-a320624aa15f" containerName="pull" Feb 23 14:43:28.508335 master-0 kubenswrapper[28758]: I0223 14:43:28.507994 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="07e4ac1b-3e7f-471c-8a03-a320624aa15f" containerName="pull" Feb 23 14:43:28.508335 master-0 kubenswrapper[28758]: E0223 14:43:28.508008 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07e4ac1b-3e7f-471c-8a03-a320624aa15f" containerName="util" Feb 23 14:43:28.508335 master-0 kubenswrapper[28758]: I0223 14:43:28.508014 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="07e4ac1b-3e7f-471c-8a03-a320624aa15f" containerName="util" Feb 23 14:43:28.508335 master-0 kubenswrapper[28758]: E0223 14:43:28.508026 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07e4ac1b-3e7f-471c-8a03-a320624aa15f" containerName="extract" Feb 23 14:43:28.508335 master-0 kubenswrapper[28758]: I0223 14:43:28.508032 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="07e4ac1b-3e7f-471c-8a03-a320624aa15f" containerName="extract" Feb 23 14:43:28.508335 master-0 kubenswrapper[28758]: I0223 14:43:28.508196 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="07e4ac1b-3e7f-471c-8a03-a320624aa15f" containerName="extract" Feb 23 14:43:28.508764 master-0 kubenswrapper[28758]: I0223 14:43:28.508736 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-qzgk8" Feb 23 14:43:28.514268 master-0 kubenswrapper[28758]: I0223 14:43:28.514217 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 23 14:43:28.514547 master-0 kubenswrapper[28758]: I0223 14:43:28.514518 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 23 14:43:28.520422 master-0 kubenswrapper[28758]: I0223 14:43:28.520382 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-qzgk8"] Feb 23 14:43:28.685989 master-0 kubenswrapper[28758]: I0223 14:43:28.685903 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97c30264-9449-47c1-8777-2af752e19ffc-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-qzgk8\" (UID: \"97c30264-9449-47c1-8777-2af752e19ffc\") " pod="cert-manager/cert-manager-webhook-6888856db4-qzgk8" Feb 23 14:43:28.686235 master-0 kubenswrapper[28758]: I0223 14:43:28.686011 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txg4j\" (UniqueName: \"kubernetes.io/projected/97c30264-9449-47c1-8777-2af752e19ffc-kube-api-access-txg4j\") pod \"cert-manager-webhook-6888856db4-qzgk8\" (UID: \"97c30264-9449-47c1-8777-2af752e19ffc\") " pod="cert-manager/cert-manager-webhook-6888856db4-qzgk8" Feb 23 14:43:28.787566 master-0 kubenswrapper[28758]: I0223 14:43:28.787419 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txg4j\" (UniqueName: \"kubernetes.io/projected/97c30264-9449-47c1-8777-2af752e19ffc-kube-api-access-txg4j\") pod \"cert-manager-webhook-6888856db4-qzgk8\" (UID: \"97c30264-9449-47c1-8777-2af752e19ffc\") " pod="cert-manager/cert-manager-webhook-6888856db4-qzgk8" Feb 23 14:43:28.787750 master-0 kubenswrapper[28758]: I0223 14:43:28.787587 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97c30264-9449-47c1-8777-2af752e19ffc-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-qzgk8\" (UID: \"97c30264-9449-47c1-8777-2af752e19ffc\") " pod="cert-manager/cert-manager-webhook-6888856db4-qzgk8" Feb 23 14:43:28.806162 master-0 kubenswrapper[28758]: I0223 14:43:28.806127 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97c30264-9449-47c1-8777-2af752e19ffc-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-qzgk8\" (UID: \"97c30264-9449-47c1-8777-2af752e19ffc\") " pod="cert-manager/cert-manager-webhook-6888856db4-qzgk8" Feb 23 14:43:28.807222 master-0 kubenswrapper[28758]: I0223 14:43:28.807181 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txg4j\" (UniqueName: \"kubernetes.io/projected/97c30264-9449-47c1-8777-2af752e19ffc-kube-api-access-txg4j\") pod \"cert-manager-webhook-6888856db4-qzgk8\" (UID: \"97c30264-9449-47c1-8777-2af752e19ffc\") " pod="cert-manager/cert-manager-webhook-6888856db4-qzgk8" Feb 23 14:43:28.828510 master-0 kubenswrapper[28758]: I0223 14:43:28.825645 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-qzgk8" Feb 23 14:43:29.267674 master-0 kubenswrapper[28758]: I0223 14:43:29.266258 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-qzgk8"] Feb 23 14:43:29.825307 master-0 kubenswrapper[28758]: I0223 14:43:29.825231 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-qzgk8" event={"ID":"97c30264-9449-47c1-8777-2af752e19ffc","Type":"ContainerStarted","Data":"5f9ac44568c0a70a6a454a224d634fe619fd96502f611c6fddaaff56b77cddb0"} Feb 23 14:43:31.197393 master-0 kubenswrapper[28758]: I0223 14:43:31.197333 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-rfx4r"] Feb 23 14:43:31.198323 master-0 kubenswrapper[28758]: I0223 14:43:31.198296 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-rfx4r" Feb 23 14:43:31.212817 master-0 kubenswrapper[28758]: I0223 14:43:31.212615 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-rfx4r"] Feb 23 14:43:31.327884 master-0 kubenswrapper[28758]: I0223 14:43:31.327794 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f0421842-6f5a-457c-931b-5e9743df11e2-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-rfx4r\" (UID: \"f0421842-6f5a-457c-931b-5e9743df11e2\") " pod="cert-manager/cert-manager-cainjector-5545bd876-rfx4r" Feb 23 14:43:31.328126 master-0 kubenswrapper[28758]: I0223 14:43:31.327969 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xpkq\" (UniqueName: \"kubernetes.io/projected/f0421842-6f5a-457c-931b-5e9743df11e2-kube-api-access-4xpkq\") pod \"cert-manager-cainjector-5545bd876-rfx4r\" (UID: \"f0421842-6f5a-457c-931b-5e9743df11e2\") " pod="cert-manager/cert-manager-cainjector-5545bd876-rfx4r" Feb 23 14:43:31.430463 master-0 kubenswrapper[28758]: I0223 14:43:31.430405 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xpkq\" (UniqueName: \"kubernetes.io/projected/f0421842-6f5a-457c-931b-5e9743df11e2-kube-api-access-4xpkq\") pod \"cert-manager-cainjector-5545bd876-rfx4r\" (UID: \"f0421842-6f5a-457c-931b-5e9743df11e2\") " pod="cert-manager/cert-manager-cainjector-5545bd876-rfx4r" Feb 23 14:43:31.430763 master-0 kubenswrapper[28758]: I0223 14:43:31.430571 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f0421842-6f5a-457c-931b-5e9743df11e2-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-rfx4r\" (UID: \"f0421842-6f5a-457c-931b-5e9743df11e2\") " pod="cert-manager/cert-manager-cainjector-5545bd876-rfx4r" Feb 23 14:43:31.447578 master-0 kubenswrapper[28758]: I0223 14:43:31.447448 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f0421842-6f5a-457c-931b-5e9743df11e2-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-rfx4r\" (UID: \"f0421842-6f5a-457c-931b-5e9743df11e2\") " pod="cert-manager/cert-manager-cainjector-5545bd876-rfx4r" Feb 23 14:43:31.453363 master-0 kubenswrapper[28758]: I0223 14:43:31.453285 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xpkq\" (UniqueName: \"kubernetes.io/projected/f0421842-6f5a-457c-931b-5e9743df11e2-kube-api-access-4xpkq\") pod \"cert-manager-cainjector-5545bd876-rfx4r\" (UID: \"f0421842-6f5a-457c-931b-5e9743df11e2\") " pod="cert-manager/cert-manager-cainjector-5545bd876-rfx4r" Feb 23 14:43:31.533582 master-0 kubenswrapper[28758]: I0223 14:43:31.533503 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-rfx4r" Feb 23 14:43:32.115603 master-0 kubenswrapper[28758]: I0223 14:43:32.114007 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-rfx4r"] Feb 23 14:43:32.118527 master-0 kubenswrapper[28758]: W0223 14:43:32.118460 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0421842_6f5a_457c_931b_5e9743df11e2.slice/crio-058c8e3d113a25022421796206f8c4db746e3a90903ef585b88c348928f3669a WatchSource:0}: Error finding container 058c8e3d113a25022421796206f8c4db746e3a90903ef585b88c348928f3669a: Status 404 returned error can't find the container with id 058c8e3d113a25022421796206f8c4db746e3a90903ef585b88c348928f3669a Feb 23 14:43:32.218506 master-0 kubenswrapper[28758]: I0223 14:43:32.208289 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-2w9dm"] Feb 23 14:43:32.218506 master-0 kubenswrapper[28758]: I0223 14:43:32.209632 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-2w9dm" Feb 23 14:43:32.218506 master-0 kubenswrapper[28758]: I0223 14:43:32.211539 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 23 14:43:32.218506 master-0 kubenswrapper[28758]: I0223 14:43:32.211963 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 23 14:43:32.235596 master-0 kubenswrapper[28758]: I0223 14:43:32.235532 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-2w9dm"] Feb 23 14:43:32.350436 master-0 kubenswrapper[28758]: I0223 14:43:32.350377 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j7c9\" (UniqueName: \"kubernetes.io/projected/c12ddca1-f73b-49d7-8906-5c8a32346b00-kube-api-access-7j7c9\") pod \"nmstate-operator-694c9596b7-2w9dm\" (UID: \"c12ddca1-f73b-49d7-8906-5c8a32346b00\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-2w9dm" Feb 23 14:43:32.452248 master-0 kubenswrapper[28758]: I0223 14:43:32.452166 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j7c9\" (UniqueName: \"kubernetes.io/projected/c12ddca1-f73b-49d7-8906-5c8a32346b00-kube-api-access-7j7c9\") pod \"nmstate-operator-694c9596b7-2w9dm\" (UID: \"c12ddca1-f73b-49d7-8906-5c8a32346b00\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-2w9dm" Feb 23 14:43:32.469005 master-0 kubenswrapper[28758]: I0223 14:43:32.468916 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j7c9\" (UniqueName: \"kubernetes.io/projected/c12ddca1-f73b-49d7-8906-5c8a32346b00-kube-api-access-7j7c9\") pod \"nmstate-operator-694c9596b7-2w9dm\" (UID: \"c12ddca1-f73b-49d7-8906-5c8a32346b00\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-2w9dm" Feb 23 14:43:32.540187 master-0 kubenswrapper[28758]: I0223 14:43:32.540116 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-2w9dm" Feb 23 14:43:32.850850 master-0 kubenswrapper[28758]: I0223 14:43:32.850704 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-rfx4r" event={"ID":"f0421842-6f5a-457c-931b-5e9743df11e2","Type":"ContainerStarted","Data":"058c8e3d113a25022421796206f8c4db746e3a90903ef585b88c348928f3669a"} Feb 23 14:43:32.996826 master-0 kubenswrapper[28758]: I0223 14:43:32.996744 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-2w9dm"] Feb 23 14:43:34.873571 master-0 kubenswrapper[28758]: I0223 14:43:34.873518 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-2w9dm" event={"ID":"c12ddca1-f73b-49d7-8906-5c8a32346b00","Type":"ContainerStarted","Data":"5e0a7f266544cf56614e122bcadd021c5418e1fd7997e2c38c0a9381288ded18"} Feb 23 14:43:35.883673 master-0 kubenswrapper[28758]: I0223 14:43:35.883586 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-qzgk8" event={"ID":"97c30264-9449-47c1-8777-2af752e19ffc","Type":"ContainerStarted","Data":"0b4c6d3f62a15ac5d4d6c2573d19943c96983dc88a57c9ca1070342dd897028f"} Feb 23 14:43:35.884200 master-0 kubenswrapper[28758]: I0223 14:43:35.883705 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-qzgk8" Feb 23 14:43:35.885460 master-0 kubenswrapper[28758]: I0223 14:43:35.885387 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-rfx4r" event={"ID":"f0421842-6f5a-457c-931b-5e9743df11e2","Type":"ContainerStarted","Data":"7b7e3e53fea7555de29938e136c6ba23fa320533f9334fb450a959bb7d19224b"} Feb 23 14:43:35.911871 master-0 kubenswrapper[28758]: I0223 14:43:35.911783 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-qzgk8" podStartSLOduration=2.424392205 podStartE2EDuration="7.911755957s" podCreationTimestamp="2026-02-23 14:43:28 +0000 UTC" firstStartedPulling="2026-02-23 14:43:29.271506204 +0000 UTC m=+541.397822136" lastFinishedPulling="2026-02-23 14:43:34.758869946 +0000 UTC m=+546.885185888" observedRunningTime="2026-02-23 14:43:35.904177135 +0000 UTC m=+548.030493067" watchObservedRunningTime="2026-02-23 14:43:35.911755957 +0000 UTC m=+548.038071899" Feb 23 14:43:35.933175 master-0 kubenswrapper[28758]: I0223 14:43:35.933082 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-rfx4r" podStartSLOduration=2.294859616 podStartE2EDuration="4.933053782s" podCreationTimestamp="2026-02-23 14:43:31 +0000 UTC" firstStartedPulling="2026-02-23 14:43:32.120797693 +0000 UTC m=+544.247113625" lastFinishedPulling="2026-02-23 14:43:34.758991859 +0000 UTC m=+546.885307791" observedRunningTime="2026-02-23 14:43:35.920574181 +0000 UTC m=+548.046890123" watchObservedRunningTime="2026-02-23 14:43:35.933053782 +0000 UTC m=+548.059369714" Feb 23 14:43:37.900159 master-0 kubenswrapper[28758]: I0223 14:43:37.900045 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-2w9dm" event={"ID":"c12ddca1-f73b-49d7-8906-5c8a32346b00","Type":"ContainerStarted","Data":"467880ff5a38c1eddf35347c56a3a01a20276f56dbf2f3d22a2b7ae08889117b"} Feb 23 14:43:37.934021 master-0 kubenswrapper[28758]: I0223 14:43:37.933925 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-2w9dm" podStartSLOduration=2.982019399 podStartE2EDuration="5.93390085s" podCreationTimestamp="2026-02-23 14:43:32 +0000 UTC" firstStartedPulling="2026-02-23 14:43:34.655587615 +0000 UTC m=+546.781903547" lastFinishedPulling="2026-02-23 14:43:37.607469066 +0000 UTC m=+549.733784998" observedRunningTime="2026-02-23 14:43:37.929051661 +0000 UTC m=+550.055367593" watchObservedRunningTime="2026-02-23 14:43:37.93390085 +0000 UTC m=+550.060216782" Feb 23 14:43:38.508050 master-0 kubenswrapper[28758]: I0223 14:43:38.507999 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-v68pt"] Feb 23 14:43:38.509301 master-0 kubenswrapper[28758]: I0223 14:43:38.509280 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-v68pt" Feb 23 14:43:38.527157 master-0 kubenswrapper[28758]: I0223 14:43:38.526694 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-v68pt"] Feb 23 14:43:38.659887 master-0 kubenswrapper[28758]: I0223 14:43:38.659823 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rdgv\" (UniqueName: \"kubernetes.io/projected/ada99ef0-4c96-4b39-bfe0-7f05673805e1-kube-api-access-5rdgv\") pod \"cert-manager-545d4d4674-v68pt\" (UID: \"ada99ef0-4c96-4b39-bfe0-7f05673805e1\") " pod="cert-manager/cert-manager-545d4d4674-v68pt" Feb 23 14:43:38.660124 master-0 kubenswrapper[28758]: I0223 14:43:38.659919 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ada99ef0-4c96-4b39-bfe0-7f05673805e1-bound-sa-token\") pod \"cert-manager-545d4d4674-v68pt\" (UID: \"ada99ef0-4c96-4b39-bfe0-7f05673805e1\") " pod="cert-manager/cert-manager-545d4d4674-v68pt" Feb 23 14:43:38.762450 master-0 kubenswrapper[28758]: I0223 14:43:38.762281 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rdgv\" (UniqueName: \"kubernetes.io/projected/ada99ef0-4c96-4b39-bfe0-7f05673805e1-kube-api-access-5rdgv\") pod \"cert-manager-545d4d4674-v68pt\" (UID: \"ada99ef0-4c96-4b39-bfe0-7f05673805e1\") " pod="cert-manager/cert-manager-545d4d4674-v68pt" Feb 23 14:43:38.762450 master-0 kubenswrapper[28758]: I0223 14:43:38.762408 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ada99ef0-4c96-4b39-bfe0-7f05673805e1-bound-sa-token\") pod \"cert-manager-545d4d4674-v68pt\" (UID: \"ada99ef0-4c96-4b39-bfe0-7f05673805e1\") " pod="cert-manager/cert-manager-545d4d4674-v68pt" Feb 23 14:43:38.778712 master-0 kubenswrapper[28758]: I0223 14:43:38.778643 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ada99ef0-4c96-4b39-bfe0-7f05673805e1-bound-sa-token\") pod \"cert-manager-545d4d4674-v68pt\" (UID: \"ada99ef0-4c96-4b39-bfe0-7f05673805e1\") " pod="cert-manager/cert-manager-545d4d4674-v68pt" Feb 23 14:43:38.779039 master-0 kubenswrapper[28758]: I0223 14:43:38.779000 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rdgv\" (UniqueName: \"kubernetes.io/projected/ada99ef0-4c96-4b39-bfe0-7f05673805e1-kube-api-access-5rdgv\") pod \"cert-manager-545d4d4674-v68pt\" (UID: \"ada99ef0-4c96-4b39-bfe0-7f05673805e1\") " pod="cert-manager/cert-manager-545d4d4674-v68pt" Feb 23 14:43:38.824012 master-0 kubenswrapper[28758]: I0223 14:43:38.823914 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-v68pt" Feb 23 14:43:39.338515 master-0 kubenswrapper[28758]: I0223 14:43:39.337507 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-v68pt"] Feb 23 14:43:39.665096 master-0 kubenswrapper[28758]: I0223 14:43:39.665030 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7d66bdc8f4-chnqv"] Feb 23 14:43:39.666015 master-0 kubenswrapper[28758]: I0223 14:43:39.665976 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7d66bdc8f4-chnqv" Feb 23 14:43:39.668095 master-0 kubenswrapper[28758]: I0223 14:43:39.668048 28758 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 23 14:43:39.668363 master-0 kubenswrapper[28758]: I0223 14:43:39.668341 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 23 14:43:39.668678 master-0 kubenswrapper[28758]: I0223 14:43:39.668651 28758 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 23 14:43:39.668882 master-0 kubenswrapper[28758]: I0223 14:43:39.668860 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 23 14:43:39.687919 master-0 kubenswrapper[28758]: I0223 14:43:39.687852 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7d66bdc8f4-chnqv"] Feb 23 14:43:39.801348 master-0 kubenswrapper[28758]: I0223 14:43:39.801277 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2q9m\" (UniqueName: \"kubernetes.io/projected/0bb85bd1-300c-4786-96ff-56978d399495-kube-api-access-r2q9m\") pod \"metallb-operator-controller-manager-7d66bdc8f4-chnqv\" (UID: \"0bb85bd1-300c-4786-96ff-56978d399495\") " pod="metallb-system/metallb-operator-controller-manager-7d66bdc8f4-chnqv" Feb 23 14:43:39.801631 master-0 kubenswrapper[28758]: I0223 14:43:39.801400 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0bb85bd1-300c-4786-96ff-56978d399495-webhook-cert\") pod \"metallb-operator-controller-manager-7d66bdc8f4-chnqv\" (UID: \"0bb85bd1-300c-4786-96ff-56978d399495\") " pod="metallb-system/metallb-operator-controller-manager-7d66bdc8f4-chnqv" Feb 23 14:43:39.801631 master-0 kubenswrapper[28758]: I0223 14:43:39.801425 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0bb85bd1-300c-4786-96ff-56978d399495-apiservice-cert\") pod \"metallb-operator-controller-manager-7d66bdc8f4-chnqv\" (UID: \"0bb85bd1-300c-4786-96ff-56978d399495\") " pod="metallb-system/metallb-operator-controller-manager-7d66bdc8f4-chnqv" Feb 23 14:43:39.904557 master-0 kubenswrapper[28758]: I0223 14:43:39.902454 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0bb85bd1-300c-4786-96ff-56978d399495-webhook-cert\") pod \"metallb-operator-controller-manager-7d66bdc8f4-chnqv\" (UID: \"0bb85bd1-300c-4786-96ff-56978d399495\") " pod="metallb-system/metallb-operator-controller-manager-7d66bdc8f4-chnqv" Feb 23 14:43:39.904557 master-0 kubenswrapper[28758]: I0223 14:43:39.902619 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0bb85bd1-300c-4786-96ff-56978d399495-apiservice-cert\") pod \"metallb-operator-controller-manager-7d66bdc8f4-chnqv\" (UID: \"0bb85bd1-300c-4786-96ff-56978d399495\") " pod="metallb-system/metallb-operator-controller-manager-7d66bdc8f4-chnqv" Feb 23 14:43:39.904557 master-0 kubenswrapper[28758]: I0223 14:43:39.902680 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2q9m\" (UniqueName: \"kubernetes.io/projected/0bb85bd1-300c-4786-96ff-56978d399495-kube-api-access-r2q9m\") pod \"metallb-operator-controller-manager-7d66bdc8f4-chnqv\" (UID: \"0bb85bd1-300c-4786-96ff-56978d399495\") " pod="metallb-system/metallb-operator-controller-manager-7d66bdc8f4-chnqv" Feb 23 14:43:39.908713 master-0 kubenswrapper[28758]: I0223 14:43:39.908048 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0bb85bd1-300c-4786-96ff-56978d399495-apiservice-cert\") pod \"metallb-operator-controller-manager-7d66bdc8f4-chnqv\" (UID: \"0bb85bd1-300c-4786-96ff-56978d399495\") " pod="metallb-system/metallb-operator-controller-manager-7d66bdc8f4-chnqv" Feb 23 14:43:39.914496 master-0 kubenswrapper[28758]: I0223 14:43:39.912026 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0bb85bd1-300c-4786-96ff-56978d399495-webhook-cert\") pod \"metallb-operator-controller-manager-7d66bdc8f4-chnqv\" (UID: \"0bb85bd1-300c-4786-96ff-56978d399495\") " pod="metallb-system/metallb-operator-controller-manager-7d66bdc8f4-chnqv" Feb 23 14:43:39.929192 master-0 kubenswrapper[28758]: I0223 14:43:39.929144 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2q9m\" (UniqueName: \"kubernetes.io/projected/0bb85bd1-300c-4786-96ff-56978d399495-kube-api-access-r2q9m\") pod \"metallb-operator-controller-manager-7d66bdc8f4-chnqv\" (UID: \"0bb85bd1-300c-4786-96ff-56978d399495\") " pod="metallb-system/metallb-operator-controller-manager-7d66bdc8f4-chnqv" Feb 23 14:43:39.939204 master-0 kubenswrapper[28758]: I0223 14:43:39.939146 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-v68pt" event={"ID":"ada99ef0-4c96-4b39-bfe0-7f05673805e1","Type":"ContainerStarted","Data":"03fab8e6422c42c3e3d68eec3d085085c445c54f632fc2d8de66f065849a4adc"} Feb 23 14:43:39.939204 master-0 kubenswrapper[28758]: I0223 14:43:39.939200 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-v68pt" event={"ID":"ada99ef0-4c96-4b39-bfe0-7f05673805e1","Type":"ContainerStarted","Data":"f094ef2cd8aaa37afd88be76de787ad354819b7bceed80d3c4f01963211dde48"} Feb 23 14:43:39.986686 master-0 kubenswrapper[28758]: I0223 14:43:39.986637 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7d66bdc8f4-chnqv" Feb 23 14:43:40.021989 master-0 kubenswrapper[28758]: I0223 14:43:40.020603 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-v68pt" podStartSLOduration=2.020577836 podStartE2EDuration="2.020577836s" podCreationTimestamp="2026-02-23 14:43:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:43:40.011843645 +0000 UTC m=+552.138159587" watchObservedRunningTime="2026-02-23 14:43:40.020577836 +0000 UTC m=+552.146893768" Feb 23 14:43:40.371223 master-0 kubenswrapper[28758]: I0223 14:43:40.371082 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-57446bd8dd-6nwv7"] Feb 23 14:43:40.372444 master-0 kubenswrapper[28758]: I0223 14:43:40.372362 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-57446bd8dd-6nwv7" Feb 23 14:43:40.380754 master-0 kubenswrapper[28758]: I0223 14:43:40.374700 28758 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 23 14:43:40.380754 master-0 kubenswrapper[28758]: I0223 14:43:40.374929 28758 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 23 14:43:40.424608 master-0 kubenswrapper[28758]: I0223 14:43:40.424542 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-57446bd8dd-6nwv7"] Feb 23 14:43:40.517375 master-0 kubenswrapper[28758]: I0223 14:43:40.517317 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7ptn\" (UniqueName: \"kubernetes.io/projected/7a2a7f30-5dd1-4918-9e1d-f65cb8aa6fc1-kube-api-access-v7ptn\") pod \"metallb-operator-webhook-server-57446bd8dd-6nwv7\" (UID: \"7a2a7f30-5dd1-4918-9e1d-f65cb8aa6fc1\") " pod="metallb-system/metallb-operator-webhook-server-57446bd8dd-6nwv7" Feb 23 14:43:40.517627 master-0 kubenswrapper[28758]: I0223 14:43:40.517467 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7a2a7f30-5dd1-4918-9e1d-f65cb8aa6fc1-apiservice-cert\") pod \"metallb-operator-webhook-server-57446bd8dd-6nwv7\" (UID: \"7a2a7f30-5dd1-4918-9e1d-f65cb8aa6fc1\") " pod="metallb-system/metallb-operator-webhook-server-57446bd8dd-6nwv7" Feb 23 14:43:40.517627 master-0 kubenswrapper[28758]: I0223 14:43:40.517517 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7a2a7f30-5dd1-4918-9e1d-f65cb8aa6fc1-webhook-cert\") pod \"metallb-operator-webhook-server-57446bd8dd-6nwv7\" (UID: \"7a2a7f30-5dd1-4918-9e1d-f65cb8aa6fc1\") " pod="metallb-system/metallb-operator-webhook-server-57446bd8dd-6nwv7" Feb 23 14:43:40.598500 master-0 kubenswrapper[28758]: I0223 14:43:40.594329 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7d66bdc8f4-chnqv"] Feb 23 14:43:40.598500 master-0 kubenswrapper[28758]: W0223 14:43:40.594958 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bb85bd1_300c_4786_96ff_56978d399495.slice/crio-3c692c1f3f29b626ec5899b0505eb1e9fde671d309dcfbb1f27730de435b8f20 WatchSource:0}: Error finding container 3c692c1f3f29b626ec5899b0505eb1e9fde671d309dcfbb1f27730de435b8f20: Status 404 returned error can't find the container with id 3c692c1f3f29b626ec5899b0505eb1e9fde671d309dcfbb1f27730de435b8f20 Feb 23 14:43:40.621506 master-0 kubenswrapper[28758]: I0223 14:43:40.618831 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7ptn\" (UniqueName: \"kubernetes.io/projected/7a2a7f30-5dd1-4918-9e1d-f65cb8aa6fc1-kube-api-access-v7ptn\") pod \"metallb-operator-webhook-server-57446bd8dd-6nwv7\" (UID: \"7a2a7f30-5dd1-4918-9e1d-f65cb8aa6fc1\") " pod="metallb-system/metallb-operator-webhook-server-57446bd8dd-6nwv7" Feb 23 14:43:40.621506 master-0 kubenswrapper[28758]: I0223 14:43:40.618898 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7a2a7f30-5dd1-4918-9e1d-f65cb8aa6fc1-apiservice-cert\") pod \"metallb-operator-webhook-server-57446bd8dd-6nwv7\" (UID: \"7a2a7f30-5dd1-4918-9e1d-f65cb8aa6fc1\") " pod="metallb-system/metallb-operator-webhook-server-57446bd8dd-6nwv7" Feb 23 14:43:40.621506 master-0 kubenswrapper[28758]: I0223 14:43:40.618916 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7a2a7f30-5dd1-4918-9e1d-f65cb8aa6fc1-webhook-cert\") pod \"metallb-operator-webhook-server-57446bd8dd-6nwv7\" (UID: \"7a2a7f30-5dd1-4918-9e1d-f65cb8aa6fc1\") " pod="metallb-system/metallb-operator-webhook-server-57446bd8dd-6nwv7" Feb 23 14:43:40.628496 master-0 kubenswrapper[28758]: I0223 14:43:40.625472 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7a2a7f30-5dd1-4918-9e1d-f65cb8aa6fc1-webhook-cert\") pod \"metallb-operator-webhook-server-57446bd8dd-6nwv7\" (UID: \"7a2a7f30-5dd1-4918-9e1d-f65cb8aa6fc1\") " pod="metallb-system/metallb-operator-webhook-server-57446bd8dd-6nwv7" Feb 23 14:43:40.628496 master-0 kubenswrapper[28758]: I0223 14:43:40.626623 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7a2a7f30-5dd1-4918-9e1d-f65cb8aa6fc1-apiservice-cert\") pod \"metallb-operator-webhook-server-57446bd8dd-6nwv7\" (UID: \"7a2a7f30-5dd1-4918-9e1d-f65cb8aa6fc1\") " pod="metallb-system/metallb-operator-webhook-server-57446bd8dd-6nwv7" Feb 23 14:43:40.660496 master-0 kubenswrapper[28758]: I0223 14:43:40.660325 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7ptn\" (UniqueName: \"kubernetes.io/projected/7a2a7f30-5dd1-4918-9e1d-f65cb8aa6fc1-kube-api-access-v7ptn\") pod \"metallb-operator-webhook-server-57446bd8dd-6nwv7\" (UID: \"7a2a7f30-5dd1-4918-9e1d-f65cb8aa6fc1\") " pod="metallb-system/metallb-operator-webhook-server-57446bd8dd-6nwv7" Feb 23 14:43:40.752086 master-0 kubenswrapper[28758]: I0223 14:43:40.751958 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-57446bd8dd-6nwv7" Feb 23 14:43:40.961552 master-0 kubenswrapper[28758]: I0223 14:43:40.950792 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7d66bdc8f4-chnqv" event={"ID":"0bb85bd1-300c-4786-96ff-56978d399495","Type":"ContainerStarted","Data":"3c692c1f3f29b626ec5899b0505eb1e9fde671d309dcfbb1f27730de435b8f20"} Feb 23 14:43:41.247548 master-0 kubenswrapper[28758]: I0223 14:43:41.247493 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-57446bd8dd-6nwv7"] Feb 23 14:43:41.963563 master-0 kubenswrapper[28758]: I0223 14:43:41.963506 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-57446bd8dd-6nwv7" event={"ID":"7a2a7f30-5dd1-4918-9e1d-f65cb8aa6fc1","Type":"ContainerStarted","Data":"36319f96b01af0656979229d22e9c7bbc69b461494a60b1d8df943b1c6d244d6"} Feb 23 14:43:43.829508 master-0 kubenswrapper[28758]: I0223 14:43:43.828992 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-qzgk8" Feb 23 14:43:45.951679 master-0 kubenswrapper[28758]: I0223 14:43:45.951607 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-7b5nf"] Feb 23 14:43:45.952864 master-0 kubenswrapper[28758]: I0223 14:43:45.952841 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-7b5nf" Feb 23 14:43:45.954907 master-0 kubenswrapper[28758]: I0223 14:43:45.954862 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 23 14:43:45.955458 master-0 kubenswrapper[28758]: I0223 14:43:45.955434 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 23 14:43:46.064700 master-0 kubenswrapper[28758]: I0223 14:43:46.063538 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-7b5nf"] Feb 23 14:43:46.131588 master-0 kubenswrapper[28758]: I0223 14:43:46.130532 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqk9q\" (UniqueName: \"kubernetes.io/projected/654cb9c9-e5a2-4875-bbd7-7616df756c19-kube-api-access-gqk9q\") pod \"obo-prometheus-operator-68bc856cb9-7b5nf\" (UID: \"654cb9c9-e5a2-4875-bbd7-7616df756c19\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-7b5nf" Feb 23 14:43:46.152393 master-0 kubenswrapper[28758]: I0223 14:43:46.152293 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6b64485d68-rv56s"] Feb 23 14:43:46.153408 master-0 kubenswrapper[28758]: I0223 14:43:46.153370 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b64485d68-rv56s" Feb 23 14:43:46.163510 master-0 kubenswrapper[28758]: I0223 14:43:46.162217 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 23 14:43:46.179979 master-0 kubenswrapper[28758]: I0223 14:43:46.179898 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6b64485d68-6hd8v"] Feb 23 14:43:46.181052 master-0 kubenswrapper[28758]: I0223 14:43:46.181012 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b64485d68-6hd8v" Feb 23 14:43:46.197968 master-0 kubenswrapper[28758]: I0223 14:43:46.196807 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6b64485d68-rv56s"] Feb 23 14:43:46.222399 master-0 kubenswrapper[28758]: I0223 14:43:46.222235 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6b64485d68-6hd8v"] Feb 23 14:43:46.234240 master-0 kubenswrapper[28758]: I0223 14:43:46.232457 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqk9q\" (UniqueName: \"kubernetes.io/projected/654cb9c9-e5a2-4875-bbd7-7616df756c19-kube-api-access-gqk9q\") pod \"obo-prometheus-operator-68bc856cb9-7b5nf\" (UID: \"654cb9c9-e5a2-4875-bbd7-7616df756c19\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-7b5nf" Feb 23 14:43:46.234240 master-0 kubenswrapper[28758]: I0223 14:43:46.232579 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1b4cd510-6185-4d8c-9433-5b2f51ddf949-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6b64485d68-rv56s\" (UID: \"1b4cd510-6185-4d8c-9433-5b2f51ddf949\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b64485d68-rv56s" Feb 23 14:43:46.234240 master-0 kubenswrapper[28758]: I0223 14:43:46.232652 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1b4cd510-6185-4d8c-9433-5b2f51ddf949-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6b64485d68-rv56s\" (UID: \"1b4cd510-6185-4d8c-9433-5b2f51ddf949\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b64485d68-rv56s" Feb 23 14:43:46.261496 master-0 kubenswrapper[28758]: I0223 14:43:46.253440 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-bj7gf"] Feb 23 14:43:46.261496 master-0 kubenswrapper[28758]: I0223 14:43:46.254836 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-bj7gf" Feb 23 14:43:46.261496 master-0 kubenswrapper[28758]: I0223 14:43:46.258897 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 23 14:43:46.287505 master-0 kubenswrapper[28758]: I0223 14:43:46.284671 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqk9q\" (UniqueName: \"kubernetes.io/projected/654cb9c9-e5a2-4875-bbd7-7616df756c19-kube-api-access-gqk9q\") pod \"obo-prometheus-operator-68bc856cb9-7b5nf\" (UID: \"654cb9c9-e5a2-4875-bbd7-7616df756c19\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-7b5nf" Feb 23 14:43:46.316497 master-0 kubenswrapper[28758]: I0223 14:43:46.308691 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-bj7gf"] Feb 23 14:43:46.336093 master-0 kubenswrapper[28758]: I0223 14:43:46.336031 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6ac7f839-eecb-4584-800c-51d1a9a51586-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6b64485d68-6hd8v\" (UID: \"6ac7f839-eecb-4584-800c-51d1a9a51586\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b64485d68-6hd8v" Feb 23 14:43:46.336093 master-0 kubenswrapper[28758]: I0223 14:43:46.336097 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec4cd05c-b5ab-4bb2-b314-63698804a380-observability-operator-tls\") pod \"observability-operator-59bdc8b94-bj7gf\" (UID: \"ec4cd05c-b5ab-4bb2-b314-63698804a380\") " pod="openshift-operators/observability-operator-59bdc8b94-bj7gf" Feb 23 14:43:46.336337 master-0 kubenswrapper[28758]: I0223 14:43:46.336127 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1b4cd510-6185-4d8c-9433-5b2f51ddf949-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6b64485d68-rv56s\" (UID: \"1b4cd510-6185-4d8c-9433-5b2f51ddf949\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b64485d68-rv56s" Feb 23 14:43:46.336337 master-0 kubenswrapper[28758]: I0223 14:43:46.336167 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6ac7f839-eecb-4584-800c-51d1a9a51586-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6b64485d68-6hd8v\" (UID: \"6ac7f839-eecb-4584-800c-51d1a9a51586\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b64485d68-6hd8v" Feb 23 14:43:46.336337 master-0 kubenswrapper[28758]: I0223 14:43:46.336211 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db9xf\" (UniqueName: \"kubernetes.io/projected/ec4cd05c-b5ab-4bb2-b314-63698804a380-kube-api-access-db9xf\") pod \"observability-operator-59bdc8b94-bj7gf\" (UID: \"ec4cd05c-b5ab-4bb2-b314-63698804a380\") " pod="openshift-operators/observability-operator-59bdc8b94-bj7gf" Feb 23 14:43:46.336337 master-0 kubenswrapper[28758]: I0223 14:43:46.336239 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1b4cd510-6185-4d8c-9433-5b2f51ddf949-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6b64485d68-rv56s\" (UID: \"1b4cd510-6185-4d8c-9433-5b2f51ddf949\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b64485d68-rv56s" Feb 23 14:43:46.344282 master-0 kubenswrapper[28758]: I0223 14:43:46.344221 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1b4cd510-6185-4d8c-9433-5b2f51ddf949-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6b64485d68-rv56s\" (UID: \"1b4cd510-6185-4d8c-9433-5b2f51ddf949\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b64485d68-rv56s" Feb 23 14:43:46.344401 master-0 kubenswrapper[28758]: I0223 14:43:46.344290 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1b4cd510-6185-4d8c-9433-5b2f51ddf949-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6b64485d68-rv56s\" (UID: \"1b4cd510-6185-4d8c-9433-5b2f51ddf949\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b64485d68-rv56s" Feb 23 14:43:46.416461 master-0 kubenswrapper[28758]: I0223 14:43:46.416372 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-r5whn"] Feb 23 14:43:46.418538 master-0 kubenswrapper[28758]: I0223 14:43:46.418497 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-r5whn" Feb 23 14:43:46.442502 master-0 kubenswrapper[28758]: I0223 14:43:46.439703 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6ac7f839-eecb-4584-800c-51d1a9a51586-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6b64485d68-6hd8v\" (UID: \"6ac7f839-eecb-4584-800c-51d1a9a51586\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b64485d68-6hd8v" Feb 23 14:43:46.442502 master-0 kubenswrapper[28758]: I0223 14:43:46.439773 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec4cd05c-b5ab-4bb2-b314-63698804a380-observability-operator-tls\") pod \"observability-operator-59bdc8b94-bj7gf\" (UID: \"ec4cd05c-b5ab-4bb2-b314-63698804a380\") " pod="openshift-operators/observability-operator-59bdc8b94-bj7gf" Feb 23 14:43:46.442502 master-0 kubenswrapper[28758]: I0223 14:43:46.439815 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6ac7f839-eecb-4584-800c-51d1a9a51586-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6b64485d68-6hd8v\" (UID: \"6ac7f839-eecb-4584-800c-51d1a9a51586\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b64485d68-6hd8v" Feb 23 14:43:46.442502 master-0 kubenswrapper[28758]: I0223 14:43:46.439852 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db9xf\" (UniqueName: \"kubernetes.io/projected/ec4cd05c-b5ab-4bb2-b314-63698804a380-kube-api-access-db9xf\") pod \"observability-operator-59bdc8b94-bj7gf\" (UID: \"ec4cd05c-b5ab-4bb2-b314-63698804a380\") " pod="openshift-operators/observability-operator-59bdc8b94-bj7gf" Feb 23 14:43:46.459078 master-0 kubenswrapper[28758]: I0223 14:43:46.450101 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec4cd05c-b5ab-4bb2-b314-63698804a380-observability-operator-tls\") pod \"observability-operator-59bdc8b94-bj7gf\" (UID: \"ec4cd05c-b5ab-4bb2-b314-63698804a380\") " pod="openshift-operators/observability-operator-59bdc8b94-bj7gf" Feb 23 14:43:46.459078 master-0 kubenswrapper[28758]: I0223 14:43:46.452773 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6ac7f839-eecb-4584-800c-51d1a9a51586-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6b64485d68-6hd8v\" (UID: \"6ac7f839-eecb-4584-800c-51d1a9a51586\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b64485d68-6hd8v" Feb 23 14:43:46.459078 master-0 kubenswrapper[28758]: I0223 14:43:46.452968 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6ac7f839-eecb-4584-800c-51d1a9a51586-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6b64485d68-6hd8v\" (UID: \"6ac7f839-eecb-4584-800c-51d1a9a51586\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b64485d68-6hd8v" Feb 23 14:43:46.481578 master-0 kubenswrapper[28758]: I0223 14:43:46.480782 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db9xf\" (UniqueName: \"kubernetes.io/projected/ec4cd05c-b5ab-4bb2-b314-63698804a380-kube-api-access-db9xf\") pod \"observability-operator-59bdc8b94-bj7gf\" (UID: \"ec4cd05c-b5ab-4bb2-b314-63698804a380\") " pod="openshift-operators/observability-operator-59bdc8b94-bj7gf" Feb 23 14:43:46.495233 master-0 kubenswrapper[28758]: I0223 14:43:46.492269 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b64485d68-rv56s" Feb 23 14:43:46.517520 master-0 kubenswrapper[28758]: I0223 14:43:46.517240 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-r5whn"] Feb 23 14:43:46.533502 master-0 kubenswrapper[28758]: I0223 14:43:46.523903 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b64485d68-6hd8v" Feb 23 14:43:46.546237 master-0 kubenswrapper[28758]: I0223 14:43:46.543227 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb8v4\" (UniqueName: \"kubernetes.io/projected/37c59c37-39b7-41b9-9492-b24aa0201257-kube-api-access-xb8v4\") pod \"perses-operator-5bf474d74f-r5whn\" (UID: \"37c59c37-39b7-41b9-9492-b24aa0201257\") " pod="openshift-operators/perses-operator-5bf474d74f-r5whn" Feb 23 14:43:46.546237 master-0 kubenswrapper[28758]: I0223 14:43:46.543324 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/37c59c37-39b7-41b9-9492-b24aa0201257-openshift-service-ca\") pod \"perses-operator-5bf474d74f-r5whn\" (UID: \"37c59c37-39b7-41b9-9492-b24aa0201257\") " pod="openshift-operators/perses-operator-5bf474d74f-r5whn" Feb 23 14:43:46.585500 master-0 kubenswrapper[28758]: I0223 14:43:46.576059 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-7b5nf" Feb 23 14:43:46.590513 master-0 kubenswrapper[28758]: I0223 14:43:46.585966 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-bj7gf" Feb 23 14:43:46.644835 master-0 kubenswrapper[28758]: I0223 14:43:46.644777 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/37c59c37-39b7-41b9-9492-b24aa0201257-openshift-service-ca\") pod \"perses-operator-5bf474d74f-r5whn\" (UID: \"37c59c37-39b7-41b9-9492-b24aa0201257\") " pod="openshift-operators/perses-operator-5bf474d74f-r5whn" Feb 23 14:43:46.645058 master-0 kubenswrapper[28758]: I0223 14:43:46.644943 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb8v4\" (UniqueName: \"kubernetes.io/projected/37c59c37-39b7-41b9-9492-b24aa0201257-kube-api-access-xb8v4\") pod \"perses-operator-5bf474d74f-r5whn\" (UID: \"37c59c37-39b7-41b9-9492-b24aa0201257\") " pod="openshift-operators/perses-operator-5bf474d74f-r5whn" Feb 23 14:43:46.646260 master-0 kubenswrapper[28758]: I0223 14:43:46.646221 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/37c59c37-39b7-41b9-9492-b24aa0201257-openshift-service-ca\") pod \"perses-operator-5bf474d74f-r5whn\" (UID: \"37c59c37-39b7-41b9-9492-b24aa0201257\") " pod="openshift-operators/perses-operator-5bf474d74f-r5whn" Feb 23 14:43:46.659089 master-0 kubenswrapper[28758]: I0223 14:43:46.659024 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb8v4\" (UniqueName: \"kubernetes.io/projected/37c59c37-39b7-41b9-9492-b24aa0201257-kube-api-access-xb8v4\") pod \"perses-operator-5bf474d74f-r5whn\" (UID: \"37c59c37-39b7-41b9-9492-b24aa0201257\") " pod="openshift-operators/perses-operator-5bf474d74f-r5whn" Feb 23 14:43:46.889770 master-0 kubenswrapper[28758]: I0223 14:43:46.883137 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-r5whn" Feb 23 14:43:48.016536 master-0 kubenswrapper[28758]: I0223 14:43:48.016442 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7d66bdc8f4-chnqv" event={"ID":"0bb85bd1-300c-4786-96ff-56978d399495","Type":"ContainerStarted","Data":"06c7d4891b3a321b821f1d55a0576da4d374394ef5128f3a78309c6fa9f7b8ed"} Feb 23 14:43:48.019165 master-0 kubenswrapper[28758]: I0223 14:43:48.019110 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-57446bd8dd-6nwv7" event={"ID":"7a2a7f30-5dd1-4918-9e1d-f65cb8aa6fc1","Type":"ContainerStarted","Data":"1a882e589535fd129ed546dd2652ec4c688e11ba433ede06dff502751a9dc134"} Feb 23 14:43:48.019830 master-0 kubenswrapper[28758]: I0223 14:43:48.019794 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-57446bd8dd-6nwv7" Feb 23 14:43:48.040177 master-0 kubenswrapper[28758]: I0223 14:43:48.038224 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7d66bdc8f4-chnqv" podStartSLOduration=2.274575432 podStartE2EDuration="9.03821008s" podCreationTimestamp="2026-02-23 14:43:39 +0000 UTC" firstStartedPulling="2026-02-23 14:43:40.603506199 +0000 UTC m=+552.729822131" lastFinishedPulling="2026-02-23 14:43:47.367140847 +0000 UTC m=+559.493456779" observedRunningTime="2026-02-23 14:43:48.036961687 +0000 UTC m=+560.163277629" watchObservedRunningTime="2026-02-23 14:43:48.03821008 +0000 UTC m=+560.164526012" Feb 23 14:43:48.064901 master-0 kubenswrapper[28758]: I0223 14:43:48.064829 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-57446bd8dd-6nwv7" podStartSLOduration=1.906918337 podStartE2EDuration="8.064806286s" podCreationTimestamp="2026-02-23 14:43:40 +0000 UTC" firstStartedPulling="2026-02-23 14:43:41.253526893 +0000 UTC m=+553.379842835" lastFinishedPulling="2026-02-23 14:43:47.411414852 +0000 UTC m=+559.537730784" observedRunningTime="2026-02-23 14:43:48.061169299 +0000 UTC m=+560.187485231" watchObservedRunningTime="2026-02-23 14:43:48.064806286 +0000 UTC m=+560.191122218" Feb 23 14:43:48.109035 master-0 kubenswrapper[28758]: W0223 14:43:48.108984 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b4cd510_6185_4d8c_9433_5b2f51ddf949.slice/crio-3ff58caef60bd467db135d3676f1c478408f81cd2218a99b23ee1dd538bf856e WatchSource:0}: Error finding container 3ff58caef60bd467db135d3676f1c478408f81cd2218a99b23ee1dd538bf856e: Status 404 returned error can't find the container with id 3ff58caef60bd467db135d3676f1c478408f81cd2218a99b23ee1dd538bf856e Feb 23 14:43:48.114943 master-0 kubenswrapper[28758]: W0223 14:43:48.114390 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37c59c37_39b7_41b9_9492_b24aa0201257.slice/crio-a215fc75c539fd413e65af7cc76ccc6b88947f81ade56a9eae1e6493466c1481 WatchSource:0}: Error finding container a215fc75c539fd413e65af7cc76ccc6b88947f81ade56a9eae1e6493466c1481: Status 404 returned error can't find the container with id a215fc75c539fd413e65af7cc76ccc6b88947f81ade56a9eae1e6493466c1481 Feb 23 14:43:48.116350 master-0 kubenswrapper[28758]: I0223 14:43:48.116310 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6b64485d68-rv56s"] Feb 23 14:43:48.116533 master-0 kubenswrapper[28758]: W0223 14:43:48.116470 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec4cd05c_b5ab_4bb2_b314_63698804a380.slice/crio-1a1a8327f675f754d3743692cd81875da9f543c79e2b9635eb646ebe016f47cf WatchSource:0}: Error finding container 1a1a8327f675f754d3743692cd81875da9f543c79e2b9635eb646ebe016f47cf: Status 404 returned error can't find the container with id 1a1a8327f675f754d3743692cd81875da9f543c79e2b9635eb646ebe016f47cf Feb 23 14:43:48.130411 master-0 kubenswrapper[28758]: I0223 14:43:48.130364 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-r5whn"] Feb 23 14:43:48.145296 master-0 kubenswrapper[28758]: I0223 14:43:48.145055 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-bj7gf"] Feb 23 14:43:48.154653 master-0 kubenswrapper[28758]: I0223 14:43:48.152917 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6b64485d68-6hd8v"] Feb 23 14:43:48.169405 master-0 kubenswrapper[28758]: I0223 14:43:48.169315 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-7b5nf"] Feb 23 14:43:49.032503 master-0 kubenswrapper[28758]: I0223 14:43:49.030688 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-7b5nf" event={"ID":"654cb9c9-e5a2-4875-bbd7-7616df756c19","Type":"ContainerStarted","Data":"97863b2da435e96710795111c8ea725772da76bda00fcaa95667b38310a5c072"} Feb 23 14:43:49.033064 master-0 kubenswrapper[28758]: I0223 14:43:49.032650 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b64485d68-6hd8v" event={"ID":"6ac7f839-eecb-4584-800c-51d1a9a51586","Type":"ContainerStarted","Data":"d082285f92ce0f232fa135f5eeb0e5349ffcbad2f4ebcb48a9d41119e1af4917"} Feb 23 14:43:49.034076 master-0 kubenswrapper[28758]: I0223 14:43:49.034029 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-bj7gf" event={"ID":"ec4cd05c-b5ab-4bb2-b314-63698804a380","Type":"ContainerStarted","Data":"1a1a8327f675f754d3743692cd81875da9f543c79e2b9635eb646ebe016f47cf"} Feb 23 14:43:49.035175 master-0 kubenswrapper[28758]: I0223 14:43:49.035148 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-r5whn" event={"ID":"37c59c37-39b7-41b9-9492-b24aa0201257","Type":"ContainerStarted","Data":"a215fc75c539fd413e65af7cc76ccc6b88947f81ade56a9eae1e6493466c1481"} Feb 23 14:43:49.036846 master-0 kubenswrapper[28758]: I0223 14:43:49.036746 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b64485d68-rv56s" event={"ID":"1b4cd510-6185-4d8c-9433-5b2f51ddf949","Type":"ContainerStarted","Data":"3ff58caef60bd467db135d3676f1c478408f81cd2218a99b23ee1dd538bf856e"} Feb 23 14:43:49.037135 master-0 kubenswrapper[28758]: I0223 14:43:49.037078 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7d66bdc8f4-chnqv" Feb 23 14:43:56.119149 master-0 kubenswrapper[28758]: I0223 14:43:56.118961 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b64485d68-rv56s" event={"ID":"1b4cd510-6185-4d8c-9433-5b2f51ddf949","Type":"ContainerStarted","Data":"763e431c50009a5ceed163bc5253317e46eb178f70151c767a3df6b3390b2d40"} Feb 23 14:43:56.125181 master-0 kubenswrapper[28758]: I0223 14:43:56.124845 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b64485d68-6hd8v" event={"ID":"6ac7f839-eecb-4584-800c-51d1a9a51586","Type":"ContainerStarted","Data":"c80b85a796b6b24c133b327e4c1c29f7757556a9ff051915f028b46b9a287e3c"} Feb 23 14:43:56.131756 master-0 kubenswrapper[28758]: I0223 14:43:56.131640 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-bj7gf" event={"ID":"ec4cd05c-b5ab-4bb2-b314-63698804a380","Type":"ContainerStarted","Data":"1d343f82bd824937f1957237346fa1fe4af7ef68359e9f886aa97d0610fd0dbe"} Feb 23 14:43:56.133580 master-0 kubenswrapper[28758]: I0223 14:43:56.132091 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-bj7gf" Feb 23 14:43:56.133674 master-0 kubenswrapper[28758]: I0223 14:43:56.133644 28758 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-bj7gf container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.128.0.129:8081/healthz\": dial tcp 10.128.0.129:8081: connect: connection refused" start-of-body= Feb 23 14:43:56.133723 master-0 kubenswrapper[28758]: I0223 14:43:56.133695 28758 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-bj7gf" podUID="ec4cd05c-b5ab-4bb2-b314-63698804a380" containerName="operator" probeResult="failure" output="Get \"http://10.128.0.129:8081/healthz\": dial tcp 10.128.0.129:8081: connect: connection refused" Feb 23 14:43:56.138292 master-0 kubenswrapper[28758]: I0223 14:43:56.138248 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-r5whn" event={"ID":"37c59c37-39b7-41b9-9492-b24aa0201257","Type":"ContainerStarted","Data":"f5a6bd83bc4e8ab4504cf4904de6725760213a7b28836638d538f4254bf186ed"} Feb 23 14:43:56.138795 master-0 kubenswrapper[28758]: I0223 14:43:56.138757 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-r5whn" Feb 23 14:43:56.146981 master-0 kubenswrapper[28758]: I0223 14:43:56.146902 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b64485d68-rv56s" podStartSLOduration=2.566432691 podStartE2EDuration="10.146879349s" podCreationTimestamp="2026-02-23 14:43:46 +0000 UTC" firstStartedPulling="2026-02-23 14:43:48.111816104 +0000 UTC m=+560.238132036" lastFinishedPulling="2026-02-23 14:43:55.692262762 +0000 UTC m=+567.818578694" observedRunningTime="2026-02-23 14:43:56.141678661 +0000 UTC m=+568.267994593" watchObservedRunningTime="2026-02-23 14:43:56.146879349 +0000 UTC m=+568.273195281" Feb 23 14:43:56.216291 master-0 kubenswrapper[28758]: I0223 14:43:56.209323 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b64485d68-6hd8v" podStartSLOduration=2.681561867 podStartE2EDuration="10.209284696s" podCreationTimestamp="2026-02-23 14:43:46 +0000 UTC" firstStartedPulling="2026-02-23 14:43:48.145102617 +0000 UTC m=+560.271418539" lastFinishedPulling="2026-02-23 14:43:55.672825436 +0000 UTC m=+567.799141368" observedRunningTime="2026-02-23 14:43:56.19739853 +0000 UTC m=+568.323714462" watchObservedRunningTime="2026-02-23 14:43:56.209284696 +0000 UTC m=+568.335600628" Feb 23 14:43:56.261165 master-0 kubenswrapper[28758]: I0223 14:43:56.261065 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-bj7gf" podStartSLOduration=2.708051849 podStartE2EDuration="10.260762352s" podCreationTimestamp="2026-02-23 14:43:46 +0000 UTC" firstStartedPulling="2026-02-23 14:43:48.11960479 +0000 UTC m=+560.245920732" lastFinishedPulling="2026-02-23 14:43:55.672315313 +0000 UTC m=+567.798631235" observedRunningTime="2026-02-23 14:43:56.250419318 +0000 UTC m=+568.376735280" watchObservedRunningTime="2026-02-23 14:43:56.260762352 +0000 UTC m=+568.387078284" Feb 23 14:43:56.317505 master-0 kubenswrapper[28758]: I0223 14:43:56.307333 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-7b5nf" podStartSLOduration=3.745341239 podStartE2EDuration="11.307316028s" podCreationTimestamp="2026-02-23 14:43:45 +0000 UTC" firstStartedPulling="2026-02-23 14:43:48.166163806 +0000 UTC m=+560.292479738" lastFinishedPulling="2026-02-23 14:43:55.728138585 +0000 UTC m=+567.854454527" observedRunningTime="2026-02-23 14:43:56.271847826 +0000 UTC m=+568.398163758" watchObservedRunningTime="2026-02-23 14:43:56.307316028 +0000 UTC m=+568.433631960" Feb 23 14:43:56.594915 master-0 kubenswrapper[28758]: I0223 14:43:56.594850 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-bj7gf" Feb 23 14:43:56.626734 master-0 kubenswrapper[28758]: I0223 14:43:56.626601 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-r5whn" podStartSLOduration=3.073706714 podStartE2EDuration="10.626564301s" podCreationTimestamp="2026-02-23 14:43:46 +0000 UTC" firstStartedPulling="2026-02-23 14:43:48.118326806 +0000 UTC m=+560.244642748" lastFinishedPulling="2026-02-23 14:43:55.671184403 +0000 UTC m=+567.797500335" observedRunningTime="2026-02-23 14:43:56.313565864 +0000 UTC m=+568.439881796" watchObservedRunningTime="2026-02-23 14:43:56.626564301 +0000 UTC m=+568.752880233" Feb 23 14:43:57.148394 master-0 kubenswrapper[28758]: I0223 14:43:57.148294 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-7b5nf" event={"ID":"654cb9c9-e5a2-4875-bbd7-7616df756c19","Type":"ContainerStarted","Data":"feb2ae8d8f13c40457e062bf94d5dd0a3cbd4b0756fb2bb62a8fef10255fb698"} Feb 23 14:44:00.759041 master-0 kubenswrapper[28758]: I0223 14:44:00.758967 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-57446bd8dd-6nwv7" Feb 23 14:44:06.889505 master-0 kubenswrapper[28758]: I0223 14:44:06.886587 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-r5whn" Feb 23 14:44:19.991220 master-0 kubenswrapper[28758]: I0223 14:44:19.991119 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7d66bdc8f4-chnqv" Feb 23 14:44:29.889109 master-0 kubenswrapper[28758]: I0223 14:44:29.889011 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-6gz6s"] Feb 23 14:44:29.890380 master-0 kubenswrapper[28758]: I0223 14:44:29.890338 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-6gz6s" Feb 23 14:44:29.893559 master-0 kubenswrapper[28758]: I0223 14:44:29.892345 28758 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 23 14:44:29.895629 master-0 kubenswrapper[28758]: I0223 14:44:29.895556 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-frlvg"] Feb 23 14:44:29.899341 master-0 kubenswrapper[28758]: I0223 14:44:29.899276 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-frlvg" Feb 23 14:44:29.904216 master-0 kubenswrapper[28758]: I0223 14:44:29.904161 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 23 14:44:29.904407 master-0 kubenswrapper[28758]: I0223 14:44:29.904298 28758 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 23 14:44:29.915581 master-0 kubenswrapper[28758]: I0223 14:44:29.915512 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-6gz6s"] Feb 23 14:44:29.982517 master-0 kubenswrapper[28758]: I0223 14:44:29.981398 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms8ht\" (UniqueName: \"kubernetes.io/projected/df5d30d1-17cc-4d03-9f0f-d5f34c0b5715-kube-api-access-ms8ht\") pod \"frr-k8s-webhook-server-78b44bf5bb-6gz6s\" (UID: \"df5d30d1-17cc-4d03-9f0f-d5f34c0b5715\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-6gz6s" Feb 23 14:44:29.982517 master-0 kubenswrapper[28758]: I0223 14:44:29.981528 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/7979db93-36fa-4bbd-99c6-e8c8ecc114f3-metrics\") pod \"frr-k8s-frlvg\" (UID: \"7979db93-36fa-4bbd-99c6-e8c8ecc114f3\") " pod="metallb-system/frr-k8s-frlvg" Feb 23 14:44:29.982517 master-0 kubenswrapper[28758]: I0223 14:44:29.981572 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/7979db93-36fa-4bbd-99c6-e8c8ecc114f3-reloader\") pod \"frr-k8s-frlvg\" (UID: \"7979db93-36fa-4bbd-99c6-e8c8ecc114f3\") " pod="metallb-system/frr-k8s-frlvg" Feb 23 14:44:29.982517 master-0 kubenswrapper[28758]: I0223 14:44:29.981807 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7979db93-36fa-4bbd-99c6-e8c8ecc114f3-metrics-certs\") pod \"frr-k8s-frlvg\" (UID: \"7979db93-36fa-4bbd-99c6-e8c8ecc114f3\") " pod="metallb-system/frr-k8s-frlvg" Feb 23 14:44:29.982517 master-0 kubenswrapper[28758]: I0223 14:44:29.981914 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/7979db93-36fa-4bbd-99c6-e8c8ecc114f3-frr-startup\") pod \"frr-k8s-frlvg\" (UID: \"7979db93-36fa-4bbd-99c6-e8c8ecc114f3\") " pod="metallb-system/frr-k8s-frlvg" Feb 23 14:44:29.982517 master-0 kubenswrapper[28758]: I0223 14:44:29.981959 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/7979db93-36fa-4bbd-99c6-e8c8ecc114f3-frr-conf\") pod \"frr-k8s-frlvg\" (UID: \"7979db93-36fa-4bbd-99c6-e8c8ecc114f3\") " pod="metallb-system/frr-k8s-frlvg" Feb 23 14:44:29.982517 master-0 kubenswrapper[28758]: I0223 14:44:29.982018 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/7979db93-36fa-4bbd-99c6-e8c8ecc114f3-frr-sockets\") pod \"frr-k8s-frlvg\" (UID: \"7979db93-36fa-4bbd-99c6-e8c8ecc114f3\") " pod="metallb-system/frr-k8s-frlvg" Feb 23 14:44:29.982517 master-0 kubenswrapper[28758]: I0223 14:44:29.982046 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df5d30d1-17cc-4d03-9f0f-d5f34c0b5715-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-6gz6s\" (UID: \"df5d30d1-17cc-4d03-9f0f-d5f34c0b5715\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-6gz6s" Feb 23 14:44:29.982517 master-0 kubenswrapper[28758]: I0223 14:44:29.982080 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvzcd\" (UniqueName: \"kubernetes.io/projected/7979db93-36fa-4bbd-99c6-e8c8ecc114f3-kube-api-access-gvzcd\") pod \"frr-k8s-frlvg\" (UID: \"7979db93-36fa-4bbd-99c6-e8c8ecc114f3\") " pod="metallb-system/frr-k8s-frlvg" Feb 23 14:44:30.004061 master-0 kubenswrapper[28758]: I0223 14:44:30.003961 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-cgfmv"] Feb 23 14:44:30.005771 master-0 kubenswrapper[28758]: I0223 14:44:30.005737 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-cgfmv" Feb 23 14:44:30.013431 master-0 kubenswrapper[28758]: I0223 14:44:30.010204 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-vdjqg"] Feb 23 14:44:30.013431 master-0 kubenswrapper[28758]: I0223 14:44:30.011877 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-vdjqg" Feb 23 14:44:30.013431 master-0 kubenswrapper[28758]: I0223 14:44:30.011925 28758 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 23 14:44:30.013431 master-0 kubenswrapper[28758]: I0223 14:44:30.012245 28758 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 23 14:44:30.015460 master-0 kubenswrapper[28758]: I0223 14:44:30.015415 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 23 14:44:30.020522 master-0 kubenswrapper[28758]: I0223 14:44:30.015759 28758 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 23 14:44:30.025937 master-0 kubenswrapper[28758]: I0223 14:44:30.025832 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-vdjqg"] Feb 23 14:44:30.088827 master-0 kubenswrapper[28758]: I0223 14:44:30.088517 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/669f05a2-9b50-4d6e-8d16-0a5050939b84-metrics-certs\") pod \"controller-69bbfbf88f-vdjqg\" (UID: \"669f05a2-9b50-4d6e-8d16-0a5050939b84\") " pod="metallb-system/controller-69bbfbf88f-vdjqg" Feb 23 14:44:30.088827 master-0 kubenswrapper[28758]: I0223 14:44:30.088587 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms8ht\" (UniqueName: \"kubernetes.io/projected/df5d30d1-17cc-4d03-9f0f-d5f34c0b5715-kube-api-access-ms8ht\") pod \"frr-k8s-webhook-server-78b44bf5bb-6gz6s\" (UID: \"df5d30d1-17cc-4d03-9f0f-d5f34c0b5715\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-6gz6s" Feb 23 14:44:30.088827 master-0 kubenswrapper[28758]: I0223 14:44:30.088677 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s25q2\" (UniqueName: \"kubernetes.io/projected/1b984ba9-84db-4b51-ac8d-f92f22a4b76f-kube-api-access-s25q2\") pod \"speaker-cgfmv\" (UID: \"1b984ba9-84db-4b51-ac8d-f92f22a4b76f\") " pod="metallb-system/speaker-cgfmv" Feb 23 14:44:30.088827 master-0 kubenswrapper[28758]: I0223 14:44:30.088819 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b7xr\" (UniqueName: \"kubernetes.io/projected/669f05a2-9b50-4d6e-8d16-0a5050939b84-kube-api-access-4b7xr\") pod \"controller-69bbfbf88f-vdjqg\" (UID: \"669f05a2-9b50-4d6e-8d16-0a5050939b84\") " pod="metallb-system/controller-69bbfbf88f-vdjqg" Feb 23 14:44:30.089242 master-0 kubenswrapper[28758]: I0223 14:44:30.088864 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/7979db93-36fa-4bbd-99c6-e8c8ecc114f3-metrics\") pod \"frr-k8s-frlvg\" (UID: \"7979db93-36fa-4bbd-99c6-e8c8ecc114f3\") " pod="metallb-system/frr-k8s-frlvg" Feb 23 14:44:30.089242 master-0 kubenswrapper[28758]: I0223 14:44:30.088903 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1b984ba9-84db-4b51-ac8d-f92f22a4b76f-memberlist\") pod \"speaker-cgfmv\" (UID: \"1b984ba9-84db-4b51-ac8d-f92f22a4b76f\") " pod="metallb-system/speaker-cgfmv" Feb 23 14:44:30.089242 master-0 kubenswrapper[28758]: I0223 14:44:30.088931 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b984ba9-84db-4b51-ac8d-f92f22a4b76f-metrics-certs\") pod \"speaker-cgfmv\" (UID: \"1b984ba9-84db-4b51-ac8d-f92f22a4b76f\") " pod="metallb-system/speaker-cgfmv" Feb 23 14:44:30.089242 master-0 kubenswrapper[28758]: I0223 14:44:30.088961 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/7979db93-36fa-4bbd-99c6-e8c8ecc114f3-reloader\") pod \"frr-k8s-frlvg\" (UID: \"7979db93-36fa-4bbd-99c6-e8c8ecc114f3\") " pod="metallb-system/frr-k8s-frlvg" Feb 23 14:44:30.089242 master-0 kubenswrapper[28758]: I0223 14:44:30.089024 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7979db93-36fa-4bbd-99c6-e8c8ecc114f3-metrics-certs\") pod \"frr-k8s-frlvg\" (UID: \"7979db93-36fa-4bbd-99c6-e8c8ecc114f3\") " pod="metallb-system/frr-k8s-frlvg" Feb 23 14:44:30.089242 master-0 kubenswrapper[28758]: I0223 14:44:30.089073 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/7979db93-36fa-4bbd-99c6-e8c8ecc114f3-frr-startup\") pod \"frr-k8s-frlvg\" (UID: \"7979db93-36fa-4bbd-99c6-e8c8ecc114f3\") " pod="metallb-system/frr-k8s-frlvg" Feb 23 14:44:30.089242 master-0 kubenswrapper[28758]: I0223 14:44:30.089137 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/7979db93-36fa-4bbd-99c6-e8c8ecc114f3-frr-conf\") pod \"frr-k8s-frlvg\" (UID: \"7979db93-36fa-4bbd-99c6-e8c8ecc114f3\") " pod="metallb-system/frr-k8s-frlvg" Feb 23 14:44:30.089242 master-0 kubenswrapper[28758]: I0223 14:44:30.089182 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/7979db93-36fa-4bbd-99c6-e8c8ecc114f3-frr-sockets\") pod \"frr-k8s-frlvg\" (UID: \"7979db93-36fa-4bbd-99c6-e8c8ecc114f3\") " pod="metallb-system/frr-k8s-frlvg" Feb 23 14:44:30.089242 master-0 kubenswrapper[28758]: I0223 14:44:30.089210 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/669f05a2-9b50-4d6e-8d16-0a5050939b84-cert\") pod \"controller-69bbfbf88f-vdjqg\" (UID: \"669f05a2-9b50-4d6e-8d16-0a5050939b84\") " pod="metallb-system/controller-69bbfbf88f-vdjqg" Feb 23 14:44:30.089242 master-0 kubenswrapper[28758]: I0223 14:44:30.089243 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df5d30d1-17cc-4d03-9f0f-d5f34c0b5715-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-6gz6s\" (UID: \"df5d30d1-17cc-4d03-9f0f-d5f34c0b5715\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-6gz6s" Feb 23 14:44:30.089687 master-0 kubenswrapper[28758]: I0223 14:44:30.089275 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1b984ba9-84db-4b51-ac8d-f92f22a4b76f-metallb-excludel2\") pod \"speaker-cgfmv\" (UID: \"1b984ba9-84db-4b51-ac8d-f92f22a4b76f\") " pod="metallb-system/speaker-cgfmv" Feb 23 14:44:30.089687 master-0 kubenswrapper[28758]: I0223 14:44:30.089307 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvzcd\" (UniqueName: \"kubernetes.io/projected/7979db93-36fa-4bbd-99c6-e8c8ecc114f3-kube-api-access-gvzcd\") pod \"frr-k8s-frlvg\" (UID: \"7979db93-36fa-4bbd-99c6-e8c8ecc114f3\") " pod="metallb-system/frr-k8s-frlvg" Feb 23 14:44:30.089919 master-0 kubenswrapper[28758]: E0223 14:44:30.089890 28758 secret.go:189] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Feb 23 14:44:30.089978 master-0 kubenswrapper[28758]: E0223 14:44:30.089963 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df5d30d1-17cc-4d03-9f0f-d5f34c0b5715-cert podName:df5d30d1-17cc-4d03-9f0f-d5f34c0b5715 nodeName:}" failed. No retries permitted until 2026-02-23 14:44:30.589936401 +0000 UTC m=+602.716252333 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/df5d30d1-17cc-4d03-9f0f-d5f34c0b5715-cert") pod "frr-k8s-webhook-server-78b44bf5bb-6gz6s" (UID: "df5d30d1-17cc-4d03-9f0f-d5f34c0b5715") : secret "frr-k8s-webhook-server-cert" not found Feb 23 14:44:30.090191 master-0 kubenswrapper[28758]: I0223 14:44:30.090130 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/7979db93-36fa-4bbd-99c6-e8c8ecc114f3-reloader\") pod \"frr-k8s-frlvg\" (UID: \"7979db93-36fa-4bbd-99c6-e8c8ecc114f3\") " pod="metallb-system/frr-k8s-frlvg" Feb 23 14:44:30.090255 master-0 kubenswrapper[28758]: I0223 14:44:30.090209 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/7979db93-36fa-4bbd-99c6-e8c8ecc114f3-frr-sockets\") pod \"frr-k8s-frlvg\" (UID: \"7979db93-36fa-4bbd-99c6-e8c8ecc114f3\") " pod="metallb-system/frr-k8s-frlvg" Feb 23 14:44:30.090311 master-0 kubenswrapper[28758]: E0223 14:44:30.090289 28758 secret.go:189] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Feb 23 14:44:30.090634 master-0 kubenswrapper[28758]: I0223 14:44:30.090611 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/7979db93-36fa-4bbd-99c6-e8c8ecc114f3-metrics\") pod \"frr-k8s-frlvg\" (UID: \"7979db93-36fa-4bbd-99c6-e8c8ecc114f3\") " pod="metallb-system/frr-k8s-frlvg" Feb 23 14:44:30.090737 master-0 kubenswrapper[28758]: E0223 14:44:30.090699 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7979db93-36fa-4bbd-99c6-e8c8ecc114f3-metrics-certs podName:7979db93-36fa-4bbd-99c6-e8c8ecc114f3 nodeName:}" failed. No retries permitted until 2026-02-23 14:44:30.59066487 +0000 UTC m=+602.716980802 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7979db93-36fa-4bbd-99c6-e8c8ecc114f3-metrics-certs") pod "frr-k8s-frlvg" (UID: "7979db93-36fa-4bbd-99c6-e8c8ecc114f3") : secret "frr-k8s-certs-secret" not found Feb 23 14:44:30.091754 master-0 kubenswrapper[28758]: I0223 14:44:30.091712 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/7979db93-36fa-4bbd-99c6-e8c8ecc114f3-frr-conf\") pod \"frr-k8s-frlvg\" (UID: \"7979db93-36fa-4bbd-99c6-e8c8ecc114f3\") " pod="metallb-system/frr-k8s-frlvg" Feb 23 14:44:30.092970 master-0 kubenswrapper[28758]: I0223 14:44:30.092663 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/7979db93-36fa-4bbd-99c6-e8c8ecc114f3-frr-startup\") pod \"frr-k8s-frlvg\" (UID: \"7979db93-36fa-4bbd-99c6-e8c8ecc114f3\") " pod="metallb-system/frr-k8s-frlvg" Feb 23 14:44:30.125046 master-0 kubenswrapper[28758]: I0223 14:44:30.124996 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvzcd\" (UniqueName: \"kubernetes.io/projected/7979db93-36fa-4bbd-99c6-e8c8ecc114f3-kube-api-access-gvzcd\") pod \"frr-k8s-frlvg\" (UID: \"7979db93-36fa-4bbd-99c6-e8c8ecc114f3\") " pod="metallb-system/frr-k8s-frlvg" Feb 23 14:44:30.128271 master-0 kubenswrapper[28758]: I0223 14:44:30.128227 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms8ht\" (UniqueName: \"kubernetes.io/projected/df5d30d1-17cc-4d03-9f0f-d5f34c0b5715-kube-api-access-ms8ht\") pod \"frr-k8s-webhook-server-78b44bf5bb-6gz6s\" (UID: \"df5d30d1-17cc-4d03-9f0f-d5f34c0b5715\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-6gz6s" Feb 23 14:44:30.191690 master-0 kubenswrapper[28758]: I0223 14:44:30.190949 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/669f05a2-9b50-4d6e-8d16-0a5050939b84-cert\") pod \"controller-69bbfbf88f-vdjqg\" (UID: \"669f05a2-9b50-4d6e-8d16-0a5050939b84\") " pod="metallb-system/controller-69bbfbf88f-vdjqg" Feb 23 14:44:30.191690 master-0 kubenswrapper[28758]: I0223 14:44:30.191036 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1b984ba9-84db-4b51-ac8d-f92f22a4b76f-metallb-excludel2\") pod \"speaker-cgfmv\" (UID: \"1b984ba9-84db-4b51-ac8d-f92f22a4b76f\") " pod="metallb-system/speaker-cgfmv" Feb 23 14:44:30.191690 master-0 kubenswrapper[28758]: I0223 14:44:30.191414 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/669f05a2-9b50-4d6e-8d16-0a5050939b84-metrics-certs\") pod \"controller-69bbfbf88f-vdjqg\" (UID: \"669f05a2-9b50-4d6e-8d16-0a5050939b84\") " pod="metallb-system/controller-69bbfbf88f-vdjqg" Feb 23 14:44:30.191690 master-0 kubenswrapper[28758]: I0223 14:44:30.191459 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s25q2\" (UniqueName: \"kubernetes.io/projected/1b984ba9-84db-4b51-ac8d-f92f22a4b76f-kube-api-access-s25q2\") pod \"speaker-cgfmv\" (UID: \"1b984ba9-84db-4b51-ac8d-f92f22a4b76f\") " pod="metallb-system/speaker-cgfmv" Feb 23 14:44:30.191690 master-0 kubenswrapper[28758]: I0223 14:44:30.191532 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b7xr\" (UniqueName: \"kubernetes.io/projected/669f05a2-9b50-4d6e-8d16-0a5050939b84-kube-api-access-4b7xr\") pod \"controller-69bbfbf88f-vdjqg\" (UID: \"669f05a2-9b50-4d6e-8d16-0a5050939b84\") " pod="metallb-system/controller-69bbfbf88f-vdjqg" Feb 23 14:44:30.192070 master-0 kubenswrapper[28758]: I0223 14:44:30.191720 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1b984ba9-84db-4b51-ac8d-f92f22a4b76f-memberlist\") pod \"speaker-cgfmv\" (UID: \"1b984ba9-84db-4b51-ac8d-f92f22a4b76f\") " pod="metallb-system/speaker-cgfmv" Feb 23 14:44:30.192714 master-0 kubenswrapper[28758]: I0223 14:44:30.192661 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b984ba9-84db-4b51-ac8d-f92f22a4b76f-metrics-certs\") pod \"speaker-cgfmv\" (UID: \"1b984ba9-84db-4b51-ac8d-f92f22a4b76f\") " pod="metallb-system/speaker-cgfmv" Feb 23 14:44:30.192782 master-0 kubenswrapper[28758]: E0223 14:44:30.192707 28758 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 23 14:44:30.192782 master-0 kubenswrapper[28758]: E0223 14:44:30.192766 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b984ba9-84db-4b51-ac8d-f92f22a4b76f-memberlist podName:1b984ba9-84db-4b51-ac8d-f92f22a4b76f nodeName:}" failed. No retries permitted until 2026-02-23 14:44:30.69274751 +0000 UTC m=+602.819063442 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1b984ba9-84db-4b51-ac8d-f92f22a4b76f-memberlist") pod "speaker-cgfmv" (UID: "1b984ba9-84db-4b51-ac8d-f92f22a4b76f") : secret "metallb-memberlist" not found Feb 23 14:44:30.195457 master-0 kubenswrapper[28758]: I0223 14:44:30.195023 28758 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 23 14:44:30.196208 master-0 kubenswrapper[28758]: I0223 14:44:30.196164 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b984ba9-84db-4b51-ac8d-f92f22a4b76f-metrics-certs\") pod \"speaker-cgfmv\" (UID: \"1b984ba9-84db-4b51-ac8d-f92f22a4b76f\") " pod="metallb-system/speaker-cgfmv" Feb 23 14:44:30.196378 master-0 kubenswrapper[28758]: I0223 14:44:30.196319 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1b984ba9-84db-4b51-ac8d-f92f22a4b76f-metallb-excludel2\") pod \"speaker-cgfmv\" (UID: \"1b984ba9-84db-4b51-ac8d-f92f22a4b76f\") " pod="metallb-system/speaker-cgfmv" Feb 23 14:44:30.198232 master-0 kubenswrapper[28758]: I0223 14:44:30.198191 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/669f05a2-9b50-4d6e-8d16-0a5050939b84-metrics-certs\") pod \"controller-69bbfbf88f-vdjqg\" (UID: \"669f05a2-9b50-4d6e-8d16-0a5050939b84\") " pod="metallb-system/controller-69bbfbf88f-vdjqg" Feb 23 14:44:30.206622 master-0 kubenswrapper[28758]: I0223 14:44:30.206575 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/669f05a2-9b50-4d6e-8d16-0a5050939b84-cert\") pod \"controller-69bbfbf88f-vdjqg\" (UID: \"669f05a2-9b50-4d6e-8d16-0a5050939b84\") " pod="metallb-system/controller-69bbfbf88f-vdjqg" Feb 23 14:44:30.209815 master-0 kubenswrapper[28758]: I0223 14:44:30.209764 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s25q2\" (UniqueName: \"kubernetes.io/projected/1b984ba9-84db-4b51-ac8d-f92f22a4b76f-kube-api-access-s25q2\") pod \"speaker-cgfmv\" (UID: \"1b984ba9-84db-4b51-ac8d-f92f22a4b76f\") " pod="metallb-system/speaker-cgfmv" Feb 23 14:44:30.212121 master-0 kubenswrapper[28758]: I0223 14:44:30.211631 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b7xr\" (UniqueName: \"kubernetes.io/projected/669f05a2-9b50-4d6e-8d16-0a5050939b84-kube-api-access-4b7xr\") pod \"controller-69bbfbf88f-vdjqg\" (UID: \"669f05a2-9b50-4d6e-8d16-0a5050939b84\") " pod="metallb-system/controller-69bbfbf88f-vdjqg" Feb 23 14:44:30.362075 master-0 kubenswrapper[28758]: I0223 14:44:30.362014 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-vdjqg" Feb 23 14:44:30.608124 master-0 kubenswrapper[28758]: I0223 14:44:30.608054 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7979db93-36fa-4bbd-99c6-e8c8ecc114f3-metrics-certs\") pod \"frr-k8s-frlvg\" (UID: \"7979db93-36fa-4bbd-99c6-e8c8ecc114f3\") " pod="metallb-system/frr-k8s-frlvg" Feb 23 14:44:30.608414 master-0 kubenswrapper[28758]: I0223 14:44:30.608147 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df5d30d1-17cc-4d03-9f0f-d5f34c0b5715-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-6gz6s\" (UID: \"df5d30d1-17cc-4d03-9f0f-d5f34c0b5715\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-6gz6s" Feb 23 14:44:30.611008 master-0 kubenswrapper[28758]: I0223 14:44:30.610965 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7979db93-36fa-4bbd-99c6-e8c8ecc114f3-metrics-certs\") pod \"frr-k8s-frlvg\" (UID: \"7979db93-36fa-4bbd-99c6-e8c8ecc114f3\") " pod="metallb-system/frr-k8s-frlvg" Feb 23 14:44:30.611442 master-0 kubenswrapper[28758]: I0223 14:44:30.611418 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df5d30d1-17cc-4d03-9f0f-d5f34c0b5715-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-6gz6s\" (UID: \"df5d30d1-17cc-4d03-9f0f-d5f34c0b5715\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-6gz6s" Feb 23 14:44:30.709261 master-0 kubenswrapper[28758]: I0223 14:44:30.709136 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1b984ba9-84db-4b51-ac8d-f92f22a4b76f-memberlist\") pod \"speaker-cgfmv\" (UID: \"1b984ba9-84db-4b51-ac8d-f92f22a4b76f\") " pod="metallb-system/speaker-cgfmv" Feb 23 14:44:30.709446 master-0 kubenswrapper[28758]: E0223 14:44:30.709338 28758 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 23 14:44:30.709446 master-0 kubenswrapper[28758]: E0223 14:44:30.709432 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b984ba9-84db-4b51-ac8d-f92f22a4b76f-memberlist podName:1b984ba9-84db-4b51-ac8d-f92f22a4b76f nodeName:}" failed. No retries permitted until 2026-02-23 14:44:31.709410614 +0000 UTC m=+603.835726546 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1b984ba9-84db-4b51-ac8d-f92f22a4b76f-memberlist") pod "speaker-cgfmv" (UID: "1b984ba9-84db-4b51-ac8d-f92f22a4b76f") : secret "metallb-memberlist" not found Feb 23 14:44:30.836148 master-0 kubenswrapper[28758]: I0223 14:44:30.836100 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-vdjqg"] Feb 23 14:44:30.841656 master-0 kubenswrapper[28758]: W0223 14:44:30.841615 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod669f05a2_9b50_4d6e_8d16_0a5050939b84.slice/crio-873eb806a1394617461786269dc60a460604156ce8583df81a88e140e0abd624 WatchSource:0}: Error finding container 873eb806a1394617461786269dc60a460604156ce8583df81a88e140e0abd624: Status 404 returned error can't find the container with id 873eb806a1394617461786269dc60a460604156ce8583df81a88e140e0abd624 Feb 23 14:44:30.843976 master-0 kubenswrapper[28758]: I0223 14:44:30.843941 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-6gz6s" Feb 23 14:44:30.861045 master-0 kubenswrapper[28758]: I0223 14:44:30.860983 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-frlvg" Feb 23 14:44:31.261330 master-0 kubenswrapper[28758]: I0223 14:44:31.261274 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-6gz6s"] Feb 23 14:44:31.263944 master-0 kubenswrapper[28758]: W0223 14:44:31.263888 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf5d30d1_17cc_4d03_9f0f_d5f34c0b5715.slice/crio-66ed015300806a63d1f0a94a0605bbc59424d3337d039f6537dfa806273141bb WatchSource:0}: Error finding container 66ed015300806a63d1f0a94a0605bbc59424d3337d039f6537dfa806273141bb: Status 404 returned error can't find the container with id 66ed015300806a63d1f0a94a0605bbc59424d3337d039f6537dfa806273141bb Feb 23 14:44:31.457377 master-0 kubenswrapper[28758]: I0223 14:44:31.457290 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-frlvg" event={"ID":"7979db93-36fa-4bbd-99c6-e8c8ecc114f3","Type":"ContainerStarted","Data":"52aa06372b5f396c9285d5a525f0a64d5e0252f439513e72e8413181ffa55fb3"} Feb 23 14:44:31.459386 master-0 kubenswrapper[28758]: I0223 14:44:31.459343 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-vdjqg" event={"ID":"669f05a2-9b50-4d6e-8d16-0a5050939b84","Type":"ContainerStarted","Data":"9e3998c9f0cd38ba39bd0d093a66b233cd31832d126ba785f5f47770bc19a4ff"} Feb 23 14:44:31.459386 master-0 kubenswrapper[28758]: I0223 14:44:31.459378 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-vdjqg" event={"ID":"669f05a2-9b50-4d6e-8d16-0a5050939b84","Type":"ContainerStarted","Data":"873eb806a1394617461786269dc60a460604156ce8583df81a88e140e0abd624"} Feb 23 14:44:31.460547 master-0 kubenswrapper[28758]: I0223 14:44:31.460492 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-6gz6s" event={"ID":"df5d30d1-17cc-4d03-9f0f-d5f34c0b5715","Type":"ContainerStarted","Data":"66ed015300806a63d1f0a94a0605bbc59424d3337d039f6537dfa806273141bb"} Feb 23 14:44:31.725341 master-0 kubenswrapper[28758]: I0223 14:44:31.725284 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1b984ba9-84db-4b51-ac8d-f92f22a4b76f-memberlist\") pod \"speaker-cgfmv\" (UID: \"1b984ba9-84db-4b51-ac8d-f92f22a4b76f\") " pod="metallb-system/speaker-cgfmv" Feb 23 14:44:31.728805 master-0 kubenswrapper[28758]: I0223 14:44:31.728763 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1b984ba9-84db-4b51-ac8d-f92f22a4b76f-memberlist\") pod \"speaker-cgfmv\" (UID: \"1b984ba9-84db-4b51-ac8d-f92f22a4b76f\") " pod="metallb-system/speaker-cgfmv" Feb 23 14:44:31.861927 master-0 kubenswrapper[28758]: I0223 14:44:31.861857 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-cgfmv" Feb 23 14:44:31.894259 master-0 kubenswrapper[28758]: W0223 14:44:31.894010 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b984ba9_84db_4b51_ac8d_f92f22a4b76f.slice/crio-795c59fb9182fb3ad09b0b5fc12cb81f6c836d9f58afa3fb9ec5b6ea7e1ccd9a WatchSource:0}: Error finding container 795c59fb9182fb3ad09b0b5fc12cb81f6c836d9f58afa3fb9ec5b6ea7e1ccd9a: Status 404 returned error can't find the container with id 795c59fb9182fb3ad09b0b5fc12cb81f6c836d9f58afa3fb9ec5b6ea7e1ccd9a Feb 23 14:44:32.013120 master-0 kubenswrapper[28758]: I0223 14:44:32.013059 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-6g7nb"] Feb 23 14:44:32.045295 master-0 kubenswrapper[28758]: I0223 14:44:32.014709 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-6g7nb" Feb 23 14:44:32.045295 master-0 kubenswrapper[28758]: I0223 14:44:32.019065 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-6g7nb"] Feb 23 14:44:32.071219 master-0 kubenswrapper[28758]: I0223 14:44:32.054646 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-nmnbc"] Feb 23 14:44:32.071219 master-0 kubenswrapper[28758]: I0223 14:44:32.056962 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nmnbc" Feb 23 14:44:32.071219 master-0 kubenswrapper[28758]: I0223 14:44:32.058663 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 23 14:44:32.071219 master-0 kubenswrapper[28758]: I0223 14:44:32.066876 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-nmnbc"] Feb 23 14:44:32.087943 master-0 kubenswrapper[28758]: I0223 14:44:32.085204 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-zs2k7"] Feb 23 14:44:32.087943 master-0 kubenswrapper[28758]: I0223 14:44:32.086727 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-zs2k7" Feb 23 14:44:32.133300 master-0 kubenswrapper[28758]: I0223 14:44:32.133253 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvfdc\" (UniqueName: \"kubernetes.io/projected/1233ccec-26e2-48e3-b43a-917eda81883d-kube-api-access-vvfdc\") pod \"nmstate-metrics-58c85c668d-6g7nb\" (UID: \"1233ccec-26e2-48e3-b43a-917eda81883d\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-6g7nb" Feb 23 14:44:32.209919 master-0 kubenswrapper[28758]: I0223 14:44:32.209836 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ng24s"] Feb 23 14:44:32.211669 master-0 kubenswrapper[28758]: I0223 14:44:32.211085 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ng24s" Feb 23 14:44:32.214582 master-0 kubenswrapper[28758]: I0223 14:44:32.214549 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 23 14:44:32.214871 master-0 kubenswrapper[28758]: I0223 14:44:32.214842 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 23 14:44:32.228387 master-0 kubenswrapper[28758]: I0223 14:44:32.228213 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ng24s"] Feb 23 14:44:32.234693 master-0 kubenswrapper[28758]: I0223 14:44:32.234644 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/58d61652-add1-403d-95f7-e27d89f02376-dbus-socket\") pod \"nmstate-handler-zs2k7\" (UID: \"58d61652-add1-403d-95f7-e27d89f02376\") " pod="openshift-nmstate/nmstate-handler-zs2k7" Feb 23 14:44:32.234877 master-0 kubenswrapper[28758]: I0223 14:44:32.234732 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cpqb\" (UniqueName: \"kubernetes.io/projected/567c7ff6-21b3-4463-9a94-43b90d5fc1de-kube-api-access-5cpqb\") pod \"nmstate-webhook-866bcb46dc-nmnbc\" (UID: \"567c7ff6-21b3-4463-9a94-43b90d5fc1de\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nmnbc" Feb 23 14:44:32.234877 master-0 kubenswrapper[28758]: I0223 14:44:32.234813 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/58d61652-add1-403d-95f7-e27d89f02376-nmstate-lock\") pod \"nmstate-handler-zs2k7\" (UID: \"58d61652-add1-403d-95f7-e27d89f02376\") " pod="openshift-nmstate/nmstate-handler-zs2k7" Feb 23 14:44:32.234877 master-0 kubenswrapper[28758]: I0223 14:44:32.234838 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/58d61652-add1-403d-95f7-e27d89f02376-ovs-socket\") pod \"nmstate-handler-zs2k7\" (UID: \"58d61652-add1-403d-95f7-e27d89f02376\") " pod="openshift-nmstate/nmstate-handler-zs2k7" Feb 23 14:44:32.234877 master-0 kubenswrapper[28758]: I0223 14:44:32.234868 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/567c7ff6-21b3-4463-9a94-43b90d5fc1de-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-nmnbc\" (UID: \"567c7ff6-21b3-4463-9a94-43b90d5fc1de\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nmnbc" Feb 23 14:44:32.234999 master-0 kubenswrapper[28758]: I0223 14:44:32.234896 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvfdc\" (UniqueName: \"kubernetes.io/projected/1233ccec-26e2-48e3-b43a-917eda81883d-kube-api-access-vvfdc\") pod \"nmstate-metrics-58c85c668d-6g7nb\" (UID: \"1233ccec-26e2-48e3-b43a-917eda81883d\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-6g7nb" Feb 23 14:44:32.234999 master-0 kubenswrapper[28758]: I0223 14:44:32.234980 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2lvn\" (UniqueName: \"kubernetes.io/projected/58d61652-add1-403d-95f7-e27d89f02376-kube-api-access-j2lvn\") pod \"nmstate-handler-zs2k7\" (UID: \"58d61652-add1-403d-95f7-e27d89f02376\") " pod="openshift-nmstate/nmstate-handler-zs2k7" Feb 23 14:44:32.261793 master-0 kubenswrapper[28758]: I0223 14:44:32.259623 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvfdc\" (UniqueName: \"kubernetes.io/projected/1233ccec-26e2-48e3-b43a-917eda81883d-kube-api-access-vvfdc\") pod \"nmstate-metrics-58c85c668d-6g7nb\" (UID: \"1233ccec-26e2-48e3-b43a-917eda81883d\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-6g7nb" Feb 23 14:44:32.336634 master-0 kubenswrapper[28758]: I0223 14:44:32.336554 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/58d61652-add1-403d-95f7-e27d89f02376-ovs-socket\") pod \"nmstate-handler-zs2k7\" (UID: \"58d61652-add1-403d-95f7-e27d89f02376\") " pod="openshift-nmstate/nmstate-handler-zs2k7" Feb 23 14:44:32.336843 master-0 kubenswrapper[28758]: I0223 14:44:32.336629 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/567c7ff6-21b3-4463-9a94-43b90d5fc1de-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-nmnbc\" (UID: \"567c7ff6-21b3-4463-9a94-43b90d5fc1de\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nmnbc" Feb 23 14:44:32.336843 master-0 kubenswrapper[28758]: I0223 14:44:32.336716 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2lvn\" (UniqueName: \"kubernetes.io/projected/58d61652-add1-403d-95f7-e27d89f02376-kube-api-access-j2lvn\") pod \"nmstate-handler-zs2k7\" (UID: \"58d61652-add1-403d-95f7-e27d89f02376\") " pod="openshift-nmstate/nmstate-handler-zs2k7" Feb 23 14:44:32.336843 master-0 kubenswrapper[28758]: I0223 14:44:32.336786 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/58d61652-add1-403d-95f7-e27d89f02376-dbus-socket\") pod \"nmstate-handler-zs2k7\" (UID: \"58d61652-add1-403d-95f7-e27d89f02376\") " pod="openshift-nmstate/nmstate-handler-zs2k7" Feb 23 14:44:32.336843 master-0 kubenswrapper[28758]: I0223 14:44:32.336815 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/efa55f6d-66f7-480c-9d2e-465c0822c7a8-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-ng24s\" (UID: \"efa55f6d-66f7-480c-9d2e-465c0822c7a8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ng24s" Feb 23 14:44:32.336964 master-0 kubenswrapper[28758]: I0223 14:44:32.336853 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cpqb\" (UniqueName: \"kubernetes.io/projected/567c7ff6-21b3-4463-9a94-43b90d5fc1de-kube-api-access-5cpqb\") pod \"nmstate-webhook-866bcb46dc-nmnbc\" (UID: \"567c7ff6-21b3-4463-9a94-43b90d5fc1de\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nmnbc" Feb 23 14:44:32.336964 master-0 kubenswrapper[28758]: I0223 14:44:32.336904 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kfkm\" (UniqueName: \"kubernetes.io/projected/efa55f6d-66f7-480c-9d2e-465c0822c7a8-kube-api-access-5kfkm\") pod \"nmstate-console-plugin-5c78fc5d65-ng24s\" (UID: \"efa55f6d-66f7-480c-9d2e-465c0822c7a8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ng24s" Feb 23 14:44:32.336964 master-0 kubenswrapper[28758]: I0223 14:44:32.336940 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/efa55f6d-66f7-480c-9d2e-465c0822c7a8-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-ng24s\" (UID: \"efa55f6d-66f7-480c-9d2e-465c0822c7a8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ng24s" Feb 23 14:44:32.337073 master-0 kubenswrapper[28758]: I0223 14:44:32.336986 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/58d61652-add1-403d-95f7-e27d89f02376-nmstate-lock\") pod \"nmstate-handler-zs2k7\" (UID: \"58d61652-add1-403d-95f7-e27d89f02376\") " pod="openshift-nmstate/nmstate-handler-zs2k7" Feb 23 14:44:32.337149 master-0 kubenswrapper[28758]: I0223 14:44:32.337080 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/58d61652-add1-403d-95f7-e27d89f02376-nmstate-lock\") pod \"nmstate-handler-zs2k7\" (UID: \"58d61652-add1-403d-95f7-e27d89f02376\") " pod="openshift-nmstate/nmstate-handler-zs2k7" Feb 23 14:44:32.337149 master-0 kubenswrapper[28758]: I0223 14:44:32.337131 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/58d61652-add1-403d-95f7-e27d89f02376-ovs-socket\") pod \"nmstate-handler-zs2k7\" (UID: \"58d61652-add1-403d-95f7-e27d89f02376\") " pod="openshift-nmstate/nmstate-handler-zs2k7" Feb 23 14:44:32.338788 master-0 kubenswrapper[28758]: I0223 14:44:32.338721 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/58d61652-add1-403d-95f7-e27d89f02376-dbus-socket\") pod \"nmstate-handler-zs2k7\" (UID: \"58d61652-add1-403d-95f7-e27d89f02376\") " pod="openshift-nmstate/nmstate-handler-zs2k7" Feb 23 14:44:32.341712 master-0 kubenswrapper[28758]: I0223 14:44:32.341675 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/567c7ff6-21b3-4463-9a94-43b90d5fc1de-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-nmnbc\" (UID: \"567c7ff6-21b3-4463-9a94-43b90d5fc1de\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nmnbc" Feb 23 14:44:32.364242 master-0 kubenswrapper[28758]: I0223 14:44:32.363344 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cpqb\" (UniqueName: \"kubernetes.io/projected/567c7ff6-21b3-4463-9a94-43b90d5fc1de-kube-api-access-5cpqb\") pod \"nmstate-webhook-866bcb46dc-nmnbc\" (UID: \"567c7ff6-21b3-4463-9a94-43b90d5fc1de\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nmnbc" Feb 23 14:44:32.369042 master-0 kubenswrapper[28758]: I0223 14:44:32.368994 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2lvn\" (UniqueName: \"kubernetes.io/projected/58d61652-add1-403d-95f7-e27d89f02376-kube-api-access-j2lvn\") pod \"nmstate-handler-zs2k7\" (UID: \"58d61652-add1-403d-95f7-e27d89f02376\") " pod="openshift-nmstate/nmstate-handler-zs2k7" Feb 23 14:44:32.376859 master-0 kubenswrapper[28758]: I0223 14:44:32.376795 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-6g7nb" Feb 23 14:44:32.406958 master-0 kubenswrapper[28758]: I0223 14:44:32.406879 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nmnbc" Feb 23 14:44:32.420493 master-0 kubenswrapper[28758]: I0223 14:44:32.420419 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-zs2k7" Feb 23 14:44:32.441642 master-0 kubenswrapper[28758]: I0223 14:44:32.441348 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-86d94cc75-jkcsk"] Feb 23 14:44:32.442643 master-0 kubenswrapper[28758]: I0223 14:44:32.442614 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86d94cc75-jkcsk" Feb 23 14:44:32.462975 master-0 kubenswrapper[28758]: I0223 14:44:32.458189 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86d94cc75-jkcsk"] Feb 23 14:44:32.488684 master-0 kubenswrapper[28758]: I0223 14:44:32.480337 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cgfmv" event={"ID":"1b984ba9-84db-4b51-ac8d-f92f22a4b76f","Type":"ContainerStarted","Data":"1b5cad0ff9a80ad410a95a1942683602f0446c47362309c2f62b64f952fc87fb"} Feb 23 14:44:32.488684 master-0 kubenswrapper[28758]: I0223 14:44:32.480385 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cgfmv" event={"ID":"1b984ba9-84db-4b51-ac8d-f92f22a4b76f","Type":"ContainerStarted","Data":"795c59fb9182fb3ad09b0b5fc12cb81f6c836d9f58afa3fb9ec5b6ea7e1ccd9a"} Feb 23 14:44:32.488684 master-0 kubenswrapper[28758]: I0223 14:44:32.482132 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kfkm\" (UniqueName: \"kubernetes.io/projected/efa55f6d-66f7-480c-9d2e-465c0822c7a8-kube-api-access-5kfkm\") pod \"nmstate-console-plugin-5c78fc5d65-ng24s\" (UID: \"efa55f6d-66f7-480c-9d2e-465c0822c7a8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ng24s" Feb 23 14:44:32.488684 master-0 kubenswrapper[28758]: I0223 14:44:32.482199 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/efa55f6d-66f7-480c-9d2e-465c0822c7a8-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-ng24s\" (UID: \"efa55f6d-66f7-480c-9d2e-465c0822c7a8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ng24s" Feb 23 14:44:32.488684 master-0 kubenswrapper[28758]: I0223 14:44:32.482228 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a05e5a62-a07f-4f60-8c77-68b7acaa07f0-trusted-ca-bundle\") pod \"console-86d94cc75-jkcsk\" (UID: \"a05e5a62-a07f-4f60-8c77-68b7acaa07f0\") " pod="openshift-console/console-86d94cc75-jkcsk" Feb 23 14:44:32.488684 master-0 kubenswrapper[28758]: I0223 14:44:32.482324 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a05e5a62-a07f-4f60-8c77-68b7acaa07f0-console-serving-cert\") pod \"console-86d94cc75-jkcsk\" (UID: \"a05e5a62-a07f-4f60-8c77-68b7acaa07f0\") " pod="openshift-console/console-86d94cc75-jkcsk" Feb 23 14:44:32.488684 master-0 kubenswrapper[28758]: I0223 14:44:32.482414 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a05e5a62-a07f-4f60-8c77-68b7acaa07f0-console-oauth-config\") pod \"console-86d94cc75-jkcsk\" (UID: \"a05e5a62-a07f-4f60-8c77-68b7acaa07f0\") " pod="openshift-console/console-86d94cc75-jkcsk" Feb 23 14:44:32.488684 master-0 kubenswrapper[28758]: I0223 14:44:32.482448 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a05e5a62-a07f-4f60-8c77-68b7acaa07f0-console-config\") pod \"console-86d94cc75-jkcsk\" (UID: \"a05e5a62-a07f-4f60-8c77-68b7acaa07f0\") " pod="openshift-console/console-86d94cc75-jkcsk" Feb 23 14:44:32.488684 master-0 kubenswrapper[28758]: I0223 14:44:32.482527 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a05e5a62-a07f-4f60-8c77-68b7acaa07f0-service-ca\") pod \"console-86d94cc75-jkcsk\" (UID: \"a05e5a62-a07f-4f60-8c77-68b7acaa07f0\") " pod="openshift-console/console-86d94cc75-jkcsk" Feb 23 14:44:32.488684 master-0 kubenswrapper[28758]: I0223 14:44:32.482562 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/efa55f6d-66f7-480c-9d2e-465c0822c7a8-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-ng24s\" (UID: \"efa55f6d-66f7-480c-9d2e-465c0822c7a8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ng24s" Feb 23 14:44:32.488684 master-0 kubenswrapper[28758]: I0223 14:44:32.482592 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75frb\" (UniqueName: \"kubernetes.io/projected/a05e5a62-a07f-4f60-8c77-68b7acaa07f0-kube-api-access-75frb\") pod \"console-86d94cc75-jkcsk\" (UID: \"a05e5a62-a07f-4f60-8c77-68b7acaa07f0\") " pod="openshift-console/console-86d94cc75-jkcsk" Feb 23 14:44:32.488684 master-0 kubenswrapper[28758]: I0223 14:44:32.482637 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a05e5a62-a07f-4f60-8c77-68b7acaa07f0-oauth-serving-cert\") pod \"console-86d94cc75-jkcsk\" (UID: \"a05e5a62-a07f-4f60-8c77-68b7acaa07f0\") " pod="openshift-console/console-86d94cc75-jkcsk" Feb 23 14:44:32.488684 master-0 kubenswrapper[28758]: I0223 14:44:32.484002 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/efa55f6d-66f7-480c-9d2e-465c0822c7a8-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-ng24s\" (UID: \"efa55f6d-66f7-480c-9d2e-465c0822c7a8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ng24s" Feb 23 14:44:32.489183 master-0 kubenswrapper[28758]: I0223 14:44:32.489098 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/efa55f6d-66f7-480c-9d2e-465c0822c7a8-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-ng24s\" (UID: \"efa55f6d-66f7-480c-9d2e-465c0822c7a8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ng24s" Feb 23 14:44:32.500058 master-0 kubenswrapper[28758]: I0223 14:44:32.490969 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-vdjqg" event={"ID":"669f05a2-9b50-4d6e-8d16-0a5050939b84","Type":"ContainerStarted","Data":"d5a389ff967e1508511d5455da6cd3c7b89fd0e22add6d98323dc206311b3a6d"} Feb 23 14:44:32.500058 master-0 kubenswrapper[28758]: I0223 14:44:32.491121 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-vdjqg" Feb 23 14:44:32.502572 master-0 kubenswrapper[28758]: I0223 14:44:32.502537 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kfkm\" (UniqueName: \"kubernetes.io/projected/efa55f6d-66f7-480c-9d2e-465c0822c7a8-kube-api-access-5kfkm\") pod \"nmstate-console-plugin-5c78fc5d65-ng24s\" (UID: \"efa55f6d-66f7-480c-9d2e-465c0822c7a8\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ng24s" Feb 23 14:44:32.536804 master-0 kubenswrapper[28758]: I0223 14:44:32.534806 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-vdjqg" podStartSLOduration=2.5772773190000002 podStartE2EDuration="3.534782075s" podCreationTimestamp="2026-02-23 14:44:29 +0000 UTC" firstStartedPulling="2026-02-23 14:44:30.985727508 +0000 UTC m=+603.112043440" lastFinishedPulling="2026-02-23 14:44:31.943232264 +0000 UTC m=+604.069548196" observedRunningTime="2026-02-23 14:44:32.514906987 +0000 UTC m=+604.641222929" watchObservedRunningTime="2026-02-23 14:44:32.534782075 +0000 UTC m=+604.661098017" Feb 23 14:44:32.541333 master-0 kubenswrapper[28758]: I0223 14:44:32.540645 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ng24s" Feb 23 14:44:32.596519 master-0 kubenswrapper[28758]: I0223 14:44:32.589239 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a05e5a62-a07f-4f60-8c77-68b7acaa07f0-console-serving-cert\") pod \"console-86d94cc75-jkcsk\" (UID: \"a05e5a62-a07f-4f60-8c77-68b7acaa07f0\") " pod="openshift-console/console-86d94cc75-jkcsk" Feb 23 14:44:32.596519 master-0 kubenswrapper[28758]: I0223 14:44:32.589577 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a05e5a62-a07f-4f60-8c77-68b7acaa07f0-console-oauth-config\") pod \"console-86d94cc75-jkcsk\" (UID: \"a05e5a62-a07f-4f60-8c77-68b7acaa07f0\") " pod="openshift-console/console-86d94cc75-jkcsk" Feb 23 14:44:32.596519 master-0 kubenswrapper[28758]: I0223 14:44:32.589617 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a05e5a62-a07f-4f60-8c77-68b7acaa07f0-console-config\") pod \"console-86d94cc75-jkcsk\" (UID: \"a05e5a62-a07f-4f60-8c77-68b7acaa07f0\") " pod="openshift-console/console-86d94cc75-jkcsk" Feb 23 14:44:32.596519 master-0 kubenswrapper[28758]: I0223 14:44:32.589672 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a05e5a62-a07f-4f60-8c77-68b7acaa07f0-service-ca\") pod \"console-86d94cc75-jkcsk\" (UID: \"a05e5a62-a07f-4f60-8c77-68b7acaa07f0\") " pod="openshift-console/console-86d94cc75-jkcsk" Feb 23 14:44:32.596519 master-0 kubenswrapper[28758]: I0223 14:44:32.589706 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75frb\" (UniqueName: \"kubernetes.io/projected/a05e5a62-a07f-4f60-8c77-68b7acaa07f0-kube-api-access-75frb\") pod \"console-86d94cc75-jkcsk\" (UID: \"a05e5a62-a07f-4f60-8c77-68b7acaa07f0\") " pod="openshift-console/console-86d94cc75-jkcsk" Feb 23 14:44:32.596519 master-0 kubenswrapper[28758]: I0223 14:44:32.589939 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a05e5a62-a07f-4f60-8c77-68b7acaa07f0-oauth-serving-cert\") pod \"console-86d94cc75-jkcsk\" (UID: \"a05e5a62-a07f-4f60-8c77-68b7acaa07f0\") " pod="openshift-console/console-86d94cc75-jkcsk" Feb 23 14:44:32.596519 master-0 kubenswrapper[28758]: I0223 14:44:32.590005 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a05e5a62-a07f-4f60-8c77-68b7acaa07f0-trusted-ca-bundle\") pod \"console-86d94cc75-jkcsk\" (UID: \"a05e5a62-a07f-4f60-8c77-68b7acaa07f0\") " pod="openshift-console/console-86d94cc75-jkcsk" Feb 23 14:44:32.596519 master-0 kubenswrapper[28758]: I0223 14:44:32.591840 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a05e5a62-a07f-4f60-8c77-68b7acaa07f0-service-ca\") pod \"console-86d94cc75-jkcsk\" (UID: \"a05e5a62-a07f-4f60-8c77-68b7acaa07f0\") " pod="openshift-console/console-86d94cc75-jkcsk" Feb 23 14:44:32.596519 master-0 kubenswrapper[28758]: I0223 14:44:32.592841 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a05e5a62-a07f-4f60-8c77-68b7acaa07f0-console-config\") pod \"console-86d94cc75-jkcsk\" (UID: \"a05e5a62-a07f-4f60-8c77-68b7acaa07f0\") " pod="openshift-console/console-86d94cc75-jkcsk" Feb 23 14:44:32.596519 master-0 kubenswrapper[28758]: I0223 14:44:32.594345 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a05e5a62-a07f-4f60-8c77-68b7acaa07f0-console-oauth-config\") pod \"console-86d94cc75-jkcsk\" (UID: \"a05e5a62-a07f-4f60-8c77-68b7acaa07f0\") " pod="openshift-console/console-86d94cc75-jkcsk" Feb 23 14:44:32.600532 master-0 kubenswrapper[28758]: I0223 14:44:32.598171 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a05e5a62-a07f-4f60-8c77-68b7acaa07f0-console-serving-cert\") pod \"console-86d94cc75-jkcsk\" (UID: \"a05e5a62-a07f-4f60-8c77-68b7acaa07f0\") " pod="openshift-console/console-86d94cc75-jkcsk" Feb 23 14:44:32.618562 master-0 kubenswrapper[28758]: I0223 14:44:32.615831 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75frb\" (UniqueName: \"kubernetes.io/projected/a05e5a62-a07f-4f60-8c77-68b7acaa07f0-kube-api-access-75frb\") pod \"console-86d94cc75-jkcsk\" (UID: \"a05e5a62-a07f-4f60-8c77-68b7acaa07f0\") " pod="openshift-console/console-86d94cc75-jkcsk" Feb 23 14:44:32.647358 master-0 kubenswrapper[28758]: I0223 14:44:32.647318 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a05e5a62-a07f-4f60-8c77-68b7acaa07f0-oauth-serving-cert\") pod \"console-86d94cc75-jkcsk\" (UID: \"a05e5a62-a07f-4f60-8c77-68b7acaa07f0\") " pod="openshift-console/console-86d94cc75-jkcsk" Feb 23 14:44:32.648593 master-0 kubenswrapper[28758]: I0223 14:44:32.648566 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a05e5a62-a07f-4f60-8c77-68b7acaa07f0-trusted-ca-bundle\") pod \"console-86d94cc75-jkcsk\" (UID: \"a05e5a62-a07f-4f60-8c77-68b7acaa07f0\") " pod="openshift-console/console-86d94cc75-jkcsk" Feb 23 14:44:32.789976 master-0 kubenswrapper[28758]: I0223 14:44:32.784506 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86d94cc75-jkcsk" Feb 23 14:44:32.880863 master-0 kubenswrapper[28758]: I0223 14:44:32.880813 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-nmnbc"] Feb 23 14:44:32.993916 master-0 kubenswrapper[28758]: I0223 14:44:32.993866 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-6g7nb"] Feb 23 14:44:33.072657 master-0 kubenswrapper[28758]: I0223 14:44:33.072583 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ng24s"] Feb 23 14:44:33.077607 master-0 kubenswrapper[28758]: W0223 14:44:33.077554 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefa55f6d_66f7_480c_9d2e_465c0822c7a8.slice/crio-2724159346a0f18b31a6d6b07b6af5523d24b16ea428334538ff3031fb814093 WatchSource:0}: Error finding container 2724159346a0f18b31a6d6b07b6af5523d24b16ea428334538ff3031fb814093: Status 404 returned error can't find the container with id 2724159346a0f18b31a6d6b07b6af5523d24b16ea428334538ff3031fb814093 Feb 23 14:44:33.228912 master-0 kubenswrapper[28758]: I0223 14:44:33.228833 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86d94cc75-jkcsk"] Feb 23 14:44:33.231318 master-0 kubenswrapper[28758]: W0223 14:44:33.231267 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda05e5a62_a07f_4f60_8c77_68b7acaa07f0.slice/crio-b4fbc9796fc3544002b76b2376636c709622713b49594f738130f8d8003b099c WatchSource:0}: Error finding container b4fbc9796fc3544002b76b2376636c709622713b49594f738130f8d8003b099c: Status 404 returned error can't find the container with id b4fbc9796fc3544002b76b2376636c709622713b49594f738130f8d8003b099c Feb 23 14:44:33.499692 master-0 kubenswrapper[28758]: I0223 14:44:33.499606 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-zs2k7" event={"ID":"58d61652-add1-403d-95f7-e27d89f02376","Type":"ContainerStarted","Data":"15f85cd2be045c8d561de202ea17cf4531776c5d0d1337a1838fb2ca7408436f"} Feb 23 14:44:33.501464 master-0 kubenswrapper[28758]: I0223 14:44:33.501410 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cgfmv" event={"ID":"1b984ba9-84db-4b51-ac8d-f92f22a4b76f","Type":"ContainerStarted","Data":"e58ce89a0a6c6b260c1aa54506340fef20c0249273b4a91cadbcbd8088ffd068"} Feb 23 14:44:33.501545 master-0 kubenswrapper[28758]: I0223 14:44:33.501526 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-cgfmv" Feb 23 14:44:33.503052 master-0 kubenswrapper[28758]: I0223 14:44:33.502987 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86d94cc75-jkcsk" event={"ID":"a05e5a62-a07f-4f60-8c77-68b7acaa07f0","Type":"ContainerStarted","Data":"bc311289f7e595c0cf0232f7ea6b759c74e1a53a9fdaf3387b119718938b165f"} Feb 23 14:44:33.503133 master-0 kubenswrapper[28758]: I0223 14:44:33.503053 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86d94cc75-jkcsk" event={"ID":"a05e5a62-a07f-4f60-8c77-68b7acaa07f0","Type":"ContainerStarted","Data":"b4fbc9796fc3544002b76b2376636c709622713b49594f738130f8d8003b099c"} Feb 23 14:44:33.504275 master-0 kubenswrapper[28758]: I0223 14:44:33.504226 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-6g7nb" event={"ID":"1233ccec-26e2-48e3-b43a-917eda81883d","Type":"ContainerStarted","Data":"85c93c96c01e1cee2d65662f2551b1d08ded427eff973e9bf3b6ea065b872582"} Feb 23 14:44:33.505340 master-0 kubenswrapper[28758]: I0223 14:44:33.505283 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ng24s" event={"ID":"efa55f6d-66f7-480c-9d2e-465c0822c7a8","Type":"ContainerStarted","Data":"2724159346a0f18b31a6d6b07b6af5523d24b16ea428334538ff3031fb814093"} Feb 23 14:44:33.506359 master-0 kubenswrapper[28758]: I0223 14:44:33.506301 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nmnbc" event={"ID":"567c7ff6-21b3-4463-9a94-43b90d5fc1de","Type":"ContainerStarted","Data":"e241595280a3ed8d5fb693dceff84235a6b08fa3494d6e9b4dece2718f6f770b"} Feb 23 14:44:33.526786 master-0 kubenswrapper[28758]: I0223 14:44:33.526708 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-cgfmv" podStartSLOduration=4.526682743 podStartE2EDuration="4.526682743s" podCreationTimestamp="2026-02-23 14:44:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:44:33.519065141 +0000 UTC m=+605.645381073" watchObservedRunningTime="2026-02-23 14:44:33.526682743 +0000 UTC m=+605.652998675" Feb 23 14:44:33.548635 master-0 kubenswrapper[28758]: I0223 14:44:33.548504 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-86d94cc75-jkcsk" podStartSLOduration=1.548466231 podStartE2EDuration="1.548466231s" podCreationTimestamp="2026-02-23 14:44:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:44:33.540323105 +0000 UTC m=+605.666639047" watchObservedRunningTime="2026-02-23 14:44:33.548466231 +0000 UTC m=+605.674782173" Feb 23 14:44:38.562296 master-0 kubenswrapper[28758]: I0223 14:44:38.562199 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-6g7nb" event={"ID":"1233ccec-26e2-48e3-b43a-917eda81883d","Type":"ContainerStarted","Data":"72611849bc3f24f12f278426df8327184e0da13ad0e9cee6764a3ed3fe08c61a"} Feb 23 14:44:39.571825 master-0 kubenswrapper[28758]: I0223 14:44:39.571731 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-6g7nb" event={"ID":"1233ccec-26e2-48e3-b43a-917eda81883d","Type":"ContainerStarted","Data":"f949fa85fd58504a1d8a4fc2da94eb76728994475004929eb1bc4ee25e514eae"} Feb 23 14:44:39.573378 master-0 kubenswrapper[28758]: I0223 14:44:39.573307 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ng24s" event={"ID":"efa55f6d-66f7-480c-9d2e-465c0822c7a8","Type":"ContainerStarted","Data":"08c9a52c79928e0c766b8523bd1158a423b52a2a14f820e4fa7b5604f4b36178"} Feb 23 14:44:39.574930 master-0 kubenswrapper[28758]: I0223 14:44:39.574868 28758 generic.go:334] "Generic (PLEG): container finished" podID="7979db93-36fa-4bbd-99c6-e8c8ecc114f3" containerID="dac361eacae207c33a403275128aef1f5a8abd4cb7b60bb0c6cf07432dda8906" exitCode=0 Feb 23 14:44:39.575068 master-0 kubenswrapper[28758]: I0223 14:44:39.574959 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-frlvg" event={"ID":"7979db93-36fa-4bbd-99c6-e8c8ecc114f3","Type":"ContainerDied","Data":"dac361eacae207c33a403275128aef1f5a8abd4cb7b60bb0c6cf07432dda8906"} Feb 23 14:44:39.584373 master-0 kubenswrapper[28758]: I0223 14:44:39.584172 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nmnbc" event={"ID":"567c7ff6-21b3-4463-9a94-43b90d5fc1de","Type":"ContainerStarted","Data":"23423e3b3e03ff835d8283740eec344dbf323f1f62ca0771e07d615141eed880"} Feb 23 14:44:39.584373 master-0 kubenswrapper[28758]: I0223 14:44:39.584305 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nmnbc" Feb 23 14:44:39.586693 master-0 kubenswrapper[28758]: I0223 14:44:39.586615 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-6gz6s" event={"ID":"df5d30d1-17cc-4d03-9f0f-d5f34c0b5715","Type":"ContainerStarted","Data":"571ef22122e4c924547333a7026de704c82a841cb3e5a2166ca99e5a5d00234b"} Feb 23 14:44:39.586813 master-0 kubenswrapper[28758]: I0223 14:44:39.586712 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-6gz6s" Feb 23 14:44:39.589818 master-0 kubenswrapper[28758]: I0223 14:44:39.589758 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-zs2k7" event={"ID":"58d61652-add1-403d-95f7-e27d89f02376","Type":"ContainerStarted","Data":"db0c369bae5cc9eb9045a30ebfd79efad38eb2798ee34de3a395e83d766fcf10"} Feb 23 14:44:39.590524 master-0 kubenswrapper[28758]: I0223 14:44:39.590443 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-zs2k7" Feb 23 14:44:39.697382 master-0 kubenswrapper[28758]: I0223 14:44:39.694295 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-6g7nb" podStartSLOduration=4.030297343 podStartE2EDuration="8.69427368s" podCreationTimestamp="2026-02-23 14:44:31 +0000 UTC" firstStartedPulling="2026-02-23 14:44:32.990798409 +0000 UTC m=+605.117114331" lastFinishedPulling="2026-02-23 14:44:37.654774736 +0000 UTC m=+609.781090668" observedRunningTime="2026-02-23 14:44:39.688565428 +0000 UTC m=+611.814881420" watchObservedRunningTime="2026-02-23 14:44:39.69427368 +0000 UTC m=+611.820589612" Feb 23 14:44:39.834118 master-0 kubenswrapper[28758]: I0223 14:44:39.834031 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nmnbc" podStartSLOduration=4.053048187 podStartE2EDuration="8.834010239s" podCreationTimestamp="2026-02-23 14:44:31 +0000 UTC" firstStartedPulling="2026-02-23 14:44:32.889279234 +0000 UTC m=+605.015595166" lastFinishedPulling="2026-02-23 14:44:37.670241286 +0000 UTC m=+609.796557218" observedRunningTime="2026-02-23 14:44:39.830950348 +0000 UTC m=+611.957266290" watchObservedRunningTime="2026-02-23 14:44:39.834010239 +0000 UTC m=+611.960326171" Feb 23 14:44:39.960956 master-0 kubenswrapper[28758]: I0223 14:44:39.960779 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-ng24s" podStartSLOduration=3.370929124 podStartE2EDuration="7.960756573s" podCreationTimestamp="2026-02-23 14:44:32 +0000 UTC" firstStartedPulling="2026-02-23 14:44:33.079910144 +0000 UTC m=+605.206226126" lastFinishedPulling="2026-02-23 14:44:37.669737643 +0000 UTC m=+609.796053575" observedRunningTime="2026-02-23 14:44:39.953781288 +0000 UTC m=+612.080097240" watchObservedRunningTime="2026-02-23 14:44:39.960756573 +0000 UTC m=+612.087072505" Feb 23 14:44:40.014841 master-0 kubenswrapper[28758]: I0223 14:44:40.014767 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-zs2k7" podStartSLOduration=2.8749166280000003 podStartE2EDuration="8.014749736s" podCreationTimestamp="2026-02-23 14:44:32 +0000 UTC" firstStartedPulling="2026-02-23 14:44:32.539702925 +0000 UTC m=+604.666018857" lastFinishedPulling="2026-02-23 14:44:37.679536033 +0000 UTC m=+609.805851965" observedRunningTime="2026-02-23 14:44:40.009918668 +0000 UTC m=+612.136234610" watchObservedRunningTime="2026-02-23 14:44:40.014749736 +0000 UTC m=+612.141065668" Feb 23 14:44:40.029308 master-0 kubenswrapper[28758]: I0223 14:44:40.029032 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-6gz6s" podStartSLOduration=4.62490161 podStartE2EDuration="11.029010885s" podCreationTimestamp="2026-02-23 14:44:29 +0000 UTC" firstStartedPulling="2026-02-23 14:44:31.266009318 +0000 UTC m=+603.392325250" lastFinishedPulling="2026-02-23 14:44:37.670118593 +0000 UTC m=+609.796434525" observedRunningTime="2026-02-23 14:44:40.026975411 +0000 UTC m=+612.153291343" watchObservedRunningTime="2026-02-23 14:44:40.029010885 +0000 UTC m=+612.155326817" Feb 23 14:44:40.365855 master-0 kubenswrapper[28758]: I0223 14:44:40.365784 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-vdjqg" Feb 23 14:44:40.601255 master-0 kubenswrapper[28758]: I0223 14:44:40.601188 28758 generic.go:334] "Generic (PLEG): container finished" podID="7979db93-36fa-4bbd-99c6-e8c8ecc114f3" containerID="ad452b181914b1417412f05a05ca8b91ae75e9de8840a43b7abeb59e7fb1b182" exitCode=0 Feb 23 14:44:40.601787 master-0 kubenswrapper[28758]: I0223 14:44:40.601263 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-frlvg" event={"ID":"7979db93-36fa-4bbd-99c6-e8c8ecc114f3","Type":"ContainerDied","Data":"ad452b181914b1417412f05a05ca8b91ae75e9de8840a43b7abeb59e7fb1b182"} Feb 23 14:44:41.614558 master-0 kubenswrapper[28758]: I0223 14:44:41.614470 28758 generic.go:334] "Generic (PLEG): container finished" podID="7979db93-36fa-4bbd-99c6-e8c8ecc114f3" containerID="5c1a76478d8ca68774c67a3306d51af09a51b4e69d673b9913c60c01432139b9" exitCode=0 Feb 23 14:44:41.615151 master-0 kubenswrapper[28758]: I0223 14:44:41.614562 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-frlvg" event={"ID":"7979db93-36fa-4bbd-99c6-e8c8ecc114f3","Type":"ContainerDied","Data":"5c1a76478d8ca68774c67a3306d51af09a51b4e69d673b9913c60c01432139b9"} Feb 23 14:44:42.625499 master-0 kubenswrapper[28758]: I0223 14:44:42.625425 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-frlvg" event={"ID":"7979db93-36fa-4bbd-99c6-e8c8ecc114f3","Type":"ContainerStarted","Data":"e5d4e81f595ba7d53db94304fbaa25facfaa03d6af179a7312d72bea298ec76b"} Feb 23 14:44:42.625499 master-0 kubenswrapper[28758]: I0223 14:44:42.625501 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-frlvg" event={"ID":"7979db93-36fa-4bbd-99c6-e8c8ecc114f3","Type":"ContainerStarted","Data":"514bccf7bd801d014b5a0dce69b5861d3fe8c1d0ce291d49d8898ea09f2bfcd9"} Feb 23 14:44:42.625499 master-0 kubenswrapper[28758]: I0223 14:44:42.625512 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-frlvg" event={"ID":"7979db93-36fa-4bbd-99c6-e8c8ecc114f3","Type":"ContainerStarted","Data":"861251fcdcc9c2761be55058bf462c1f6f33fe1a2f1d01ad4e0450cf144666a1"} Feb 23 14:44:42.626193 master-0 kubenswrapper[28758]: I0223 14:44:42.625521 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-frlvg" event={"ID":"7979db93-36fa-4bbd-99c6-e8c8ecc114f3","Type":"ContainerStarted","Data":"c5851dffdec46741309088979492c79ac8a5150fd237123df1ee098f57aa4c99"} Feb 23 14:44:42.626193 master-0 kubenswrapper[28758]: I0223 14:44:42.625530 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-frlvg" event={"ID":"7979db93-36fa-4bbd-99c6-e8c8ecc114f3","Type":"ContainerStarted","Data":"efc2fb1f3c8cdfc832a56111492655faafd2400cc2fb686443fc194916b5be31"} Feb 23 14:44:42.786081 master-0 kubenswrapper[28758]: I0223 14:44:42.785625 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-86d94cc75-jkcsk" Feb 23 14:44:42.786570 master-0 kubenswrapper[28758]: I0223 14:44:42.786539 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-86d94cc75-jkcsk" Feb 23 14:44:42.791139 master-0 kubenswrapper[28758]: I0223 14:44:42.791105 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-86d94cc75-jkcsk" Feb 23 14:44:43.641511 master-0 kubenswrapper[28758]: I0223 14:44:43.641427 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-frlvg" event={"ID":"7979db93-36fa-4bbd-99c6-e8c8ecc114f3","Type":"ContainerStarted","Data":"085cf05217cacc2d64254ca591eb5dbae84451ef59e72abb191feadb06438395"} Feb 23 14:44:43.642787 master-0 kubenswrapper[28758]: I0223 14:44:43.641643 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-frlvg" Feb 23 14:44:43.645131 master-0 kubenswrapper[28758]: I0223 14:44:43.645090 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-86d94cc75-jkcsk" Feb 23 14:44:43.669300 master-0 kubenswrapper[28758]: I0223 14:44:43.668634 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-frlvg" podStartSLOduration=7.950667376 podStartE2EDuration="14.668610291s" podCreationTimestamp="2026-02-23 14:44:29 +0000 UTC" firstStartedPulling="2026-02-23 14:44:30.988091391 +0000 UTC m=+603.114407323" lastFinishedPulling="2026-02-23 14:44:37.706034306 +0000 UTC m=+609.832350238" observedRunningTime="2026-02-23 14:44:43.664901753 +0000 UTC m=+615.791217725" watchObservedRunningTime="2026-02-23 14:44:43.668610291 +0000 UTC m=+615.794926233" Feb 23 14:44:43.756062 master-0 kubenswrapper[28758]: I0223 14:44:43.755985 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-688ffc4cbf-2brtn"] Feb 23 14:44:45.861706 master-0 kubenswrapper[28758]: I0223 14:44:45.861628 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-frlvg" Feb 23 14:44:45.903018 master-0 kubenswrapper[28758]: I0223 14:44:45.902956 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-frlvg" Feb 23 14:44:47.449351 master-0 kubenswrapper[28758]: I0223 14:44:47.449254 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-zs2k7" Feb 23 14:44:50.852247 master-0 kubenswrapper[28758]: I0223 14:44:50.852167 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-6gz6s" Feb 23 14:44:51.865898 master-0 kubenswrapper[28758]: I0223 14:44:51.865777 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-cgfmv" Feb 23 14:44:52.416375 master-0 kubenswrapper[28758]: I0223 14:44:52.416279 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-nmnbc" Feb 23 14:44:59.310116 master-0 kubenswrapper[28758]: I0223 14:44:59.310028 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/vg-manager-5jvzp"] Feb 23 14:44:59.311635 master-0 kubenswrapper[28758]: I0223 14:44:59.311601 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.313843 master-0 kubenswrapper[28758]: I0223 14:44:59.313802 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"vg-manager-metrics-cert" Feb 23 14:44:59.338156 master-0 kubenswrapper[28758]: I0223 14:44:59.338083 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-5jvzp"] Feb 23 14:44:59.409948 master-0 kubenswrapper[28758]: I0223 14:44:59.409886 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/40428987-a2a4-4b49-b2fe-39e0d234032f-lvmd-config\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.409948 master-0 kubenswrapper[28758]: I0223 14:44:59.409939 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/40428987-a2a4-4b49-b2fe-39e0d234032f-device-dir\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.410270 master-0 kubenswrapper[28758]: I0223 14:44:59.409968 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/40428987-a2a4-4b49-b2fe-39e0d234032f-node-plugin-dir\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.410270 master-0 kubenswrapper[28758]: I0223 14:44:59.409989 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/40428987-a2a4-4b49-b2fe-39e0d234032f-registration-dir\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.410270 master-0 kubenswrapper[28758]: I0223 14:44:59.410046 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/40428987-a2a4-4b49-b2fe-39e0d234032f-sys\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.410270 master-0 kubenswrapper[28758]: I0223 14:44:59.410064 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mmls\" (UniqueName: \"kubernetes.io/projected/40428987-a2a4-4b49-b2fe-39e0d234032f-kube-api-access-5mmls\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.410270 master-0 kubenswrapper[28758]: I0223 14:44:59.410081 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/40428987-a2a4-4b49-b2fe-39e0d234032f-pod-volumes-dir\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.410270 master-0 kubenswrapper[28758]: I0223 14:44:59.410096 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/40428987-a2a4-4b49-b2fe-39e0d234032f-run-udev\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.410270 master-0 kubenswrapper[28758]: I0223 14:44:59.410110 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/40428987-a2a4-4b49-b2fe-39e0d234032f-file-lock-dir\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.410270 master-0 kubenswrapper[28758]: I0223 14:44:59.410129 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/40428987-a2a4-4b49-b2fe-39e0d234032f-csi-plugin-dir\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.410270 master-0 kubenswrapper[28758]: I0223 14:44:59.410168 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/40428987-a2a4-4b49-b2fe-39e0d234032f-metrics-cert\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.511710 master-0 kubenswrapper[28758]: I0223 14:44:59.511640 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/40428987-a2a4-4b49-b2fe-39e0d234032f-sys\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.511913 master-0 kubenswrapper[28758]: I0223 14:44:59.511719 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mmls\" (UniqueName: \"kubernetes.io/projected/40428987-a2a4-4b49-b2fe-39e0d234032f-kube-api-access-5mmls\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.511913 master-0 kubenswrapper[28758]: I0223 14:44:59.511787 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/40428987-a2a4-4b49-b2fe-39e0d234032f-sys\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.511982 master-0 kubenswrapper[28758]: I0223 14:44:59.511904 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/40428987-a2a4-4b49-b2fe-39e0d234032f-pod-volumes-dir\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.512016 master-0 kubenswrapper[28758]: I0223 14:44:59.511988 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/40428987-a2a4-4b49-b2fe-39e0d234032f-run-udev\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.512053 master-0 kubenswrapper[28758]: I0223 14:44:59.512019 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/40428987-a2a4-4b49-b2fe-39e0d234032f-file-lock-dir\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.512053 master-0 kubenswrapper[28758]: I0223 14:44:59.512026 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/40428987-a2a4-4b49-b2fe-39e0d234032f-pod-volumes-dir\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.512053 master-0 kubenswrapper[28758]: I0223 14:44:59.512049 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/40428987-a2a4-4b49-b2fe-39e0d234032f-csi-plugin-dir\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.512181 master-0 kubenswrapper[28758]: I0223 14:44:59.512074 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/40428987-a2a4-4b49-b2fe-39e0d234032f-run-udev\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.512272 master-0 kubenswrapper[28758]: I0223 14:44:59.512225 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/40428987-a2a4-4b49-b2fe-39e0d234032f-metrics-cert\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.512351 master-0 kubenswrapper[28758]: I0223 14:44:59.512324 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/40428987-a2a4-4b49-b2fe-39e0d234032f-file-lock-dir\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.512393 master-0 kubenswrapper[28758]: I0223 14:44:59.512353 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/40428987-a2a4-4b49-b2fe-39e0d234032f-lvmd-config\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.512393 master-0 kubenswrapper[28758]: I0223 14:44:59.512369 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/40428987-a2a4-4b49-b2fe-39e0d234032f-csi-plugin-dir\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.512453 master-0 kubenswrapper[28758]: I0223 14:44:59.512396 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/40428987-a2a4-4b49-b2fe-39e0d234032f-device-dir\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.512506 master-0 kubenswrapper[28758]: I0223 14:44:59.512446 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/40428987-a2a4-4b49-b2fe-39e0d234032f-node-plugin-dir\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.512547 master-0 kubenswrapper[28758]: I0223 14:44:59.512522 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/40428987-a2a4-4b49-b2fe-39e0d234032f-registration-dir\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.512594 master-0 kubenswrapper[28758]: I0223 14:44:59.512531 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/40428987-a2a4-4b49-b2fe-39e0d234032f-device-dir\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.512594 master-0 kubenswrapper[28758]: I0223 14:44:59.512576 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/40428987-a2a4-4b49-b2fe-39e0d234032f-lvmd-config\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.512659 master-0 kubenswrapper[28758]: I0223 14:44:59.512627 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/40428987-a2a4-4b49-b2fe-39e0d234032f-registration-dir\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.512693 master-0 kubenswrapper[28758]: I0223 14:44:59.512676 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/40428987-a2a4-4b49-b2fe-39e0d234032f-node-plugin-dir\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.516394 master-0 kubenswrapper[28758]: I0223 14:44:59.516358 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/40428987-a2a4-4b49-b2fe-39e0d234032f-metrics-cert\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.533216 master-0 kubenswrapper[28758]: I0223 14:44:59.533154 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mmls\" (UniqueName: \"kubernetes.io/projected/40428987-a2a4-4b49-b2fe-39e0d234032f-kube-api-access-5mmls\") pod \"vg-manager-5jvzp\" (UID: \"40428987-a2a4-4b49-b2fe-39e0d234032f\") " pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:44:59.632830 master-0 kubenswrapper[28758]: I0223 14:44:59.632650 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:45:00.397469 master-0 kubenswrapper[28758]: I0223 14:45:00.397375 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-5jvzp"] Feb 23 14:45:00.405332 master-0 kubenswrapper[28758]: W0223 14:45:00.405229 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40428987_a2a4_4b49_b2fe_39e0d234032f.slice/crio-98335d27c261862e44544eb69c3d6698aa541f680e9daab91fd37f716f5aa3d3 WatchSource:0}: Error finding container 98335d27c261862e44544eb69c3d6698aa541f680e9daab91fd37f716f5aa3d3: Status 404 returned error can't find the container with id 98335d27c261862e44544eb69c3d6698aa541f680e9daab91fd37f716f5aa3d3 Feb 23 14:45:00.452486 master-0 kubenswrapper[28758]: I0223 14:45:00.452407 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530965-bqk6c"] Feb 23 14:45:00.453376 master-0 kubenswrapper[28758]: I0223 14:45:00.453351 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530965-bqk6c" Feb 23 14:45:00.455405 master-0 kubenswrapper[28758]: I0223 14:45:00.455365 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-jqqg2" Feb 23 14:45:00.455570 master-0 kubenswrapper[28758]: I0223 14:45:00.455511 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 14:45:00.531731 master-0 kubenswrapper[28758]: I0223 14:45:00.531563 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530965-bqk6c"] Feb 23 14:45:00.533012 master-0 kubenswrapper[28758]: I0223 14:45:00.532979 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57kbm\" (UniqueName: \"kubernetes.io/projected/e8af9b04-6b1e-488a-bd1e-686edc98b077-kube-api-access-57kbm\") pod \"collect-profiles-29530965-bqk6c\" (UID: \"e8af9b04-6b1e-488a-bd1e-686edc98b077\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530965-bqk6c" Feb 23 14:45:00.533183 master-0 kubenswrapper[28758]: I0223 14:45:00.533106 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8af9b04-6b1e-488a-bd1e-686edc98b077-config-volume\") pod \"collect-profiles-29530965-bqk6c\" (UID: \"e8af9b04-6b1e-488a-bd1e-686edc98b077\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530965-bqk6c" Feb 23 14:45:00.533416 master-0 kubenswrapper[28758]: I0223 14:45:00.533278 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8af9b04-6b1e-488a-bd1e-686edc98b077-secret-volume\") pod \"collect-profiles-29530965-bqk6c\" (UID: \"e8af9b04-6b1e-488a-bd1e-686edc98b077\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530965-bqk6c" Feb 23 14:45:00.636426 master-0 kubenswrapper[28758]: I0223 14:45:00.635184 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57kbm\" (UniqueName: \"kubernetes.io/projected/e8af9b04-6b1e-488a-bd1e-686edc98b077-kube-api-access-57kbm\") pod \"collect-profiles-29530965-bqk6c\" (UID: \"e8af9b04-6b1e-488a-bd1e-686edc98b077\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530965-bqk6c" Feb 23 14:45:00.636426 master-0 kubenswrapper[28758]: I0223 14:45:00.635280 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8af9b04-6b1e-488a-bd1e-686edc98b077-config-volume\") pod \"collect-profiles-29530965-bqk6c\" (UID: \"e8af9b04-6b1e-488a-bd1e-686edc98b077\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530965-bqk6c" Feb 23 14:45:00.636426 master-0 kubenswrapper[28758]: I0223 14:45:00.635747 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8af9b04-6b1e-488a-bd1e-686edc98b077-secret-volume\") pod \"collect-profiles-29530965-bqk6c\" (UID: \"e8af9b04-6b1e-488a-bd1e-686edc98b077\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530965-bqk6c" Feb 23 14:45:00.637996 master-0 kubenswrapper[28758]: I0223 14:45:00.637800 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8af9b04-6b1e-488a-bd1e-686edc98b077-config-volume\") pod \"collect-profiles-29530965-bqk6c\" (UID: \"e8af9b04-6b1e-488a-bd1e-686edc98b077\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530965-bqk6c" Feb 23 14:45:00.640198 master-0 kubenswrapper[28758]: I0223 14:45:00.640177 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8af9b04-6b1e-488a-bd1e-686edc98b077-secret-volume\") pod \"collect-profiles-29530965-bqk6c\" (UID: \"e8af9b04-6b1e-488a-bd1e-686edc98b077\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530965-bqk6c" Feb 23 14:45:00.672207 master-0 kubenswrapper[28758]: I0223 14:45:00.671956 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57kbm\" (UniqueName: \"kubernetes.io/projected/e8af9b04-6b1e-488a-bd1e-686edc98b077-kube-api-access-57kbm\") pod \"collect-profiles-29530965-bqk6c\" (UID: \"e8af9b04-6b1e-488a-bd1e-686edc98b077\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530965-bqk6c" Feb 23 14:45:00.790105 master-0 kubenswrapper[28758]: I0223 14:45:00.790015 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530965-bqk6c" Feb 23 14:45:00.804450 master-0 kubenswrapper[28758]: I0223 14:45:00.804356 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-5jvzp" event={"ID":"40428987-a2a4-4b49-b2fe-39e0d234032f","Type":"ContainerStarted","Data":"4545d2ba576971e8ab1db47277b0f0625a489666100e3ffd8d6399da769592e7"} Feb 23 14:45:00.804450 master-0 kubenswrapper[28758]: I0223 14:45:00.804452 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-5jvzp" event={"ID":"40428987-a2a4-4b49-b2fe-39e0d234032f","Type":"ContainerStarted","Data":"98335d27c261862e44544eb69c3d6698aa541f680e9daab91fd37f716f5aa3d3"} Feb 23 14:45:00.842255 master-0 kubenswrapper[28758]: I0223 14:45:00.842144 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/vg-manager-5jvzp" podStartSLOduration=1.8421225460000001 podStartE2EDuration="1.842122546s" podCreationTimestamp="2026-02-23 14:44:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:45:00.834061432 +0000 UTC m=+632.960377374" watchObservedRunningTime="2026-02-23 14:45:00.842122546 +0000 UTC m=+632.968438478" Feb 23 14:45:00.883106 master-0 kubenswrapper[28758]: I0223 14:45:00.869855 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-frlvg" Feb 23 14:45:01.249672 master-0 kubenswrapper[28758]: I0223 14:45:01.248576 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530965-bqk6c"] Feb 23 14:45:01.820746 master-0 kubenswrapper[28758]: I0223 14:45:01.820624 28758 generic.go:334] "Generic (PLEG): container finished" podID="e8af9b04-6b1e-488a-bd1e-686edc98b077" containerID="24b51587de56366d55651b082d62405c8ede091afd16526e2cf5a388a27fb538" exitCode=0 Feb 23 14:45:01.820746 master-0 kubenswrapper[28758]: I0223 14:45:01.820702 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530965-bqk6c" event={"ID":"e8af9b04-6b1e-488a-bd1e-686edc98b077","Type":"ContainerDied","Data":"24b51587de56366d55651b082d62405c8ede091afd16526e2cf5a388a27fb538"} Feb 23 14:45:01.821332 master-0 kubenswrapper[28758]: I0223 14:45:01.820794 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530965-bqk6c" event={"ID":"e8af9b04-6b1e-488a-bd1e-686edc98b077","Type":"ContainerStarted","Data":"a5f49c0b7369a8c0b9cb927bb77d0cf264882242daeffc029032268ef4d99b9e"} Feb 23 14:45:02.831512 master-0 kubenswrapper[28758]: I0223 14:45:02.831442 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-5jvzp_40428987-a2a4-4b49-b2fe-39e0d234032f/vg-manager/0.log" Feb 23 14:45:02.832356 master-0 kubenswrapper[28758]: I0223 14:45:02.831532 28758 generic.go:334] "Generic (PLEG): container finished" podID="40428987-a2a4-4b49-b2fe-39e0d234032f" containerID="4545d2ba576971e8ab1db47277b0f0625a489666100e3ffd8d6399da769592e7" exitCode=1 Feb 23 14:45:02.832356 master-0 kubenswrapper[28758]: I0223 14:45:02.831721 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-5jvzp" event={"ID":"40428987-a2a4-4b49-b2fe-39e0d234032f","Type":"ContainerDied","Data":"4545d2ba576971e8ab1db47277b0f0625a489666100e3ffd8d6399da769592e7"} Feb 23 14:45:02.832555 master-0 kubenswrapper[28758]: I0223 14:45:02.832498 28758 scope.go:117] "RemoveContainer" containerID="4545d2ba576971e8ab1db47277b0f0625a489666100e3ffd8d6399da769592e7" Feb 23 14:45:03.200778 master-0 kubenswrapper[28758]: I0223 14:45:03.199773 28758 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock" Feb 23 14:45:03.253615 master-0 kubenswrapper[28758]: I0223 14:45:03.252151 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530965-bqk6c" Feb 23 14:45:03.288449 master-0 kubenswrapper[28758]: I0223 14:45:03.286517 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8af9b04-6b1e-488a-bd1e-686edc98b077-secret-volume\") pod \"e8af9b04-6b1e-488a-bd1e-686edc98b077\" (UID: \"e8af9b04-6b1e-488a-bd1e-686edc98b077\") " Feb 23 14:45:03.288449 master-0 kubenswrapper[28758]: I0223 14:45:03.286674 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8af9b04-6b1e-488a-bd1e-686edc98b077-config-volume\") pod \"e8af9b04-6b1e-488a-bd1e-686edc98b077\" (UID: \"e8af9b04-6b1e-488a-bd1e-686edc98b077\") " Feb 23 14:45:03.288449 master-0 kubenswrapper[28758]: I0223 14:45:03.286710 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57kbm\" (UniqueName: \"kubernetes.io/projected/e8af9b04-6b1e-488a-bd1e-686edc98b077-kube-api-access-57kbm\") pod \"e8af9b04-6b1e-488a-bd1e-686edc98b077\" (UID: \"e8af9b04-6b1e-488a-bd1e-686edc98b077\") " Feb 23 14:45:03.288449 master-0 kubenswrapper[28758]: I0223 14:45:03.287333 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8af9b04-6b1e-488a-bd1e-686edc98b077-config-volume" (OuterVolumeSpecName: "config-volume") pod "e8af9b04-6b1e-488a-bd1e-686edc98b077" (UID: "e8af9b04-6b1e-488a-bd1e-686edc98b077"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:45:03.290761 master-0 kubenswrapper[28758]: I0223 14:45:03.290081 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8af9b04-6b1e-488a-bd1e-686edc98b077-kube-api-access-57kbm" (OuterVolumeSpecName: "kube-api-access-57kbm") pod "e8af9b04-6b1e-488a-bd1e-686edc98b077" (UID: "e8af9b04-6b1e-488a-bd1e-686edc98b077"). InnerVolumeSpecName "kube-api-access-57kbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:45:03.290761 master-0 kubenswrapper[28758]: I0223 14:45:03.290168 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8af9b04-6b1e-488a-bd1e-686edc98b077-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e8af9b04-6b1e-488a-bd1e-686edc98b077" (UID: "e8af9b04-6b1e-488a-bd1e-686edc98b077"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:45:03.389537 master-0 kubenswrapper[28758]: I0223 14:45:03.389111 28758 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8af9b04-6b1e-488a-bd1e-686edc98b077-config-volume\") on node \"master-0\" DevicePath \"\"" Feb 23 14:45:03.389537 master-0 kubenswrapper[28758]: I0223 14:45:03.389194 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57kbm\" (UniqueName: \"kubernetes.io/projected/e8af9b04-6b1e-488a-bd1e-686edc98b077-kube-api-access-57kbm\") on node \"master-0\" DevicePath \"\"" Feb 23 14:45:03.389537 master-0 kubenswrapper[28758]: I0223 14:45:03.389211 28758 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8af9b04-6b1e-488a-bd1e-686edc98b077-secret-volume\") on node \"master-0\" DevicePath \"\"" Feb 23 14:45:03.758561 master-0 kubenswrapper[28758]: I0223 14:45:03.758412 28758 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock","Timestamp":"2026-02-23T14:45:03.199814516Z","Handler":null,"Name":""} Feb 23 14:45:03.760762 master-0 kubenswrapper[28758]: I0223 14:45:03.760681 28758 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: topolvm.io endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock versions: 1.0.0 Feb 23 14:45:03.760762 master-0 kubenswrapper[28758]: I0223 14:45:03.760755 28758 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: topolvm.io at endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock Feb 23 14:45:03.841215 master-0 kubenswrapper[28758]: I0223 14:45:03.841149 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-5jvzp_40428987-a2a4-4b49-b2fe-39e0d234032f/vg-manager/0.log" Feb 23 14:45:03.841714 master-0 kubenswrapper[28758]: I0223 14:45:03.841286 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-5jvzp" event={"ID":"40428987-a2a4-4b49-b2fe-39e0d234032f","Type":"ContainerStarted","Data":"ce3edd9cde4b455448f5d236ce0a3b3f399bdfdf723d4a5bf12e27cef26c6142"} Feb 23 14:45:03.843336 master-0 kubenswrapper[28758]: I0223 14:45:03.843277 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530965-bqk6c" event={"ID":"e8af9b04-6b1e-488a-bd1e-686edc98b077","Type":"ContainerDied","Data":"a5f49c0b7369a8c0b9cb927bb77d0cf264882242daeffc029032268ef4d99b9e"} Feb 23 14:45:03.843406 master-0 kubenswrapper[28758]: I0223 14:45:03.843335 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5f49c0b7369a8c0b9cb927bb77d0cf264882242daeffc029032268ef4d99b9e" Feb 23 14:45:03.843442 master-0 kubenswrapper[28758]: I0223 14:45:03.843408 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530965-bqk6c" Feb 23 14:45:06.290546 master-0 kubenswrapper[28758]: I0223 14:45:06.290415 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-psmh5"] Feb 23 14:45:06.291242 master-0 kubenswrapper[28758]: E0223 14:45:06.291040 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8af9b04-6b1e-488a-bd1e-686edc98b077" containerName="collect-profiles" Feb 23 14:45:06.291242 master-0 kubenswrapper[28758]: I0223 14:45:06.291058 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8af9b04-6b1e-488a-bd1e-686edc98b077" containerName="collect-profiles" Feb 23 14:45:06.291344 master-0 kubenswrapper[28758]: I0223 14:45:06.291273 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8af9b04-6b1e-488a-bd1e-686edc98b077" containerName="collect-profiles" Feb 23 14:45:06.292447 master-0 kubenswrapper[28758]: I0223 14:45:06.292413 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-psmh5" Feb 23 14:45:06.295934 master-0 kubenswrapper[28758]: I0223 14:45:06.295888 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 23 14:45:06.296168 master-0 kubenswrapper[28758]: I0223 14:45:06.296122 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 23 14:45:06.307381 master-0 kubenswrapper[28758]: I0223 14:45:06.307308 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-psmh5"] Feb 23 14:45:06.342502 master-0 kubenswrapper[28758]: I0223 14:45:06.342152 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb9gr\" (UniqueName: \"kubernetes.io/projected/cbfdf8a3-4911-4e93-ba4f-1322ac364244-kube-api-access-kb9gr\") pod \"openstack-operator-index-psmh5\" (UID: \"cbfdf8a3-4911-4e93-ba4f-1322ac364244\") " pod="openstack-operators/openstack-operator-index-psmh5" Feb 23 14:45:06.449786 master-0 kubenswrapper[28758]: I0223 14:45:06.449694 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb9gr\" (UniqueName: \"kubernetes.io/projected/cbfdf8a3-4911-4e93-ba4f-1322ac364244-kube-api-access-kb9gr\") pod \"openstack-operator-index-psmh5\" (UID: \"cbfdf8a3-4911-4e93-ba4f-1322ac364244\") " pod="openstack-operators/openstack-operator-index-psmh5" Feb 23 14:45:06.478400 master-0 kubenswrapper[28758]: I0223 14:45:06.478328 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb9gr\" (UniqueName: \"kubernetes.io/projected/cbfdf8a3-4911-4e93-ba4f-1322ac364244-kube-api-access-kb9gr\") pod \"openstack-operator-index-psmh5\" (UID: \"cbfdf8a3-4911-4e93-ba4f-1322ac364244\") " pod="openstack-operators/openstack-operator-index-psmh5" Feb 23 14:45:06.616967 master-0 kubenswrapper[28758]: I0223 14:45:06.616821 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-psmh5" Feb 23 14:45:07.082432 master-0 kubenswrapper[28758]: I0223 14:45:07.082338 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-psmh5"] Feb 23 14:45:07.892937 master-0 kubenswrapper[28758]: I0223 14:45:07.892848 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-psmh5" event={"ID":"cbfdf8a3-4911-4e93-ba4f-1322ac364244","Type":"ContainerStarted","Data":"7d4527b9c111a9de9f2235d2ca5ae077fd51885048655517d4e76a5882c66394"} Feb 23 14:45:08.801508 master-0 kubenswrapper[28758]: I0223 14:45:08.801386 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-688ffc4cbf-2brtn" podUID="a34af5db-bbf9-4a9d-a822-112be40e78fe" containerName="console" containerID="cri-o://416ba7badcc5c7f0a84772cee6ed1cecc21c532a8eb50c3097be01e6d740ccfa" gracePeriod=15 Feb 23 14:45:08.903346 master-0 kubenswrapper[28758]: I0223 14:45:08.903241 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-psmh5" event={"ID":"cbfdf8a3-4911-4e93-ba4f-1322ac364244","Type":"ContainerStarted","Data":"944bd45b0b555fd09bb12d1142c61be3bd5c9be962cca2d01c0ff2a02547ecee"} Feb 23 14:45:08.943211 master-0 kubenswrapper[28758]: I0223 14:45:08.943123 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-psmh5" podStartSLOduration=1.966004216 podStartE2EDuration="2.943103131s" podCreationTimestamp="2026-02-23 14:45:06 +0000 UTC" firstStartedPulling="2026-02-23 14:45:07.096804885 +0000 UTC m=+639.223120817" lastFinishedPulling="2026-02-23 14:45:08.0739038 +0000 UTC m=+640.200219732" observedRunningTime="2026-02-23 14:45:08.938274773 +0000 UTC m=+641.064590705" watchObservedRunningTime="2026-02-23 14:45:08.943103131 +0000 UTC m=+641.069419053" Feb 23 14:45:09.293666 master-0 kubenswrapper[28758]: I0223 14:45:09.293612 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-688ffc4cbf-2brtn_a34af5db-bbf9-4a9d-a822-112be40e78fe/console/0.log" Feb 23 14:45:09.293965 master-0 kubenswrapper[28758]: I0223 14:45:09.293690 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-688ffc4cbf-2brtn" Feb 23 14:45:09.414695 master-0 kubenswrapper[28758]: I0223 14:45:09.414625 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a34af5db-bbf9-4a9d-a822-112be40e78fe-oauth-serving-cert\") pod \"a34af5db-bbf9-4a9d-a822-112be40e78fe\" (UID: \"a34af5db-bbf9-4a9d-a822-112be40e78fe\") " Feb 23 14:45:09.414695 master-0 kubenswrapper[28758]: I0223 14:45:09.414684 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a34af5db-bbf9-4a9d-a822-112be40e78fe-trusted-ca-bundle\") pod \"a34af5db-bbf9-4a9d-a822-112be40e78fe\" (UID: \"a34af5db-bbf9-4a9d-a822-112be40e78fe\") " Feb 23 14:45:09.414695 master-0 kubenswrapper[28758]: I0223 14:45:09.414724 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a34af5db-bbf9-4a9d-a822-112be40e78fe-console-config\") pod \"a34af5db-bbf9-4a9d-a822-112be40e78fe\" (UID: \"a34af5db-bbf9-4a9d-a822-112be40e78fe\") " Feb 23 14:45:09.415030 master-0 kubenswrapper[28758]: I0223 14:45:09.414823 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a34af5db-bbf9-4a9d-a822-112be40e78fe-console-serving-cert\") pod \"a34af5db-bbf9-4a9d-a822-112be40e78fe\" (UID: \"a34af5db-bbf9-4a9d-a822-112be40e78fe\") " Feb 23 14:45:09.415030 master-0 kubenswrapper[28758]: I0223 14:45:09.414931 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a34af5db-bbf9-4a9d-a822-112be40e78fe-service-ca\") pod \"a34af5db-bbf9-4a9d-a822-112be40e78fe\" (UID: \"a34af5db-bbf9-4a9d-a822-112be40e78fe\") " Feb 23 14:45:09.415030 master-0 kubenswrapper[28758]: I0223 14:45:09.414988 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a34af5db-bbf9-4a9d-a822-112be40e78fe-console-oauth-config\") pod \"a34af5db-bbf9-4a9d-a822-112be40e78fe\" (UID: \"a34af5db-bbf9-4a9d-a822-112be40e78fe\") " Feb 23 14:45:09.415157 master-0 kubenswrapper[28758]: I0223 14:45:09.415034 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqmt4\" (UniqueName: \"kubernetes.io/projected/a34af5db-bbf9-4a9d-a822-112be40e78fe-kube-api-access-rqmt4\") pod \"a34af5db-bbf9-4a9d-a822-112be40e78fe\" (UID: \"a34af5db-bbf9-4a9d-a822-112be40e78fe\") " Feb 23 14:45:09.415305 master-0 kubenswrapper[28758]: I0223 14:45:09.415247 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a34af5db-bbf9-4a9d-a822-112be40e78fe-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a34af5db-bbf9-4a9d-a822-112be40e78fe" (UID: "a34af5db-bbf9-4a9d-a822-112be40e78fe"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:45:09.415363 master-0 kubenswrapper[28758]: I0223 14:45:09.415324 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a34af5db-bbf9-4a9d-a822-112be40e78fe-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a34af5db-bbf9-4a9d-a822-112be40e78fe" (UID: "a34af5db-bbf9-4a9d-a822-112be40e78fe"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:45:09.415943 master-0 kubenswrapper[28758]: I0223 14:45:09.415458 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a34af5db-bbf9-4a9d-a822-112be40e78fe-console-config" (OuterVolumeSpecName: "console-config") pod "a34af5db-bbf9-4a9d-a822-112be40e78fe" (UID: "a34af5db-bbf9-4a9d-a822-112be40e78fe"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:45:09.415943 master-0 kubenswrapper[28758]: I0223 14:45:09.415633 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a34af5db-bbf9-4a9d-a822-112be40e78fe-service-ca" (OuterVolumeSpecName: "service-ca") pod "a34af5db-bbf9-4a9d-a822-112be40e78fe" (UID: "a34af5db-bbf9-4a9d-a822-112be40e78fe"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:45:09.416046 master-0 kubenswrapper[28758]: I0223 14:45:09.415962 28758 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a34af5db-bbf9-4a9d-a822-112be40e78fe-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Feb 23 14:45:09.416046 master-0 kubenswrapper[28758]: I0223 14:45:09.415984 28758 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a34af5db-bbf9-4a9d-a822-112be40e78fe-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:45:09.416046 master-0 kubenswrapper[28758]: I0223 14:45:09.415996 28758 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a34af5db-bbf9-4a9d-a822-112be40e78fe-console-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:45:09.416046 master-0 kubenswrapper[28758]: I0223 14:45:09.416008 28758 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a34af5db-bbf9-4a9d-a822-112be40e78fe-service-ca\") on node \"master-0\" DevicePath \"\"" Feb 23 14:45:09.418754 master-0 kubenswrapper[28758]: I0223 14:45:09.418659 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a34af5db-bbf9-4a9d-a822-112be40e78fe-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a34af5db-bbf9-4a9d-a822-112be40e78fe" (UID: "a34af5db-bbf9-4a9d-a822-112be40e78fe"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:45:09.419226 master-0 kubenswrapper[28758]: I0223 14:45:09.419170 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a34af5db-bbf9-4a9d-a822-112be40e78fe-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a34af5db-bbf9-4a9d-a822-112be40e78fe" (UID: "a34af5db-bbf9-4a9d-a822-112be40e78fe"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:45:09.420716 master-0 kubenswrapper[28758]: I0223 14:45:09.420586 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a34af5db-bbf9-4a9d-a822-112be40e78fe-kube-api-access-rqmt4" (OuterVolumeSpecName: "kube-api-access-rqmt4") pod "a34af5db-bbf9-4a9d-a822-112be40e78fe" (UID: "a34af5db-bbf9-4a9d-a822-112be40e78fe"). InnerVolumeSpecName "kube-api-access-rqmt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:45:09.518141 master-0 kubenswrapper[28758]: I0223 14:45:09.518023 28758 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a34af5db-bbf9-4a9d-a822-112be40e78fe-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Feb 23 14:45:09.518141 master-0 kubenswrapper[28758]: I0223 14:45:09.518070 28758 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a34af5db-bbf9-4a9d-a822-112be40e78fe-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:45:09.518141 master-0 kubenswrapper[28758]: I0223 14:45:09.518079 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqmt4\" (UniqueName: \"kubernetes.io/projected/a34af5db-bbf9-4a9d-a822-112be40e78fe-kube-api-access-rqmt4\") on node \"master-0\" DevicePath \"\"" Feb 23 14:45:09.632997 master-0 kubenswrapper[28758]: I0223 14:45:09.632921 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:45:09.636115 master-0 kubenswrapper[28758]: I0223 14:45:09.636047 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:45:09.913886 master-0 kubenswrapper[28758]: I0223 14:45:09.913769 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-688ffc4cbf-2brtn_a34af5db-bbf9-4a9d-a822-112be40e78fe/console/0.log" Feb 23 14:45:09.913886 master-0 kubenswrapper[28758]: I0223 14:45:09.913823 28758 generic.go:334] "Generic (PLEG): container finished" podID="a34af5db-bbf9-4a9d-a822-112be40e78fe" containerID="416ba7badcc5c7f0a84772cee6ed1cecc21c532a8eb50c3097be01e6d740ccfa" exitCode=2 Feb 23 14:45:09.914719 master-0 kubenswrapper[28758]: I0223 14:45:09.913895 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-688ffc4cbf-2brtn" Feb 23 14:45:09.914719 master-0 kubenswrapper[28758]: I0223 14:45:09.913905 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-688ffc4cbf-2brtn" event={"ID":"a34af5db-bbf9-4a9d-a822-112be40e78fe","Type":"ContainerDied","Data":"416ba7badcc5c7f0a84772cee6ed1cecc21c532a8eb50c3097be01e6d740ccfa"} Feb 23 14:45:09.914719 master-0 kubenswrapper[28758]: I0223 14:45:09.913956 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-688ffc4cbf-2brtn" event={"ID":"a34af5db-bbf9-4a9d-a822-112be40e78fe","Type":"ContainerDied","Data":"498b3ed84d24314845b96669bd21a641bff94f329e776e9433c847fcf9d3c13d"} Feb 23 14:45:09.914719 master-0 kubenswrapper[28758]: I0223 14:45:09.913981 28758 scope.go:117] "RemoveContainer" containerID="416ba7badcc5c7f0a84772cee6ed1cecc21c532a8eb50c3097be01e6d740ccfa" Feb 23 14:45:09.914902 master-0 kubenswrapper[28758]: I0223 14:45:09.914858 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:45:09.916007 master-0 kubenswrapper[28758]: I0223 14:45:09.915953 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/vg-manager-5jvzp" Feb 23 14:45:09.934221 master-0 kubenswrapper[28758]: I0223 14:45:09.934185 28758 scope.go:117] "RemoveContainer" containerID="416ba7badcc5c7f0a84772cee6ed1cecc21c532a8eb50c3097be01e6d740ccfa" Feb 23 14:45:09.934603 master-0 kubenswrapper[28758]: E0223 14:45:09.934559 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"416ba7badcc5c7f0a84772cee6ed1cecc21c532a8eb50c3097be01e6d740ccfa\": container with ID starting with 416ba7badcc5c7f0a84772cee6ed1cecc21c532a8eb50c3097be01e6d740ccfa not found: ID does not exist" containerID="416ba7badcc5c7f0a84772cee6ed1cecc21c532a8eb50c3097be01e6d740ccfa" Feb 23 14:45:09.934603 master-0 kubenswrapper[28758]: I0223 14:45:09.934589 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"416ba7badcc5c7f0a84772cee6ed1cecc21c532a8eb50c3097be01e6d740ccfa"} err="failed to get container status \"416ba7badcc5c7f0a84772cee6ed1cecc21c532a8eb50c3097be01e6d740ccfa\": rpc error: code = NotFound desc = could not find container \"416ba7badcc5c7f0a84772cee6ed1cecc21c532a8eb50c3097be01e6d740ccfa\": container with ID starting with 416ba7badcc5c7f0a84772cee6ed1cecc21c532a8eb50c3097be01e6d740ccfa not found: ID does not exist" Feb 23 14:45:09.977564 master-0 kubenswrapper[28758]: I0223 14:45:09.977454 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-688ffc4cbf-2brtn"] Feb 23 14:45:09.989191 master-0 kubenswrapper[28758]: I0223 14:45:09.989109 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-688ffc4cbf-2brtn"] Feb 23 14:45:10.102586 master-0 kubenswrapper[28758]: I0223 14:45:10.100911 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a34af5db-bbf9-4a9d-a822-112be40e78fe" path="/var/lib/kubelet/pods/a34af5db-bbf9-4a9d-a822-112be40e78fe/volumes" Feb 23 14:45:10.434607 master-0 kubenswrapper[28758]: I0223 14:45:10.434530 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-psmh5"] Feb 23 14:45:10.924301 master-0 kubenswrapper[28758]: I0223 14:45:10.924187 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-psmh5" podUID="cbfdf8a3-4911-4e93-ba4f-1322ac364244" containerName="registry-server" containerID="cri-o://944bd45b0b555fd09bb12d1142c61be3bd5c9be962cca2d01c0ff2a02547ecee" gracePeriod=2 Feb 23 14:45:11.061877 master-0 kubenswrapper[28758]: I0223 14:45:11.061609 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-swrgd"] Feb 23 14:45:11.062384 master-0 kubenswrapper[28758]: E0223 14:45:11.062348 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a34af5db-bbf9-4a9d-a822-112be40e78fe" containerName="console" Feb 23 14:45:11.062384 master-0 kubenswrapper[28758]: I0223 14:45:11.062373 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="a34af5db-bbf9-4a9d-a822-112be40e78fe" containerName="console" Feb 23 14:45:11.062661 master-0 kubenswrapper[28758]: I0223 14:45:11.062594 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="a34af5db-bbf9-4a9d-a822-112be40e78fe" containerName="console" Feb 23 14:45:11.063184 master-0 kubenswrapper[28758]: I0223 14:45:11.063155 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-swrgd" Feb 23 14:45:11.073066 master-0 kubenswrapper[28758]: I0223 14:45:11.071276 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-swrgd"] Feb 23 14:45:11.152500 master-0 kubenswrapper[28758]: I0223 14:45:11.152402 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4pkm\" (UniqueName: \"kubernetes.io/projected/f652f5b1-6aeb-4b2e-9c69-f053fa17c6b7-kube-api-access-j4pkm\") pod \"openstack-operator-index-swrgd\" (UID: \"f652f5b1-6aeb-4b2e-9c69-f053fa17c6b7\") " pod="openstack-operators/openstack-operator-index-swrgd" Feb 23 14:45:11.253608 master-0 kubenswrapper[28758]: I0223 14:45:11.253559 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4pkm\" (UniqueName: \"kubernetes.io/projected/f652f5b1-6aeb-4b2e-9c69-f053fa17c6b7-kube-api-access-j4pkm\") pod \"openstack-operator-index-swrgd\" (UID: \"f652f5b1-6aeb-4b2e-9c69-f053fa17c6b7\") " pod="openstack-operators/openstack-operator-index-swrgd" Feb 23 14:45:11.268682 master-0 kubenswrapper[28758]: I0223 14:45:11.268651 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4pkm\" (UniqueName: \"kubernetes.io/projected/f652f5b1-6aeb-4b2e-9c69-f053fa17c6b7-kube-api-access-j4pkm\") pod \"openstack-operator-index-swrgd\" (UID: \"f652f5b1-6aeb-4b2e-9c69-f053fa17c6b7\") " pod="openstack-operators/openstack-operator-index-swrgd" Feb 23 14:45:11.433901 master-0 kubenswrapper[28758]: I0223 14:45:11.433863 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-swrgd" Feb 23 14:45:11.508927 master-0 kubenswrapper[28758]: I0223 14:45:11.508627 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-psmh5" Feb 23 14:45:11.578404 master-0 kubenswrapper[28758]: I0223 14:45:11.578341 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb9gr\" (UniqueName: \"kubernetes.io/projected/cbfdf8a3-4911-4e93-ba4f-1322ac364244-kube-api-access-kb9gr\") pod \"cbfdf8a3-4911-4e93-ba4f-1322ac364244\" (UID: \"cbfdf8a3-4911-4e93-ba4f-1322ac364244\") " Feb 23 14:45:11.581232 master-0 kubenswrapper[28758]: I0223 14:45:11.581185 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbfdf8a3-4911-4e93-ba4f-1322ac364244-kube-api-access-kb9gr" (OuterVolumeSpecName: "kube-api-access-kb9gr") pod "cbfdf8a3-4911-4e93-ba4f-1322ac364244" (UID: "cbfdf8a3-4911-4e93-ba4f-1322ac364244"). InnerVolumeSpecName "kube-api-access-kb9gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:45:11.682449 master-0 kubenswrapper[28758]: I0223 14:45:11.682398 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kb9gr\" (UniqueName: \"kubernetes.io/projected/cbfdf8a3-4911-4e93-ba4f-1322ac364244-kube-api-access-kb9gr\") on node \"master-0\" DevicePath \"\"" Feb 23 14:45:11.913187 master-0 kubenswrapper[28758]: I0223 14:45:11.912645 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-swrgd"] Feb 23 14:45:11.932532 master-0 kubenswrapper[28758]: I0223 14:45:11.932459 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-swrgd" event={"ID":"f652f5b1-6aeb-4b2e-9c69-f053fa17c6b7","Type":"ContainerStarted","Data":"32a2508fc6d1b4e3b1efa9d756e77595d29e504aa959dfe614798c2f4958a0fb"} Feb 23 14:45:11.934301 master-0 kubenswrapper[28758]: I0223 14:45:11.934253 28758 generic.go:334] "Generic (PLEG): container finished" podID="cbfdf8a3-4911-4e93-ba4f-1322ac364244" containerID="944bd45b0b555fd09bb12d1142c61be3bd5c9be962cca2d01c0ff2a02547ecee" exitCode=0 Feb 23 14:45:11.934301 master-0 kubenswrapper[28758]: I0223 14:45:11.934289 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-psmh5" Feb 23 14:45:11.934400 master-0 kubenswrapper[28758]: I0223 14:45:11.934328 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-psmh5" event={"ID":"cbfdf8a3-4911-4e93-ba4f-1322ac364244","Type":"ContainerDied","Data":"944bd45b0b555fd09bb12d1142c61be3bd5c9be962cca2d01c0ff2a02547ecee"} Feb 23 14:45:11.934400 master-0 kubenswrapper[28758]: I0223 14:45:11.934370 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-psmh5" event={"ID":"cbfdf8a3-4911-4e93-ba4f-1322ac364244","Type":"ContainerDied","Data":"7d4527b9c111a9de9f2235d2ca5ae077fd51885048655517d4e76a5882c66394"} Feb 23 14:45:11.934400 master-0 kubenswrapper[28758]: I0223 14:45:11.934391 28758 scope.go:117] "RemoveContainer" containerID="944bd45b0b555fd09bb12d1142c61be3bd5c9be962cca2d01c0ff2a02547ecee" Feb 23 14:45:11.950053 master-0 kubenswrapper[28758]: I0223 14:45:11.949588 28758 scope.go:117] "RemoveContainer" containerID="944bd45b0b555fd09bb12d1142c61be3bd5c9be962cca2d01c0ff2a02547ecee" Feb 23 14:45:11.950053 master-0 kubenswrapper[28758]: E0223 14:45:11.949985 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"944bd45b0b555fd09bb12d1142c61be3bd5c9be962cca2d01c0ff2a02547ecee\": container with ID starting with 944bd45b0b555fd09bb12d1142c61be3bd5c9be962cca2d01c0ff2a02547ecee not found: ID does not exist" containerID="944bd45b0b555fd09bb12d1142c61be3bd5c9be962cca2d01c0ff2a02547ecee" Feb 23 14:45:11.950053 master-0 kubenswrapper[28758]: I0223 14:45:11.950022 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"944bd45b0b555fd09bb12d1142c61be3bd5c9be962cca2d01c0ff2a02547ecee"} err="failed to get container status \"944bd45b0b555fd09bb12d1142c61be3bd5c9be962cca2d01c0ff2a02547ecee\": rpc error: code = NotFound desc = could not find container \"944bd45b0b555fd09bb12d1142c61be3bd5c9be962cca2d01c0ff2a02547ecee\": container with ID starting with 944bd45b0b555fd09bb12d1142c61be3bd5c9be962cca2d01c0ff2a02547ecee not found: ID does not exist" Feb 23 14:45:12.067043 master-0 kubenswrapper[28758]: I0223 14:45:12.066940 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-psmh5"] Feb 23 14:45:12.074862 master-0 kubenswrapper[28758]: I0223 14:45:12.074800 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-psmh5"] Feb 23 14:45:12.098057 master-0 kubenswrapper[28758]: I0223 14:45:12.097973 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbfdf8a3-4911-4e93-ba4f-1322ac364244" path="/var/lib/kubelet/pods/cbfdf8a3-4911-4e93-ba4f-1322ac364244/volumes" Feb 23 14:45:12.946284 master-0 kubenswrapper[28758]: I0223 14:45:12.946211 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-swrgd" event={"ID":"f652f5b1-6aeb-4b2e-9c69-f053fa17c6b7","Type":"ContainerStarted","Data":"bafa76113134c8bca150725e0e22f0c921e159584bec181ee4330d903c03ce45"} Feb 23 14:45:12.980888 master-0 kubenswrapper[28758]: I0223 14:45:12.979456 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-swrgd" podStartSLOduration=1.193583968 podStartE2EDuration="1.979414707s" podCreationTimestamp="2026-02-23 14:45:11 +0000 UTC" firstStartedPulling="2026-02-23 14:45:11.917269954 +0000 UTC m=+644.043585896" lastFinishedPulling="2026-02-23 14:45:12.703100703 +0000 UTC m=+644.829416635" observedRunningTime="2026-02-23 14:45:12.970056929 +0000 UTC m=+645.096372861" watchObservedRunningTime="2026-02-23 14:45:12.979414707 +0000 UTC m=+645.105730639" Feb 23 14:45:21.434693 master-0 kubenswrapper[28758]: I0223 14:45:21.434628 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-swrgd" Feb 23 14:45:21.435634 master-0 kubenswrapper[28758]: I0223 14:45:21.434765 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-swrgd" Feb 23 14:45:21.470398 master-0 kubenswrapper[28758]: I0223 14:45:21.470340 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-swrgd" Feb 23 14:45:22.056188 master-0 kubenswrapper[28758]: I0223 14:45:22.056138 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-swrgd" Feb 23 14:45:23.703672 master-0 kubenswrapper[28758]: I0223 14:45:23.703598 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5"] Feb 23 14:45:23.704239 master-0 kubenswrapper[28758]: E0223 14:45:23.704019 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbfdf8a3-4911-4e93-ba4f-1322ac364244" containerName="registry-server" Feb 23 14:45:23.704239 master-0 kubenswrapper[28758]: I0223 14:45:23.704035 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbfdf8a3-4911-4e93-ba4f-1322ac364244" containerName="registry-server" Feb 23 14:45:23.704334 master-0 kubenswrapper[28758]: I0223 14:45:23.704265 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbfdf8a3-4911-4e93-ba4f-1322ac364244" containerName="registry-server" Feb 23 14:45:23.705891 master-0 kubenswrapper[28758]: I0223 14:45:23.705856 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5" Feb 23 14:45:23.717803 master-0 kubenswrapper[28758]: I0223 14:45:23.717745 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5"] Feb 23 14:45:23.792064 master-0 kubenswrapper[28758]: I0223 14:45:23.792000 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/56066a12-1d5b-4e64-9227-290d644a6906-util\") pod \"11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5\" (UID: \"56066a12-1d5b-4e64-9227-290d644a6906\") " pod="openstack-operators/11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5" Feb 23 14:45:23.792362 master-0 kubenswrapper[28758]: I0223 14:45:23.792342 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7sfq\" (UniqueName: \"kubernetes.io/projected/56066a12-1d5b-4e64-9227-290d644a6906-kube-api-access-n7sfq\") pod \"11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5\" (UID: \"56066a12-1d5b-4e64-9227-290d644a6906\") " pod="openstack-operators/11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5" Feb 23 14:45:23.792550 master-0 kubenswrapper[28758]: I0223 14:45:23.792511 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/56066a12-1d5b-4e64-9227-290d644a6906-bundle\") pod \"11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5\" (UID: \"56066a12-1d5b-4e64-9227-290d644a6906\") " pod="openstack-operators/11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5" Feb 23 14:45:23.894670 master-0 kubenswrapper[28758]: I0223 14:45:23.893946 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/56066a12-1d5b-4e64-9227-290d644a6906-util\") pod \"11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5\" (UID: \"56066a12-1d5b-4e64-9227-290d644a6906\") " pod="openstack-operators/11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5" Feb 23 14:45:23.894670 master-0 kubenswrapper[28758]: I0223 14:45:23.894067 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7sfq\" (UniqueName: \"kubernetes.io/projected/56066a12-1d5b-4e64-9227-290d644a6906-kube-api-access-n7sfq\") pod \"11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5\" (UID: \"56066a12-1d5b-4e64-9227-290d644a6906\") " pod="openstack-operators/11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5" Feb 23 14:45:23.894670 master-0 kubenswrapper[28758]: I0223 14:45:23.894142 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/56066a12-1d5b-4e64-9227-290d644a6906-bundle\") pod \"11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5\" (UID: \"56066a12-1d5b-4e64-9227-290d644a6906\") " pod="openstack-operators/11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5" Feb 23 14:45:23.894670 master-0 kubenswrapper[28758]: I0223 14:45:23.894620 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/56066a12-1d5b-4e64-9227-290d644a6906-util\") pod \"11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5\" (UID: \"56066a12-1d5b-4e64-9227-290d644a6906\") " pod="openstack-operators/11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5" Feb 23 14:45:23.894670 master-0 kubenswrapper[28758]: I0223 14:45:23.894649 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/56066a12-1d5b-4e64-9227-290d644a6906-bundle\") pod \"11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5\" (UID: \"56066a12-1d5b-4e64-9227-290d644a6906\") " pod="openstack-operators/11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5" Feb 23 14:45:23.914710 master-0 kubenswrapper[28758]: I0223 14:45:23.914583 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7sfq\" (UniqueName: \"kubernetes.io/projected/56066a12-1d5b-4e64-9227-290d644a6906-kube-api-access-n7sfq\") pod \"11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5\" (UID: \"56066a12-1d5b-4e64-9227-290d644a6906\") " pod="openstack-operators/11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5" Feb 23 14:45:24.025789 master-0 kubenswrapper[28758]: I0223 14:45:24.025605 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5" Feb 23 14:45:24.499062 master-0 kubenswrapper[28758]: I0223 14:45:24.498977 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5"] Feb 23 14:45:24.517735 master-0 kubenswrapper[28758]: W0223 14:45:24.517668 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56066a12_1d5b_4e64_9227_290d644a6906.slice/crio-80d81f2f0972c12512a393f8fce95e96bea0e2d025614944066cc3c61fe71b89 WatchSource:0}: Error finding container 80d81f2f0972c12512a393f8fce95e96bea0e2d025614944066cc3c61fe71b89: Status 404 returned error can't find the container with id 80d81f2f0972c12512a393f8fce95e96bea0e2d025614944066cc3c61fe71b89 Feb 23 14:45:25.047151 master-0 kubenswrapper[28758]: I0223 14:45:25.047087 28758 generic.go:334] "Generic (PLEG): container finished" podID="56066a12-1d5b-4e64-9227-290d644a6906" containerID="af79444697a6cb7806d979a0caa8b3b67c80699e2c1aed01f80f0f92ed98367b" exitCode=0 Feb 23 14:45:25.047151 master-0 kubenswrapper[28758]: I0223 14:45:25.047139 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5" event={"ID":"56066a12-1d5b-4e64-9227-290d644a6906","Type":"ContainerDied","Data":"af79444697a6cb7806d979a0caa8b3b67c80699e2c1aed01f80f0f92ed98367b"} Feb 23 14:45:25.047151 master-0 kubenswrapper[28758]: I0223 14:45:25.047164 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5" event={"ID":"56066a12-1d5b-4e64-9227-290d644a6906","Type":"ContainerStarted","Data":"80d81f2f0972c12512a393f8fce95e96bea0e2d025614944066cc3c61fe71b89"} Feb 23 14:45:26.056637 master-0 kubenswrapper[28758]: I0223 14:45:26.056542 28758 generic.go:334] "Generic (PLEG): container finished" podID="56066a12-1d5b-4e64-9227-290d644a6906" containerID="b5002b7314774811390b5a2147866949760555ca3d49fb11e055f80a6311251c" exitCode=0 Feb 23 14:45:26.056637 master-0 kubenswrapper[28758]: I0223 14:45:26.056591 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5" event={"ID":"56066a12-1d5b-4e64-9227-290d644a6906","Type":"ContainerDied","Data":"b5002b7314774811390b5a2147866949760555ca3d49fb11e055f80a6311251c"} Feb 23 14:45:27.065667 master-0 kubenswrapper[28758]: I0223 14:45:27.065575 28758 generic.go:334] "Generic (PLEG): container finished" podID="56066a12-1d5b-4e64-9227-290d644a6906" containerID="6116ee834c214b9d6e052f9bef9f770c42a1307794526fe3424e08d9cee98134" exitCode=0 Feb 23 14:45:27.065667 master-0 kubenswrapper[28758]: I0223 14:45:27.065637 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5" event={"ID":"56066a12-1d5b-4e64-9227-290d644a6906","Type":"ContainerDied","Data":"6116ee834c214b9d6e052f9bef9f770c42a1307794526fe3424e08d9cee98134"} Feb 23 14:45:28.402345 master-0 kubenswrapper[28758]: I0223 14:45:28.402267 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5" Feb 23 14:45:28.483292 master-0 kubenswrapper[28758]: I0223 14:45:28.483215 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7sfq\" (UniqueName: \"kubernetes.io/projected/56066a12-1d5b-4e64-9227-290d644a6906-kube-api-access-n7sfq\") pod \"56066a12-1d5b-4e64-9227-290d644a6906\" (UID: \"56066a12-1d5b-4e64-9227-290d644a6906\") " Feb 23 14:45:28.483568 master-0 kubenswrapper[28758]: I0223 14:45:28.483536 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/56066a12-1d5b-4e64-9227-290d644a6906-util\") pod \"56066a12-1d5b-4e64-9227-290d644a6906\" (UID: \"56066a12-1d5b-4e64-9227-290d644a6906\") " Feb 23 14:45:28.483646 master-0 kubenswrapper[28758]: I0223 14:45:28.483587 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/56066a12-1d5b-4e64-9227-290d644a6906-bundle\") pod \"56066a12-1d5b-4e64-9227-290d644a6906\" (UID: \"56066a12-1d5b-4e64-9227-290d644a6906\") " Feb 23 14:45:28.484464 master-0 kubenswrapper[28758]: I0223 14:45:28.484418 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56066a12-1d5b-4e64-9227-290d644a6906-bundle" (OuterVolumeSpecName: "bundle") pod "56066a12-1d5b-4e64-9227-290d644a6906" (UID: "56066a12-1d5b-4e64-9227-290d644a6906"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:45:28.486064 master-0 kubenswrapper[28758]: I0223 14:45:28.486033 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56066a12-1d5b-4e64-9227-290d644a6906-kube-api-access-n7sfq" (OuterVolumeSpecName: "kube-api-access-n7sfq") pod "56066a12-1d5b-4e64-9227-290d644a6906" (UID: "56066a12-1d5b-4e64-9227-290d644a6906"). InnerVolumeSpecName "kube-api-access-n7sfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:45:28.496963 master-0 kubenswrapper[28758]: I0223 14:45:28.496942 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56066a12-1d5b-4e64-9227-290d644a6906-util" (OuterVolumeSpecName: "util") pod "56066a12-1d5b-4e64-9227-290d644a6906" (UID: "56066a12-1d5b-4e64-9227-290d644a6906"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:45:28.586778 master-0 kubenswrapper[28758]: I0223 14:45:28.586690 28758 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/56066a12-1d5b-4e64-9227-290d644a6906-util\") on node \"master-0\" DevicePath \"\"" Feb 23 14:45:28.586778 master-0 kubenswrapper[28758]: I0223 14:45:28.586769 28758 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/56066a12-1d5b-4e64-9227-290d644a6906-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:45:28.587088 master-0 kubenswrapper[28758]: I0223 14:45:28.586803 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7sfq\" (UniqueName: \"kubernetes.io/projected/56066a12-1d5b-4e64-9227-290d644a6906-kube-api-access-n7sfq\") on node \"master-0\" DevicePath \"\"" Feb 23 14:45:29.085478 master-0 kubenswrapper[28758]: I0223 14:45:29.085384 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5" event={"ID":"56066a12-1d5b-4e64-9227-290d644a6906","Type":"ContainerDied","Data":"80d81f2f0972c12512a393f8fce95e96bea0e2d025614944066cc3c61fe71b89"} Feb 23 14:45:29.085478 master-0 kubenswrapper[28758]: I0223 14:45:29.085469 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80d81f2f0972c12512a393f8fce95e96bea0e2d025614944066cc3c61fe71b89" Feb 23 14:45:29.085478 master-0 kubenswrapper[28758]: I0223 14:45:29.085401 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5" Feb 23 14:45:36.425568 master-0 kubenswrapper[28758]: I0223 14:45:36.425495 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-55c649df44-jq7t2"] Feb 23 14:45:36.426408 master-0 kubenswrapper[28758]: E0223 14:45:36.425930 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56066a12-1d5b-4e64-9227-290d644a6906" containerName="pull" Feb 23 14:45:36.426408 master-0 kubenswrapper[28758]: I0223 14:45:36.425948 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="56066a12-1d5b-4e64-9227-290d644a6906" containerName="pull" Feb 23 14:45:36.426408 master-0 kubenswrapper[28758]: E0223 14:45:36.425978 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56066a12-1d5b-4e64-9227-290d644a6906" containerName="extract" Feb 23 14:45:36.426408 master-0 kubenswrapper[28758]: I0223 14:45:36.425986 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="56066a12-1d5b-4e64-9227-290d644a6906" containerName="extract" Feb 23 14:45:36.426408 master-0 kubenswrapper[28758]: E0223 14:45:36.425996 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56066a12-1d5b-4e64-9227-290d644a6906" containerName="util" Feb 23 14:45:36.426408 master-0 kubenswrapper[28758]: I0223 14:45:36.426004 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="56066a12-1d5b-4e64-9227-290d644a6906" containerName="util" Feb 23 14:45:36.426408 master-0 kubenswrapper[28758]: I0223 14:45:36.426236 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="56066a12-1d5b-4e64-9227-290d644a6906" containerName="extract" Feb 23 14:45:36.426991 master-0 kubenswrapper[28758]: I0223 14:45:36.426962 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-55c649df44-jq7t2" Feb 23 14:45:36.455438 master-0 kubenswrapper[28758]: I0223 14:45:36.455367 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-55c649df44-jq7t2"] Feb 23 14:45:36.509916 master-0 kubenswrapper[28758]: I0223 14:45:36.509857 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd5qx\" (UniqueName: \"kubernetes.io/projected/2a582179-6ad7-46d4-8bac-01c14ddaaa09-kube-api-access-dd5qx\") pod \"openstack-operator-controller-init-55c649df44-jq7t2\" (UID: \"2a582179-6ad7-46d4-8bac-01c14ddaaa09\") " pod="openstack-operators/openstack-operator-controller-init-55c649df44-jq7t2" Feb 23 14:45:36.614501 master-0 kubenswrapper[28758]: I0223 14:45:36.614404 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd5qx\" (UniqueName: \"kubernetes.io/projected/2a582179-6ad7-46d4-8bac-01c14ddaaa09-kube-api-access-dd5qx\") pod \"openstack-operator-controller-init-55c649df44-jq7t2\" (UID: \"2a582179-6ad7-46d4-8bac-01c14ddaaa09\") " pod="openstack-operators/openstack-operator-controller-init-55c649df44-jq7t2" Feb 23 14:45:36.654752 master-0 kubenswrapper[28758]: I0223 14:45:36.654680 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd5qx\" (UniqueName: \"kubernetes.io/projected/2a582179-6ad7-46d4-8bac-01c14ddaaa09-kube-api-access-dd5qx\") pod \"openstack-operator-controller-init-55c649df44-jq7t2\" (UID: \"2a582179-6ad7-46d4-8bac-01c14ddaaa09\") " pod="openstack-operators/openstack-operator-controller-init-55c649df44-jq7t2" Feb 23 14:45:36.745010 master-0 kubenswrapper[28758]: I0223 14:45:36.744950 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-55c649df44-jq7t2" Feb 23 14:45:37.199850 master-0 kubenswrapper[28758]: I0223 14:45:37.199791 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-55c649df44-jq7t2"] Feb 23 14:45:37.216636 master-0 kubenswrapper[28758]: I0223 14:45:37.216361 28758 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 14:45:38.159540 master-0 kubenswrapper[28758]: I0223 14:45:38.159414 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-55c649df44-jq7t2" event={"ID":"2a582179-6ad7-46d4-8bac-01c14ddaaa09","Type":"ContainerStarted","Data":"25ba2c0470c838f1b74daebc44b9b3bdcc9428192c1b78a64fe4724c6b450023"} Feb 23 14:45:42.197235 master-0 kubenswrapper[28758]: I0223 14:45:42.197124 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-55c649df44-jq7t2" event={"ID":"2a582179-6ad7-46d4-8bac-01c14ddaaa09","Type":"ContainerStarted","Data":"d82835fbc35529128634216584c436522437d2f04dcd6dfca24f1e7b1e147e34"} Feb 23 14:45:42.197943 master-0 kubenswrapper[28758]: I0223 14:45:42.197253 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-55c649df44-jq7t2" Feb 23 14:45:42.379849 master-0 kubenswrapper[28758]: I0223 14:45:42.379738 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-55c649df44-jq7t2" podStartSLOduration=1.9581428189999999 podStartE2EDuration="6.379718002s" podCreationTimestamp="2026-02-23 14:45:36 +0000 UTC" firstStartedPulling="2026-02-23 14:45:37.216303958 +0000 UTC m=+669.342619890" lastFinishedPulling="2026-02-23 14:45:41.637879141 +0000 UTC m=+673.764195073" observedRunningTime="2026-02-23 14:45:42.377339119 +0000 UTC m=+674.503655071" watchObservedRunningTime="2026-02-23 14:45:42.379718002 +0000 UTC m=+674.506033934" Feb 23 14:45:46.750524 master-0 kubenswrapper[28758]: I0223 14:45:46.750443 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-55c649df44-jq7t2" Feb 23 14:46:08.010557 master-0 kubenswrapper[28758]: I0223 14:46:08.009893 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-9trkk"] Feb 23 14:46:08.012146 master-0 kubenswrapper[28758]: I0223 14:46:08.011337 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-9trkk" Feb 23 14:46:08.031495 master-0 kubenswrapper[28758]: I0223 14:46:08.028694 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-zgz2h"] Feb 23 14:46:08.031495 master-0 kubenswrapper[28758]: I0223 14:46:08.029870 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-zgz2h" Feb 23 14:46:08.049049 master-0 kubenswrapper[28758]: I0223 14:46:08.048555 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-9trkk"] Feb 23 14:46:08.058118 master-0 kubenswrapper[28758]: I0223 14:46:08.054516 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-zgz2h"] Feb 23 14:46:08.061830 master-0 kubenswrapper[28758]: I0223 14:46:08.061549 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-vd6sh"] Feb 23 14:46:08.063387 master-0 kubenswrapper[28758]: I0223 14:46:08.062747 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-vd6sh" Feb 23 14:46:08.111203 master-0 kubenswrapper[28758]: I0223 14:46:08.109657 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xbdk\" (UniqueName: \"kubernetes.io/projected/c9893b1e-ab36-4e25-93f9-fe364916347a-kube-api-access-9xbdk\") pod \"barbican-operator-controller-manager-868647ff47-9trkk\" (UID: \"c9893b1e-ab36-4e25-93f9-fe364916347a\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-9trkk" Feb 23 14:46:08.115542 master-0 kubenswrapper[28758]: I0223 14:46:08.114626 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mkh6\" (UniqueName: \"kubernetes.io/projected/b99e7588-8c4f-48ff-8de9-12eef47ee79b-kube-api-access-2mkh6\") pod \"cinder-operator-controller-manager-55d77d7b5c-zgz2h\" (UID: \"b99e7588-8c4f-48ff-8de9-12eef47ee79b\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-zgz2h" Feb 23 14:46:08.132985 master-0 kubenswrapper[28758]: I0223 14:46:08.132914 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-vd6sh"] Feb 23 14:46:08.132985 master-0 kubenswrapper[28758]: I0223 14:46:08.132980 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-96nsl"] Feb 23 14:46:08.148927 master-0 kubenswrapper[28758]: I0223 14:46:08.145422 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-96nsl"] Feb 23 14:46:08.148927 master-0 kubenswrapper[28758]: I0223 14:46:08.145579 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-96nsl" Feb 23 14:46:08.170069 master-0 kubenswrapper[28758]: I0223 14:46:08.163600 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-psqph"] Feb 23 14:46:08.170524 master-0 kubenswrapper[28758]: I0223 14:46:08.170461 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-psqph" Feb 23 14:46:08.172619 master-0 kubenswrapper[28758]: I0223 14:46:08.171266 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-psqph"] Feb 23 14:46:08.201713 master-0 kubenswrapper[28758]: I0223 14:46:08.200110 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-5kwfs"] Feb 23 14:46:08.201713 master-0 kubenswrapper[28758]: I0223 14:46:08.201401 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-5kwfs" Feb 23 14:46:08.224011 master-0 kubenswrapper[28758]: I0223 14:46:08.223921 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-5kwfs"] Feb 23 14:46:08.224931 master-0 kubenswrapper[28758]: I0223 14:46:08.224890 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mkh6\" (UniqueName: \"kubernetes.io/projected/b99e7588-8c4f-48ff-8de9-12eef47ee79b-kube-api-access-2mkh6\") pod \"cinder-operator-controller-manager-55d77d7b5c-zgz2h\" (UID: \"b99e7588-8c4f-48ff-8de9-12eef47ee79b\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-zgz2h" Feb 23 14:46:08.225184 master-0 kubenswrapper[28758]: I0223 14:46:08.225134 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdcjb\" (UniqueName: \"kubernetes.io/projected/5602bab5-e632-45c0-9a58-fca8c507ff8d-kube-api-access-pdcjb\") pod \"designate-operator-controller-manager-6d8bf5c495-vd6sh\" (UID: \"5602bab5-e632-45c0-9a58-fca8c507ff8d\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-vd6sh" Feb 23 14:46:08.226294 master-0 kubenswrapper[28758]: I0223 14:46:08.226262 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xbdk\" (UniqueName: \"kubernetes.io/projected/c9893b1e-ab36-4e25-93f9-fe364916347a-kube-api-access-9xbdk\") pod \"barbican-operator-controller-manager-868647ff47-9trkk\" (UID: \"c9893b1e-ab36-4e25-93f9-fe364916347a\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-9trkk" Feb 23 14:46:08.226367 master-0 kubenswrapper[28758]: I0223 14:46:08.226335 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt8gj\" (UniqueName: \"kubernetes.io/projected/14ce0057-4678-4bf7-bf28-256945f8a589-kube-api-access-bt8gj\") pod \"glance-operator-controller-manager-784b5bb6c5-96nsl\" (UID: \"14ce0057-4678-4bf7-bf28-256945f8a589\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-96nsl" Feb 23 14:46:08.237951 master-0 kubenswrapper[28758]: I0223 14:46:08.237630 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5f879c76b6-5zmbh"] Feb 23 14:46:08.238850 master-0 kubenswrapper[28758]: I0223 14:46:08.238814 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5f879c76b6-5zmbh" Feb 23 14:46:08.243825 master-0 kubenswrapper[28758]: I0223 14:46:08.243781 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 23 14:46:08.246968 master-0 kubenswrapper[28758]: I0223 14:46:08.246908 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xbdk\" (UniqueName: \"kubernetes.io/projected/c9893b1e-ab36-4e25-93f9-fe364916347a-kube-api-access-9xbdk\") pod \"barbican-operator-controller-manager-868647ff47-9trkk\" (UID: \"c9893b1e-ab36-4e25-93f9-fe364916347a\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-9trkk" Feb 23 14:46:08.256568 master-0 kubenswrapper[28758]: I0223 14:46:08.252089 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mkh6\" (UniqueName: \"kubernetes.io/projected/b99e7588-8c4f-48ff-8de9-12eef47ee79b-kube-api-access-2mkh6\") pod \"cinder-operator-controller-manager-55d77d7b5c-zgz2h\" (UID: \"b99e7588-8c4f-48ff-8de9-12eef47ee79b\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-zgz2h" Feb 23 14:46:08.279586 master-0 kubenswrapper[28758]: I0223 14:46:08.273683 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-dkqkb"] Feb 23 14:46:08.279586 master-0 kubenswrapper[28758]: I0223 14:46:08.276350 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-dkqkb" Feb 23 14:46:08.284846 master-0 kubenswrapper[28758]: I0223 14:46:08.284800 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5f879c76b6-5zmbh"] Feb 23 14:46:08.311954 master-0 kubenswrapper[28758]: I0223 14:46:08.311889 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-dkqkb"] Feb 23 14:46:08.331395 master-0 kubenswrapper[28758]: I0223 14:46:08.326153 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-zftk6"] Feb 23 14:46:08.331395 master-0 kubenswrapper[28758]: I0223 14:46:08.327583 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-zftk6" Feb 23 14:46:08.331395 master-0 kubenswrapper[28758]: I0223 14:46:08.328608 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89e18c7d-ffb4-44e5-b640-e26175c114e1-cert\") pod \"infra-operator-controller-manager-5f879c76b6-5zmbh\" (UID: \"89e18c7d-ffb4-44e5-b640-e26175c114e1\") " pod="openstack-operators/infra-operator-controller-manager-5f879c76b6-5zmbh" Feb 23 14:46:08.331395 master-0 kubenswrapper[28758]: I0223 14:46:08.328641 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcc5n\" (UniqueName: \"kubernetes.io/projected/13419178-05f6-4d41-be2b-2849b477ff68-kube-api-access-xcc5n\") pod \"heat-operator-controller-manager-69f49c598c-psqph\" (UID: \"13419178-05f6-4d41-be2b-2849b477ff68\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-psqph" Feb 23 14:46:08.331395 master-0 kubenswrapper[28758]: I0223 14:46:08.328665 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxcp8\" (UniqueName: \"kubernetes.io/projected/64270f21-553d-4155-b8e0-b45d5547285b-kube-api-access-rxcp8\") pod \"horizon-operator-controller-manager-5b9b8895d5-5kwfs\" (UID: \"64270f21-553d-4155-b8e0-b45d5547285b\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-5kwfs" Feb 23 14:46:08.331395 master-0 kubenswrapper[28758]: I0223 14:46:08.328759 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdcjb\" (UniqueName: \"kubernetes.io/projected/5602bab5-e632-45c0-9a58-fca8c507ff8d-kube-api-access-pdcjb\") pod \"designate-operator-controller-manager-6d8bf5c495-vd6sh\" (UID: \"5602bab5-e632-45c0-9a58-fca8c507ff8d\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-vd6sh" Feb 23 14:46:08.331395 master-0 kubenswrapper[28758]: I0223 14:46:08.329105 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnwr6\" (UniqueName: \"kubernetes.io/projected/89e18c7d-ffb4-44e5-b640-e26175c114e1-kube-api-access-lnwr6\") pod \"infra-operator-controller-manager-5f879c76b6-5zmbh\" (UID: \"89e18c7d-ffb4-44e5-b640-e26175c114e1\") " pod="openstack-operators/infra-operator-controller-manager-5f879c76b6-5zmbh" Feb 23 14:46:08.331395 master-0 kubenswrapper[28758]: I0223 14:46:08.329194 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt8gj\" (UniqueName: \"kubernetes.io/projected/14ce0057-4678-4bf7-bf28-256945f8a589-kube-api-access-bt8gj\") pod \"glance-operator-controller-manager-784b5bb6c5-96nsl\" (UID: \"14ce0057-4678-4bf7-bf28-256945f8a589\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-96nsl" Feb 23 14:46:08.342766 master-0 kubenswrapper[28758]: I0223 14:46:08.342706 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-9trkk" Feb 23 14:46:08.360635 master-0 kubenswrapper[28758]: I0223 14:46:08.360583 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-zftk6"] Feb 23 14:46:08.367748 master-0 kubenswrapper[28758]: I0223 14:46:08.367709 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-7kkjf"] Feb 23 14:46:08.368986 master-0 kubenswrapper[28758]: I0223 14:46:08.368955 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-7kkjf" Feb 23 14:46:08.384215 master-0 kubenswrapper[28758]: I0223 14:46:08.384170 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-xnn4r"] Feb 23 14:46:08.384787 master-0 kubenswrapper[28758]: I0223 14:46:08.384749 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-zgz2h" Feb 23 14:46:08.385366 master-0 kubenswrapper[28758]: I0223 14:46:08.385337 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-xnn4r" Feb 23 14:46:08.394022 master-0 kubenswrapper[28758]: I0223 14:46:08.393954 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-7kkjf"] Feb 23 14:46:08.401058 master-0 kubenswrapper[28758]: I0223 14:46:08.401009 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-xnn4r"] Feb 23 14:46:08.408179 master-0 kubenswrapper[28758]: I0223 14:46:08.408123 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-bsbx5"] Feb 23 14:46:08.409361 master-0 kubenswrapper[28758]: I0223 14:46:08.409333 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-bsbx5" Feb 23 14:46:08.426493 master-0 kubenswrapper[28758]: I0223 14:46:08.426408 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-bsbx5"] Feb 23 14:46:08.430785 master-0 kubenswrapper[28758]: I0223 14:46:08.430734 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89e18c7d-ffb4-44e5-b640-e26175c114e1-cert\") pod \"infra-operator-controller-manager-5f879c76b6-5zmbh\" (UID: \"89e18c7d-ffb4-44e5-b640-e26175c114e1\") " pod="openstack-operators/infra-operator-controller-manager-5f879c76b6-5zmbh" Feb 23 14:46:08.430899 master-0 kubenswrapper[28758]: I0223 14:46:08.430814 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcc5n\" (UniqueName: \"kubernetes.io/projected/13419178-05f6-4d41-be2b-2849b477ff68-kube-api-access-xcc5n\") pod \"heat-operator-controller-manager-69f49c598c-psqph\" (UID: \"13419178-05f6-4d41-be2b-2849b477ff68\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-psqph" Feb 23 14:46:08.430899 master-0 kubenswrapper[28758]: I0223 14:46:08.430866 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxcp8\" (UniqueName: \"kubernetes.io/projected/64270f21-553d-4155-b8e0-b45d5547285b-kube-api-access-rxcp8\") pod \"horizon-operator-controller-manager-5b9b8895d5-5kwfs\" (UID: \"64270f21-553d-4155-b8e0-b45d5547285b\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-5kwfs" Feb 23 14:46:08.431000 master-0 kubenswrapper[28758]: E0223 14:46:08.430885 28758 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 23 14:46:08.431000 master-0 kubenswrapper[28758]: E0223 14:46:08.430968 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89e18c7d-ffb4-44e5-b640-e26175c114e1-cert podName:89e18c7d-ffb4-44e5-b640-e26175c114e1 nodeName:}" failed. No retries permitted until 2026-02-23 14:46:08.93094649 +0000 UTC m=+701.057262422 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/89e18c7d-ffb4-44e5-b640-e26175c114e1-cert") pod "infra-operator-controller-manager-5f879c76b6-5zmbh" (UID: "89e18c7d-ffb4-44e5-b640-e26175c114e1") : secret "infra-operator-webhook-server-cert" not found Feb 23 14:46:08.431000 master-0 kubenswrapper[28758]: I0223 14:46:08.430900 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgmm7\" (UniqueName: \"kubernetes.io/projected/ada28ec1-a66b-4668-941c-c0f0cd424ee4-kube-api-access-fgmm7\") pod \"ironic-operator-controller-manager-554564d7fc-dkqkb\" (UID: \"ada28ec1-a66b-4668-941c-c0f0cd424ee4\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-dkqkb" Feb 23 14:46:08.431157 master-0 kubenswrapper[28758]: I0223 14:46:08.431127 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnwr6\" (UniqueName: \"kubernetes.io/projected/89e18c7d-ffb4-44e5-b640-e26175c114e1-kube-api-access-lnwr6\") pod \"infra-operator-controller-manager-5f879c76b6-5zmbh\" (UID: \"89e18c7d-ffb4-44e5-b640-e26175c114e1\") " pod="openstack-operators/infra-operator-controller-manager-5f879c76b6-5zmbh" Feb 23 14:46:08.431213 master-0 kubenswrapper[28758]: I0223 14:46:08.431200 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwsxn\" (UniqueName: \"kubernetes.io/projected/b00988a0-8239-4f73-832d-5e28c3afac6a-kube-api-access-wwsxn\") pod \"keystone-operator-controller-manager-b4d948c87-zftk6\" (UID: \"b00988a0-8239-4f73-832d-5e28c3afac6a\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-zftk6" Feb 23 14:46:08.437977 master-0 kubenswrapper[28758]: I0223 14:46:08.437883 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-bcvt8"] Feb 23 14:46:08.439523 master-0 kubenswrapper[28758]: I0223 14:46:08.439501 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-bcvt8" Feb 23 14:46:08.518965 master-0 kubenswrapper[28758]: I0223 14:46:08.518894 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt8gj\" (UniqueName: \"kubernetes.io/projected/14ce0057-4678-4bf7-bf28-256945f8a589-kube-api-access-bt8gj\") pod \"glance-operator-controller-manager-784b5bb6c5-96nsl\" (UID: \"14ce0057-4678-4bf7-bf28-256945f8a589\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-96nsl" Feb 23 14:46:08.521586 master-0 kubenswrapper[28758]: I0223 14:46:08.521553 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2lsqj"] Feb 23 14:46:08.522760 master-0 kubenswrapper[28758]: I0223 14:46:08.522733 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2lsqj" Feb 23 14:46:08.532078 master-0 kubenswrapper[28758]: I0223 14:46:08.532020 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-bcvt8"] Feb 23 14:46:08.535245 master-0 kubenswrapper[28758]: I0223 14:46:08.533504 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r6lc\" (UniqueName: \"kubernetes.io/projected/eb045dcf-8041-4baf-a750-6ad9e951dd65-kube-api-access-4r6lc\") pod \"nova-operator-controller-manager-567668f5cf-bcvt8\" (UID: \"eb045dcf-8041-4baf-a750-6ad9e951dd65\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-bcvt8" Feb 23 14:46:08.535245 master-0 kubenswrapper[28758]: I0223 14:46:08.533538 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6q2b\" (UniqueName: \"kubernetes.io/projected/f257fd05-c591-4324-94b4-8f87a7741118-kube-api-access-z6q2b\") pod \"mariadb-operator-controller-manager-6994f66f48-xnn4r\" (UID: \"f257fd05-c591-4324-94b4-8f87a7741118\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-xnn4r" Feb 23 14:46:08.535245 master-0 kubenswrapper[28758]: I0223 14:46:08.534102 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwsxn\" (UniqueName: \"kubernetes.io/projected/b00988a0-8239-4f73-832d-5e28c3afac6a-kube-api-access-wwsxn\") pod \"keystone-operator-controller-manager-b4d948c87-zftk6\" (UID: \"b00988a0-8239-4f73-832d-5e28c3afac6a\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-zftk6" Feb 23 14:46:08.535245 master-0 kubenswrapper[28758]: I0223 14:46:08.534164 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxrrm\" (UniqueName: \"kubernetes.io/projected/63f4cc5c-7ea4-40dd-8ada-a8508d600f2a-kube-api-access-xxrrm\") pod \"neutron-operator-controller-manager-6bd4687957-bsbx5\" (UID: \"63f4cc5c-7ea4-40dd-8ada-a8508d600f2a\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-bsbx5" Feb 23 14:46:08.535245 master-0 kubenswrapper[28758]: I0223 14:46:08.534278 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgmm7\" (UniqueName: \"kubernetes.io/projected/ada28ec1-a66b-4668-941c-c0f0cd424ee4-kube-api-access-fgmm7\") pod \"ironic-operator-controller-manager-554564d7fc-dkqkb\" (UID: \"ada28ec1-a66b-4668-941c-c0f0cd424ee4\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-dkqkb" Feb 23 14:46:08.535245 master-0 kubenswrapper[28758]: I0223 14:46:08.534416 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frtdx\" (UniqueName: \"kubernetes.io/projected/61cee583-7aa7-483b-b0e4-96f48d26a940-kube-api-access-frtdx\") pod \"manila-operator-controller-manager-67d996989d-7kkjf\" (UID: \"61cee583-7aa7-483b-b0e4-96f48d26a940\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-7kkjf" Feb 23 14:46:08.536341 master-0 kubenswrapper[28758]: I0223 14:46:08.536170 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcc5n\" (UniqueName: \"kubernetes.io/projected/13419178-05f6-4d41-be2b-2849b477ff68-kube-api-access-xcc5n\") pod \"heat-operator-controller-manager-69f49c598c-psqph\" (UID: \"13419178-05f6-4d41-be2b-2849b477ff68\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-psqph" Feb 23 14:46:08.536341 master-0 kubenswrapper[28758]: I0223 14:46:08.536208 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxcp8\" (UniqueName: \"kubernetes.io/projected/64270f21-553d-4155-b8e0-b45d5547285b-kube-api-access-rxcp8\") pod \"horizon-operator-controller-manager-5b9b8895d5-5kwfs\" (UID: \"64270f21-553d-4155-b8e0-b45d5547285b\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-5kwfs" Feb 23 14:46:08.536341 master-0 kubenswrapper[28758]: I0223 14:46:08.536213 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnwr6\" (UniqueName: \"kubernetes.io/projected/89e18c7d-ffb4-44e5-b640-e26175c114e1-kube-api-access-lnwr6\") pod \"infra-operator-controller-manager-5f879c76b6-5zmbh\" (UID: \"89e18c7d-ffb4-44e5-b640-e26175c114e1\") " pod="openstack-operators/infra-operator-controller-manager-5f879c76b6-5zmbh" Feb 23 14:46:08.546543 master-0 kubenswrapper[28758]: I0223 14:46:08.546127 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-5kwfs" Feb 23 14:46:08.548564 master-0 kubenswrapper[28758]: I0223 14:46:08.547162 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdcjb\" (UniqueName: \"kubernetes.io/projected/5602bab5-e632-45c0-9a58-fca8c507ff8d-kube-api-access-pdcjb\") pod \"designate-operator-controller-manager-6d8bf5c495-vd6sh\" (UID: \"5602bab5-e632-45c0-9a58-fca8c507ff8d\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-vd6sh" Feb 23 14:46:08.553602 master-0 kubenswrapper[28758]: I0223 14:46:08.552977 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2lsqj"] Feb 23 14:46:08.592450 master-0 kubenswrapper[28758]: I0223 14:46:08.586388 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwsxn\" (UniqueName: \"kubernetes.io/projected/b00988a0-8239-4f73-832d-5e28c3afac6a-kube-api-access-wwsxn\") pod \"keystone-operator-controller-manager-b4d948c87-zftk6\" (UID: \"b00988a0-8239-4f73-832d-5e28c3afac6a\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-zftk6" Feb 23 14:46:08.592450 master-0 kubenswrapper[28758]: I0223 14:46:08.591226 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgmm7\" (UniqueName: \"kubernetes.io/projected/ada28ec1-a66b-4668-941c-c0f0cd424ee4-kube-api-access-fgmm7\") pod \"ironic-operator-controller-manager-554564d7fc-dkqkb\" (UID: \"ada28ec1-a66b-4668-941c-c0f0cd424ee4\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-dkqkb" Feb 23 14:46:08.595170 master-0 kubenswrapper[28758]: I0223 14:46:08.595113 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-579b7786b9qh5rv"] Feb 23 14:46:08.596706 master-0 kubenswrapper[28758]: I0223 14:46:08.596589 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-579b7786b9qh5rv" Feb 23 14:46:08.599876 master-0 kubenswrapper[28758]: I0223 14:46:08.599841 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 23 14:46:08.621707 master-0 kubenswrapper[28758]: I0223 14:46:08.620692 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-wll2v"] Feb 23 14:46:08.621966 master-0 kubenswrapper[28758]: I0223 14:46:08.621922 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-wll2v" Feb 23 14:46:08.639902 master-0 kubenswrapper[28758]: I0223 14:46:08.639839 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frtdx\" (UniqueName: \"kubernetes.io/projected/61cee583-7aa7-483b-b0e4-96f48d26a940-kube-api-access-frtdx\") pod \"manila-operator-controller-manager-67d996989d-7kkjf\" (UID: \"61cee583-7aa7-483b-b0e4-96f48d26a940\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-7kkjf" Feb 23 14:46:08.640028 master-0 kubenswrapper[28758]: I0223 14:46:08.640003 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r6lc\" (UniqueName: \"kubernetes.io/projected/eb045dcf-8041-4baf-a750-6ad9e951dd65-kube-api-access-4r6lc\") pod \"nova-operator-controller-manager-567668f5cf-bcvt8\" (UID: \"eb045dcf-8041-4baf-a750-6ad9e951dd65\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-bcvt8" Feb 23 14:46:08.640093 master-0 kubenswrapper[28758]: I0223 14:46:08.640040 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6q2b\" (UniqueName: \"kubernetes.io/projected/f257fd05-c591-4324-94b4-8f87a7741118-kube-api-access-z6q2b\") pod \"mariadb-operator-controller-manager-6994f66f48-xnn4r\" (UID: \"f257fd05-c591-4324-94b4-8f87a7741118\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-xnn4r" Feb 23 14:46:08.640131 master-0 kubenswrapper[28758]: I0223 14:46:08.640086 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cts7m\" (UniqueName: \"kubernetes.io/projected/0e1182a2-d37e-45d6-9457-af639fc42480-kube-api-access-cts7m\") pod \"octavia-operator-controller-manager-659dc6bbfc-2lsqj\" (UID: \"0e1182a2-d37e-45d6-9457-af639fc42480\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2lsqj" Feb 23 14:46:08.640188 master-0 kubenswrapper[28758]: I0223 14:46:08.640135 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxrrm\" (UniqueName: \"kubernetes.io/projected/63f4cc5c-7ea4-40dd-8ada-a8508d600f2a-kube-api-access-xxrrm\") pod \"neutron-operator-controller-manager-6bd4687957-bsbx5\" (UID: \"63f4cc5c-7ea4-40dd-8ada-a8508d600f2a\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-bsbx5" Feb 23 14:46:08.645302 master-0 kubenswrapper[28758]: I0223 14:46:08.644077 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-579b7786b9qh5rv"] Feb 23 14:46:08.674314 master-0 kubenswrapper[28758]: I0223 14:46:08.668802 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-dkqkb" Feb 23 14:46:08.680008 master-0 kubenswrapper[28758]: I0223 14:46:08.679720 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-gtxbn"] Feb 23 14:46:08.681180 master-0 kubenswrapper[28758]: I0223 14:46:08.681132 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-zftk6" Feb 23 14:46:08.682332 master-0 kubenswrapper[28758]: I0223 14:46:08.682289 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-gtxbn" Feb 23 14:46:08.701269 master-0 kubenswrapper[28758]: I0223 14:46:08.698440 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6q2b\" (UniqueName: \"kubernetes.io/projected/f257fd05-c591-4324-94b4-8f87a7741118-kube-api-access-z6q2b\") pod \"mariadb-operator-controller-manager-6994f66f48-xnn4r\" (UID: \"f257fd05-c591-4324-94b4-8f87a7741118\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-xnn4r" Feb 23 14:46:08.704914 master-0 kubenswrapper[28758]: I0223 14:46:08.704873 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r6lc\" (UniqueName: \"kubernetes.io/projected/eb045dcf-8041-4baf-a750-6ad9e951dd65-kube-api-access-4r6lc\") pod \"nova-operator-controller-manager-567668f5cf-bcvt8\" (UID: \"eb045dcf-8041-4baf-a750-6ad9e951dd65\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-bcvt8" Feb 23 14:46:08.706419 master-0 kubenswrapper[28758]: I0223 14:46:08.706387 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frtdx\" (UniqueName: \"kubernetes.io/projected/61cee583-7aa7-483b-b0e4-96f48d26a940-kube-api-access-frtdx\") pod \"manila-operator-controller-manager-67d996989d-7kkjf\" (UID: \"61cee583-7aa7-483b-b0e4-96f48d26a940\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-7kkjf" Feb 23 14:46:08.713536 master-0 kubenswrapper[28758]: I0223 14:46:08.712777 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxrrm\" (UniqueName: \"kubernetes.io/projected/63f4cc5c-7ea4-40dd-8ada-a8508d600f2a-kube-api-access-xxrrm\") pod \"neutron-operator-controller-manager-6bd4687957-bsbx5\" (UID: \"63f4cc5c-7ea4-40dd-8ada-a8508d600f2a\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-bsbx5" Feb 23 14:46:08.723128 master-0 kubenswrapper[28758]: I0223 14:46:08.722972 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-wll2v"] Feb 23 14:46:08.723834 master-0 kubenswrapper[28758]: I0223 14:46:08.723462 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-vd6sh" Feb 23 14:46:08.731330 master-0 kubenswrapper[28758]: I0223 14:46:08.731285 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-bcvt8" Feb 23 14:46:08.771553 master-0 kubenswrapper[28758]: I0223 14:46:08.761249 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkbx4\" (UniqueName: \"kubernetes.io/projected/b2991a48-1675-489b-b719-b366221c255f-kube-api-access-nkbx4\") pod \"openstack-baremetal-operator-controller-manager-579b7786b9qh5rv\" (UID: \"b2991a48-1675-489b-b719-b366221c255f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-579b7786b9qh5rv" Feb 23 14:46:08.771553 master-0 kubenswrapper[28758]: I0223 14:46:08.761334 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfj64\" (UniqueName: \"kubernetes.io/projected/6520dae0-e31c-40d1-a562-6e0631553146-kube-api-access-rfj64\") pod \"ovn-operator-controller-manager-5955d8c787-wll2v\" (UID: \"6520dae0-e31c-40d1-a562-6e0631553146\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-wll2v" Feb 23 14:46:08.771553 master-0 kubenswrapper[28758]: I0223 14:46:08.761368 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2991a48-1675-489b-b719-b366221c255f-cert\") pod \"openstack-baremetal-operator-controller-manager-579b7786b9qh5rv\" (UID: \"b2991a48-1675-489b-b719-b366221c255f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-579b7786b9qh5rv" Feb 23 14:46:08.771553 master-0 kubenswrapper[28758]: I0223 14:46:08.764314 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cts7m\" (UniqueName: \"kubernetes.io/projected/0e1182a2-d37e-45d6-9457-af639fc42480-kube-api-access-cts7m\") pod \"octavia-operator-controller-manager-659dc6bbfc-2lsqj\" (UID: \"0e1182a2-d37e-45d6-9457-af639fc42480\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2lsqj" Feb 23 14:46:08.771553 master-0 kubenswrapper[28758]: I0223 14:46:08.767156 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-gtxbn"] Feb 23 14:46:08.800664 master-0 kubenswrapper[28758]: I0223 14:46:08.800333 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cts7m\" (UniqueName: \"kubernetes.io/projected/0e1182a2-d37e-45d6-9457-af639fc42480-kube-api-access-cts7m\") pod \"octavia-operator-controller-manager-659dc6bbfc-2lsqj\" (UID: \"0e1182a2-d37e-45d6-9457-af639fc42480\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2lsqj" Feb 23 14:46:08.812552 master-0 kubenswrapper[28758]: I0223 14:46:08.807781 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2lsqj" Feb 23 14:46:08.816456 master-0 kubenswrapper[28758]: I0223 14:46:08.816145 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-96nsl" Feb 23 14:46:08.822271 master-0 kubenswrapper[28758]: I0223 14:46:08.822228 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-xnn4r" Feb 23 14:46:08.825588 master-0 kubenswrapper[28758]: I0223 14:46:08.825532 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-psqph" Feb 23 14:46:08.840987 master-0 kubenswrapper[28758]: I0223 14:46:08.840724 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-jjtxv"] Feb 23 14:46:08.843237 master-0 kubenswrapper[28758]: I0223 14:46:08.843197 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-jjtxv" Feb 23 14:46:08.865130 master-0 kubenswrapper[28758]: I0223 14:46:08.865065 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-jjtxv"] Feb 23 14:46:08.866245 master-0 kubenswrapper[28758]: I0223 14:46:08.866186 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h5cd\" (UniqueName: \"kubernetes.io/projected/8cc75447-5114-451b-a0c7-accd382d82eb-kube-api-access-7h5cd\") pod \"placement-operator-controller-manager-8497b45c89-gtxbn\" (UID: \"8cc75447-5114-451b-a0c7-accd382d82eb\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-gtxbn" Feb 23 14:46:08.866363 master-0 kubenswrapper[28758]: I0223 14:46:08.866325 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkbx4\" (UniqueName: \"kubernetes.io/projected/b2991a48-1675-489b-b719-b366221c255f-kube-api-access-nkbx4\") pod \"openstack-baremetal-operator-controller-manager-579b7786b9qh5rv\" (UID: \"b2991a48-1675-489b-b719-b366221c255f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-579b7786b9qh5rv" Feb 23 14:46:08.866412 master-0 kubenswrapper[28758]: I0223 14:46:08.866369 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfj64\" (UniqueName: \"kubernetes.io/projected/6520dae0-e31c-40d1-a562-6e0631553146-kube-api-access-rfj64\") pod \"ovn-operator-controller-manager-5955d8c787-wll2v\" (UID: \"6520dae0-e31c-40d1-a562-6e0631553146\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-wll2v" Feb 23 14:46:08.866412 master-0 kubenswrapper[28758]: I0223 14:46:08.866401 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2991a48-1675-489b-b719-b366221c255f-cert\") pod \"openstack-baremetal-operator-controller-manager-579b7786b9qh5rv\" (UID: \"b2991a48-1675-489b-b719-b366221c255f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-579b7786b9qh5rv" Feb 23 14:46:08.867019 master-0 kubenswrapper[28758]: E0223 14:46:08.866788 28758 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 14:46:08.867019 master-0 kubenswrapper[28758]: E0223 14:46:08.866872 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2991a48-1675-489b-b719-b366221c255f-cert podName:b2991a48-1675-489b-b719-b366221c255f nodeName:}" failed. No retries permitted until 2026-02-23 14:46:09.36685175 +0000 UTC m=+701.493167682 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b2991a48-1675-489b-b719-b366221c255f-cert") pod "openstack-baremetal-operator-controller-manager-579b7786b9qh5rv" (UID: "b2991a48-1675-489b-b719-b366221c255f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 14:46:08.894926 master-0 kubenswrapper[28758]: I0223 14:46:08.894000 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkbx4\" (UniqueName: \"kubernetes.io/projected/b2991a48-1675-489b-b719-b366221c255f-kube-api-access-nkbx4\") pod \"openstack-baremetal-operator-controller-manager-579b7786b9qh5rv\" (UID: \"b2991a48-1675-489b-b719-b366221c255f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-579b7786b9qh5rv" Feb 23 14:46:08.906188 master-0 kubenswrapper[28758]: I0223 14:46:08.904588 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-bsbx5" Feb 23 14:46:08.914122 master-0 kubenswrapper[28758]: I0223 14:46:08.914048 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-589c568786-44k55"] Feb 23 14:46:08.925442 master-0 kubenswrapper[28758]: I0223 14:46:08.915674 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-44k55" Feb 23 14:46:08.925442 master-0 kubenswrapper[28758]: I0223 14:46:08.923471 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfj64\" (UniqueName: \"kubernetes.io/projected/6520dae0-e31c-40d1-a562-6e0631553146-kube-api-access-rfj64\") pod \"ovn-operator-controller-manager-5955d8c787-wll2v\" (UID: \"6520dae0-e31c-40d1-a562-6e0631553146\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-wll2v" Feb 23 14:46:08.942974 master-0 kubenswrapper[28758]: I0223 14:46:08.940094 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-589c568786-44k55"] Feb 23 14:46:08.965507 master-0 kubenswrapper[28758]: I0223 14:46:08.964950 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-n9hpv"] Feb 23 14:46:08.968493 master-0 kubenswrapper[28758]: I0223 14:46:08.968381 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-n9hpv" Feb 23 14:46:08.974776 master-0 kubenswrapper[28758]: I0223 14:46:08.973612 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-n9hpv"] Feb 23 14:46:08.974776 master-0 kubenswrapper[28758]: I0223 14:46:08.974701 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89e18c7d-ffb4-44e5-b640-e26175c114e1-cert\") pod \"infra-operator-controller-manager-5f879c76b6-5zmbh\" (UID: \"89e18c7d-ffb4-44e5-b640-e26175c114e1\") " pod="openstack-operators/infra-operator-controller-manager-5f879c76b6-5zmbh" Feb 23 14:46:08.974776 master-0 kubenswrapper[28758]: I0223 14:46:08.974756 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn8lg\" (UniqueName: \"kubernetes.io/projected/0114c0ec-3af5-4d4e-adac-34f5471c64ce-kube-api-access-rn8lg\") pod \"swift-operator-controller-manager-68f46476f-jjtxv\" (UID: \"0114c0ec-3af5-4d4e-adac-34f5471c64ce\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-jjtxv" Feb 23 14:46:08.975024 master-0 kubenswrapper[28758]: I0223 14:46:08.974835 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h5cd\" (UniqueName: \"kubernetes.io/projected/8cc75447-5114-451b-a0c7-accd382d82eb-kube-api-access-7h5cd\") pod \"placement-operator-controller-manager-8497b45c89-gtxbn\" (UID: \"8cc75447-5114-451b-a0c7-accd382d82eb\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-gtxbn" Feb 23 14:46:08.977597 master-0 kubenswrapper[28758]: E0223 14:46:08.975827 28758 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 23 14:46:08.977597 master-0 kubenswrapper[28758]: E0223 14:46:08.975887 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89e18c7d-ffb4-44e5-b640-e26175c114e1-cert podName:89e18c7d-ffb4-44e5-b640-e26175c114e1 nodeName:}" failed. No retries permitted until 2026-02-23 14:46:09.975869144 +0000 UTC m=+702.102185086 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/89e18c7d-ffb4-44e5-b640-e26175c114e1-cert") pod "infra-operator-controller-manager-5f879c76b6-5zmbh" (UID: "89e18c7d-ffb4-44e5-b640-e26175c114e1") : secret "infra-operator-webhook-server-cert" not found Feb 23 14:46:08.988953 master-0 kubenswrapper[28758]: I0223 14:46:08.988826 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-7kkjf" Feb 23 14:46:09.000420 master-0 kubenswrapper[28758]: I0223 14:46:08.999294 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-wll2v" Feb 23 14:46:09.013861 master-0 kubenswrapper[28758]: I0223 14:46:09.009194 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h5cd\" (UniqueName: \"kubernetes.io/projected/8cc75447-5114-451b-a0c7-accd382d82eb-kube-api-access-7h5cd\") pod \"placement-operator-controller-manager-8497b45c89-gtxbn\" (UID: \"8cc75447-5114-451b-a0c7-accd382d82eb\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-gtxbn" Feb 23 14:46:09.022821 master-0 kubenswrapper[28758]: I0223 14:46:09.022745 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-gtxbn" Feb 23 14:46:09.060656 master-0 kubenswrapper[28758]: I0223 14:46:09.056525 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-pxqtf"] Feb 23 14:46:09.060656 master-0 kubenswrapper[28758]: I0223 14:46:09.058187 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pxqtf" Feb 23 14:46:09.077198 master-0 kubenswrapper[28758]: I0223 14:46:09.076083 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7dfn\" (UniqueName: \"kubernetes.io/projected/b496c821-47a3-4f07-838c-54560eebf847-kube-api-access-x7dfn\") pod \"telemetry-operator-controller-manager-589c568786-44k55\" (UID: \"b496c821-47a3-4f07-838c-54560eebf847\") " pod="openstack-operators/telemetry-operator-controller-manager-589c568786-44k55" Feb 23 14:46:09.077198 master-0 kubenswrapper[28758]: I0223 14:46:09.076213 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jl4q\" (UniqueName: \"kubernetes.io/projected/10b5a636-69f5-4828-8e1e-9a3a598c28aa-kube-api-access-9jl4q\") pod \"test-operator-controller-manager-5dc6794d5b-n9hpv\" (UID: \"10b5a636-69f5-4828-8e1e-9a3a598c28aa\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-n9hpv" Feb 23 14:46:09.077198 master-0 kubenswrapper[28758]: I0223 14:46:09.076267 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn8lg\" (UniqueName: \"kubernetes.io/projected/0114c0ec-3af5-4d4e-adac-34f5471c64ce-kube-api-access-rn8lg\") pod \"swift-operator-controller-manager-68f46476f-jjtxv\" (UID: \"0114c0ec-3af5-4d4e-adac-34f5471c64ce\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-jjtxv" Feb 23 14:46:09.131724 master-0 kubenswrapper[28758]: I0223 14:46:09.131300 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn8lg\" (UniqueName: \"kubernetes.io/projected/0114c0ec-3af5-4d4e-adac-34f5471c64ce-kube-api-access-rn8lg\") pod \"swift-operator-controller-manager-68f46476f-jjtxv\" (UID: \"0114c0ec-3af5-4d4e-adac-34f5471c64ce\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-jjtxv" Feb 23 14:46:09.136839 master-0 kubenswrapper[28758]: I0223 14:46:09.136653 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-pxqtf"] Feb 23 14:46:09.182573 master-0 kubenswrapper[28758]: I0223 14:46:09.179386 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d6hf\" (UniqueName: \"kubernetes.io/projected/19dbaca0-2912-45da-bc06-8050bd931706-kube-api-access-6d6hf\") pod \"watcher-operator-controller-manager-bccc79885-pxqtf\" (UID: \"19dbaca0-2912-45da-bc06-8050bd931706\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pxqtf" Feb 23 14:46:09.182573 master-0 kubenswrapper[28758]: I0223 14:46:09.179526 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jl4q\" (UniqueName: \"kubernetes.io/projected/10b5a636-69f5-4828-8e1e-9a3a598c28aa-kube-api-access-9jl4q\") pod \"test-operator-controller-manager-5dc6794d5b-n9hpv\" (UID: \"10b5a636-69f5-4828-8e1e-9a3a598c28aa\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-n9hpv" Feb 23 14:46:09.182573 master-0 kubenswrapper[28758]: I0223 14:46:09.179969 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7dfn\" (UniqueName: \"kubernetes.io/projected/b496c821-47a3-4f07-838c-54560eebf847-kube-api-access-x7dfn\") pod \"telemetry-operator-controller-manager-589c568786-44k55\" (UID: \"b496c821-47a3-4f07-838c-54560eebf847\") " pod="openstack-operators/telemetry-operator-controller-manager-589c568786-44k55" Feb 23 14:46:09.215836 master-0 kubenswrapper[28758]: I0223 14:46:09.214328 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7dfn\" (UniqueName: \"kubernetes.io/projected/b496c821-47a3-4f07-838c-54560eebf847-kube-api-access-x7dfn\") pod \"telemetry-operator-controller-manager-589c568786-44k55\" (UID: \"b496c821-47a3-4f07-838c-54560eebf847\") " pod="openstack-operators/telemetry-operator-controller-manager-589c568786-44k55" Feb 23 14:46:09.215836 master-0 kubenswrapper[28758]: I0223 14:46:09.215357 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jl4q\" (UniqueName: \"kubernetes.io/projected/10b5a636-69f5-4828-8e1e-9a3a598c28aa-kube-api-access-9jl4q\") pod \"test-operator-controller-manager-5dc6794d5b-n9hpv\" (UID: \"10b5a636-69f5-4828-8e1e-9a3a598c28aa\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-n9hpv" Feb 23 14:46:09.251509 master-0 kubenswrapper[28758]: I0223 14:46:09.250080 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5dc486cffc-dtpjj"] Feb 23 14:46:09.254998 master-0 kubenswrapper[28758]: I0223 14:46:09.252284 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5dc486cffc-dtpjj" Feb 23 14:46:09.256002 master-0 kubenswrapper[28758]: I0223 14:46:09.255961 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 23 14:46:09.256186 master-0 kubenswrapper[28758]: I0223 14:46:09.256165 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 23 14:46:09.264706 master-0 kubenswrapper[28758]: I0223 14:46:09.260417 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5dc486cffc-dtpjj"] Feb 23 14:46:09.282183 master-0 kubenswrapper[28758]: I0223 14:46:09.281937 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d6hf\" (UniqueName: \"kubernetes.io/projected/19dbaca0-2912-45da-bc06-8050bd931706-kube-api-access-6d6hf\") pod \"watcher-operator-controller-manager-bccc79885-pxqtf\" (UID: \"19dbaca0-2912-45da-bc06-8050bd931706\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pxqtf" Feb 23 14:46:09.292990 master-0 kubenswrapper[28758]: I0223 14:46:09.292550 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-9trkk"] Feb 23 14:46:09.299494 master-0 kubenswrapper[28758]: I0223 14:46:09.297739 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mlb5b"] Feb 23 14:46:09.299494 master-0 kubenswrapper[28758]: I0223 14:46:09.299051 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mlb5b" Feb 23 14:46:09.317559 master-0 kubenswrapper[28758]: I0223 14:46:09.316864 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mlb5b"] Feb 23 14:46:09.326520 master-0 kubenswrapper[28758]: I0223 14:46:09.325291 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d6hf\" (UniqueName: \"kubernetes.io/projected/19dbaca0-2912-45da-bc06-8050bd931706-kube-api-access-6d6hf\") pod \"watcher-operator-controller-manager-bccc79885-pxqtf\" (UID: \"19dbaca0-2912-45da-bc06-8050bd931706\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pxqtf" Feb 23 14:46:09.330937 master-0 kubenswrapper[28758]: I0223 14:46:09.330853 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-5kwfs"] Feb 23 14:46:09.354505 master-0 kubenswrapper[28758]: I0223 14:46:09.349140 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-zgz2h"] Feb 23 14:46:09.385894 master-0 kubenswrapper[28758]: I0223 14:46:09.385290 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2991a48-1675-489b-b719-b366221c255f-cert\") pod \"openstack-baremetal-operator-controller-manager-579b7786b9qh5rv\" (UID: \"b2991a48-1675-489b-b719-b366221c255f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-579b7786b9qh5rv" Feb 23 14:46:09.385894 master-0 kubenswrapper[28758]: I0223 14:46:09.385400 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-webhook-certs\") pod \"openstack-operator-controller-manager-5dc486cffc-dtpjj\" (UID: \"98eda798-728f-481c-bb71-8a12ad8a3c37\") " pod="openstack-operators/openstack-operator-controller-manager-5dc486cffc-dtpjj" Feb 23 14:46:09.385894 master-0 kubenswrapper[28758]: E0223 14:46:09.385432 28758 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 14:46:09.385894 master-0 kubenswrapper[28758]: I0223 14:46:09.385460 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnwv2\" (UniqueName: \"kubernetes.io/projected/98eda798-728f-481c-bb71-8a12ad8a3c37-kube-api-access-xnwv2\") pod \"openstack-operator-controller-manager-5dc486cffc-dtpjj\" (UID: \"98eda798-728f-481c-bb71-8a12ad8a3c37\") " pod="openstack-operators/openstack-operator-controller-manager-5dc486cffc-dtpjj" Feb 23 14:46:09.385894 master-0 kubenswrapper[28758]: E0223 14:46:09.385570 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2991a48-1675-489b-b719-b366221c255f-cert podName:b2991a48-1675-489b-b719-b366221c255f nodeName:}" failed. No retries permitted until 2026-02-23 14:46:10.385548177 +0000 UTC m=+702.511864119 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b2991a48-1675-489b-b719-b366221c255f-cert") pod "openstack-baremetal-operator-controller-manager-579b7786b9qh5rv" (UID: "b2991a48-1675-489b-b719-b366221c255f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 14:46:09.385894 master-0 kubenswrapper[28758]: I0223 14:46:09.385548 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-metrics-certs\") pod \"openstack-operator-controller-manager-5dc486cffc-dtpjj\" (UID: \"98eda798-728f-481c-bb71-8a12ad8a3c37\") " pod="openstack-operators/openstack-operator-controller-manager-5dc486cffc-dtpjj" Feb 23 14:46:09.385894 master-0 kubenswrapper[28758]: I0223 14:46:09.385638 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pnbm\" (UniqueName: \"kubernetes.io/projected/9b189b3e-46a8-4ae5-8037-09238906f46c-kube-api-access-6pnbm\") pod \"rabbitmq-cluster-operator-manager-668c99d594-mlb5b\" (UID: \"9b189b3e-46a8-4ae5-8037-09238906f46c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mlb5b" Feb 23 14:46:09.395504 master-0 kubenswrapper[28758]: I0223 14:46:09.387193 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-jjtxv" Feb 23 14:46:09.468889 master-0 kubenswrapper[28758]: I0223 14:46:09.462657 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-44k55" Feb 23 14:46:09.474569 master-0 kubenswrapper[28758]: I0223 14:46:09.471719 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-n9hpv" Feb 23 14:46:09.474569 master-0 kubenswrapper[28758]: I0223 14:46:09.473067 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-zgz2h" event={"ID":"b99e7588-8c4f-48ff-8de9-12eef47ee79b","Type":"ContainerStarted","Data":"b37a46931e6e7e773d6a0503b25847d9b7a35d3f525b965be4392ead436e0c5d"} Feb 23 14:46:09.486226 master-0 kubenswrapper[28758]: I0223 14:46:09.481751 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-5kwfs" event={"ID":"64270f21-553d-4155-b8e0-b45d5547285b","Type":"ContainerStarted","Data":"2c56525e4cc524f5a39b204655fd57466db6bf12bd50b6d784aa1497e663aad9"} Feb 23 14:46:09.487137 master-0 kubenswrapper[28758]: I0223 14:46:09.487081 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-webhook-certs\") pod \"openstack-operator-controller-manager-5dc486cffc-dtpjj\" (UID: \"98eda798-728f-481c-bb71-8a12ad8a3c37\") " pod="openstack-operators/openstack-operator-controller-manager-5dc486cffc-dtpjj" Feb 23 14:46:09.487285 master-0 kubenswrapper[28758]: E0223 14:46:09.487247 28758 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 23 14:46:09.487285 master-0 kubenswrapper[28758]: I0223 14:46:09.487273 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnwv2\" (UniqueName: \"kubernetes.io/projected/98eda798-728f-481c-bb71-8a12ad8a3c37-kube-api-access-xnwv2\") pod \"openstack-operator-controller-manager-5dc486cffc-dtpjj\" (UID: \"98eda798-728f-481c-bb71-8a12ad8a3c37\") " pod="openstack-operators/openstack-operator-controller-manager-5dc486cffc-dtpjj" Feb 23 14:46:09.487424 master-0 kubenswrapper[28758]: E0223 14:46:09.487325 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-webhook-certs podName:98eda798-728f-481c-bb71-8a12ad8a3c37 nodeName:}" failed. No retries permitted until 2026-02-23 14:46:09.987306218 +0000 UTC m=+702.113622150 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-webhook-certs") pod "openstack-operator-controller-manager-5dc486cffc-dtpjj" (UID: "98eda798-728f-481c-bb71-8a12ad8a3c37") : secret "webhook-server-cert" not found Feb 23 14:46:09.487424 master-0 kubenswrapper[28758]: I0223 14:46:09.487353 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-metrics-certs\") pod \"openstack-operator-controller-manager-5dc486cffc-dtpjj\" (UID: \"98eda798-728f-481c-bb71-8a12ad8a3c37\") " pod="openstack-operators/openstack-operator-controller-manager-5dc486cffc-dtpjj" Feb 23 14:46:09.487628 master-0 kubenswrapper[28758]: I0223 14:46:09.487456 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pnbm\" (UniqueName: \"kubernetes.io/projected/9b189b3e-46a8-4ae5-8037-09238906f46c-kube-api-access-6pnbm\") pod \"rabbitmq-cluster-operator-manager-668c99d594-mlb5b\" (UID: \"9b189b3e-46a8-4ae5-8037-09238906f46c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mlb5b" Feb 23 14:46:09.487847 master-0 kubenswrapper[28758]: E0223 14:46:09.487824 28758 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 23 14:46:09.487896 master-0 kubenswrapper[28758]: E0223 14:46:09.487862 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-metrics-certs podName:98eda798-728f-481c-bb71-8a12ad8a3c37 nodeName:}" failed. No retries permitted until 2026-02-23 14:46:09.987853183 +0000 UTC m=+702.114169115 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-metrics-certs") pod "openstack-operator-controller-manager-5dc486cffc-dtpjj" (UID: "98eda798-728f-481c-bb71-8a12ad8a3c37") : secret "metrics-server-cert" not found Feb 23 14:46:09.499497 master-0 kubenswrapper[28758]: I0223 14:46:09.499180 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-9trkk" event={"ID":"c9893b1e-ab36-4e25-93f9-fe364916347a","Type":"ContainerStarted","Data":"3e6bfb2304f4ce3cd2b18865ffb8d505c967f48d3b1dda6f317e36b6c5146944"} Feb 23 14:46:09.516566 master-0 kubenswrapper[28758]: I0223 14:46:09.516502 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnwv2\" (UniqueName: \"kubernetes.io/projected/98eda798-728f-481c-bb71-8a12ad8a3c37-kube-api-access-xnwv2\") pod \"openstack-operator-controller-manager-5dc486cffc-dtpjj\" (UID: \"98eda798-728f-481c-bb71-8a12ad8a3c37\") " pod="openstack-operators/openstack-operator-controller-manager-5dc486cffc-dtpjj" Feb 23 14:46:09.534372 master-0 kubenswrapper[28758]: I0223 14:46:09.534312 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pnbm\" (UniqueName: \"kubernetes.io/projected/9b189b3e-46a8-4ae5-8037-09238906f46c-kube-api-access-6pnbm\") pod \"rabbitmq-cluster-operator-manager-668c99d594-mlb5b\" (UID: \"9b189b3e-46a8-4ae5-8037-09238906f46c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mlb5b" Feb 23 14:46:09.655607 master-0 kubenswrapper[28758]: I0223 14:46:09.651848 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pxqtf" Feb 23 14:46:10.019265 master-0 kubenswrapper[28758]: E0223 14:46:10.019208 28758 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 23 14:46:10.019853 master-0 kubenswrapper[28758]: E0223 14:46:10.019303 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89e18c7d-ffb4-44e5-b640-e26175c114e1-cert podName:89e18c7d-ffb4-44e5-b640-e26175c114e1 nodeName:}" failed. No retries permitted until 2026-02-23 14:46:12.019286049 +0000 UTC m=+704.145601971 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/89e18c7d-ffb4-44e5-b640-e26175c114e1-cert") pod "infra-operator-controller-manager-5f879c76b6-5zmbh" (UID: "89e18c7d-ffb4-44e5-b640-e26175c114e1") : secret "infra-operator-webhook-server-cert" not found Feb 23 14:46:10.025518 master-0 kubenswrapper[28758]: I0223 14:46:10.019098 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89e18c7d-ffb4-44e5-b640-e26175c114e1-cert\") pod \"infra-operator-controller-manager-5f879c76b6-5zmbh\" (UID: \"89e18c7d-ffb4-44e5-b640-e26175c114e1\") " pod="openstack-operators/infra-operator-controller-manager-5f879c76b6-5zmbh" Feb 23 14:46:10.025518 master-0 kubenswrapper[28758]: I0223 14:46:10.021297 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-webhook-certs\") pod \"openstack-operator-controller-manager-5dc486cffc-dtpjj\" (UID: \"98eda798-728f-481c-bb71-8a12ad8a3c37\") " pod="openstack-operators/openstack-operator-controller-manager-5dc486cffc-dtpjj" Feb 23 14:46:10.025518 master-0 kubenswrapper[28758]: E0223 14:46:10.021416 28758 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 23 14:46:10.025518 master-0 kubenswrapper[28758]: E0223 14:46:10.021464 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-webhook-certs podName:98eda798-728f-481c-bb71-8a12ad8a3c37 nodeName:}" failed. No retries permitted until 2026-02-23 14:46:11.021450796 +0000 UTC m=+703.147766728 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-webhook-certs") pod "openstack-operator-controller-manager-5dc486cffc-dtpjj" (UID: "98eda798-728f-481c-bb71-8a12ad8a3c37") : secret "webhook-server-cert" not found Feb 23 14:46:10.025518 master-0 kubenswrapper[28758]: I0223 14:46:10.021586 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-metrics-certs\") pod \"openstack-operator-controller-manager-5dc486cffc-dtpjj\" (UID: \"98eda798-728f-481c-bb71-8a12ad8a3c37\") " pod="openstack-operators/openstack-operator-controller-manager-5dc486cffc-dtpjj" Feb 23 14:46:10.025518 master-0 kubenswrapper[28758]: E0223 14:46:10.021798 28758 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 23 14:46:10.025518 master-0 kubenswrapper[28758]: E0223 14:46:10.021895 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-metrics-certs podName:98eda798-728f-481c-bb71-8a12ad8a3c37 nodeName:}" failed. No retries permitted until 2026-02-23 14:46:11.021881438 +0000 UTC m=+703.148197370 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-metrics-certs") pod "openstack-operator-controller-manager-5dc486cffc-dtpjj" (UID: "98eda798-728f-481c-bb71-8a12ad8a3c37") : secret "metrics-server-cert" not found Feb 23 14:46:10.048613 master-0 kubenswrapper[28758]: I0223 14:46:10.047988 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mlb5b" Feb 23 14:46:10.435291 master-0 kubenswrapper[28758]: I0223 14:46:10.434704 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2991a48-1675-489b-b719-b366221c255f-cert\") pod \"openstack-baremetal-operator-controller-manager-579b7786b9qh5rv\" (UID: \"b2991a48-1675-489b-b719-b366221c255f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-579b7786b9qh5rv" Feb 23 14:46:10.435291 master-0 kubenswrapper[28758]: E0223 14:46:10.434897 28758 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 14:46:10.435291 master-0 kubenswrapper[28758]: E0223 14:46:10.435064 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2991a48-1675-489b-b719-b366221c255f-cert podName:b2991a48-1675-489b-b719-b366221c255f nodeName:}" failed. No retries permitted until 2026-02-23 14:46:12.435039744 +0000 UTC m=+704.561355676 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b2991a48-1675-489b-b719-b366221c255f-cert") pod "openstack-baremetal-operator-controller-manager-579b7786b9qh5rv" (UID: "b2991a48-1675-489b-b719-b366221c255f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 14:46:10.973875 master-0 kubenswrapper[28758]: I0223 14:46:10.973796 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-zftk6"] Feb 23 14:46:10.990543 master-0 kubenswrapper[28758]: I0223 14:46:10.985578 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-psqph"] Feb 23 14:46:10.996298 master-0 kubenswrapper[28758]: I0223 14:46:10.994875 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-bcvt8"] Feb 23 14:46:11.003666 master-0 kubenswrapper[28758]: I0223 14:46:11.003600 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-dkqkb"] Feb 23 14:46:11.011241 master-0 kubenswrapper[28758]: I0223 14:46:11.010752 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-vd6sh"] Feb 23 14:46:11.052439 master-0 kubenswrapper[28758]: I0223 14:46:11.052303 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-metrics-certs\") pod \"openstack-operator-controller-manager-5dc486cffc-dtpjj\" (UID: \"98eda798-728f-481c-bb71-8a12ad8a3c37\") " pod="openstack-operators/openstack-operator-controller-manager-5dc486cffc-dtpjj" Feb 23 14:46:11.053036 master-0 kubenswrapper[28758]: I0223 14:46:11.052520 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-webhook-certs\") pod \"openstack-operator-controller-manager-5dc486cffc-dtpjj\" (UID: \"98eda798-728f-481c-bb71-8a12ad8a3c37\") " pod="openstack-operators/openstack-operator-controller-manager-5dc486cffc-dtpjj" Feb 23 14:46:11.053036 master-0 kubenswrapper[28758]: E0223 14:46:11.052720 28758 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 23 14:46:11.053036 master-0 kubenswrapper[28758]: E0223 14:46:11.052815 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-metrics-certs podName:98eda798-728f-481c-bb71-8a12ad8a3c37 nodeName:}" failed. No retries permitted until 2026-02-23 14:46:13.052788991 +0000 UTC m=+705.179105013 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-metrics-certs") pod "openstack-operator-controller-manager-5dc486cffc-dtpjj" (UID: "98eda798-728f-481c-bb71-8a12ad8a3c37") : secret "metrics-server-cert" not found Feb 23 14:46:11.053036 master-0 kubenswrapper[28758]: E0223 14:46:11.052922 28758 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 23 14:46:11.053321 master-0 kubenswrapper[28758]: E0223 14:46:11.053046 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-webhook-certs podName:98eda798-728f-481c-bb71-8a12ad8a3c37 nodeName:}" failed. No retries permitted until 2026-02-23 14:46:13.053025617 +0000 UTC m=+705.179341549 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-webhook-certs") pod "openstack-operator-controller-manager-5dc486cffc-dtpjj" (UID: "98eda798-728f-481c-bb71-8a12ad8a3c37") : secret "webhook-server-cert" not found Feb 23 14:46:11.527722 master-0 kubenswrapper[28758]: I0223 14:46:11.527668 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2lsqj"] Feb 23 14:46:11.536586 master-0 kubenswrapper[28758]: I0223 14:46:11.536525 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-gtxbn"] Feb 23 14:46:11.545437 master-0 kubenswrapper[28758]: I0223 14:46:11.545312 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-wll2v"] Feb 23 14:46:11.553509 master-0 kubenswrapper[28758]: I0223 14:46:11.552090 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-jjtxv"] Feb 23 14:46:11.561020 master-0 kubenswrapper[28758]: I0223 14:46:11.560757 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-bsbx5"] Feb 23 14:46:11.569365 master-0 kubenswrapper[28758]: I0223 14:46:11.569295 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-96nsl"] Feb 23 14:46:12.041546 master-0 kubenswrapper[28758]: I0223 14:46:12.038585 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-xnn4r"] Feb 23 14:46:12.047825 master-0 kubenswrapper[28758]: I0223 14:46:12.047598 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-pxqtf"] Feb 23 14:46:12.055218 master-0 kubenswrapper[28758]: I0223 14:46:12.055137 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-n9hpv"] Feb 23 14:46:12.063556 master-0 kubenswrapper[28758]: I0223 14:46:12.063444 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-589c568786-44k55"] Feb 23 14:46:12.073147 master-0 kubenswrapper[28758]: I0223 14:46:12.073106 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-7kkjf"] Feb 23 14:46:12.083686 master-0 kubenswrapper[28758]: I0223 14:46:12.083640 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89e18c7d-ffb4-44e5-b640-e26175c114e1-cert\") pod \"infra-operator-controller-manager-5f879c76b6-5zmbh\" (UID: \"89e18c7d-ffb4-44e5-b640-e26175c114e1\") " pod="openstack-operators/infra-operator-controller-manager-5f879c76b6-5zmbh" Feb 23 14:46:12.083874 master-0 kubenswrapper[28758]: E0223 14:46:12.083848 28758 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 23 14:46:12.083956 master-0 kubenswrapper[28758]: E0223 14:46:12.083900 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89e18c7d-ffb4-44e5-b640-e26175c114e1-cert podName:89e18c7d-ffb4-44e5-b640-e26175c114e1 nodeName:}" failed. No retries permitted until 2026-02-23 14:46:16.083883719 +0000 UTC m=+708.210199651 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/89e18c7d-ffb4-44e5-b640-e26175c114e1-cert") pod "infra-operator-controller-manager-5f879c76b6-5zmbh" (UID: "89e18c7d-ffb4-44e5-b640-e26175c114e1") : secret "infra-operator-webhook-server-cert" not found Feb 23 14:46:12.086495 master-0 kubenswrapper[28758]: I0223 14:46:12.086384 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mlb5b"] Feb 23 14:46:12.273170 master-0 kubenswrapper[28758]: W0223 14:46:12.273110 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podada28ec1_a66b_4668_941c_c0f0cd424ee4.slice/crio-6abe6f503eb414ce57b6760c189f4937202e375d6a254cac3c1f127f14ab8660 WatchSource:0}: Error finding container 6abe6f503eb414ce57b6760c189f4937202e375d6a254cac3c1f127f14ab8660: Status 404 returned error can't find the container with id 6abe6f503eb414ce57b6760c189f4937202e375d6a254cac3c1f127f14ab8660 Feb 23 14:46:12.491546 master-0 kubenswrapper[28758]: I0223 14:46:12.491037 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2991a48-1675-489b-b719-b366221c255f-cert\") pod \"openstack-baremetal-operator-controller-manager-579b7786b9qh5rv\" (UID: \"b2991a48-1675-489b-b719-b366221c255f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-579b7786b9qh5rv" Feb 23 14:46:12.491546 master-0 kubenswrapper[28758]: E0223 14:46:12.491327 28758 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 14:46:12.491546 master-0 kubenswrapper[28758]: E0223 14:46:12.491506 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2991a48-1675-489b-b719-b366221c255f-cert podName:b2991a48-1675-489b-b719-b366221c255f nodeName:}" failed. No retries permitted until 2026-02-23 14:46:16.491470338 +0000 UTC m=+708.617786270 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b2991a48-1675-489b-b719-b366221c255f-cert") pod "openstack-baremetal-operator-controller-manager-579b7786b9qh5rv" (UID: "b2991a48-1675-489b-b719-b366221c255f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 14:46:12.546363 master-0 kubenswrapper[28758]: I0223 14:46:12.546278 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-96nsl" event={"ID":"14ce0057-4678-4bf7-bf28-256945f8a589","Type":"ContainerStarted","Data":"a096d4b81459c7c47fc9c78cdeca7efa75c0b3e9f8c96e8f0e3f9923fc41c3df"} Feb 23 14:46:12.547805 master-0 kubenswrapper[28758]: I0223 14:46:12.547759 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-dkqkb" event={"ID":"ada28ec1-a66b-4668-941c-c0f0cd424ee4","Type":"ContainerStarted","Data":"6abe6f503eb414ce57b6760c189f4937202e375d6a254cac3c1f127f14ab8660"} Feb 23 14:46:12.869093 master-0 kubenswrapper[28758]: W0223 14:46:12.868927 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cc75447_5114_451b_a0c7_accd382d82eb.slice/crio-ac12fdba7ffb7e7a65f1adb349260cc2fbb837496f3191da8197de9a9eaa618a WatchSource:0}: Error finding container ac12fdba7ffb7e7a65f1adb349260cc2fbb837496f3191da8197de9a9eaa618a: Status 404 returned error can't find the container with id ac12fdba7ffb7e7a65f1adb349260cc2fbb837496f3191da8197de9a9eaa618a Feb 23 14:46:12.870267 master-0 kubenswrapper[28758]: W0223 14:46:12.870221 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5602bab5_e632_45c0_9a58_fca8c507ff8d.slice/crio-7a1c97a78556b05a1cc3215a5bddd50340f2eaa9aada5bf2af9a300c26ecbb4b WatchSource:0}: Error finding container 7a1c97a78556b05a1cc3215a5bddd50340f2eaa9aada5bf2af9a300c26ecbb4b: Status 404 returned error can't find the container with id 7a1c97a78556b05a1cc3215a5bddd50340f2eaa9aada5bf2af9a300c26ecbb4b Feb 23 14:46:12.901150 master-0 kubenswrapper[28758]: W0223 14:46:12.901084 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb045dcf_8041_4baf_a750_6ad9e951dd65.slice/crio-39ee5e5a1fbbab043fb97bf0d025cd0646a3893d852c86ca82b332363688daff WatchSource:0}: Error finding container 39ee5e5a1fbbab043fb97bf0d025cd0646a3893d852c86ca82b332363688daff: Status 404 returned error can't find the container with id 39ee5e5a1fbbab043fb97bf0d025cd0646a3893d852c86ca82b332363688daff Feb 23 14:46:12.904429 master-0 kubenswrapper[28758]: W0223 14:46:12.904375 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb496c821_47a3_4f07_838c_54560eebf847.slice/crio-134d5d2b6efcb766954654e335acb15d2da4af1ca00052cde8be66b173b2f486 WatchSource:0}: Error finding container 134d5d2b6efcb766954654e335acb15d2da4af1ca00052cde8be66b173b2f486: Status 404 returned error can't find the container with id 134d5d2b6efcb766954654e335acb15d2da4af1ca00052cde8be66b173b2f486 Feb 23 14:46:12.905847 master-0 kubenswrapper[28758]: W0223 14:46:12.905754 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61cee583_7aa7_483b_b0e4_96f48d26a940.slice/crio-afaaccbdd5e1253fa97fc28d8432796439cccad87bdc8da9f494a5b8e93789e2 WatchSource:0}: Error finding container afaaccbdd5e1253fa97fc28d8432796439cccad87bdc8da9f494a5b8e93789e2: Status 404 returned error can't find the container with id afaaccbdd5e1253fa97fc28d8432796439cccad87bdc8da9f494a5b8e93789e2 Feb 23 14:46:12.907643 master-0 kubenswrapper[28758]: W0223 14:46:12.907607 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19dbaca0_2912_45da_bc06_8050bd931706.slice/crio-0cffb8a9d800679921eb90f9096e5a7ec8a6918629d9fb9e0096578d9b1b3452 WatchSource:0}: Error finding container 0cffb8a9d800679921eb90f9096e5a7ec8a6918629d9fb9e0096578d9b1b3452: Status 404 returned error can't find the container with id 0cffb8a9d800679921eb90f9096e5a7ec8a6918629d9fb9e0096578d9b1b3452 Feb 23 14:46:12.913181 master-0 kubenswrapper[28758]: W0223 14:46:12.913097 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0114c0ec_3af5_4d4e_adac_34f5471c64ce.slice/crio-1e45582b973f7189233c1bcf3c792aa42d1114e75ff7884ca4ae31636d295bcd WatchSource:0}: Error finding container 1e45582b973f7189233c1bcf3c792aa42d1114e75ff7884ca4ae31636d295bcd: Status 404 returned error can't find the container with id 1e45582b973f7189233c1bcf3c792aa42d1114e75ff7884ca4ae31636d295bcd Feb 23 14:46:12.917597 master-0 kubenswrapper[28758]: W0223 14:46:12.917204 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb00988a0_8239_4f73_832d_5e28c3afac6a.slice/crio-141ab43791eb6e9b253d0de29e57ab733fb50ffeb58a19ffc5021d87b1882e55 WatchSource:0}: Error finding container 141ab43791eb6e9b253d0de29e57ab733fb50ffeb58a19ffc5021d87b1882e55: Status 404 returned error can't find the container with id 141ab43791eb6e9b253d0de29e57ab733fb50ffeb58a19ffc5021d87b1882e55 Feb 23 14:46:12.921813 master-0 kubenswrapper[28758]: W0223 14:46:12.921754 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10b5a636_69f5_4828_8e1e_9a3a598c28aa.slice/crio-6c236895ec576f4ea665627f3f9aa873fbbbd088b0311b94aa507ff9ea19d289 WatchSource:0}: Error finding container 6c236895ec576f4ea665627f3f9aa873fbbbd088b0311b94aa507ff9ea19d289: Status 404 returned error can't find the container with id 6c236895ec576f4ea665627f3f9aa873fbbbd088b0311b94aa507ff9ea19d289 Feb 23 14:46:12.929968 master-0 kubenswrapper[28758]: W0223 14:46:12.929883 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf257fd05_c591_4324_94b4_8f87a7741118.slice/crio-03820dd492072da8ee7381d806ff1916f488888055f8701d2f05ac33a81646d0 WatchSource:0}: Error finding container 03820dd492072da8ee7381d806ff1916f488888055f8701d2f05ac33a81646d0: Status 404 returned error can't find the container with id 03820dd492072da8ee7381d806ff1916f488888055f8701d2f05ac33a81646d0 Feb 23 14:46:12.931403 master-0 kubenswrapper[28758]: E0223 14:46:12.931334 28758 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z6q2b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6994f66f48-xnn4r_openstack-operators(f257fd05-c591-4324-94b4-8f87a7741118): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 23 14:46:12.932764 master-0 kubenswrapper[28758]: E0223 14:46:12.932716 28758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-xnn4r" podUID="f257fd05-c591-4324-94b4-8f87a7741118" Feb 23 14:46:12.936125 master-0 kubenswrapper[28758]: W0223 14:46:12.936068 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e1182a2_d37e_45d6_9457_af639fc42480.slice/crio-0efab0a24b05729f7e2c5ec5e55f13ac75350c49da503ede370ab4010d7715fb WatchSource:0}: Error finding container 0efab0a24b05729f7e2c5ec5e55f13ac75350c49da503ede370ab4010d7715fb: Status 404 returned error can't find the container with id 0efab0a24b05729f7e2c5ec5e55f13ac75350c49da503ede370ab4010d7715fb Feb 23 14:46:12.936312 master-0 kubenswrapper[28758]: W0223 14:46:12.936254 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6520dae0_e31c_40d1_a562_6e0631553146.slice/crio-e89222d990ba8063541435571b35d0dc95c79877d69853af543e94526e670db0 WatchSource:0}: Error finding container e89222d990ba8063541435571b35d0dc95c79877d69853af543e94526e670db0: Status 404 returned error can't find the container with id e89222d990ba8063541435571b35d0dc95c79877d69853af543e94526e670db0 Feb 23 14:46:12.983307 master-0 kubenswrapper[28758]: E0223 14:46:12.983000 28758 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:f4143497c70c048a7733c284060347a0c74ef4e628aca22ee191e5bc9e4c7192,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rfj64,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-5955d8c787-wll2v_openstack-operators(6520dae0-e31c-40d1-a562-6e0631553146): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 23 14:46:12.984712 master-0 kubenswrapper[28758]: E0223 14:46:12.984648 28758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-wll2v" podUID="6520dae0-e31c-40d1-a562-6e0631553146" Feb 23 14:46:12.986803 master-0 kubenswrapper[28758]: E0223 14:46:12.986246 28758 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6pnbm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000810000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-mlb5b_openstack-operators(9b189b3e-46a8-4ae5-8037-09238906f46c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 23 14:46:12.986803 master-0 kubenswrapper[28758]: E0223 14:46:12.986408 28758 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:c7c7d4228994efb8b93cfabe4d78b40b085d91848dc49db247b7bbca689dae06,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cts7m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-659dc6bbfc-2lsqj_openstack-operators(0e1182a2-d37e-45d6-9457-af639fc42480): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 23 14:46:12.986803 master-0 kubenswrapper[28758]: E0223 14:46:12.986613 28758 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:14ae1fb8d065e2317959ce7490a878dc87731d27ebf40259f801ba1a83cfefcf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xxrrm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-6bd4687957-bsbx5_openstack-operators(63f4cc5c-7ea4-40dd-8ada-a8508d600f2a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 23 14:46:12.987594 master-0 kubenswrapper[28758]: E0223 14:46:12.987520 28758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2lsqj" podUID="0e1182a2-d37e-45d6-9457-af639fc42480" Feb 23 14:46:12.987594 master-0 kubenswrapper[28758]: E0223 14:46:12.987566 28758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mlb5b" podUID="9b189b3e-46a8-4ae5-8037-09238906f46c" Feb 23 14:46:12.987876 master-0 kubenswrapper[28758]: E0223 14:46:12.987768 28758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-bsbx5" podUID="63f4cc5c-7ea4-40dd-8ada-a8508d600f2a" Feb 23 14:46:13.107556 master-0 kubenswrapper[28758]: I0223 14:46:13.106036 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-webhook-certs\") pod \"openstack-operator-controller-manager-5dc486cffc-dtpjj\" (UID: \"98eda798-728f-481c-bb71-8a12ad8a3c37\") " pod="openstack-operators/openstack-operator-controller-manager-5dc486cffc-dtpjj" Feb 23 14:46:13.107556 master-0 kubenswrapper[28758]: I0223 14:46:13.106155 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-metrics-certs\") pod \"openstack-operator-controller-manager-5dc486cffc-dtpjj\" (UID: \"98eda798-728f-481c-bb71-8a12ad8a3c37\") " pod="openstack-operators/openstack-operator-controller-manager-5dc486cffc-dtpjj" Feb 23 14:46:13.107556 master-0 kubenswrapper[28758]: E0223 14:46:13.106218 28758 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 23 14:46:13.107556 master-0 kubenswrapper[28758]: E0223 14:46:13.106312 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-webhook-certs podName:98eda798-728f-481c-bb71-8a12ad8a3c37 nodeName:}" failed. No retries permitted until 2026-02-23 14:46:17.106286027 +0000 UTC m=+709.232601959 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-webhook-certs") pod "openstack-operator-controller-manager-5dc486cffc-dtpjj" (UID: "98eda798-728f-481c-bb71-8a12ad8a3c37") : secret "webhook-server-cert" not found Feb 23 14:46:13.107556 master-0 kubenswrapper[28758]: E0223 14:46:13.106340 28758 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 23 14:46:13.107556 master-0 kubenswrapper[28758]: E0223 14:46:13.106417 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-metrics-certs podName:98eda798-728f-481c-bb71-8a12ad8a3c37 nodeName:}" failed. No retries permitted until 2026-02-23 14:46:17.10640258 +0000 UTC m=+709.232718512 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-metrics-certs") pod "openstack-operator-controller-manager-5dc486cffc-dtpjj" (UID: "98eda798-728f-481c-bb71-8a12ad8a3c37") : secret "metrics-server-cert" not found Feb 23 14:46:13.564096 master-0 kubenswrapper[28758]: I0223 14:46:13.564028 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-wll2v" event={"ID":"6520dae0-e31c-40d1-a562-6e0631553146","Type":"ContainerStarted","Data":"e89222d990ba8063541435571b35d0dc95c79877d69853af543e94526e670db0"} Feb 23 14:46:13.565409 master-0 kubenswrapper[28758]: E0223 14:46:13.565355 28758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:f4143497c70c048a7733c284060347a0c74ef4e628aca22ee191e5bc9e4c7192\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-wll2v" podUID="6520dae0-e31c-40d1-a562-6e0631553146" Feb 23 14:46:13.567625 master-0 kubenswrapper[28758]: I0223 14:46:13.567581 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-7kkjf" event={"ID":"61cee583-7aa7-483b-b0e4-96f48d26a940","Type":"ContainerStarted","Data":"afaaccbdd5e1253fa97fc28d8432796439cccad87bdc8da9f494a5b8e93789e2"} Feb 23 14:46:13.569282 master-0 kubenswrapper[28758]: I0223 14:46:13.569241 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-jjtxv" event={"ID":"0114c0ec-3af5-4d4e-adac-34f5471c64ce","Type":"ContainerStarted","Data":"1e45582b973f7189233c1bcf3c792aa42d1114e75ff7884ca4ae31636d295bcd"} Feb 23 14:46:13.571991 master-0 kubenswrapper[28758]: I0223 14:46:13.571737 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-gtxbn" event={"ID":"8cc75447-5114-451b-a0c7-accd382d82eb","Type":"ContainerStarted","Data":"ac12fdba7ffb7e7a65f1adb349260cc2fbb837496f3191da8197de9a9eaa618a"} Feb 23 14:46:13.575514 master-0 kubenswrapper[28758]: I0223 14:46:13.574981 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-xnn4r" event={"ID":"f257fd05-c591-4324-94b4-8f87a7741118","Type":"ContainerStarted","Data":"03820dd492072da8ee7381d806ff1916f488888055f8701d2f05ac33a81646d0"} Feb 23 14:46:13.576642 master-0 kubenswrapper[28758]: E0223 14:46:13.576601 28758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-xnn4r" podUID="f257fd05-c591-4324-94b4-8f87a7741118" Feb 23 14:46:13.580601 master-0 kubenswrapper[28758]: I0223 14:46:13.578660 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-zftk6" event={"ID":"b00988a0-8239-4f73-832d-5e28c3afac6a","Type":"ContainerStarted","Data":"141ab43791eb6e9b253d0de29e57ab733fb50ffeb58a19ffc5021d87b1882e55"} Feb 23 14:46:13.597834 master-0 kubenswrapper[28758]: I0223 14:46:13.596052 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-5kwfs" event={"ID":"64270f21-553d-4155-b8e0-b45d5547285b","Type":"ContainerStarted","Data":"97f77d2eecf7c2787b354cd4f95e864d0080bb30e0122f3b4531551cafd9a853"} Feb 23 14:46:13.597834 master-0 kubenswrapper[28758]: I0223 14:46:13.596141 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-5kwfs" Feb 23 14:46:13.599106 master-0 kubenswrapper[28758]: I0223 14:46:13.599048 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-9trkk" event={"ID":"c9893b1e-ab36-4e25-93f9-fe364916347a","Type":"ContainerStarted","Data":"c2c80648752f016fbd9c62972a1991d71bf81ba0bcddca44cb8af4d2932e9a19"} Feb 23 14:46:13.599903 master-0 kubenswrapper[28758]: I0223 14:46:13.599851 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-9trkk" Feb 23 14:46:13.601411 master-0 kubenswrapper[28758]: I0223 14:46:13.601366 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-psqph" event={"ID":"13419178-05f6-4d41-be2b-2849b477ff68","Type":"ContainerStarted","Data":"998bd7c11103cc761bb9fbdc583fd1a3ecd0a4190bc075ce8a061e3bfb6138b5"} Feb 23 14:46:13.603211 master-0 kubenswrapper[28758]: I0223 14:46:13.603149 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mlb5b" event={"ID":"9b189b3e-46a8-4ae5-8037-09238906f46c","Type":"ContainerStarted","Data":"9ca68daefcf70423e6a3eaf1a55fe0fb59b76b86abee12ff35fe163bd3a79104"} Feb 23 14:46:13.615935 master-0 kubenswrapper[28758]: E0223 14:46:13.611515 28758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mlb5b" podUID="9b189b3e-46a8-4ae5-8037-09238906f46c" Feb 23 14:46:13.628430 master-0 kubenswrapper[28758]: I0223 14:46:13.628358 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pxqtf" event={"ID":"19dbaca0-2912-45da-bc06-8050bd931706","Type":"ContainerStarted","Data":"0cffb8a9d800679921eb90f9096e5a7ec8a6918629d9fb9e0096578d9b1b3452"} Feb 23 14:46:13.631463 master-0 kubenswrapper[28758]: I0223 14:46:13.631405 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-zgz2h" event={"ID":"b99e7588-8c4f-48ff-8de9-12eef47ee79b","Type":"ContainerStarted","Data":"271eb8c1d23dc6a92eb063a75e6e5eaad2768576ec496483a53d27eda9fd81f4"} Feb 23 14:46:13.632469 master-0 kubenswrapper[28758]: I0223 14:46:13.632405 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-zgz2h" Feb 23 14:46:13.639877 master-0 kubenswrapper[28758]: I0223 14:46:13.639820 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-bcvt8" event={"ID":"eb045dcf-8041-4baf-a750-6ad9e951dd65","Type":"ContainerStarted","Data":"39ee5e5a1fbbab043fb97bf0d025cd0646a3893d852c86ca82b332363688daff"} Feb 23 14:46:13.649093 master-0 kubenswrapper[28758]: I0223 14:46:13.646031 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-44k55" event={"ID":"b496c821-47a3-4f07-838c-54560eebf847","Type":"ContainerStarted","Data":"134d5d2b6efcb766954654e335acb15d2da4af1ca00052cde8be66b173b2f486"} Feb 23 14:46:13.649093 master-0 kubenswrapper[28758]: I0223 14:46:13.648810 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2lsqj" event={"ID":"0e1182a2-d37e-45d6-9457-af639fc42480","Type":"ContainerStarted","Data":"0efab0a24b05729f7e2c5ec5e55f13ac75350c49da503ede370ab4010d7715fb"} Feb 23 14:46:13.650724 master-0 kubenswrapper[28758]: I0223 14:46:13.650663 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-vd6sh" event={"ID":"5602bab5-e632-45c0-9a58-fca8c507ff8d","Type":"ContainerStarted","Data":"7a1c97a78556b05a1cc3215a5bddd50340f2eaa9aada5bf2af9a300c26ecbb4b"} Feb 23 14:46:13.652284 master-0 kubenswrapper[28758]: E0223 14:46:13.652181 28758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:c7c7d4228994efb8b93cfabe4d78b40b085d91848dc49db247b7bbca689dae06\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2lsqj" podUID="0e1182a2-d37e-45d6-9457-af639fc42480" Feb 23 14:46:13.652639 master-0 kubenswrapper[28758]: I0223 14:46:13.652605 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-n9hpv" event={"ID":"10b5a636-69f5-4828-8e1e-9a3a598c28aa","Type":"ContainerStarted","Data":"6c236895ec576f4ea665627f3f9aa873fbbbd088b0311b94aa507ff9ea19d289"} Feb 23 14:46:13.656983 master-0 kubenswrapper[28758]: I0223 14:46:13.656042 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-bsbx5" event={"ID":"63f4cc5c-7ea4-40dd-8ada-a8508d600f2a","Type":"ContainerStarted","Data":"32df494f3488a2bc48163c89b13a3803e391edda1e6c7d3f7855a96fe6b0eacd"} Feb 23 14:46:13.659443 master-0 kubenswrapper[28758]: E0223 14:46:13.659378 28758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:14ae1fb8d065e2317959ce7490a878dc87731d27ebf40259f801ba1a83cfefcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-bsbx5" podUID="63f4cc5c-7ea4-40dd-8ada-a8508d600f2a" Feb 23 14:46:13.683461 master-0 kubenswrapper[28758]: I0223 14:46:13.683232 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-5kwfs" podStartSLOduration=1.940264971 podStartE2EDuration="5.68321202s" podCreationTimestamp="2026-02-23 14:46:08 +0000 UTC" firstStartedPulling="2026-02-23 14:46:09.256676557 +0000 UTC m=+701.382992489" lastFinishedPulling="2026-02-23 14:46:12.999623616 +0000 UTC m=+705.125939538" observedRunningTime="2026-02-23 14:46:13.677744775 +0000 UTC m=+705.804060697" watchObservedRunningTime="2026-02-23 14:46:13.68321202 +0000 UTC m=+705.809527952" Feb 23 14:46:13.687883 master-0 kubenswrapper[28758]: I0223 14:46:13.686654 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-9trkk" podStartSLOduration=2.779508785 podStartE2EDuration="6.686641632s" podCreationTimestamp="2026-02-23 14:46:07 +0000 UTC" firstStartedPulling="2026-02-23 14:46:09.131465203 +0000 UTC m=+701.257781135" lastFinishedPulling="2026-02-23 14:46:13.03859805 +0000 UTC m=+705.164913982" observedRunningTime="2026-02-23 14:46:13.644564315 +0000 UTC m=+705.770880267" watchObservedRunningTime="2026-02-23 14:46:13.686641632 +0000 UTC m=+705.812957564" Feb 23 14:46:13.830435 master-0 kubenswrapper[28758]: I0223 14:46:13.830295 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-zgz2h" podStartSLOduration=3.093753695 podStartE2EDuration="6.830271584s" podCreationTimestamp="2026-02-23 14:46:07 +0000 UTC" firstStartedPulling="2026-02-23 14:46:09.267626177 +0000 UTC m=+701.393942109" lastFinishedPulling="2026-02-23 14:46:13.004144066 +0000 UTC m=+705.130459998" observedRunningTime="2026-02-23 14:46:13.816230021 +0000 UTC m=+705.942545953" watchObservedRunningTime="2026-02-23 14:46:13.830271584 +0000 UTC m=+705.956587516" Feb 23 14:46:14.669329 master-0 kubenswrapper[28758]: E0223 14:46:14.669243 28758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mlb5b" podUID="9b189b3e-46a8-4ae5-8037-09238906f46c" Feb 23 14:46:14.669688 master-0 kubenswrapper[28758]: E0223 14:46:14.669456 28758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:f4143497c70c048a7733c284060347a0c74ef4e628aca22ee191e5bc9e4c7192\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-wll2v" podUID="6520dae0-e31c-40d1-a562-6e0631553146" Feb 23 14:46:14.669688 master-0 kubenswrapper[28758]: E0223 14:46:14.669548 28758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-xnn4r" podUID="f257fd05-c591-4324-94b4-8f87a7741118" Feb 23 14:46:14.669688 master-0 kubenswrapper[28758]: E0223 14:46:14.669630 28758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:c7c7d4228994efb8b93cfabe4d78b40b085d91848dc49db247b7bbca689dae06\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2lsqj" podUID="0e1182a2-d37e-45d6-9457-af639fc42480" Feb 23 14:46:14.670863 master-0 kubenswrapper[28758]: E0223 14:46:14.670817 28758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:14ae1fb8d065e2317959ce7490a878dc87731d27ebf40259f801ba1a83cfefcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-bsbx5" podUID="63f4cc5c-7ea4-40dd-8ada-a8508d600f2a" Feb 23 14:46:16.175814 master-0 kubenswrapper[28758]: I0223 14:46:16.175555 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89e18c7d-ffb4-44e5-b640-e26175c114e1-cert\") pod \"infra-operator-controller-manager-5f879c76b6-5zmbh\" (UID: \"89e18c7d-ffb4-44e5-b640-e26175c114e1\") " pod="openstack-operators/infra-operator-controller-manager-5f879c76b6-5zmbh" Feb 23 14:46:16.176365 master-0 kubenswrapper[28758]: E0223 14:46:16.175858 28758 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 23 14:46:16.176365 master-0 kubenswrapper[28758]: E0223 14:46:16.175960 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89e18c7d-ffb4-44e5-b640-e26175c114e1-cert podName:89e18c7d-ffb4-44e5-b640-e26175c114e1 nodeName:}" failed. No retries permitted until 2026-02-23 14:46:24.175933255 +0000 UTC m=+716.302249267 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/89e18c7d-ffb4-44e5-b640-e26175c114e1-cert") pod "infra-operator-controller-manager-5f879c76b6-5zmbh" (UID: "89e18c7d-ffb4-44e5-b640-e26175c114e1") : secret "infra-operator-webhook-server-cert" not found Feb 23 14:46:16.582280 master-0 kubenswrapper[28758]: I0223 14:46:16.582188 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2991a48-1675-489b-b719-b366221c255f-cert\") pod \"openstack-baremetal-operator-controller-manager-579b7786b9qh5rv\" (UID: \"b2991a48-1675-489b-b719-b366221c255f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-579b7786b9qh5rv" Feb 23 14:46:16.582590 master-0 kubenswrapper[28758]: E0223 14:46:16.582537 28758 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 14:46:16.582665 master-0 kubenswrapper[28758]: E0223 14:46:16.582620 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2991a48-1675-489b-b719-b366221c255f-cert podName:b2991a48-1675-489b-b719-b366221c255f nodeName:}" failed. No retries permitted until 2026-02-23 14:46:24.582600949 +0000 UTC m=+716.708916881 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b2991a48-1675-489b-b719-b366221c255f-cert") pod "openstack-baremetal-operator-controller-manager-579b7786b9qh5rv" (UID: "b2991a48-1675-489b-b719-b366221c255f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 23 14:46:17.195559 master-0 kubenswrapper[28758]: I0223 14:46:17.195498 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-webhook-certs\") pod \"openstack-operator-controller-manager-5dc486cffc-dtpjj\" (UID: \"98eda798-728f-481c-bb71-8a12ad8a3c37\") " pod="openstack-operators/openstack-operator-controller-manager-5dc486cffc-dtpjj" Feb 23 14:46:17.196209 master-0 kubenswrapper[28758]: I0223 14:46:17.196189 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-metrics-certs\") pod \"openstack-operator-controller-manager-5dc486cffc-dtpjj\" (UID: \"98eda798-728f-481c-bb71-8a12ad8a3c37\") " pod="openstack-operators/openstack-operator-controller-manager-5dc486cffc-dtpjj" Feb 23 14:46:17.197884 master-0 kubenswrapper[28758]: E0223 14:46:17.197866 28758 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 23 14:46:17.198024 master-0 kubenswrapper[28758]: E0223 14:46:17.198012 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-webhook-certs podName:98eda798-728f-481c-bb71-8a12ad8a3c37 nodeName:}" failed. No retries permitted until 2026-02-23 14:46:25.197997724 +0000 UTC m=+717.324313656 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-webhook-certs") pod "openstack-operator-controller-manager-5dc486cffc-dtpjj" (UID: "98eda798-728f-481c-bb71-8a12ad8a3c37") : secret "webhook-server-cert" not found Feb 23 14:46:17.198682 master-0 kubenswrapper[28758]: E0223 14:46:17.198667 28758 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 23 14:46:17.198780 master-0 kubenswrapper[28758]: E0223 14:46:17.198769 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-metrics-certs podName:98eda798-728f-481c-bb71-8a12ad8a3c37 nodeName:}" failed. No retries permitted until 2026-02-23 14:46:25.198760334 +0000 UTC m=+717.325076266 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-metrics-certs") pod "openstack-operator-controller-manager-5dc486cffc-dtpjj" (UID: "98eda798-728f-481c-bb71-8a12ad8a3c37") : secret "metrics-server-cert" not found Feb 23 14:46:18.349169 master-0 kubenswrapper[28758]: I0223 14:46:18.349094 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-9trkk" Feb 23 14:46:18.389734 master-0 kubenswrapper[28758]: I0223 14:46:18.389638 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-zgz2h" Feb 23 14:46:18.549695 master-0 kubenswrapper[28758]: I0223 14:46:18.549624 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-5kwfs" Feb 23 14:46:23.770721 master-0 kubenswrapper[28758]: I0223 14:46:23.770662 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-vd6sh" event={"ID":"5602bab5-e632-45c0-9a58-fca8c507ff8d","Type":"ContainerStarted","Data":"4e38c32dcfce3a034503edb616a96ffd662ac078da9cf6fa742329c258045154"} Feb 23 14:46:23.773572 master-0 kubenswrapper[28758]: I0223 14:46:23.773069 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-vd6sh" Feb 23 14:46:23.783700 master-0 kubenswrapper[28758]: I0223 14:46:23.783656 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-zftk6" event={"ID":"b00988a0-8239-4f73-832d-5e28c3afac6a","Type":"ContainerStarted","Data":"161616630b0b14cc0c2e0a903f84d73d425e964799674e07d161a4b30df92b29"} Feb 23 14:46:23.784342 master-0 kubenswrapper[28758]: I0223 14:46:23.784323 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-zftk6" Feb 23 14:46:23.800631 master-0 kubenswrapper[28758]: I0223 14:46:23.800603 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-dkqkb" event={"ID":"ada28ec1-a66b-4668-941c-c0f0cd424ee4","Type":"ContainerStarted","Data":"1b617c597f70b2a5dc4ef1a37715151276b8524a72a74c1cad07d6b2857f6cf1"} Feb 23 14:46:23.801259 master-0 kubenswrapper[28758]: I0223 14:46:23.801243 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-dkqkb" Feb 23 14:46:23.818603 master-0 kubenswrapper[28758]: I0223 14:46:23.816789 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-bcvt8" event={"ID":"eb045dcf-8041-4baf-a750-6ad9e951dd65","Type":"ContainerStarted","Data":"0e10ab7aa752a97780951c68ff9b036cc3424a7f02690855ed960747d00f1c73"} Feb 23 14:46:23.818603 master-0 kubenswrapper[28758]: I0223 14:46:23.817681 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-bcvt8" Feb 23 14:46:23.834458 master-0 kubenswrapper[28758]: I0223 14:46:23.834387 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pxqtf" event={"ID":"19dbaca0-2912-45da-bc06-8050bd931706","Type":"ContainerStarted","Data":"9352b0f2e9dd18cf1ddec62e4aebca83a6408b746523048b4ffdcc7332905c1d"} Feb 23 14:46:23.835337 master-0 kubenswrapper[28758]: I0223 14:46:23.835299 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pxqtf" Feb 23 14:46:23.849574 master-0 kubenswrapper[28758]: I0223 14:46:23.847760 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-96nsl" event={"ID":"14ce0057-4678-4bf7-bf28-256945f8a589","Type":"ContainerStarted","Data":"c17a8c243dca9400954ad62091f6bc03d4977a6864895fde903ba75d72aa9817"} Feb 23 14:46:23.849574 master-0 kubenswrapper[28758]: I0223 14:46:23.848682 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-96nsl" Feb 23 14:46:23.860614 master-0 kubenswrapper[28758]: I0223 14:46:23.860544 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-gtxbn" event={"ID":"8cc75447-5114-451b-a0c7-accd382d82eb","Type":"ContainerStarted","Data":"5e7f937155aa632327a536530b6ada413bdca7009236c0984334ee453342737e"} Feb 23 14:46:23.861553 master-0 kubenswrapper[28758]: I0223 14:46:23.861515 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-gtxbn" Feb 23 14:46:23.868510 master-0 kubenswrapper[28758]: I0223 14:46:23.862583 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-vd6sh" podStartSLOduration=7.301158232 podStartE2EDuration="16.862561212s" podCreationTimestamp="2026-02-23 14:46:07 +0000 UTC" firstStartedPulling="2026-02-23 14:46:12.895969074 +0000 UTC m=+705.022285006" lastFinishedPulling="2026-02-23 14:46:22.457372054 +0000 UTC m=+714.583687986" observedRunningTime="2026-02-23 14:46:23.85871374 +0000 UTC m=+715.985029672" watchObservedRunningTime="2026-02-23 14:46:23.862561212 +0000 UTC m=+715.988877144" Feb 23 14:46:23.868510 master-0 kubenswrapper[28758]: I0223 14:46:23.866369 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-44k55" event={"ID":"b496c821-47a3-4f07-838c-54560eebf847","Type":"ContainerStarted","Data":"dd890d8a63cc3873b9707118b0bad9084cb65ac3382985be8a0296777f83c43b"} Feb 23 14:46:23.868510 master-0 kubenswrapper[28758]: I0223 14:46:23.867284 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-44k55" Feb 23 14:46:23.877505 master-0 kubenswrapper[28758]: I0223 14:46:23.874838 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-n9hpv" event={"ID":"10b5a636-69f5-4828-8e1e-9a3a598c28aa","Type":"ContainerStarted","Data":"0e7e61e6af7d60b9404c46da6be1b08da9de2d230baa0297da4fda49a85d78f1"} Feb 23 14:46:23.877505 master-0 kubenswrapper[28758]: I0223 14:46:23.875812 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-n9hpv" Feb 23 14:46:23.881119 master-0 kubenswrapper[28758]: I0223 14:46:23.880641 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-7kkjf" event={"ID":"61cee583-7aa7-483b-b0e4-96f48d26a940","Type":"ContainerStarted","Data":"6677cf8bd69b9e0e26a716ebb0ac73c844a22d0f4da02377babc016a673198e2"} Feb 23 14:46:23.882623 master-0 kubenswrapper[28758]: I0223 14:46:23.881527 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-67d996989d-7kkjf" Feb 23 14:46:23.891588 master-0 kubenswrapper[28758]: I0223 14:46:23.891420 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-psqph" event={"ID":"13419178-05f6-4d41-be2b-2849b477ff68","Type":"ContainerStarted","Data":"4c3485e035051d3cb733f8533ac4f2d3f2b3d885ee89cf6a5eb8c1c07b855c6b"} Feb 23 14:46:23.896510 master-0 kubenswrapper[28758]: I0223 14:46:23.892541 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-psqph" Feb 23 14:46:23.900696 master-0 kubenswrapper[28758]: I0223 14:46:23.900639 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-96nsl" podStartSLOduration=5.816093047 podStartE2EDuration="15.900621362s" podCreationTimestamp="2026-02-23 14:46:08 +0000 UTC" firstStartedPulling="2026-02-23 14:46:12.270252886 +0000 UTC m=+704.396568818" lastFinishedPulling="2026-02-23 14:46:22.354781191 +0000 UTC m=+714.481097133" observedRunningTime="2026-02-23 14:46:23.896867713 +0000 UTC m=+716.023183645" watchObservedRunningTime="2026-02-23 14:46:23.900621362 +0000 UTC m=+716.026937294" Feb 23 14:46:23.905426 master-0 kubenswrapper[28758]: I0223 14:46:23.905378 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-jjtxv" event={"ID":"0114c0ec-3af5-4d4e-adac-34f5471c64ce","Type":"ContainerStarted","Data":"276904ebfe1589ea010cace46ad8d544d515c007e9d0663e9a41aabea993458f"} Feb 23 14:46:23.906154 master-0 kubenswrapper[28758]: I0223 14:46:23.906074 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-jjtxv" Feb 23 14:46:23.942845 master-0 kubenswrapper[28758]: I0223 14:46:23.942762 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-zftk6" podStartSLOduration=6.334699102 podStartE2EDuration="15.94274322s" podCreationTimestamp="2026-02-23 14:46:08 +0000 UTC" firstStartedPulling="2026-02-23 14:46:12.918972625 +0000 UTC m=+705.045288557" lastFinishedPulling="2026-02-23 14:46:22.527016733 +0000 UTC m=+714.653332675" observedRunningTime="2026-02-23 14:46:23.931494122 +0000 UTC m=+716.057810064" watchObservedRunningTime="2026-02-23 14:46:23.94274322 +0000 UTC m=+716.069059152" Feb 23 14:46:23.982558 master-0 kubenswrapper[28758]: I0223 14:46:23.980535 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-bcvt8" podStartSLOduration=6.425973105 podStartE2EDuration="15.980513533s" podCreationTimestamp="2026-02-23 14:46:08 +0000 UTC" firstStartedPulling="2026-02-23 14:46:12.904548322 +0000 UTC m=+705.030864254" lastFinishedPulling="2026-02-23 14:46:22.45908875 +0000 UTC m=+714.585404682" observedRunningTime="2026-02-23 14:46:23.969730327 +0000 UTC m=+716.096046269" watchObservedRunningTime="2026-02-23 14:46:23.980513533 +0000 UTC m=+716.106829465" Feb 23 14:46:24.014241 master-0 kubenswrapper[28758]: I0223 14:46:24.014152 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-dkqkb" podStartSLOduration=5.936491822 podStartE2EDuration="16.014127735s" podCreationTimestamp="2026-02-23 14:46:08 +0000 UTC" firstStartedPulling="2026-02-23 14:46:12.27568575 +0000 UTC m=+704.402001682" lastFinishedPulling="2026-02-23 14:46:22.353321643 +0000 UTC m=+714.479637595" observedRunningTime="2026-02-23 14:46:24.001931521 +0000 UTC m=+716.128247453" watchObservedRunningTime="2026-02-23 14:46:24.014127735 +0000 UTC m=+716.140443667" Feb 23 14:46:24.041499 master-0 kubenswrapper[28758]: I0223 14:46:24.041308 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pxqtf" podStartSLOduration=6.599042019 podStartE2EDuration="16.041275276s" podCreationTimestamp="2026-02-23 14:46:08 +0000 UTC" firstStartedPulling="2026-02-23 14:46:12.90976346 +0000 UTC m=+705.036079392" lastFinishedPulling="2026-02-23 14:46:22.351996717 +0000 UTC m=+714.478312649" observedRunningTime="2026-02-23 14:46:24.030027587 +0000 UTC m=+716.156343519" watchObservedRunningTime="2026-02-23 14:46:24.041275276 +0000 UTC m=+716.167591208" Feb 23 14:46:24.061779 master-0 kubenswrapper[28758]: I0223 14:46:24.061683 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-n9hpv" podStartSLOduration=6.63075558 podStartE2EDuration="16.061658257s" podCreationTimestamp="2026-02-23 14:46:08 +0000 UTC" firstStartedPulling="2026-02-23 14:46:12.924050049 +0000 UTC m=+705.050365981" lastFinishedPulling="2026-02-23 14:46:22.354952736 +0000 UTC m=+714.481268658" observedRunningTime="2026-02-23 14:46:24.051646781 +0000 UTC m=+716.177962713" watchObservedRunningTime="2026-02-23 14:46:24.061658257 +0000 UTC m=+716.187974189" Feb 23 14:46:24.078521 master-0 kubenswrapper[28758]: I0223 14:46:24.078400 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-gtxbn" podStartSLOduration=6.618725741 podStartE2EDuration="16.078379931s" podCreationTimestamp="2026-02-23 14:46:08 +0000 UTC" firstStartedPulling="2026-02-23 14:46:12.895967204 +0000 UTC m=+705.022283136" lastFinishedPulling="2026-02-23 14:46:22.355621404 +0000 UTC m=+714.481937326" observedRunningTime="2026-02-23 14:46:24.069855854 +0000 UTC m=+716.196171786" watchObservedRunningTime="2026-02-23 14:46:24.078379931 +0000 UTC m=+716.204695863" Feb 23 14:46:24.089649 master-0 kubenswrapper[28758]: I0223 14:46:24.089560 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-jjtxv" podStartSLOduration=6.597061726 podStartE2EDuration="16.089538087s" podCreationTimestamp="2026-02-23 14:46:08 +0000 UTC" firstStartedPulling="2026-02-23 14:46:12.915894853 +0000 UTC m=+705.042210795" lastFinishedPulling="2026-02-23 14:46:22.408371224 +0000 UTC m=+714.534687156" observedRunningTime="2026-02-23 14:46:24.086156437 +0000 UTC m=+716.212472369" watchObservedRunningTime="2026-02-23 14:46:24.089538087 +0000 UTC m=+716.215854039" Feb 23 14:46:24.143498 master-0 kubenswrapper[28758]: I0223 14:46:24.140167 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-psqph" podStartSLOduration=6.647748991 podStartE2EDuration="16.14013913s" podCreationTimestamp="2026-02-23 14:46:08 +0000 UTC" firstStartedPulling="2026-02-23 14:46:12.917035823 +0000 UTC m=+705.043351765" lastFinishedPulling="2026-02-23 14:46:22.409425972 +0000 UTC m=+714.535741904" observedRunningTime="2026-02-23 14:46:24.12433213 +0000 UTC m=+716.250648062" watchObservedRunningTime="2026-02-23 14:46:24.14013913 +0000 UTC m=+716.266455062" Feb 23 14:46:24.164498 master-0 kubenswrapper[28758]: I0223 14:46:24.163127 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-44k55" podStartSLOduration=6.609061155 podStartE2EDuration="16.1631112s" podCreationTimestamp="2026-02-23 14:46:08 +0000 UTC" firstStartedPulling="2026-02-23 14:46:12.906131354 +0000 UTC m=+705.032447276" lastFinishedPulling="2026-02-23 14:46:22.460181389 +0000 UTC m=+714.586497321" observedRunningTime="2026-02-23 14:46:24.161708112 +0000 UTC m=+716.288024044" watchObservedRunningTime="2026-02-23 14:46:24.1631112 +0000 UTC m=+716.289427132" Feb 23 14:46:24.183596 master-0 kubenswrapper[28758]: I0223 14:46:24.183444 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-67d996989d-7kkjf" podStartSLOduration=6.736902338 podStartE2EDuration="16.183428529s" podCreationTimestamp="2026-02-23 14:46:08 +0000 UTC" firstStartedPulling="2026-02-23 14:46:12.90710637 +0000 UTC m=+705.033422302" lastFinishedPulling="2026-02-23 14:46:22.353632561 +0000 UTC m=+714.479948493" observedRunningTime="2026-02-23 14:46:24.182818983 +0000 UTC m=+716.309134915" watchObservedRunningTime="2026-02-23 14:46:24.183428529 +0000 UTC m=+716.309744461" Feb 23 14:46:24.250211 master-0 kubenswrapper[28758]: I0223 14:46:24.250140 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89e18c7d-ffb4-44e5-b640-e26175c114e1-cert\") pod \"infra-operator-controller-manager-5f879c76b6-5zmbh\" (UID: \"89e18c7d-ffb4-44e5-b640-e26175c114e1\") " pod="openstack-operators/infra-operator-controller-manager-5f879c76b6-5zmbh" Feb 23 14:46:24.253044 master-0 kubenswrapper[28758]: I0223 14:46:24.253000 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89e18c7d-ffb4-44e5-b640-e26175c114e1-cert\") pod \"infra-operator-controller-manager-5f879c76b6-5zmbh\" (UID: \"89e18c7d-ffb4-44e5-b640-e26175c114e1\") " pod="openstack-operators/infra-operator-controller-manager-5f879c76b6-5zmbh" Feb 23 14:46:24.536074 master-0 kubenswrapper[28758]: I0223 14:46:24.536014 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5f879c76b6-5zmbh" Feb 23 14:46:24.659950 master-0 kubenswrapper[28758]: I0223 14:46:24.659897 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2991a48-1675-489b-b719-b366221c255f-cert\") pod \"openstack-baremetal-operator-controller-manager-579b7786b9qh5rv\" (UID: \"b2991a48-1675-489b-b719-b366221c255f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-579b7786b9qh5rv" Feb 23 14:46:24.664784 master-0 kubenswrapper[28758]: I0223 14:46:24.664741 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b2991a48-1675-489b-b719-b366221c255f-cert\") pod \"openstack-baremetal-operator-controller-manager-579b7786b9qh5rv\" (UID: \"b2991a48-1675-489b-b719-b366221c255f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-579b7786b9qh5rv" Feb 23 14:46:24.833344 master-0 kubenswrapper[28758]: I0223 14:46:24.832912 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-579b7786b9qh5rv" Feb 23 14:46:24.982732 master-0 kubenswrapper[28758]: I0223 14:46:24.982696 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5f879c76b6-5zmbh"] Feb 23 14:46:25.270714 master-0 kubenswrapper[28758]: I0223 14:46:25.270623 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-webhook-certs\") pod \"openstack-operator-controller-manager-5dc486cffc-dtpjj\" (UID: \"98eda798-728f-481c-bb71-8a12ad8a3c37\") " pod="openstack-operators/openstack-operator-controller-manager-5dc486cffc-dtpjj" Feb 23 14:46:25.270967 master-0 kubenswrapper[28758]: I0223 14:46:25.270789 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-metrics-certs\") pod \"openstack-operator-controller-manager-5dc486cffc-dtpjj\" (UID: \"98eda798-728f-481c-bb71-8a12ad8a3c37\") " pod="openstack-operators/openstack-operator-controller-manager-5dc486cffc-dtpjj" Feb 23 14:46:25.273166 master-0 kubenswrapper[28758]: E0223 14:46:25.272289 28758 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 23 14:46:25.273166 master-0 kubenswrapper[28758]: E0223 14:46:25.272406 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-webhook-certs podName:98eda798-728f-481c-bb71-8a12ad8a3c37 nodeName:}" failed. No retries permitted until 2026-02-23 14:46:41.272379283 +0000 UTC m=+733.398695255 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-webhook-certs") pod "openstack-operator-controller-manager-5dc486cffc-dtpjj" (UID: "98eda798-728f-481c-bb71-8a12ad8a3c37") : secret "webhook-server-cert" not found Feb 23 14:46:25.276685 master-0 kubenswrapper[28758]: I0223 14:46:25.276635 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-metrics-certs\") pod \"openstack-operator-controller-manager-5dc486cffc-dtpjj\" (UID: \"98eda798-728f-481c-bb71-8a12ad8a3c37\") " pod="openstack-operators/openstack-operator-controller-manager-5dc486cffc-dtpjj" Feb 23 14:46:25.284182 master-0 kubenswrapper[28758]: W0223 14:46:25.284122 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2991a48_1675_489b_b719_b366221c255f.slice/crio-b589d0a1ef29ae910d69a8ade89abef87f963a891a39e2681db92c32e711b748 WatchSource:0}: Error finding container b589d0a1ef29ae910d69a8ade89abef87f963a891a39e2681db92c32e711b748: Status 404 returned error can't find the container with id b589d0a1ef29ae910d69a8ade89abef87f963a891a39e2681db92c32e711b748 Feb 23 14:46:25.286656 master-0 kubenswrapper[28758]: I0223 14:46:25.286607 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-579b7786b9qh5rv"] Feb 23 14:46:25.926103 master-0 kubenswrapper[28758]: I0223 14:46:25.926040 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-579b7786b9qh5rv" event={"ID":"b2991a48-1675-489b-b719-b366221c255f","Type":"ContainerStarted","Data":"b589d0a1ef29ae910d69a8ade89abef87f963a891a39e2681db92c32e711b748"} Feb 23 14:46:25.928754 master-0 kubenswrapper[28758]: I0223 14:46:25.928720 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5f879c76b6-5zmbh" event={"ID":"89e18c7d-ffb4-44e5-b640-e26175c114e1","Type":"ContainerStarted","Data":"70c39a69238c323dc11b66845d8fb7eb728504a7c2ebeea5ec88ee908a94a661"} Feb 23 14:46:27.968212 master-0 kubenswrapper[28758]: I0223 14:46:27.968119 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-579b7786b9qh5rv" event={"ID":"b2991a48-1675-489b-b719-b366221c255f","Type":"ContainerStarted","Data":"f1eb90dcfbf8e1516319e3c5cc1461eb460f479602cfe5633a4f648314a907bf"} Feb 23 14:46:27.968923 master-0 kubenswrapper[28758]: I0223 14:46:27.968231 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-579b7786b9qh5rv" Feb 23 14:46:27.970385 master-0 kubenswrapper[28758]: I0223 14:46:27.970332 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5f879c76b6-5zmbh" event={"ID":"89e18c7d-ffb4-44e5-b640-e26175c114e1","Type":"ContainerStarted","Data":"575006bab924030a6c71ed089f62e86e98c74f2f2d5fe487826deda2d3e8e64a"} Feb 23 14:46:27.971352 master-0 kubenswrapper[28758]: I0223 14:46:27.971312 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5f879c76b6-5zmbh" Feb 23 14:46:28.016821 master-0 kubenswrapper[28758]: I0223 14:46:28.016741 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-579b7786b9qh5rv" podStartSLOduration=17.875194404 podStartE2EDuration="20.016719626s" podCreationTimestamp="2026-02-23 14:46:08 +0000 UTC" firstStartedPulling="2026-02-23 14:46:25.286147899 +0000 UTC m=+717.412463821" lastFinishedPulling="2026-02-23 14:46:27.427673111 +0000 UTC m=+719.553989043" observedRunningTime="2026-02-23 14:46:28.008399206 +0000 UTC m=+720.134715138" watchObservedRunningTime="2026-02-23 14:46:28.016719626 +0000 UTC m=+720.143035548" Feb 23 14:46:28.037975 master-0 kubenswrapper[28758]: I0223 14:46:28.037862 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5f879c76b6-5zmbh" podStartSLOduration=17.616078926 podStartE2EDuration="20.037841367s" podCreationTimestamp="2026-02-23 14:46:08 +0000 UTC" firstStartedPulling="2026-02-23 14:46:24.999683775 +0000 UTC m=+717.125999707" lastFinishedPulling="2026-02-23 14:46:27.421446216 +0000 UTC m=+719.547762148" observedRunningTime="2026-02-23 14:46:28.029099135 +0000 UTC m=+720.155415077" watchObservedRunningTime="2026-02-23 14:46:28.037841367 +0000 UTC m=+720.164157309" Feb 23 14:46:28.672210 master-0 kubenswrapper[28758]: I0223 14:46:28.672159 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-dkqkb" Feb 23 14:46:28.685961 master-0 kubenswrapper[28758]: I0223 14:46:28.685394 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-zftk6" Feb 23 14:46:28.729069 master-0 kubenswrapper[28758]: I0223 14:46:28.728856 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-vd6sh" Feb 23 14:46:28.739110 master-0 kubenswrapper[28758]: I0223 14:46:28.739031 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-bcvt8" Feb 23 14:46:31.743885 master-0 kubenswrapper[28758]: I0223 14:46:31.743826 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-gtxbn" Feb 23 14:46:31.744420 master-0 kubenswrapper[28758]: I0223 14:46:31.743917 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-589c568786-44k55" Feb 23 14:46:31.744420 master-0 kubenswrapper[28758]: I0223 14:46:31.743966 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pxqtf" Feb 23 14:46:31.744420 master-0 kubenswrapper[28758]: I0223 14:46:31.744001 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-jjtxv" Feb 23 14:46:31.744420 master-0 kubenswrapper[28758]: I0223 14:46:31.744041 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-96nsl" Feb 23 14:46:31.744420 master-0 kubenswrapper[28758]: I0223 14:46:31.744078 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-psqph" Feb 23 14:46:31.744420 master-0 kubenswrapper[28758]: I0223 14:46:31.744116 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-n9hpv" Feb 23 14:46:31.744420 master-0 kubenswrapper[28758]: I0223 14:46:31.744153 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-67d996989d-7kkjf" Feb 23 14:46:32.741416 master-0 kubenswrapper[28758]: I0223 14:46:32.741360 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5f879c76b6-5zmbh" Feb 23 14:46:34.839877 master-0 kubenswrapper[28758]: I0223 14:46:34.839834 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-579b7786b9qh5rv" Feb 23 14:46:37.800360 master-0 kubenswrapper[28758]: I0223 14:46:37.796719 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-xnn4r" event={"ID":"f257fd05-c591-4324-94b4-8f87a7741118","Type":"ContainerStarted","Data":"912c45ab6164d1a26a24057fe81a873dfab64f4b005520e3cf6c7c76e2747449"} Feb 23 14:46:37.800360 master-0 kubenswrapper[28758]: I0223 14:46:37.797028 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-xnn4r" Feb 23 14:46:37.806503 master-0 kubenswrapper[28758]: I0223 14:46:37.803243 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-wll2v" event={"ID":"6520dae0-e31c-40d1-a562-6e0631553146","Type":"ContainerStarted","Data":"7913702166b9df5ff4de6bd3a471f4e3a658c1a17be9eb73ed22c2ebf477e2a7"} Feb 23 14:46:37.806503 master-0 kubenswrapper[28758]: I0223 14:46:37.803433 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-wll2v" Feb 23 14:46:37.812153 master-0 kubenswrapper[28758]: I0223 14:46:37.811165 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2lsqj" event={"ID":"0e1182a2-d37e-45d6-9457-af639fc42480","Type":"ContainerStarted","Data":"0990439c98d695d78dd8349c4f1515112d311a5a8a2ccf7325f9ae63d0a35011"} Feb 23 14:46:37.812474 master-0 kubenswrapper[28758]: I0223 14:46:37.812433 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2lsqj" Feb 23 14:46:37.839821 master-0 kubenswrapper[28758]: I0223 14:46:37.837834 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-xnn4r" podStartSLOduration=5.663138957 podStartE2EDuration="29.837812109s" podCreationTimestamp="2026-02-23 14:46:08 +0000 UTC" firstStartedPulling="2026-02-23 14:46:12.931190659 +0000 UTC m=+705.057506591" lastFinishedPulling="2026-02-23 14:46:37.105863811 +0000 UTC m=+729.232179743" observedRunningTime="2026-02-23 14:46:37.824966918 +0000 UTC m=+729.951282860" watchObservedRunningTime="2026-02-23 14:46:37.837812109 +0000 UTC m=+729.964128041" Feb 23 14:46:37.848165 master-0 kubenswrapper[28758]: I0223 14:46:37.848079 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2lsqj" podStartSLOduration=5.706883498 podStartE2EDuration="29.84805624s" podCreationTimestamp="2026-02-23 14:46:08 +0000 UTC" firstStartedPulling="2026-02-23 14:46:12.986321103 +0000 UTC m=+705.112637035" lastFinishedPulling="2026-02-23 14:46:37.127493845 +0000 UTC m=+729.253809777" observedRunningTime="2026-02-23 14:46:37.845535714 +0000 UTC m=+729.971851666" watchObservedRunningTime="2026-02-23 14:46:37.84805624 +0000 UTC m=+729.974372172" Feb 23 14:46:37.869579 master-0 kubenswrapper[28758]: I0223 14:46:37.869375 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-wll2v" podStartSLOduration=6.796233623 podStartE2EDuration="29.869350286s" podCreationTimestamp="2026-02-23 14:46:08 +0000 UTC" firstStartedPulling="2026-02-23 14:46:12.98283098 +0000 UTC m=+705.109146912" lastFinishedPulling="2026-02-23 14:46:36.055947643 +0000 UTC m=+728.182263575" observedRunningTime="2026-02-23 14:46:37.868792721 +0000 UTC m=+729.995108673" watchObservedRunningTime="2026-02-23 14:46:37.869350286 +0000 UTC m=+729.995666218" Feb 23 14:46:40.846339 master-0 kubenswrapper[28758]: I0223 14:46:40.846269 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mlb5b" event={"ID":"9b189b3e-46a8-4ae5-8037-09238906f46c","Type":"ContainerStarted","Data":"ec491cfaaa79f8c2371d200d33e1b2e914c8346ad8dec47e3fbca360969315c6"} Feb 23 14:46:40.849405 master-0 kubenswrapper[28758]: I0223 14:46:40.849320 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-bsbx5" event={"ID":"63f4cc5c-7ea4-40dd-8ada-a8508d600f2a","Type":"ContainerStarted","Data":"0a685554e9840e3a2112ca1a45ed765fc798ed4b80f395cdf1fadcb1c46cb236"} Feb 23 14:46:40.849618 master-0 kubenswrapper[28758]: I0223 14:46:40.849565 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-bsbx5" Feb 23 14:46:40.870084 master-0 kubenswrapper[28758]: I0223 14:46:40.869980 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mlb5b" podStartSLOduration=5.918203708 podStartE2EDuration="32.869957322s" podCreationTimestamp="2026-02-23 14:46:08 +0000 UTC" firstStartedPulling="2026-02-23 14:46:12.986075796 +0000 UTC m=+705.112391728" lastFinishedPulling="2026-02-23 14:46:39.93782941 +0000 UTC m=+732.064145342" observedRunningTime="2026-02-23 14:46:40.862161435 +0000 UTC m=+732.988477387" watchObservedRunningTime="2026-02-23 14:46:40.869957322 +0000 UTC m=+732.996273264" Feb 23 14:46:40.896557 master-0 kubenswrapper[28758]: I0223 14:46:40.892011 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-bsbx5" podStartSLOduration=5.96048683 podStartE2EDuration="32.891992917s" podCreationTimestamp="2026-02-23 14:46:08 +0000 UTC" firstStartedPulling="2026-02-23 14:46:12.986533278 +0000 UTC m=+705.112849210" lastFinishedPulling="2026-02-23 14:46:39.918039365 +0000 UTC m=+732.044355297" observedRunningTime="2026-02-23 14:46:40.890953519 +0000 UTC m=+733.017269451" watchObservedRunningTime="2026-02-23 14:46:40.891992917 +0000 UTC m=+733.018308849" Feb 23 14:46:41.281604 master-0 kubenswrapper[28758]: I0223 14:46:41.278770 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-webhook-certs\") pod \"openstack-operator-controller-manager-5dc486cffc-dtpjj\" (UID: \"98eda798-728f-481c-bb71-8a12ad8a3c37\") " pod="openstack-operators/openstack-operator-controller-manager-5dc486cffc-dtpjj" Feb 23 14:46:41.284351 master-0 kubenswrapper[28758]: I0223 14:46:41.284309 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98eda798-728f-481c-bb71-8a12ad8a3c37-webhook-certs\") pod \"openstack-operator-controller-manager-5dc486cffc-dtpjj\" (UID: \"98eda798-728f-481c-bb71-8a12ad8a3c37\") " pod="openstack-operators/openstack-operator-controller-manager-5dc486cffc-dtpjj" Feb 23 14:46:41.490517 master-0 kubenswrapper[28758]: I0223 14:46:41.489901 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5dc486cffc-dtpjj" Feb 23 14:46:42.015692 master-0 kubenswrapper[28758]: I0223 14:46:42.015628 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5dc486cffc-dtpjj"] Feb 23 14:46:42.024367 master-0 kubenswrapper[28758]: W0223 14:46:42.024262 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98eda798_728f_481c_bb71_8a12ad8a3c37.slice/crio-2a57d443740ee0e1393e489189b6b779e8524e71c9a2f56b7d9972562628a40d WatchSource:0}: Error finding container 2a57d443740ee0e1393e489189b6b779e8524e71c9a2f56b7d9972562628a40d: Status 404 returned error can't find the container with id 2a57d443740ee0e1393e489189b6b779e8524e71c9a2f56b7d9972562628a40d Feb 23 14:46:42.875260 master-0 kubenswrapper[28758]: I0223 14:46:42.875199 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5dc486cffc-dtpjj" event={"ID":"98eda798-728f-481c-bb71-8a12ad8a3c37","Type":"ContainerStarted","Data":"50abb7e534d2c52b02788cc6445d3da409fc7cb4fd48f269902d58937deefab7"} Feb 23 14:46:42.875260 master-0 kubenswrapper[28758]: I0223 14:46:42.875252 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5dc486cffc-dtpjj" event={"ID":"98eda798-728f-481c-bb71-8a12ad8a3c37","Type":"ContainerStarted","Data":"2a57d443740ee0e1393e489189b6b779e8524e71c9a2f56b7d9972562628a40d"} Feb 23 14:46:42.875575 master-0 kubenswrapper[28758]: I0223 14:46:42.875392 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5dc486cffc-dtpjj" Feb 23 14:46:42.920974 master-0 kubenswrapper[28758]: I0223 14:46:42.920827 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5dc486cffc-dtpjj" podStartSLOduration=34.920781407 podStartE2EDuration="34.920781407s" podCreationTimestamp="2026-02-23 14:46:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:46:42.910409951 +0000 UTC m=+735.036725883" watchObservedRunningTime="2026-02-23 14:46:42.920781407 +0000 UTC m=+735.047097339" Feb 23 14:46:48.811973 master-0 kubenswrapper[28758]: I0223 14:46:48.811918 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-2lsqj" Feb 23 14:46:48.826360 master-0 kubenswrapper[28758]: I0223 14:46:48.826258 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-xnn4r" Feb 23 14:46:48.909679 master-0 kubenswrapper[28758]: I0223 14:46:48.909582 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-bsbx5" Feb 23 14:46:49.004040 master-0 kubenswrapper[28758]: I0223 14:46:49.003943 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-wll2v" Feb 23 14:46:51.496296 master-0 kubenswrapper[28758]: I0223 14:46:51.496246 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5dc486cffc-dtpjj" Feb 23 14:47:26.434444 master-0 kubenswrapper[28758]: E0223 14:47:26.434393 28758 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/NetworkManager-dispatcher.service\": RecentStats: unable to find data in memory cache]" Feb 23 14:47:28.517684 master-0 kubenswrapper[28758]: I0223 14:47:28.517565 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bc7f9869-fvhpj"] Feb 23 14:47:28.525833 master-0 kubenswrapper[28758]: I0223 14:47:28.525739 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc7f9869-fvhpj" Feb 23 14:47:28.532019 master-0 kubenswrapper[28758]: I0223 14:47:28.531099 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 23 14:47:28.532019 master-0 kubenswrapper[28758]: I0223 14:47:28.531459 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 23 14:47:28.539456 master-0 kubenswrapper[28758]: I0223 14:47:28.536591 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 23 14:47:28.539456 master-0 kubenswrapper[28758]: I0223 14:47:28.538984 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bc7f9869-fvhpj"] Feb 23 14:47:28.607126 master-0 kubenswrapper[28758]: I0223 14:47:28.604629 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d4c486879-dcskk"] Feb 23 14:47:28.607126 master-0 kubenswrapper[28758]: I0223 14:47:28.606470 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d4c486879-dcskk" Feb 23 14:47:28.608951 master-0 kubenswrapper[28758]: I0223 14:47:28.608875 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 23 14:47:28.612017 master-0 kubenswrapper[28758]: I0223 14:47:28.610934 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdd52\" (UniqueName: \"kubernetes.io/projected/c78f8eb5-a4fd-48e0-9eca-e89d4896fc67-kube-api-access-bdd52\") pod \"dnsmasq-dns-bc7f9869-fvhpj\" (UID: \"c78f8eb5-a4fd-48e0-9eca-e89d4896fc67\") " pod="openstack/dnsmasq-dns-bc7f9869-fvhpj" Feb 23 14:47:28.612017 master-0 kubenswrapper[28758]: I0223 14:47:28.611163 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c78f8eb5-a4fd-48e0-9eca-e89d4896fc67-config\") pod \"dnsmasq-dns-bc7f9869-fvhpj\" (UID: \"c78f8eb5-a4fd-48e0-9eca-e89d4896fc67\") " pod="openstack/dnsmasq-dns-bc7f9869-fvhpj" Feb 23 14:47:28.628014 master-0 kubenswrapper[28758]: I0223 14:47:28.623541 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d4c486879-dcskk"] Feb 23 14:47:28.713180 master-0 kubenswrapper[28758]: I0223 14:47:28.713095 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdd52\" (UniqueName: \"kubernetes.io/projected/c78f8eb5-a4fd-48e0-9eca-e89d4896fc67-kube-api-access-bdd52\") pod \"dnsmasq-dns-bc7f9869-fvhpj\" (UID: \"c78f8eb5-a4fd-48e0-9eca-e89d4896fc67\") " pod="openstack/dnsmasq-dns-bc7f9869-fvhpj" Feb 23 14:47:28.713384 master-0 kubenswrapper[28758]: I0223 14:47:28.713225 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c78f8eb5-a4fd-48e0-9eca-e89d4896fc67-config\") pod \"dnsmasq-dns-bc7f9869-fvhpj\" (UID: \"c78f8eb5-a4fd-48e0-9eca-e89d4896fc67\") " pod="openstack/dnsmasq-dns-bc7f9869-fvhpj" Feb 23 14:47:28.713384 master-0 kubenswrapper[28758]: I0223 14:47:28.713272 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eaf6c126-dc0e-4840-9c9d-afef68564e47-dns-svc\") pod \"dnsmasq-dns-7d4c486879-dcskk\" (UID: \"eaf6c126-dc0e-4840-9c9d-afef68564e47\") " pod="openstack/dnsmasq-dns-7d4c486879-dcskk" Feb 23 14:47:28.713384 master-0 kubenswrapper[28758]: I0223 14:47:28.713301 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj2x4\" (UniqueName: \"kubernetes.io/projected/eaf6c126-dc0e-4840-9c9d-afef68564e47-kube-api-access-wj2x4\") pod \"dnsmasq-dns-7d4c486879-dcskk\" (UID: \"eaf6c126-dc0e-4840-9c9d-afef68564e47\") " pod="openstack/dnsmasq-dns-7d4c486879-dcskk" Feb 23 14:47:28.713593 master-0 kubenswrapper[28758]: I0223 14:47:28.713522 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaf6c126-dc0e-4840-9c9d-afef68564e47-config\") pod \"dnsmasq-dns-7d4c486879-dcskk\" (UID: \"eaf6c126-dc0e-4840-9c9d-afef68564e47\") " pod="openstack/dnsmasq-dns-7d4c486879-dcskk" Feb 23 14:47:28.714499 master-0 kubenswrapper[28758]: I0223 14:47:28.714433 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c78f8eb5-a4fd-48e0-9eca-e89d4896fc67-config\") pod \"dnsmasq-dns-bc7f9869-fvhpj\" (UID: \"c78f8eb5-a4fd-48e0-9eca-e89d4896fc67\") " pod="openstack/dnsmasq-dns-bc7f9869-fvhpj" Feb 23 14:47:28.729581 master-0 kubenswrapper[28758]: I0223 14:47:28.729532 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdd52\" (UniqueName: \"kubernetes.io/projected/c78f8eb5-a4fd-48e0-9eca-e89d4896fc67-kube-api-access-bdd52\") pod \"dnsmasq-dns-bc7f9869-fvhpj\" (UID: \"c78f8eb5-a4fd-48e0-9eca-e89d4896fc67\") " pod="openstack/dnsmasq-dns-bc7f9869-fvhpj" Feb 23 14:47:28.815430 master-0 kubenswrapper[28758]: I0223 14:47:28.815276 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eaf6c126-dc0e-4840-9c9d-afef68564e47-dns-svc\") pod \"dnsmasq-dns-7d4c486879-dcskk\" (UID: \"eaf6c126-dc0e-4840-9c9d-afef68564e47\") " pod="openstack/dnsmasq-dns-7d4c486879-dcskk" Feb 23 14:47:28.815430 master-0 kubenswrapper[28758]: I0223 14:47:28.815362 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wj2x4\" (UniqueName: \"kubernetes.io/projected/eaf6c126-dc0e-4840-9c9d-afef68564e47-kube-api-access-wj2x4\") pod \"dnsmasq-dns-7d4c486879-dcskk\" (UID: \"eaf6c126-dc0e-4840-9c9d-afef68564e47\") " pod="openstack/dnsmasq-dns-7d4c486879-dcskk" Feb 23 14:47:28.815430 master-0 kubenswrapper[28758]: I0223 14:47:28.815416 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaf6c126-dc0e-4840-9c9d-afef68564e47-config\") pod \"dnsmasq-dns-7d4c486879-dcskk\" (UID: \"eaf6c126-dc0e-4840-9c9d-afef68564e47\") " pod="openstack/dnsmasq-dns-7d4c486879-dcskk" Feb 23 14:47:28.816113 master-0 kubenswrapper[28758]: I0223 14:47:28.816082 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eaf6c126-dc0e-4840-9c9d-afef68564e47-dns-svc\") pod \"dnsmasq-dns-7d4c486879-dcskk\" (UID: \"eaf6c126-dc0e-4840-9c9d-afef68564e47\") " pod="openstack/dnsmasq-dns-7d4c486879-dcskk" Feb 23 14:47:28.816423 master-0 kubenswrapper[28758]: I0223 14:47:28.816379 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaf6c126-dc0e-4840-9c9d-afef68564e47-config\") pod \"dnsmasq-dns-7d4c486879-dcskk\" (UID: \"eaf6c126-dc0e-4840-9c9d-afef68564e47\") " pod="openstack/dnsmasq-dns-7d4c486879-dcskk" Feb 23 14:47:28.834011 master-0 kubenswrapper[28758]: I0223 14:47:28.833966 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj2x4\" (UniqueName: \"kubernetes.io/projected/eaf6c126-dc0e-4840-9c9d-afef68564e47-kube-api-access-wj2x4\") pod \"dnsmasq-dns-7d4c486879-dcskk\" (UID: \"eaf6c126-dc0e-4840-9c9d-afef68564e47\") " pod="openstack/dnsmasq-dns-7d4c486879-dcskk" Feb 23 14:47:28.882727 master-0 kubenswrapper[28758]: I0223 14:47:28.882642 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc7f9869-fvhpj" Feb 23 14:47:28.958353 master-0 kubenswrapper[28758]: I0223 14:47:28.954064 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d4c486879-dcskk" Feb 23 14:47:29.434562 master-0 kubenswrapper[28758]: W0223 14:47:29.434433 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc78f8eb5_a4fd_48e0_9eca_e89d4896fc67.slice/crio-2fe52a6ced5b50c75a6cec9826c0d909bb5d0bf9bfa82c010ab5dbc4d86ddd65 WatchSource:0}: Error finding container 2fe52a6ced5b50c75a6cec9826c0d909bb5d0bf9bfa82c010ab5dbc4d86ddd65: Status 404 returned error can't find the container with id 2fe52a6ced5b50c75a6cec9826c0d909bb5d0bf9bfa82c010ab5dbc4d86ddd65 Feb 23 14:47:29.435852 master-0 kubenswrapper[28758]: I0223 14:47:29.435780 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bc7f9869-fvhpj"] Feb 23 14:47:29.553363 master-0 kubenswrapper[28758]: W0223 14:47:29.553303 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeaf6c126_dc0e_4840_9c9d_afef68564e47.slice/crio-d59195820f21aadfc5248596519caec5ee5406053afd57b72ad59862d7fb04d9 WatchSource:0}: Error finding container d59195820f21aadfc5248596519caec5ee5406053afd57b72ad59862d7fb04d9: Status 404 returned error can't find the container with id d59195820f21aadfc5248596519caec5ee5406053afd57b72ad59862d7fb04d9 Feb 23 14:47:29.553928 master-0 kubenswrapper[28758]: I0223 14:47:29.553891 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d4c486879-dcskk"] Feb 23 14:47:30.364086 master-0 kubenswrapper[28758]: I0223 14:47:30.363957 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc7f9869-fvhpj" event={"ID":"c78f8eb5-a4fd-48e0-9eca-e89d4896fc67","Type":"ContainerStarted","Data":"2fe52a6ced5b50c75a6cec9826c0d909bb5d0bf9bfa82c010ab5dbc4d86ddd65"} Feb 23 14:47:30.366698 master-0 kubenswrapper[28758]: I0223 14:47:30.366631 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d4c486879-dcskk" event={"ID":"eaf6c126-dc0e-4840-9c9d-afef68564e47","Type":"ContainerStarted","Data":"d59195820f21aadfc5248596519caec5ee5406053afd57b72ad59862d7fb04d9"} Feb 23 14:47:31.312507 master-0 kubenswrapper[28758]: I0223 14:47:31.312001 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d4c486879-dcskk"] Feb 23 14:47:31.337500 master-0 kubenswrapper[28758]: I0223 14:47:31.331557 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6974cff98c-ph5jg"] Feb 23 14:47:31.344495 master-0 kubenswrapper[28758]: I0223 14:47:31.340315 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6974cff98c-ph5jg" Feb 23 14:47:31.381499 master-0 kubenswrapper[28758]: I0223 14:47:31.374890 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6974cff98c-ph5jg"] Feb 23 14:47:31.503500 master-0 kubenswrapper[28758]: I0223 14:47:31.502541 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef57f6b9-940a-45fd-99a9-520e64c5be84-dns-svc\") pod \"dnsmasq-dns-6974cff98c-ph5jg\" (UID: \"ef57f6b9-940a-45fd-99a9-520e64c5be84\") " pod="openstack/dnsmasq-dns-6974cff98c-ph5jg" Feb 23 14:47:31.503500 master-0 kubenswrapper[28758]: I0223 14:47:31.502677 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhvwx\" (UniqueName: \"kubernetes.io/projected/ef57f6b9-940a-45fd-99a9-520e64c5be84-kube-api-access-fhvwx\") pod \"dnsmasq-dns-6974cff98c-ph5jg\" (UID: \"ef57f6b9-940a-45fd-99a9-520e64c5be84\") " pod="openstack/dnsmasq-dns-6974cff98c-ph5jg" Feb 23 14:47:31.503500 master-0 kubenswrapper[28758]: I0223 14:47:31.502757 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef57f6b9-940a-45fd-99a9-520e64c5be84-config\") pod \"dnsmasq-dns-6974cff98c-ph5jg\" (UID: \"ef57f6b9-940a-45fd-99a9-520e64c5be84\") " pod="openstack/dnsmasq-dns-6974cff98c-ph5jg" Feb 23 14:47:31.610867 master-0 kubenswrapper[28758]: I0223 14:47:31.608294 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhvwx\" (UniqueName: \"kubernetes.io/projected/ef57f6b9-940a-45fd-99a9-520e64c5be84-kube-api-access-fhvwx\") pod \"dnsmasq-dns-6974cff98c-ph5jg\" (UID: \"ef57f6b9-940a-45fd-99a9-520e64c5be84\") " pod="openstack/dnsmasq-dns-6974cff98c-ph5jg" Feb 23 14:47:31.610867 master-0 kubenswrapper[28758]: I0223 14:47:31.608960 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef57f6b9-940a-45fd-99a9-520e64c5be84-config\") pod \"dnsmasq-dns-6974cff98c-ph5jg\" (UID: \"ef57f6b9-940a-45fd-99a9-520e64c5be84\") " pod="openstack/dnsmasq-dns-6974cff98c-ph5jg" Feb 23 14:47:31.610867 master-0 kubenswrapper[28758]: I0223 14:47:31.609182 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef57f6b9-940a-45fd-99a9-520e64c5be84-dns-svc\") pod \"dnsmasq-dns-6974cff98c-ph5jg\" (UID: \"ef57f6b9-940a-45fd-99a9-520e64c5be84\") " pod="openstack/dnsmasq-dns-6974cff98c-ph5jg" Feb 23 14:47:31.610867 master-0 kubenswrapper[28758]: I0223 14:47:31.610449 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef57f6b9-940a-45fd-99a9-520e64c5be84-dns-svc\") pod \"dnsmasq-dns-6974cff98c-ph5jg\" (UID: \"ef57f6b9-940a-45fd-99a9-520e64c5be84\") " pod="openstack/dnsmasq-dns-6974cff98c-ph5jg" Feb 23 14:47:31.617630 master-0 kubenswrapper[28758]: I0223 14:47:31.616054 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef57f6b9-940a-45fd-99a9-520e64c5be84-config\") pod \"dnsmasq-dns-6974cff98c-ph5jg\" (UID: \"ef57f6b9-940a-45fd-99a9-520e64c5be84\") " pod="openstack/dnsmasq-dns-6974cff98c-ph5jg" Feb 23 14:47:31.644708 master-0 kubenswrapper[28758]: I0223 14:47:31.644663 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhvwx\" (UniqueName: \"kubernetes.io/projected/ef57f6b9-940a-45fd-99a9-520e64c5be84-kube-api-access-fhvwx\") pod \"dnsmasq-dns-6974cff98c-ph5jg\" (UID: \"ef57f6b9-940a-45fd-99a9-520e64c5be84\") " pod="openstack/dnsmasq-dns-6974cff98c-ph5jg" Feb 23 14:47:31.657699 master-0 kubenswrapper[28758]: I0223 14:47:31.657639 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bc7f9869-fvhpj"] Feb 23 14:47:31.676301 master-0 kubenswrapper[28758]: I0223 14:47:31.676253 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6974cff98c-ph5jg" Feb 23 14:47:31.698781 master-0 kubenswrapper[28758]: I0223 14:47:31.698722 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c45d57b9c-qrtjc"] Feb 23 14:47:31.702709 master-0 kubenswrapper[28758]: I0223 14:47:31.702670 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c45d57b9c-qrtjc" Feb 23 14:47:31.766580 master-0 kubenswrapper[28758]: I0223 14:47:31.766189 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c45d57b9c-qrtjc"] Feb 23 14:47:31.820649 master-0 kubenswrapper[28758]: I0223 14:47:31.820593 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f23ec168-6731-4fdf-8b85-837218a794f6-config\") pod \"dnsmasq-dns-7c45d57b9c-qrtjc\" (UID: \"f23ec168-6731-4fdf-8b85-837218a794f6\") " pod="openstack/dnsmasq-dns-7c45d57b9c-qrtjc" Feb 23 14:47:31.821412 master-0 kubenswrapper[28758]: I0223 14:47:31.820681 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f23ec168-6731-4fdf-8b85-837218a794f6-dns-svc\") pod \"dnsmasq-dns-7c45d57b9c-qrtjc\" (UID: \"f23ec168-6731-4fdf-8b85-837218a794f6\") " pod="openstack/dnsmasq-dns-7c45d57b9c-qrtjc" Feb 23 14:47:31.821412 master-0 kubenswrapper[28758]: I0223 14:47:31.820725 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6shm8\" (UniqueName: \"kubernetes.io/projected/f23ec168-6731-4fdf-8b85-837218a794f6-kube-api-access-6shm8\") pod \"dnsmasq-dns-7c45d57b9c-qrtjc\" (UID: \"f23ec168-6731-4fdf-8b85-837218a794f6\") " pod="openstack/dnsmasq-dns-7c45d57b9c-qrtjc" Feb 23 14:47:31.923990 master-0 kubenswrapper[28758]: I0223 14:47:31.923870 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f23ec168-6731-4fdf-8b85-837218a794f6-config\") pod \"dnsmasq-dns-7c45d57b9c-qrtjc\" (UID: \"f23ec168-6731-4fdf-8b85-837218a794f6\") " pod="openstack/dnsmasq-dns-7c45d57b9c-qrtjc" Feb 23 14:47:31.923990 master-0 kubenswrapper[28758]: I0223 14:47:31.923952 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f23ec168-6731-4fdf-8b85-837218a794f6-dns-svc\") pod \"dnsmasq-dns-7c45d57b9c-qrtjc\" (UID: \"f23ec168-6731-4fdf-8b85-837218a794f6\") " pod="openstack/dnsmasq-dns-7c45d57b9c-qrtjc" Feb 23 14:47:31.924381 master-0 kubenswrapper[28758]: I0223 14:47:31.924036 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6shm8\" (UniqueName: \"kubernetes.io/projected/f23ec168-6731-4fdf-8b85-837218a794f6-kube-api-access-6shm8\") pod \"dnsmasq-dns-7c45d57b9c-qrtjc\" (UID: \"f23ec168-6731-4fdf-8b85-837218a794f6\") " pod="openstack/dnsmasq-dns-7c45d57b9c-qrtjc" Feb 23 14:47:31.925005 master-0 kubenswrapper[28758]: I0223 14:47:31.924971 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f23ec168-6731-4fdf-8b85-837218a794f6-config\") pod \"dnsmasq-dns-7c45d57b9c-qrtjc\" (UID: \"f23ec168-6731-4fdf-8b85-837218a794f6\") " pod="openstack/dnsmasq-dns-7c45d57b9c-qrtjc" Feb 23 14:47:31.925221 master-0 kubenswrapper[28758]: I0223 14:47:31.925169 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f23ec168-6731-4fdf-8b85-837218a794f6-dns-svc\") pod \"dnsmasq-dns-7c45d57b9c-qrtjc\" (UID: \"f23ec168-6731-4fdf-8b85-837218a794f6\") " pod="openstack/dnsmasq-dns-7c45d57b9c-qrtjc" Feb 23 14:47:31.943408 master-0 kubenswrapper[28758]: I0223 14:47:31.943360 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6shm8\" (UniqueName: \"kubernetes.io/projected/f23ec168-6731-4fdf-8b85-837218a794f6-kube-api-access-6shm8\") pod \"dnsmasq-dns-7c45d57b9c-qrtjc\" (UID: \"f23ec168-6731-4fdf-8b85-837218a794f6\") " pod="openstack/dnsmasq-dns-7c45d57b9c-qrtjc" Feb 23 14:47:32.113978 master-0 kubenswrapper[28758]: I0223 14:47:32.113906 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c45d57b9c-qrtjc" Feb 23 14:47:32.288080 master-0 kubenswrapper[28758]: I0223 14:47:32.285421 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6974cff98c-ph5jg"] Feb 23 14:47:32.402170 master-0 kubenswrapper[28758]: I0223 14:47:32.402061 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6974cff98c-ph5jg" event={"ID":"ef57f6b9-940a-45fd-99a9-520e64c5be84","Type":"ContainerStarted","Data":"e0bf3f6cfa4e400857642830a865234e9c759589d651ea562f5cf6cc5f68fec5"} Feb 23 14:47:32.659436 master-0 kubenswrapper[28758]: I0223 14:47:32.659316 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c45d57b9c-qrtjc"] Feb 23 14:47:32.669382 master-0 kubenswrapper[28758]: W0223 14:47:32.669310 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf23ec168_6731_4fdf_8b85_837218a794f6.slice/crio-51e34f0a2697d5173501f9e1efcdff962de18b3fc33ea0f3dd16904d58721fe9 WatchSource:0}: Error finding container 51e34f0a2697d5173501f9e1efcdff962de18b3fc33ea0f3dd16904d58721fe9: Status 404 returned error can't find the container with id 51e34f0a2697d5173501f9e1efcdff962de18b3fc33ea0f3dd16904d58721fe9 Feb 23 14:47:33.436198 master-0 kubenswrapper[28758]: I0223 14:47:33.436112 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c45d57b9c-qrtjc" event={"ID":"f23ec168-6731-4fdf-8b85-837218a794f6","Type":"ContainerStarted","Data":"51e34f0a2697d5173501f9e1efcdff962de18b3fc33ea0f3dd16904d58721fe9"} Feb 23 14:47:35.473937 master-0 kubenswrapper[28758]: I0223 14:47:35.473875 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 14:47:35.477083 master-0 kubenswrapper[28758]: I0223 14:47:35.477033 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.479663 master-0 kubenswrapper[28758]: I0223 14:47:35.479617 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 23 14:47:35.481399 master-0 kubenswrapper[28758]: I0223 14:47:35.481211 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 23 14:47:35.481548 master-0 kubenswrapper[28758]: I0223 14:47:35.481506 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 23 14:47:35.481613 master-0 kubenswrapper[28758]: I0223 14:47:35.481559 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 23 14:47:35.483028 master-0 kubenswrapper[28758]: I0223 14:47:35.482189 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 23 14:47:35.487659 master-0 kubenswrapper[28758]: I0223 14:47:35.483786 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 23 14:47:35.519796 master-0 kubenswrapper[28758]: I0223 14:47:35.519721 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 14:47:35.576178 master-0 kubenswrapper[28758]: I0223 14:47:35.576083 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7pd9\" (UniqueName: \"kubernetes.io/projected/373bbd85-b2d4-40a4-afc1-3ecf50a666e7-kube-api-access-r7pd9\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.576178 master-0 kubenswrapper[28758]: I0223 14:47:35.576178 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/373bbd85-b2d4-40a4-afc1-3ecf50a666e7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.576431 master-0 kubenswrapper[28758]: I0223 14:47:35.576215 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/373bbd85-b2d4-40a4-afc1-3ecf50a666e7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.576431 master-0 kubenswrapper[28758]: I0223 14:47:35.576279 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/373bbd85-b2d4-40a4-afc1-3ecf50a666e7-config-data\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.576431 master-0 kubenswrapper[28758]: I0223 14:47:35.576299 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/373bbd85-b2d4-40a4-afc1-3ecf50a666e7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.576431 master-0 kubenswrapper[28758]: I0223 14:47:35.576338 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2e5ebe95-e4c9-492d-9d4e-b527c8590b01\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f42d5b6b-1b32-4d6e-9984-6a0ce465f688\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.576431 master-0 kubenswrapper[28758]: I0223 14:47:35.576373 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/373bbd85-b2d4-40a4-afc1-3ecf50a666e7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.576602 master-0 kubenswrapper[28758]: I0223 14:47:35.576470 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/373bbd85-b2d4-40a4-afc1-3ecf50a666e7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.576602 master-0 kubenswrapper[28758]: I0223 14:47:35.576516 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/373bbd85-b2d4-40a4-afc1-3ecf50a666e7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.576602 master-0 kubenswrapper[28758]: I0223 14:47:35.576540 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/373bbd85-b2d4-40a4-afc1-3ecf50a666e7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.576602 master-0 kubenswrapper[28758]: I0223 14:47:35.576566 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/373bbd85-b2d4-40a4-afc1-3ecf50a666e7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.676136 master-0 kubenswrapper[28758]: I0223 14:47:35.676038 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 23 14:47:35.679256 master-0 kubenswrapper[28758]: I0223 14:47:35.679117 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/373bbd85-b2d4-40a4-afc1-3ecf50a666e7-config-data\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.679256 master-0 kubenswrapper[28758]: I0223 14:47:35.679164 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/373bbd85-b2d4-40a4-afc1-3ecf50a666e7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.679637 master-0 kubenswrapper[28758]: I0223 14:47:35.679270 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2e5ebe95-e4c9-492d-9d4e-b527c8590b01\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f42d5b6b-1b32-4d6e-9984-6a0ce465f688\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.679637 master-0 kubenswrapper[28758]: I0223 14:47:35.679385 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/373bbd85-b2d4-40a4-afc1-3ecf50a666e7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.679637 master-0 kubenswrapper[28758]: I0223 14:47:35.679491 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/373bbd85-b2d4-40a4-afc1-3ecf50a666e7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.679637 master-0 kubenswrapper[28758]: I0223 14:47:35.679522 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/373bbd85-b2d4-40a4-afc1-3ecf50a666e7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.679637 master-0 kubenswrapper[28758]: I0223 14:47:35.679546 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/373bbd85-b2d4-40a4-afc1-3ecf50a666e7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.679637 master-0 kubenswrapper[28758]: I0223 14:47:35.679570 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/373bbd85-b2d4-40a4-afc1-3ecf50a666e7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.679637 master-0 kubenswrapper[28758]: I0223 14:47:35.679598 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7pd9\" (UniqueName: \"kubernetes.io/projected/373bbd85-b2d4-40a4-afc1-3ecf50a666e7-kube-api-access-r7pd9\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.679637 master-0 kubenswrapper[28758]: I0223 14:47:35.679629 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/373bbd85-b2d4-40a4-afc1-3ecf50a666e7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.679997 master-0 kubenswrapper[28758]: I0223 14:47:35.679657 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/373bbd85-b2d4-40a4-afc1-3ecf50a666e7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.680516 master-0 kubenswrapper[28758]: I0223 14:47:35.680214 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/373bbd85-b2d4-40a4-afc1-3ecf50a666e7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.680862 master-0 kubenswrapper[28758]: I0223 14:47:35.680731 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 23 14:47:35.682741 master-0 kubenswrapper[28758]: I0223 14:47:35.682265 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/373bbd85-b2d4-40a4-afc1-3ecf50a666e7-config-data\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.699576 master-0 kubenswrapper[28758]: I0223 14:47:35.699524 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/373bbd85-b2d4-40a4-afc1-3ecf50a666e7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.702893 master-0 kubenswrapper[28758]: I0223 14:47:35.702845 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/373bbd85-b2d4-40a4-afc1-3ecf50a666e7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.726796 master-0 kubenswrapper[28758]: I0223 14:47:35.707649 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 23 14:47:35.726796 master-0 kubenswrapper[28758]: I0223 14:47:35.718192 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/373bbd85-b2d4-40a4-afc1-3ecf50a666e7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.726796 master-0 kubenswrapper[28758]: I0223 14:47:35.718345 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/373bbd85-b2d4-40a4-afc1-3ecf50a666e7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.726796 master-0 kubenswrapper[28758]: I0223 14:47:35.721662 28758 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 14:47:35.726796 master-0 kubenswrapper[28758]: I0223 14:47:35.721701 28758 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2e5ebe95-e4c9-492d-9d4e-b527c8590b01\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f42d5b6b-1b32-4d6e-9984-6a0ce465f688\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/79f1a3fae49bd8fd2f012774cb849951804ef9e9f4f76b48629923d745bf57d8/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.728585 master-0 kubenswrapper[28758]: I0223 14:47:35.728191 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/373bbd85-b2d4-40a4-afc1-3ecf50a666e7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.729237 master-0 kubenswrapper[28758]: I0223 14:47:35.729211 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 23 14:47:35.730351 master-0 kubenswrapper[28758]: I0223 14:47:35.730005 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 23 14:47:35.730351 master-0 kubenswrapper[28758]: I0223 14:47:35.730151 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 23 14:47:35.734292 master-0 kubenswrapper[28758]: I0223 14:47:35.733729 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/373bbd85-b2d4-40a4-afc1-3ecf50a666e7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.734822 master-0 kubenswrapper[28758]: I0223 14:47:35.734752 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7pd9\" (UniqueName: \"kubernetes.io/projected/373bbd85-b2d4-40a4-afc1-3ecf50a666e7-kube-api-access-r7pd9\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.757161 master-0 kubenswrapper[28758]: I0223 14:47:35.756271 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/373bbd85-b2d4-40a4-afc1-3ecf50a666e7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:35.787768 master-0 kubenswrapper[28758]: I0223 14:47:35.787679 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/29c2ac3e-73c3-4a07-a129-fcea6817fda3-kolla-config\") pod \"memcached-0\" (UID: \"29c2ac3e-73c3-4a07-a129-fcea6817fda3\") " pod="openstack/memcached-0" Feb 23 14:47:35.788199 master-0 kubenswrapper[28758]: I0223 14:47:35.787802 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2xzg\" (UniqueName: \"kubernetes.io/projected/29c2ac3e-73c3-4a07-a129-fcea6817fda3-kube-api-access-n2xzg\") pod \"memcached-0\" (UID: \"29c2ac3e-73c3-4a07-a129-fcea6817fda3\") " pod="openstack/memcached-0" Feb 23 14:47:35.788199 master-0 kubenswrapper[28758]: I0223 14:47:35.787843 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/29c2ac3e-73c3-4a07-a129-fcea6817fda3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"29c2ac3e-73c3-4a07-a129-fcea6817fda3\") " pod="openstack/memcached-0" Feb 23 14:47:35.788199 master-0 kubenswrapper[28758]: I0223 14:47:35.787963 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/29c2ac3e-73c3-4a07-a129-fcea6817fda3-config-data\") pod \"memcached-0\" (UID: \"29c2ac3e-73c3-4a07-a129-fcea6817fda3\") " pod="openstack/memcached-0" Feb 23 14:47:35.788199 master-0 kubenswrapper[28758]: I0223 14:47:35.788094 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29c2ac3e-73c3-4a07-a129-fcea6817fda3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"29c2ac3e-73c3-4a07-a129-fcea6817fda3\") " pod="openstack/memcached-0" Feb 23 14:47:35.890487 master-0 kubenswrapper[28758]: I0223 14:47:35.890289 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29c2ac3e-73c3-4a07-a129-fcea6817fda3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"29c2ac3e-73c3-4a07-a129-fcea6817fda3\") " pod="openstack/memcached-0" Feb 23 14:47:35.890487 master-0 kubenswrapper[28758]: I0223 14:47:35.890399 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/29c2ac3e-73c3-4a07-a129-fcea6817fda3-kolla-config\") pod \"memcached-0\" (UID: \"29c2ac3e-73c3-4a07-a129-fcea6817fda3\") " pod="openstack/memcached-0" Feb 23 14:47:35.890827 master-0 kubenswrapper[28758]: I0223 14:47:35.890422 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2xzg\" (UniqueName: \"kubernetes.io/projected/29c2ac3e-73c3-4a07-a129-fcea6817fda3-kube-api-access-n2xzg\") pod \"memcached-0\" (UID: \"29c2ac3e-73c3-4a07-a129-fcea6817fda3\") " pod="openstack/memcached-0" Feb 23 14:47:35.890827 master-0 kubenswrapper[28758]: I0223 14:47:35.890726 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/29c2ac3e-73c3-4a07-a129-fcea6817fda3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"29c2ac3e-73c3-4a07-a129-fcea6817fda3\") " pod="openstack/memcached-0" Feb 23 14:47:35.891005 master-0 kubenswrapper[28758]: I0223 14:47:35.890956 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/29c2ac3e-73c3-4a07-a129-fcea6817fda3-config-data\") pod \"memcached-0\" (UID: \"29c2ac3e-73c3-4a07-a129-fcea6817fda3\") " pod="openstack/memcached-0" Feb 23 14:47:35.892500 master-0 kubenswrapper[28758]: I0223 14:47:35.892334 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/29c2ac3e-73c3-4a07-a129-fcea6817fda3-config-data\") pod \"memcached-0\" (UID: \"29c2ac3e-73c3-4a07-a129-fcea6817fda3\") " pod="openstack/memcached-0" Feb 23 14:47:35.892772 master-0 kubenswrapper[28758]: I0223 14:47:35.892743 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/29c2ac3e-73c3-4a07-a129-fcea6817fda3-kolla-config\") pod \"memcached-0\" (UID: \"29c2ac3e-73c3-4a07-a129-fcea6817fda3\") " pod="openstack/memcached-0" Feb 23 14:47:35.893643 master-0 kubenswrapper[28758]: I0223 14:47:35.893604 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 14:47:35.895368 master-0 kubenswrapper[28758]: I0223 14:47:35.895340 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:35.896857 master-0 kubenswrapper[28758]: I0223 14:47:35.896809 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29c2ac3e-73c3-4a07-a129-fcea6817fda3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"29c2ac3e-73c3-4a07-a129-fcea6817fda3\") " pod="openstack/memcached-0" Feb 23 14:47:35.898067 master-0 kubenswrapper[28758]: I0223 14:47:35.898032 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 23 14:47:35.898208 master-0 kubenswrapper[28758]: I0223 14:47:35.898155 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 23 14:47:35.898345 master-0 kubenswrapper[28758]: I0223 14:47:35.898312 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/29c2ac3e-73c3-4a07-a129-fcea6817fda3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"29c2ac3e-73c3-4a07-a129-fcea6817fda3\") " pod="openstack/memcached-0" Feb 23 14:47:35.899897 master-0 kubenswrapper[28758]: I0223 14:47:35.899871 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 23 14:47:35.900003 master-0 kubenswrapper[28758]: I0223 14:47:35.899956 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 23 14:47:35.901215 master-0 kubenswrapper[28758]: I0223 14:47:35.901193 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 23 14:47:35.901440 master-0 kubenswrapper[28758]: I0223 14:47:35.901407 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 23 14:47:35.917333 master-0 kubenswrapper[28758]: I0223 14:47:35.917271 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 14:47:35.930160 master-0 kubenswrapper[28758]: I0223 14:47:35.929929 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2xzg\" (UniqueName: \"kubernetes.io/projected/29c2ac3e-73c3-4a07-a129-fcea6817fda3-kube-api-access-n2xzg\") pod \"memcached-0\" (UID: \"29c2ac3e-73c3-4a07-a129-fcea6817fda3\") " pod="openstack/memcached-0" Feb 23 14:47:35.994027 master-0 kubenswrapper[28758]: I0223 14:47:35.993862 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e18414da-932f-4a26-ab6a-af32aa83196b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:35.994027 master-0 kubenswrapper[28758]: I0223 14:47:35.993976 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qnlx\" (UniqueName: \"kubernetes.io/projected/e18414da-932f-4a26-ab6a-af32aa83196b-kube-api-access-8qnlx\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:35.994310 master-0 kubenswrapper[28758]: I0223 14:47:35.994150 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-94b77561-0e8c-4889-9c92-22b2c3c81de7\" (UniqueName: \"kubernetes.io/csi/topolvm.io^1cf62405-902f-4427-8646-44d4d14dbdf5\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:35.994310 master-0 kubenswrapper[28758]: I0223 14:47:35.994169 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e18414da-932f-4a26-ab6a-af32aa83196b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:35.994310 master-0 kubenswrapper[28758]: I0223 14:47:35.994188 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e18414da-932f-4a26-ab6a-af32aa83196b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:35.994310 master-0 kubenswrapper[28758]: I0223 14:47:35.994298 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e18414da-932f-4a26-ab6a-af32aa83196b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:36.002227 master-0 kubenswrapper[28758]: I0223 14:47:36.002137 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e18414da-932f-4a26-ab6a-af32aa83196b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:36.002449 master-0 kubenswrapper[28758]: I0223 14:47:36.002230 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e18414da-932f-4a26-ab6a-af32aa83196b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:36.002449 master-0 kubenswrapper[28758]: I0223 14:47:36.002372 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e18414da-932f-4a26-ab6a-af32aa83196b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:36.002725 master-0 kubenswrapper[28758]: I0223 14:47:36.002711 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e18414da-932f-4a26-ab6a-af32aa83196b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:36.003010 master-0 kubenswrapper[28758]: I0223 14:47:36.002982 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e18414da-932f-4a26-ab6a-af32aa83196b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:36.107893 master-0 kubenswrapper[28758]: I0223 14:47:36.107833 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e18414da-932f-4a26-ab6a-af32aa83196b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:36.108260 master-0 kubenswrapper[28758]: I0223 14:47:36.107926 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e18414da-932f-4a26-ab6a-af32aa83196b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:36.108260 master-0 kubenswrapper[28758]: I0223 14:47:36.107963 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e18414da-932f-4a26-ab6a-af32aa83196b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:36.108260 master-0 kubenswrapper[28758]: I0223 14:47:36.108004 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e18414da-932f-4a26-ab6a-af32aa83196b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:36.108260 master-0 kubenswrapper[28758]: I0223 14:47:36.108026 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qnlx\" (UniqueName: \"kubernetes.io/projected/e18414da-932f-4a26-ab6a-af32aa83196b-kube-api-access-8qnlx\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:36.108260 master-0 kubenswrapper[28758]: I0223 14:47:36.108060 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-94b77561-0e8c-4889-9c92-22b2c3c81de7\" (UniqueName: \"kubernetes.io/csi/topolvm.io^1cf62405-902f-4427-8646-44d4d14dbdf5\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:36.108260 master-0 kubenswrapper[28758]: I0223 14:47:36.108088 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e18414da-932f-4a26-ab6a-af32aa83196b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:36.108260 master-0 kubenswrapper[28758]: I0223 14:47:36.108113 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e18414da-932f-4a26-ab6a-af32aa83196b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:36.108260 master-0 kubenswrapper[28758]: I0223 14:47:36.108164 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e18414da-932f-4a26-ab6a-af32aa83196b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:36.108260 master-0 kubenswrapper[28758]: I0223 14:47:36.108207 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e18414da-932f-4a26-ab6a-af32aa83196b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:36.108260 master-0 kubenswrapper[28758]: I0223 14:47:36.108224 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e18414da-932f-4a26-ab6a-af32aa83196b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:36.110272 master-0 kubenswrapper[28758]: I0223 14:47:36.110211 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e18414da-932f-4a26-ab6a-af32aa83196b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:36.110922 master-0 kubenswrapper[28758]: I0223 14:47:36.110397 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e18414da-932f-4a26-ab6a-af32aa83196b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:36.110922 master-0 kubenswrapper[28758]: I0223 14:47:36.110817 28758 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 14:47:36.110922 master-0 kubenswrapper[28758]: I0223 14:47:36.110844 28758 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-94b77561-0e8c-4889-9c92-22b2c3c81de7\" (UniqueName: \"kubernetes.io/csi/topolvm.io^1cf62405-902f-4427-8646-44d4d14dbdf5\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/7a9b22fd2543dd6fc23a7b3fedc2e65ce0c57d18a8877d5b8476c4c71cfe6ccd/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:36.112074 master-0 kubenswrapper[28758]: I0223 14:47:36.111719 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e18414da-932f-4a26-ab6a-af32aa83196b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:36.112074 master-0 kubenswrapper[28758]: I0223 14:47:36.111778 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e18414da-932f-4a26-ab6a-af32aa83196b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:36.123330 master-0 kubenswrapper[28758]: I0223 14:47:36.114856 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e18414da-932f-4a26-ab6a-af32aa83196b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:36.123330 master-0 kubenswrapper[28758]: I0223 14:47:36.115580 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e18414da-932f-4a26-ab6a-af32aa83196b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:36.123330 master-0 kubenswrapper[28758]: I0223 14:47:36.120246 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e18414da-932f-4a26-ab6a-af32aa83196b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:36.123330 master-0 kubenswrapper[28758]: I0223 14:47:36.121163 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e18414da-932f-4a26-ab6a-af32aa83196b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:36.128902 master-0 kubenswrapper[28758]: I0223 14:47:36.128044 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qnlx\" (UniqueName: \"kubernetes.io/projected/e18414da-932f-4a26-ab6a-af32aa83196b-kube-api-access-8qnlx\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:36.151712 master-0 kubenswrapper[28758]: I0223 14:47:36.139526 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 23 14:47:36.151712 master-0 kubenswrapper[28758]: I0223 14:47:36.143427 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e18414da-932f-4a26-ab6a-af32aa83196b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:36.951615 master-0 kubenswrapper[28758]: I0223 14:47:36.944298 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 23 14:47:36.951615 master-0 kubenswrapper[28758]: I0223 14:47:36.945882 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 23 14:47:36.952861 master-0 kubenswrapper[28758]: I0223 14:47:36.952832 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 23 14:47:36.953244 master-0 kubenswrapper[28758]: I0223 14:47:36.953230 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 23 14:47:36.953431 master-0 kubenswrapper[28758]: I0223 14:47:36.953419 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 23 14:47:37.022705 master-0 kubenswrapper[28758]: I0223 14:47:37.022606 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 23 14:47:37.039558 master-0 kubenswrapper[28758]: I0223 14:47:37.028641 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs7p6\" (UniqueName: \"kubernetes.io/projected/daacc97c-efdc-40e3-b833-237dde2caafe-kube-api-access-vs7p6\") pod \"openstack-galera-0\" (UID: \"daacc97c-efdc-40e3-b833-237dde2caafe\") " pod="openstack/openstack-galera-0" Feb 23 14:47:37.039558 master-0 kubenswrapper[28758]: I0223 14:47:37.028777 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/daacc97c-efdc-40e3-b833-237dde2caafe-config-data-default\") pod \"openstack-galera-0\" (UID: \"daacc97c-efdc-40e3-b833-237dde2caafe\") " pod="openstack/openstack-galera-0" Feb 23 14:47:37.039558 master-0 kubenswrapper[28758]: I0223 14:47:37.028814 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daacc97c-efdc-40e3-b833-237dde2caafe-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"daacc97c-efdc-40e3-b833-237dde2caafe\") " pod="openstack/openstack-galera-0" Feb 23 14:47:37.039558 master-0 kubenswrapper[28758]: I0223 14:47:37.028877 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/daacc97c-efdc-40e3-b833-237dde2caafe-config-data-generated\") pod \"openstack-galera-0\" (UID: \"daacc97c-efdc-40e3-b833-237dde2caafe\") " pod="openstack/openstack-galera-0" Feb 23 14:47:37.039558 master-0 kubenswrapper[28758]: I0223 14:47:37.029003 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/daacc97c-efdc-40e3-b833-237dde2caafe-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"daacc97c-efdc-40e3-b833-237dde2caafe\") " pod="openstack/openstack-galera-0" Feb 23 14:47:37.039558 master-0 kubenswrapper[28758]: I0223 14:47:37.029046 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/daacc97c-efdc-40e3-b833-237dde2caafe-operator-scripts\") pod \"openstack-galera-0\" (UID: \"daacc97c-efdc-40e3-b833-237dde2caafe\") " pod="openstack/openstack-galera-0" Feb 23 14:47:37.039558 master-0 kubenswrapper[28758]: I0223 14:47:37.029166 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7809075e-f1af-4286-b3df-18834662b921\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d2f5a4f4-e43d-424e-94ed-0686821d1492\") pod \"openstack-galera-0\" (UID: \"daacc97c-efdc-40e3-b833-237dde2caafe\") " pod="openstack/openstack-galera-0" Feb 23 14:47:37.039558 master-0 kubenswrapper[28758]: I0223 14:47:37.029303 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/daacc97c-efdc-40e3-b833-237dde2caafe-kolla-config\") pod \"openstack-galera-0\" (UID: \"daacc97c-efdc-40e3-b833-237dde2caafe\") " pod="openstack/openstack-galera-0" Feb 23 14:47:37.132216 master-0 kubenswrapper[28758]: I0223 14:47:37.132149 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs7p6\" (UniqueName: \"kubernetes.io/projected/daacc97c-efdc-40e3-b833-237dde2caafe-kube-api-access-vs7p6\") pod \"openstack-galera-0\" (UID: \"daacc97c-efdc-40e3-b833-237dde2caafe\") " pod="openstack/openstack-galera-0" Feb 23 14:47:37.132216 master-0 kubenswrapper[28758]: I0223 14:47:37.132221 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/daacc97c-efdc-40e3-b833-237dde2caafe-config-data-default\") pod \"openstack-galera-0\" (UID: \"daacc97c-efdc-40e3-b833-237dde2caafe\") " pod="openstack/openstack-galera-0" Feb 23 14:47:37.132452 master-0 kubenswrapper[28758]: I0223 14:47:37.132245 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daacc97c-efdc-40e3-b833-237dde2caafe-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"daacc97c-efdc-40e3-b833-237dde2caafe\") " pod="openstack/openstack-galera-0" Feb 23 14:47:37.132452 master-0 kubenswrapper[28758]: I0223 14:47:37.132266 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/daacc97c-efdc-40e3-b833-237dde2caafe-config-data-generated\") pod \"openstack-galera-0\" (UID: \"daacc97c-efdc-40e3-b833-237dde2caafe\") " pod="openstack/openstack-galera-0" Feb 23 14:47:37.132452 master-0 kubenswrapper[28758]: I0223 14:47:37.132313 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/daacc97c-efdc-40e3-b833-237dde2caafe-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"daacc97c-efdc-40e3-b833-237dde2caafe\") " pod="openstack/openstack-galera-0" Feb 23 14:47:37.132452 master-0 kubenswrapper[28758]: I0223 14:47:37.132332 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/daacc97c-efdc-40e3-b833-237dde2caafe-operator-scripts\") pod \"openstack-galera-0\" (UID: \"daacc97c-efdc-40e3-b833-237dde2caafe\") " pod="openstack/openstack-galera-0" Feb 23 14:47:37.132452 master-0 kubenswrapper[28758]: I0223 14:47:37.132371 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7809075e-f1af-4286-b3df-18834662b921\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d2f5a4f4-e43d-424e-94ed-0686821d1492\") pod \"openstack-galera-0\" (UID: \"daacc97c-efdc-40e3-b833-237dde2caafe\") " pod="openstack/openstack-galera-0" Feb 23 14:47:37.132452 master-0 kubenswrapper[28758]: I0223 14:47:37.132420 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/daacc97c-efdc-40e3-b833-237dde2caafe-kolla-config\") pod \"openstack-galera-0\" (UID: \"daacc97c-efdc-40e3-b833-237dde2caafe\") " pod="openstack/openstack-galera-0" Feb 23 14:47:37.133287 master-0 kubenswrapper[28758]: I0223 14:47:37.133231 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/daacc97c-efdc-40e3-b833-237dde2caafe-config-data-generated\") pod \"openstack-galera-0\" (UID: \"daacc97c-efdc-40e3-b833-237dde2caafe\") " pod="openstack/openstack-galera-0" Feb 23 14:47:37.133351 master-0 kubenswrapper[28758]: I0223 14:47:37.133315 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/daacc97c-efdc-40e3-b833-237dde2caafe-kolla-config\") pod \"openstack-galera-0\" (UID: \"daacc97c-efdc-40e3-b833-237dde2caafe\") " pod="openstack/openstack-galera-0" Feb 23 14:47:37.137625 master-0 kubenswrapper[28758]: I0223 14:47:37.137576 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/daacc97c-efdc-40e3-b833-237dde2caafe-config-data-default\") pod \"openstack-galera-0\" (UID: \"daacc97c-efdc-40e3-b833-237dde2caafe\") " pod="openstack/openstack-galera-0" Feb 23 14:47:37.139165 master-0 kubenswrapper[28758]: I0223 14:47:37.139042 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/daacc97c-efdc-40e3-b833-237dde2caafe-operator-scripts\") pod \"openstack-galera-0\" (UID: \"daacc97c-efdc-40e3-b833-237dde2caafe\") " pod="openstack/openstack-galera-0" Feb 23 14:47:37.139754 master-0 kubenswrapper[28758]: I0223 14:47:37.139680 28758 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 14:47:37.139754 master-0 kubenswrapper[28758]: I0223 14:47:37.139717 28758 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7809075e-f1af-4286-b3df-18834662b921\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d2f5a4f4-e43d-424e-94ed-0686821d1492\") pod \"openstack-galera-0\" (UID: \"daacc97c-efdc-40e3-b833-237dde2caafe\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/a65319228cb96557df527b03f8994c071f338ef5553855c5ac49b0951408464a/globalmount\"" pod="openstack/openstack-galera-0" Feb 23 14:47:37.158021 master-0 kubenswrapper[28758]: I0223 14:47:37.157555 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/daacc97c-efdc-40e3-b833-237dde2caafe-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"daacc97c-efdc-40e3-b833-237dde2caafe\") " pod="openstack/openstack-galera-0" Feb 23 14:47:37.162923 master-0 kubenswrapper[28758]: I0223 14:47:37.162875 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daacc97c-efdc-40e3-b833-237dde2caafe-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"daacc97c-efdc-40e3-b833-237dde2caafe\") " pod="openstack/openstack-galera-0" Feb 23 14:47:37.175576 master-0 kubenswrapper[28758]: I0223 14:47:37.175525 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs7p6\" (UniqueName: \"kubernetes.io/projected/daacc97c-efdc-40e3-b833-237dde2caafe-kube-api-access-vs7p6\") pod \"openstack-galera-0\" (UID: \"daacc97c-efdc-40e3-b833-237dde2caafe\") " pod="openstack/openstack-galera-0" Feb 23 14:47:37.460843 master-0 kubenswrapper[28758]: I0223 14:47:37.460787 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2e5ebe95-e4c9-492d-9d4e-b527c8590b01\" (UniqueName: \"kubernetes.io/csi/topolvm.io^f42d5b6b-1b32-4d6e-9984-6a0ce465f688\") pod \"rabbitmq-server-0\" (UID: \"373bbd85-b2d4-40a4-afc1-3ecf50a666e7\") " pod="openstack/rabbitmq-server-0" Feb 23 14:47:37.613097 master-0 kubenswrapper[28758]: I0223 14:47:37.612989 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 23 14:47:38.515938 master-0 kubenswrapper[28758]: I0223 14:47:38.504689 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 23 14:47:38.515938 master-0 kubenswrapper[28758]: I0223 14:47:38.506339 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 23 14:47:38.536322 master-0 kubenswrapper[28758]: I0223 14:47:38.535007 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 23 14:47:38.536322 master-0 kubenswrapper[28758]: I0223 14:47:38.535763 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 23 14:47:38.536322 master-0 kubenswrapper[28758]: I0223 14:47:38.535932 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 23 14:47:38.574311 master-0 kubenswrapper[28758]: I0223 14:47:38.574245 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 23 14:47:38.581820 master-0 kubenswrapper[28758]: I0223 14:47:38.581778 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f7a29d97-0b1c-4657-ae05-8ef48a3813ba-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f7a29d97-0b1c-4657-ae05-8ef48a3813ba\") " pod="openstack/openstack-cell1-galera-0" Feb 23 14:47:38.582048 master-0 kubenswrapper[28758]: I0223 14:47:38.582028 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7a29d97-0b1c-4657-ae05-8ef48a3813ba-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f7a29d97-0b1c-4657-ae05-8ef48a3813ba\") " pod="openstack/openstack-cell1-galera-0" Feb 23 14:47:38.582206 master-0 kubenswrapper[28758]: I0223 14:47:38.582183 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qz4h\" (UniqueName: \"kubernetes.io/projected/f7a29d97-0b1c-4657-ae05-8ef48a3813ba-kube-api-access-6qz4h\") pod \"openstack-cell1-galera-0\" (UID: \"f7a29d97-0b1c-4657-ae05-8ef48a3813ba\") " pod="openstack/openstack-cell1-galera-0" Feb 23 14:47:38.582356 master-0 kubenswrapper[28758]: I0223 14:47:38.582337 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f7a29d97-0b1c-4657-ae05-8ef48a3813ba-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f7a29d97-0b1c-4657-ae05-8ef48a3813ba\") " pod="openstack/openstack-cell1-galera-0" Feb 23 14:47:38.582692 master-0 kubenswrapper[28758]: I0223 14:47:38.582638 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7a29d97-0b1c-4657-ae05-8ef48a3813ba-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f7a29d97-0b1c-4657-ae05-8ef48a3813ba\") " pod="openstack/openstack-cell1-galera-0" Feb 23 14:47:38.582780 master-0 kubenswrapper[28758]: I0223 14:47:38.582716 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e31fafda-0bc9-4eb1-9e1b-43c1e25e0369\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d058654e-5dde-40ee-8337-029cdfcebc8a\") pod \"openstack-cell1-galera-0\" (UID: \"f7a29d97-0b1c-4657-ae05-8ef48a3813ba\") " pod="openstack/openstack-cell1-galera-0" Feb 23 14:47:38.582819 master-0 kubenswrapper[28758]: I0223 14:47:38.582782 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f7a29d97-0b1c-4657-ae05-8ef48a3813ba-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f7a29d97-0b1c-4657-ae05-8ef48a3813ba\") " pod="openstack/openstack-cell1-galera-0" Feb 23 14:47:38.582855 master-0 kubenswrapper[28758]: I0223 14:47:38.582817 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7a29d97-0b1c-4657-ae05-8ef48a3813ba-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f7a29d97-0b1c-4657-ae05-8ef48a3813ba\") " pod="openstack/openstack-cell1-galera-0" Feb 23 14:47:38.691500 master-0 kubenswrapper[28758]: I0223 14:47:38.686343 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qz4h\" (UniqueName: \"kubernetes.io/projected/f7a29d97-0b1c-4657-ae05-8ef48a3813ba-kube-api-access-6qz4h\") pod \"openstack-cell1-galera-0\" (UID: \"f7a29d97-0b1c-4657-ae05-8ef48a3813ba\") " pod="openstack/openstack-cell1-galera-0" Feb 23 14:47:38.693750 master-0 kubenswrapper[28758]: I0223 14:47:38.693706 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f7a29d97-0b1c-4657-ae05-8ef48a3813ba-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f7a29d97-0b1c-4657-ae05-8ef48a3813ba\") " pod="openstack/openstack-cell1-galera-0" Feb 23 14:47:38.702707 master-0 kubenswrapper[28758]: I0223 14:47:38.694617 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f7a29d97-0b1c-4657-ae05-8ef48a3813ba-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f7a29d97-0b1c-4657-ae05-8ef48a3813ba\") " pod="openstack/openstack-cell1-galera-0" Feb 23 14:47:38.703164 master-0 kubenswrapper[28758]: I0223 14:47:38.703137 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7a29d97-0b1c-4657-ae05-8ef48a3813ba-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f7a29d97-0b1c-4657-ae05-8ef48a3813ba\") " pod="openstack/openstack-cell1-galera-0" Feb 23 14:47:38.703276 master-0 kubenswrapper[28758]: I0223 14:47:38.703259 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e31fafda-0bc9-4eb1-9e1b-43c1e25e0369\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d058654e-5dde-40ee-8337-029cdfcebc8a\") pod \"openstack-cell1-galera-0\" (UID: \"f7a29d97-0b1c-4657-ae05-8ef48a3813ba\") " pod="openstack/openstack-cell1-galera-0" Feb 23 14:47:38.703385 master-0 kubenswrapper[28758]: I0223 14:47:38.703372 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f7a29d97-0b1c-4657-ae05-8ef48a3813ba-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f7a29d97-0b1c-4657-ae05-8ef48a3813ba\") " pod="openstack/openstack-cell1-galera-0" Feb 23 14:47:38.703492 master-0 kubenswrapper[28758]: I0223 14:47:38.703459 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7a29d97-0b1c-4657-ae05-8ef48a3813ba-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f7a29d97-0b1c-4657-ae05-8ef48a3813ba\") " pod="openstack/openstack-cell1-galera-0" Feb 23 14:47:38.703692 master-0 kubenswrapper[28758]: I0223 14:47:38.703678 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f7a29d97-0b1c-4657-ae05-8ef48a3813ba-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f7a29d97-0b1c-4657-ae05-8ef48a3813ba\") " pod="openstack/openstack-cell1-galera-0" Feb 23 14:47:38.703783 master-0 kubenswrapper[28758]: I0223 14:47:38.703771 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7a29d97-0b1c-4657-ae05-8ef48a3813ba-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f7a29d97-0b1c-4657-ae05-8ef48a3813ba\") " pod="openstack/openstack-cell1-galera-0" Feb 23 14:47:38.722499 master-0 kubenswrapper[28758]: I0223 14:47:38.704994 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f7a29d97-0b1c-4657-ae05-8ef48a3813ba-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f7a29d97-0b1c-4657-ae05-8ef48a3813ba\") " pod="openstack/openstack-cell1-galera-0" Feb 23 14:47:38.722499 master-0 kubenswrapper[28758]: I0223 14:47:38.706227 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7a29d97-0b1c-4657-ae05-8ef48a3813ba-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f7a29d97-0b1c-4657-ae05-8ef48a3813ba\") " pod="openstack/openstack-cell1-galera-0" Feb 23 14:47:38.723550 master-0 kubenswrapper[28758]: I0223 14:47:38.723114 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f7a29d97-0b1c-4657-ae05-8ef48a3813ba-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f7a29d97-0b1c-4657-ae05-8ef48a3813ba\") " pod="openstack/openstack-cell1-galera-0" Feb 23 14:47:38.731907 master-0 kubenswrapper[28758]: I0223 14:47:38.731838 28758 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 14:47:38.731907 master-0 kubenswrapper[28758]: I0223 14:47:38.731888 28758 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e31fafda-0bc9-4eb1-9e1b-43c1e25e0369\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d058654e-5dde-40ee-8337-029cdfcebc8a\") pod \"openstack-cell1-galera-0\" (UID: \"f7a29d97-0b1c-4657-ae05-8ef48a3813ba\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/81bcbdddb8d056b14361ae2a896d4f9fb06bde1dc0bb49ac8f01850c8c690f97/globalmount\"" pod="openstack/openstack-cell1-galera-0" Feb 23 14:47:38.733186 master-0 kubenswrapper[28758]: I0223 14:47:38.733136 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7a29d97-0b1c-4657-ae05-8ef48a3813ba-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f7a29d97-0b1c-4657-ae05-8ef48a3813ba\") " pod="openstack/openstack-cell1-galera-0" Feb 23 14:47:38.746596 master-0 kubenswrapper[28758]: I0223 14:47:38.735712 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7a29d97-0b1c-4657-ae05-8ef48a3813ba-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f7a29d97-0b1c-4657-ae05-8ef48a3813ba\") " pod="openstack/openstack-cell1-galera-0" Feb 23 14:47:38.750496 master-0 kubenswrapper[28758]: I0223 14:47:38.748218 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qz4h\" (UniqueName: \"kubernetes.io/projected/f7a29d97-0b1c-4657-ae05-8ef48a3813ba-kube-api-access-6qz4h\") pod \"openstack-cell1-galera-0\" (UID: \"f7a29d97-0b1c-4657-ae05-8ef48a3813ba\") " pod="openstack/openstack-cell1-galera-0" Feb 23 14:47:38.838087 master-0 kubenswrapper[28758]: I0223 14:47:38.837968 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-94b77561-0e8c-4889-9c92-22b2c3c81de7\" (UniqueName: \"kubernetes.io/csi/topolvm.io^1cf62405-902f-4427-8646-44d4d14dbdf5\") pod \"rabbitmq-cell1-server-0\" (UID: \"e18414da-932f-4a26-ab6a-af32aa83196b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:39.024245 master-0 kubenswrapper[28758]: I0223 14:47:39.024156 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:47:39.849913 master-0 kubenswrapper[28758]: I0223 14:47:39.849784 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7809075e-f1af-4286-b3df-18834662b921\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d2f5a4f4-e43d-424e-94ed-0686821d1492\") pod \"openstack-galera-0\" (UID: \"daacc97c-efdc-40e3-b833-237dde2caafe\") " pod="openstack/openstack-galera-0" Feb 23 14:47:40.023652 master-0 kubenswrapper[28758]: I0223 14:47:40.023539 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 23 14:47:40.983702 master-0 kubenswrapper[28758]: I0223 14:47:40.983633 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e31fafda-0bc9-4eb1-9e1b-43c1e25e0369\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d058654e-5dde-40ee-8337-029cdfcebc8a\") pod \"openstack-cell1-galera-0\" (UID: \"f7a29d97-0b1c-4657-ae05-8ef48a3813ba\") " pod="openstack/openstack-cell1-galera-0" Feb 23 14:47:41.264746 master-0 kubenswrapper[28758]: I0223 14:47:41.264645 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 23 14:47:41.566914 master-0 kubenswrapper[28758]: I0223 14:47:41.566725 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lssgn"] Feb 23 14:47:41.568322 master-0 kubenswrapper[28758]: I0223 14:47:41.568267 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lssgn" Feb 23 14:47:41.571630 master-0 kubenswrapper[28758]: I0223 14:47:41.571587 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 23 14:47:41.571730 master-0 kubenswrapper[28758]: I0223 14:47:41.571587 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 23 14:47:41.575504 master-0 kubenswrapper[28758]: I0223 14:47:41.575426 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lssgn"] Feb 23 14:47:41.635666 master-0 kubenswrapper[28758]: I0223 14:47:41.635591 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-td6ds"] Feb 23 14:47:41.637921 master-0 kubenswrapper[28758]: I0223 14:47:41.637890 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-td6ds" Feb 23 14:47:41.685185 master-0 kubenswrapper[28758]: I0223 14:47:41.685107 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9342b63-0ec2-4c10-898a-cebd5e86414a-scripts\") pod \"ovn-controller-lssgn\" (UID: \"b9342b63-0ec2-4c10-898a-cebd5e86414a\") " pod="openstack/ovn-controller-lssgn" Feb 23 14:47:41.685303 master-0 kubenswrapper[28758]: I0223 14:47:41.685273 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdfwp\" (UniqueName: \"kubernetes.io/projected/b9342b63-0ec2-4c10-898a-cebd5e86414a-kube-api-access-vdfwp\") pod \"ovn-controller-lssgn\" (UID: \"b9342b63-0ec2-4c10-898a-cebd5e86414a\") " pod="openstack/ovn-controller-lssgn" Feb 23 14:47:41.685364 master-0 kubenswrapper[28758]: I0223 14:47:41.685344 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9342b63-0ec2-4c10-898a-cebd5e86414a-combined-ca-bundle\") pod \"ovn-controller-lssgn\" (UID: \"b9342b63-0ec2-4c10-898a-cebd5e86414a\") " pod="openstack/ovn-controller-lssgn" Feb 23 14:47:41.685436 master-0 kubenswrapper[28758]: I0223 14:47:41.685414 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b9342b63-0ec2-4c10-898a-cebd5e86414a-var-run-ovn\") pod \"ovn-controller-lssgn\" (UID: \"b9342b63-0ec2-4c10-898a-cebd5e86414a\") " pod="openstack/ovn-controller-lssgn" Feb 23 14:47:41.685679 master-0 kubenswrapper[28758]: I0223 14:47:41.685653 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b9342b63-0ec2-4c10-898a-cebd5e86414a-var-run\") pod \"ovn-controller-lssgn\" (UID: \"b9342b63-0ec2-4c10-898a-cebd5e86414a\") " pod="openstack/ovn-controller-lssgn" Feb 23 14:47:41.685778 master-0 kubenswrapper[28758]: I0223 14:47:41.685758 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b9342b63-0ec2-4c10-898a-cebd5e86414a-var-log-ovn\") pod \"ovn-controller-lssgn\" (UID: \"b9342b63-0ec2-4c10-898a-cebd5e86414a\") " pod="openstack/ovn-controller-lssgn" Feb 23 14:47:41.685865 master-0 kubenswrapper[28758]: I0223 14:47:41.685844 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9342b63-0ec2-4c10-898a-cebd5e86414a-ovn-controller-tls-certs\") pod \"ovn-controller-lssgn\" (UID: \"b9342b63-0ec2-4c10-898a-cebd5e86414a\") " pod="openstack/ovn-controller-lssgn" Feb 23 14:47:41.693470 master-0 kubenswrapper[28758]: I0223 14:47:41.693370 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-td6ds"] Feb 23 14:47:41.788811 master-0 kubenswrapper[28758]: I0223 14:47:41.788715 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9342b63-0ec2-4c10-898a-cebd5e86414a-ovn-controller-tls-certs\") pod \"ovn-controller-lssgn\" (UID: \"b9342b63-0ec2-4c10-898a-cebd5e86414a\") " pod="openstack/ovn-controller-lssgn" Feb 23 14:47:41.788811 master-0 kubenswrapper[28758]: I0223 14:47:41.788803 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f847003c-7775-4189-896d-b6c727a97222-var-run\") pod \"ovn-controller-ovs-td6ds\" (UID: \"f847003c-7775-4189-896d-b6c727a97222\") " pod="openstack/ovn-controller-ovs-td6ds" Feb 23 14:47:41.789136 master-0 kubenswrapper[28758]: I0223 14:47:41.788864 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9342b63-0ec2-4c10-898a-cebd5e86414a-scripts\") pod \"ovn-controller-lssgn\" (UID: \"b9342b63-0ec2-4c10-898a-cebd5e86414a\") " pod="openstack/ovn-controller-lssgn" Feb 23 14:47:41.789136 master-0 kubenswrapper[28758]: I0223 14:47:41.788921 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f847003c-7775-4189-896d-b6c727a97222-var-lib\") pod \"ovn-controller-ovs-td6ds\" (UID: \"f847003c-7775-4189-896d-b6c727a97222\") " pod="openstack/ovn-controller-ovs-td6ds" Feb 23 14:47:41.789136 master-0 kubenswrapper[28758]: I0223 14:47:41.788939 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-782t8\" (UniqueName: \"kubernetes.io/projected/f847003c-7775-4189-896d-b6c727a97222-kube-api-access-782t8\") pod \"ovn-controller-ovs-td6ds\" (UID: \"f847003c-7775-4189-896d-b6c727a97222\") " pod="openstack/ovn-controller-ovs-td6ds" Feb 23 14:47:41.789136 master-0 kubenswrapper[28758]: I0223 14:47:41.788999 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdfwp\" (UniqueName: \"kubernetes.io/projected/b9342b63-0ec2-4c10-898a-cebd5e86414a-kube-api-access-vdfwp\") pod \"ovn-controller-lssgn\" (UID: \"b9342b63-0ec2-4c10-898a-cebd5e86414a\") " pod="openstack/ovn-controller-lssgn" Feb 23 14:47:41.789316 master-0 kubenswrapper[28758]: I0223 14:47:41.789159 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9342b63-0ec2-4c10-898a-cebd5e86414a-combined-ca-bundle\") pod \"ovn-controller-lssgn\" (UID: \"b9342b63-0ec2-4c10-898a-cebd5e86414a\") " pod="openstack/ovn-controller-lssgn" Feb 23 14:47:41.789538 master-0 kubenswrapper[28758]: I0223 14:47:41.789279 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b9342b63-0ec2-4c10-898a-cebd5e86414a-var-run-ovn\") pod \"ovn-controller-lssgn\" (UID: \"b9342b63-0ec2-4c10-898a-cebd5e86414a\") " pod="openstack/ovn-controller-lssgn" Feb 23 14:47:41.789614 master-0 kubenswrapper[28758]: I0223 14:47:41.789549 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b9342b63-0ec2-4c10-898a-cebd5e86414a-var-run\") pod \"ovn-controller-lssgn\" (UID: \"b9342b63-0ec2-4c10-898a-cebd5e86414a\") " pod="openstack/ovn-controller-lssgn" Feb 23 14:47:41.789748 master-0 kubenswrapper[28758]: I0223 14:47:41.789709 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f847003c-7775-4189-896d-b6c727a97222-var-log\") pod \"ovn-controller-ovs-td6ds\" (UID: \"f847003c-7775-4189-896d-b6c727a97222\") " pod="openstack/ovn-controller-ovs-td6ds" Feb 23 14:47:41.789839 master-0 kubenswrapper[28758]: I0223 14:47:41.789811 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b9342b63-0ec2-4c10-898a-cebd5e86414a-var-log-ovn\") pod \"ovn-controller-lssgn\" (UID: \"b9342b63-0ec2-4c10-898a-cebd5e86414a\") " pod="openstack/ovn-controller-lssgn" Feb 23 14:47:41.789952 master-0 kubenswrapper[28758]: I0223 14:47:41.789894 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b9342b63-0ec2-4c10-898a-cebd5e86414a-var-run-ovn\") pod \"ovn-controller-lssgn\" (UID: \"b9342b63-0ec2-4c10-898a-cebd5e86414a\") " pod="openstack/ovn-controller-lssgn" Feb 23 14:47:41.789952 master-0 kubenswrapper[28758]: I0223 14:47:41.789908 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f847003c-7775-4189-896d-b6c727a97222-scripts\") pod \"ovn-controller-ovs-td6ds\" (UID: \"f847003c-7775-4189-896d-b6c727a97222\") " pod="openstack/ovn-controller-ovs-td6ds" Feb 23 14:47:41.790028 master-0 kubenswrapper[28758]: I0223 14:47:41.790005 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f847003c-7775-4189-896d-b6c727a97222-etc-ovs\") pod \"ovn-controller-ovs-td6ds\" (UID: \"f847003c-7775-4189-896d-b6c727a97222\") " pod="openstack/ovn-controller-ovs-td6ds" Feb 23 14:47:41.790028 master-0 kubenswrapper[28758]: I0223 14:47:41.790003 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b9342b63-0ec2-4c10-898a-cebd5e86414a-var-run\") pod \"ovn-controller-lssgn\" (UID: \"b9342b63-0ec2-4c10-898a-cebd5e86414a\") " pod="openstack/ovn-controller-lssgn" Feb 23 14:47:41.790247 master-0 kubenswrapper[28758]: I0223 14:47:41.790197 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b9342b63-0ec2-4c10-898a-cebd5e86414a-var-log-ovn\") pod \"ovn-controller-lssgn\" (UID: \"b9342b63-0ec2-4c10-898a-cebd5e86414a\") " pod="openstack/ovn-controller-lssgn" Feb 23 14:47:41.791537 master-0 kubenswrapper[28758]: I0223 14:47:41.791459 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9342b63-0ec2-4c10-898a-cebd5e86414a-scripts\") pod \"ovn-controller-lssgn\" (UID: \"b9342b63-0ec2-4c10-898a-cebd5e86414a\") " pod="openstack/ovn-controller-lssgn" Feb 23 14:47:41.793032 master-0 kubenswrapper[28758]: I0223 14:47:41.792800 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9342b63-0ec2-4c10-898a-cebd5e86414a-ovn-controller-tls-certs\") pod \"ovn-controller-lssgn\" (UID: \"b9342b63-0ec2-4c10-898a-cebd5e86414a\") " pod="openstack/ovn-controller-lssgn" Feb 23 14:47:41.793118 master-0 kubenswrapper[28758]: I0223 14:47:41.793071 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9342b63-0ec2-4c10-898a-cebd5e86414a-combined-ca-bundle\") pod \"ovn-controller-lssgn\" (UID: \"b9342b63-0ec2-4c10-898a-cebd5e86414a\") " pod="openstack/ovn-controller-lssgn" Feb 23 14:47:41.805299 master-0 kubenswrapper[28758]: I0223 14:47:41.805229 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdfwp\" (UniqueName: \"kubernetes.io/projected/b9342b63-0ec2-4c10-898a-cebd5e86414a-kube-api-access-vdfwp\") pod \"ovn-controller-lssgn\" (UID: \"b9342b63-0ec2-4c10-898a-cebd5e86414a\") " pod="openstack/ovn-controller-lssgn" Feb 23 14:47:41.891432 master-0 kubenswrapper[28758]: I0223 14:47:41.891294 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f847003c-7775-4189-896d-b6c727a97222-var-run\") pod \"ovn-controller-ovs-td6ds\" (UID: \"f847003c-7775-4189-896d-b6c727a97222\") " pod="openstack/ovn-controller-ovs-td6ds" Feb 23 14:47:41.891432 master-0 kubenswrapper[28758]: I0223 14:47:41.891384 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f847003c-7775-4189-896d-b6c727a97222-var-lib\") pod \"ovn-controller-ovs-td6ds\" (UID: \"f847003c-7775-4189-896d-b6c727a97222\") " pod="openstack/ovn-controller-ovs-td6ds" Feb 23 14:47:41.891432 master-0 kubenswrapper[28758]: I0223 14:47:41.891401 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-782t8\" (UniqueName: \"kubernetes.io/projected/f847003c-7775-4189-896d-b6c727a97222-kube-api-access-782t8\") pod \"ovn-controller-ovs-td6ds\" (UID: \"f847003c-7775-4189-896d-b6c727a97222\") " pod="openstack/ovn-controller-ovs-td6ds" Feb 23 14:47:41.891748 master-0 kubenswrapper[28758]: I0223 14:47:41.891461 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f847003c-7775-4189-896d-b6c727a97222-var-log\") pod \"ovn-controller-ovs-td6ds\" (UID: \"f847003c-7775-4189-896d-b6c727a97222\") " pod="openstack/ovn-controller-ovs-td6ds" Feb 23 14:47:41.891748 master-0 kubenswrapper[28758]: I0223 14:47:41.891538 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f847003c-7775-4189-896d-b6c727a97222-scripts\") pod \"ovn-controller-ovs-td6ds\" (UID: \"f847003c-7775-4189-896d-b6c727a97222\") " pod="openstack/ovn-controller-ovs-td6ds" Feb 23 14:47:41.891748 master-0 kubenswrapper[28758]: I0223 14:47:41.891565 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f847003c-7775-4189-896d-b6c727a97222-etc-ovs\") pod \"ovn-controller-ovs-td6ds\" (UID: \"f847003c-7775-4189-896d-b6c727a97222\") " pod="openstack/ovn-controller-ovs-td6ds" Feb 23 14:47:41.891901 master-0 kubenswrapper[28758]: I0223 14:47:41.891876 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f847003c-7775-4189-896d-b6c727a97222-etc-ovs\") pod \"ovn-controller-ovs-td6ds\" (UID: \"f847003c-7775-4189-896d-b6c727a97222\") " pod="openstack/ovn-controller-ovs-td6ds" Feb 23 14:47:41.891951 master-0 kubenswrapper[28758]: I0223 14:47:41.891941 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f847003c-7775-4189-896d-b6c727a97222-var-run\") pod \"ovn-controller-ovs-td6ds\" (UID: \"f847003c-7775-4189-896d-b6c727a97222\") " pod="openstack/ovn-controller-ovs-td6ds" Feb 23 14:47:41.892086 master-0 kubenswrapper[28758]: I0223 14:47:41.892052 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f847003c-7775-4189-896d-b6c727a97222-var-lib\") pod \"ovn-controller-ovs-td6ds\" (UID: \"f847003c-7775-4189-896d-b6c727a97222\") " pod="openstack/ovn-controller-ovs-td6ds" Feb 23 14:47:41.892465 master-0 kubenswrapper[28758]: I0223 14:47:41.892437 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f847003c-7775-4189-896d-b6c727a97222-var-log\") pod \"ovn-controller-ovs-td6ds\" (UID: \"f847003c-7775-4189-896d-b6c727a97222\") " pod="openstack/ovn-controller-ovs-td6ds" Feb 23 14:47:41.894695 master-0 kubenswrapper[28758]: I0223 14:47:41.894648 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f847003c-7775-4189-896d-b6c727a97222-scripts\") pod \"ovn-controller-ovs-td6ds\" (UID: \"f847003c-7775-4189-896d-b6c727a97222\") " pod="openstack/ovn-controller-ovs-td6ds" Feb 23 14:47:41.903929 master-0 kubenswrapper[28758]: I0223 14:47:41.903879 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lssgn" Feb 23 14:47:41.925728 master-0 kubenswrapper[28758]: I0223 14:47:41.925684 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-782t8\" (UniqueName: \"kubernetes.io/projected/f847003c-7775-4189-896d-b6c727a97222-kube-api-access-782t8\") pod \"ovn-controller-ovs-td6ds\" (UID: \"f847003c-7775-4189-896d-b6c727a97222\") " pod="openstack/ovn-controller-ovs-td6ds" Feb 23 14:47:41.989320 master-0 kubenswrapper[28758]: I0223 14:47:41.989235 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-td6ds" Feb 23 14:47:42.393813 master-0 kubenswrapper[28758]: I0223 14:47:42.393740 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 23 14:47:42.395791 master-0 kubenswrapper[28758]: I0223 14:47:42.395738 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 23 14:47:42.398629 master-0 kubenswrapper[28758]: I0223 14:47:42.398598 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 23 14:47:42.399586 master-0 kubenswrapper[28758]: I0223 14:47:42.399558 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 23 14:47:42.399788 master-0 kubenswrapper[28758]: I0223 14:47:42.399745 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 23 14:47:42.399788 master-0 kubenswrapper[28758]: I0223 14:47:42.399763 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 23 14:47:42.400581 master-0 kubenswrapper[28758]: I0223 14:47:42.400534 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 23 14:47:42.500966 master-0 kubenswrapper[28758]: I0223 14:47:42.500902 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl8bt\" (UniqueName: \"kubernetes.io/projected/714c1426-c191-4c69-8346-acf55252ed0a-kube-api-access-wl8bt\") pod \"ovsdbserver-nb-0\" (UID: \"714c1426-c191-4c69-8346-acf55252ed0a\") " pod="openstack/ovsdbserver-nb-0" Feb 23 14:47:42.500966 master-0 kubenswrapper[28758]: I0223 14:47:42.500968 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/714c1426-c191-4c69-8346-acf55252ed0a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"714c1426-c191-4c69-8346-acf55252ed0a\") " pod="openstack/ovsdbserver-nb-0" Feb 23 14:47:42.501256 master-0 kubenswrapper[28758]: I0223 14:47:42.501091 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/714c1426-c191-4c69-8346-acf55252ed0a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"714c1426-c191-4c69-8346-acf55252ed0a\") " pod="openstack/ovsdbserver-nb-0" Feb 23 14:47:42.501256 master-0 kubenswrapper[28758]: I0223 14:47:42.501231 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/714c1426-c191-4c69-8346-acf55252ed0a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"714c1426-c191-4c69-8346-acf55252ed0a\") " pod="openstack/ovsdbserver-nb-0" Feb 23 14:47:42.501570 master-0 kubenswrapper[28758]: I0223 14:47:42.501495 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/714c1426-c191-4c69-8346-acf55252ed0a-config\") pod \"ovsdbserver-nb-0\" (UID: \"714c1426-c191-4c69-8346-acf55252ed0a\") " pod="openstack/ovsdbserver-nb-0" Feb 23 14:47:42.501773 master-0 kubenswrapper[28758]: I0223 14:47:42.501747 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/714c1426-c191-4c69-8346-acf55252ed0a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"714c1426-c191-4c69-8346-acf55252ed0a\") " pod="openstack/ovsdbserver-nb-0" Feb 23 14:47:42.501847 master-0 kubenswrapper[28758]: I0223 14:47:42.501825 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/714c1426-c191-4c69-8346-acf55252ed0a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"714c1426-c191-4c69-8346-acf55252ed0a\") " pod="openstack/ovsdbserver-nb-0" Feb 23 14:47:42.501929 master-0 kubenswrapper[28758]: I0223 14:47:42.501904 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-bf2bdeaf-e510-4061-a4df-eb17ce8c7de3\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d8b5bce0-ae2b-4c74-9049-42fcf0ed7ca3\") pod \"ovsdbserver-nb-0\" (UID: \"714c1426-c191-4c69-8346-acf55252ed0a\") " pod="openstack/ovsdbserver-nb-0" Feb 23 14:47:42.603945 master-0 kubenswrapper[28758]: I0223 14:47:42.603889 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/714c1426-c191-4c69-8346-acf55252ed0a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"714c1426-c191-4c69-8346-acf55252ed0a\") " pod="openstack/ovsdbserver-nb-0" Feb 23 14:47:42.604178 master-0 kubenswrapper[28758]: I0223 14:47:42.604049 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/714c1426-c191-4c69-8346-acf55252ed0a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"714c1426-c191-4c69-8346-acf55252ed0a\") " pod="openstack/ovsdbserver-nb-0" Feb 23 14:47:42.604178 master-0 kubenswrapper[28758]: I0223 14:47:42.604085 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-bf2bdeaf-e510-4061-a4df-eb17ce8c7de3\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d8b5bce0-ae2b-4c74-9049-42fcf0ed7ca3\") pod \"ovsdbserver-nb-0\" (UID: \"714c1426-c191-4c69-8346-acf55252ed0a\") " pod="openstack/ovsdbserver-nb-0" Feb 23 14:47:42.604178 master-0 kubenswrapper[28758]: I0223 14:47:42.604146 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl8bt\" (UniqueName: \"kubernetes.io/projected/714c1426-c191-4c69-8346-acf55252ed0a-kube-api-access-wl8bt\") pod \"ovsdbserver-nb-0\" (UID: \"714c1426-c191-4c69-8346-acf55252ed0a\") " pod="openstack/ovsdbserver-nb-0" Feb 23 14:47:42.604935 master-0 kubenswrapper[28758]: I0223 14:47:42.604714 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/714c1426-c191-4c69-8346-acf55252ed0a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"714c1426-c191-4c69-8346-acf55252ed0a\") " pod="openstack/ovsdbserver-nb-0" Feb 23 14:47:42.605023 master-0 kubenswrapper[28758]: I0223 14:47:42.604953 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/714c1426-c191-4c69-8346-acf55252ed0a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"714c1426-c191-4c69-8346-acf55252ed0a\") " pod="openstack/ovsdbserver-nb-0" Feb 23 14:47:42.605023 master-0 kubenswrapper[28758]: I0223 14:47:42.605007 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/714c1426-c191-4c69-8346-acf55252ed0a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"714c1426-c191-4c69-8346-acf55252ed0a\") " pod="openstack/ovsdbserver-nb-0" Feb 23 14:47:42.605180 master-0 kubenswrapper[28758]: I0223 14:47:42.605077 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/714c1426-c191-4c69-8346-acf55252ed0a-config\") pod \"ovsdbserver-nb-0\" (UID: \"714c1426-c191-4c69-8346-acf55252ed0a\") " pod="openstack/ovsdbserver-nb-0" Feb 23 14:47:42.607864 master-0 kubenswrapper[28758]: I0223 14:47:42.607786 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/714c1426-c191-4c69-8346-acf55252ed0a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"714c1426-c191-4c69-8346-acf55252ed0a\") " pod="openstack/ovsdbserver-nb-0" Feb 23 14:47:42.608393 master-0 kubenswrapper[28758]: I0223 14:47:42.608349 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/714c1426-c191-4c69-8346-acf55252ed0a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"714c1426-c191-4c69-8346-acf55252ed0a\") " pod="openstack/ovsdbserver-nb-0" Feb 23 14:47:42.608393 master-0 kubenswrapper[28758]: I0223 14:47:42.608383 28758 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 14:47:42.608535 master-0 kubenswrapper[28758]: I0223 14:47:42.608411 28758 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-bf2bdeaf-e510-4061-a4df-eb17ce8c7de3\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d8b5bce0-ae2b-4c74-9049-42fcf0ed7ca3\") pod \"ovsdbserver-nb-0\" (UID: \"714c1426-c191-4c69-8346-acf55252ed0a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/20441a7f6e1ee5d195a8284d30dabbb3acf7cca390ca4f191c4eeefd8f6504d4/globalmount\"" pod="openstack/ovsdbserver-nb-0" Feb 23 14:47:42.608535 master-0 kubenswrapper[28758]: I0223 14:47:42.608411 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/714c1426-c191-4c69-8346-acf55252ed0a-config\") pod \"ovsdbserver-nb-0\" (UID: \"714c1426-c191-4c69-8346-acf55252ed0a\") " pod="openstack/ovsdbserver-nb-0" Feb 23 14:47:42.609900 master-0 kubenswrapper[28758]: I0223 14:47:42.609848 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/714c1426-c191-4c69-8346-acf55252ed0a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"714c1426-c191-4c69-8346-acf55252ed0a\") " pod="openstack/ovsdbserver-nb-0" Feb 23 14:47:42.612093 master-0 kubenswrapper[28758]: I0223 14:47:42.612010 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/714c1426-c191-4c69-8346-acf55252ed0a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"714c1426-c191-4c69-8346-acf55252ed0a\") " pod="openstack/ovsdbserver-nb-0" Feb 23 14:47:42.612771 master-0 kubenswrapper[28758]: I0223 14:47:42.612734 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/714c1426-c191-4c69-8346-acf55252ed0a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"714c1426-c191-4c69-8346-acf55252ed0a\") " pod="openstack/ovsdbserver-nb-0" Feb 23 14:47:42.622531 master-0 kubenswrapper[28758]: I0223 14:47:42.622472 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl8bt\" (UniqueName: \"kubernetes.io/projected/714c1426-c191-4c69-8346-acf55252ed0a-kube-api-access-wl8bt\") pod \"ovsdbserver-nb-0\" (UID: \"714c1426-c191-4c69-8346-acf55252ed0a\") " pod="openstack/ovsdbserver-nb-0" Feb 23 14:47:44.084508 master-0 kubenswrapper[28758]: I0223 14:47:44.068532 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-bf2bdeaf-e510-4061-a4df-eb17ce8c7de3\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d8b5bce0-ae2b-4c74-9049-42fcf0ed7ca3\") pod \"ovsdbserver-nb-0\" (UID: \"714c1426-c191-4c69-8346-acf55252ed0a\") " pod="openstack/ovsdbserver-nb-0" Feb 23 14:47:44.227702 master-0 kubenswrapper[28758]: I0223 14:47:44.227491 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 23 14:47:46.836309 master-0 kubenswrapper[28758]: I0223 14:47:46.836203 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 23 14:47:46.838600 master-0 kubenswrapper[28758]: I0223 14:47:46.838556 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 23 14:47:46.840665 master-0 kubenswrapper[28758]: I0223 14:47:46.840385 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 23 14:47:46.840665 master-0 kubenswrapper[28758]: I0223 14:47:46.840657 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 23 14:47:46.840989 master-0 kubenswrapper[28758]: I0223 14:47:46.840963 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 23 14:47:47.162841 master-0 kubenswrapper[28758]: I0223 14:47:47.162520 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 23 14:47:47.223853 master-0 kubenswrapper[28758]: I0223 14:47:47.223773 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4dd9afd-e1bd-494b-a6d8-d18012ac483b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d4dd9afd-e1bd-494b-a6d8-d18012ac483b\") " pod="openstack/ovsdbserver-sb-0" Feb 23 14:47:47.224205 master-0 kubenswrapper[28758]: I0223 14:47:47.223902 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4dd9afd-e1bd-494b-a6d8-d18012ac483b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d4dd9afd-e1bd-494b-a6d8-d18012ac483b\") " pod="openstack/ovsdbserver-sb-0" Feb 23 14:47:47.224205 master-0 kubenswrapper[28758]: I0223 14:47:47.223976 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4dd9afd-e1bd-494b-a6d8-d18012ac483b-config\") pod \"ovsdbserver-sb-0\" (UID: \"d4dd9afd-e1bd-494b-a6d8-d18012ac483b\") " pod="openstack/ovsdbserver-sb-0" Feb 23 14:47:47.224205 master-0 kubenswrapper[28758]: I0223 14:47:47.224067 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-768b860f-83e1-40ba-87aa-048b037fc1a8\" (UniqueName: \"kubernetes.io/csi/topolvm.io^57769bc6-2dcd-45a6-8a58-22ffb7836429\") pod \"ovsdbserver-sb-0\" (UID: \"d4dd9afd-e1bd-494b-a6d8-d18012ac483b\") " pod="openstack/ovsdbserver-sb-0" Feb 23 14:47:47.224205 master-0 kubenswrapper[28758]: I0223 14:47:47.224144 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4dd9afd-e1bd-494b-a6d8-d18012ac483b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d4dd9afd-e1bd-494b-a6d8-d18012ac483b\") " pod="openstack/ovsdbserver-sb-0" Feb 23 14:47:47.224205 master-0 kubenswrapper[28758]: I0223 14:47:47.224196 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hdvk\" (UniqueName: \"kubernetes.io/projected/d4dd9afd-e1bd-494b-a6d8-d18012ac483b-kube-api-access-2hdvk\") pod \"ovsdbserver-sb-0\" (UID: \"d4dd9afd-e1bd-494b-a6d8-d18012ac483b\") " pod="openstack/ovsdbserver-sb-0" Feb 23 14:47:47.224385 master-0 kubenswrapper[28758]: I0223 14:47:47.224242 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4dd9afd-e1bd-494b-a6d8-d18012ac483b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d4dd9afd-e1bd-494b-a6d8-d18012ac483b\") " pod="openstack/ovsdbserver-sb-0" Feb 23 14:47:47.224385 master-0 kubenswrapper[28758]: I0223 14:47:47.224291 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d4dd9afd-e1bd-494b-a6d8-d18012ac483b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d4dd9afd-e1bd-494b-a6d8-d18012ac483b\") " pod="openstack/ovsdbserver-sb-0" Feb 23 14:47:47.325610 master-0 kubenswrapper[28758]: I0223 14:47:47.325507 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4dd9afd-e1bd-494b-a6d8-d18012ac483b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d4dd9afd-e1bd-494b-a6d8-d18012ac483b\") " pod="openstack/ovsdbserver-sb-0" Feb 23 14:47:47.325610 master-0 kubenswrapper[28758]: I0223 14:47:47.325623 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4dd9afd-e1bd-494b-a6d8-d18012ac483b-config\") pod \"ovsdbserver-sb-0\" (UID: \"d4dd9afd-e1bd-494b-a6d8-d18012ac483b\") " pod="openstack/ovsdbserver-sb-0" Feb 23 14:47:47.325980 master-0 kubenswrapper[28758]: I0223 14:47:47.325686 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-768b860f-83e1-40ba-87aa-048b037fc1a8\" (UniqueName: \"kubernetes.io/csi/topolvm.io^57769bc6-2dcd-45a6-8a58-22ffb7836429\") pod \"ovsdbserver-sb-0\" (UID: \"d4dd9afd-e1bd-494b-a6d8-d18012ac483b\") " pod="openstack/ovsdbserver-sb-0" Feb 23 14:47:47.326192 master-0 kubenswrapper[28758]: I0223 14:47:47.326124 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4dd9afd-e1bd-494b-a6d8-d18012ac483b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d4dd9afd-e1bd-494b-a6d8-d18012ac483b\") " pod="openstack/ovsdbserver-sb-0" Feb 23 14:47:47.326350 master-0 kubenswrapper[28758]: I0223 14:47:47.326309 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hdvk\" (UniqueName: \"kubernetes.io/projected/d4dd9afd-e1bd-494b-a6d8-d18012ac483b-kube-api-access-2hdvk\") pod \"ovsdbserver-sb-0\" (UID: \"d4dd9afd-e1bd-494b-a6d8-d18012ac483b\") " pod="openstack/ovsdbserver-sb-0" Feb 23 14:47:47.326465 master-0 kubenswrapper[28758]: I0223 14:47:47.326437 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4dd9afd-e1bd-494b-a6d8-d18012ac483b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d4dd9afd-e1bd-494b-a6d8-d18012ac483b\") " pod="openstack/ovsdbserver-sb-0" Feb 23 14:47:47.326609 master-0 kubenswrapper[28758]: I0223 14:47:47.326558 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d4dd9afd-e1bd-494b-a6d8-d18012ac483b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d4dd9afd-e1bd-494b-a6d8-d18012ac483b\") " pod="openstack/ovsdbserver-sb-0" Feb 23 14:47:47.327062 master-0 kubenswrapper[28758]: I0223 14:47:47.326713 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4dd9afd-e1bd-494b-a6d8-d18012ac483b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d4dd9afd-e1bd-494b-a6d8-d18012ac483b\") " pod="openstack/ovsdbserver-sb-0" Feb 23 14:47:47.327062 master-0 kubenswrapper[28758]: I0223 14:47:47.326721 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4dd9afd-e1bd-494b-a6d8-d18012ac483b-config\") pod \"ovsdbserver-sb-0\" (UID: \"d4dd9afd-e1bd-494b-a6d8-d18012ac483b\") " pod="openstack/ovsdbserver-sb-0" Feb 23 14:47:47.328056 master-0 kubenswrapper[28758]: I0223 14:47:47.328009 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d4dd9afd-e1bd-494b-a6d8-d18012ac483b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d4dd9afd-e1bd-494b-a6d8-d18012ac483b\") " pod="openstack/ovsdbserver-sb-0" Feb 23 14:47:47.328528 master-0 kubenswrapper[28758]: I0223 14:47:47.328474 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4dd9afd-e1bd-494b-a6d8-d18012ac483b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d4dd9afd-e1bd-494b-a6d8-d18012ac483b\") " pod="openstack/ovsdbserver-sb-0" Feb 23 14:47:47.329782 master-0 kubenswrapper[28758]: I0223 14:47:47.329745 28758 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 14:47:47.329782 master-0 kubenswrapper[28758]: I0223 14:47:47.329774 28758 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-768b860f-83e1-40ba-87aa-048b037fc1a8\" (UniqueName: \"kubernetes.io/csi/topolvm.io^57769bc6-2dcd-45a6-8a58-22ffb7836429\") pod \"ovsdbserver-sb-0\" (UID: \"d4dd9afd-e1bd-494b-a6d8-d18012ac483b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/cdd3576e6c2cfabb640db72a3f9a50022fff9652a98cdccafd69f51a4ba126aa/globalmount\"" pod="openstack/ovsdbserver-sb-0" Feb 23 14:47:47.330085 master-0 kubenswrapper[28758]: I0223 14:47:47.330048 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4dd9afd-e1bd-494b-a6d8-d18012ac483b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d4dd9afd-e1bd-494b-a6d8-d18012ac483b\") " pod="openstack/ovsdbserver-sb-0" Feb 23 14:47:47.332065 master-0 kubenswrapper[28758]: I0223 14:47:47.332008 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4dd9afd-e1bd-494b-a6d8-d18012ac483b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d4dd9afd-e1bd-494b-a6d8-d18012ac483b\") " pod="openstack/ovsdbserver-sb-0" Feb 23 14:47:47.340294 master-0 kubenswrapper[28758]: I0223 14:47:47.340170 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4dd9afd-e1bd-494b-a6d8-d18012ac483b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d4dd9afd-e1bd-494b-a6d8-d18012ac483b\") " pod="openstack/ovsdbserver-sb-0" Feb 23 14:47:47.352627 master-0 kubenswrapper[28758]: I0223 14:47:47.352265 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hdvk\" (UniqueName: \"kubernetes.io/projected/d4dd9afd-e1bd-494b-a6d8-d18012ac483b-kube-api-access-2hdvk\") pod \"ovsdbserver-sb-0\" (UID: \"d4dd9afd-e1bd-494b-a6d8-d18012ac483b\") " pod="openstack/ovsdbserver-sb-0" Feb 23 14:47:48.627348 master-0 kubenswrapper[28758]: I0223 14:47:48.627164 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 23 14:47:48.873651 master-0 kubenswrapper[28758]: W0223 14:47:48.873588 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29c2ac3e_73c3_4a07_a129_fcea6817fda3.slice/crio-5581f30998fc0abbef5f292efd14a46e73ca8d4fa1f823b3a775fba9ccbc4453 WatchSource:0}: Error finding container 5581f30998fc0abbef5f292efd14a46e73ca8d4fa1f823b3a775fba9ccbc4453: Status 404 returned error can't find the container with id 5581f30998fc0abbef5f292efd14a46e73ca8d4fa1f823b3a775fba9ccbc4453 Feb 23 14:47:49.000462 master-0 kubenswrapper[28758]: I0223 14:47:49.000408 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-768b860f-83e1-40ba-87aa-048b037fc1a8\" (UniqueName: \"kubernetes.io/csi/topolvm.io^57769bc6-2dcd-45a6-8a58-22ffb7836429\") pod \"ovsdbserver-sb-0\" (UID: \"d4dd9afd-e1bd-494b-a6d8-d18012ac483b\") " pod="openstack/ovsdbserver-sb-0" Feb 23 14:47:49.265352 master-0 kubenswrapper[28758]: I0223 14:47:49.265307 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 23 14:47:49.685814 master-0 kubenswrapper[28758]: I0223 14:47:49.685753 28758 generic.go:334] "Generic (PLEG): container finished" podID="c78f8eb5-a4fd-48e0-9eca-e89d4896fc67" containerID="9a89dfa3610c9820fb0804f8fa0f927164a7399aaa227df9da1c91b6d1de11fb" exitCode=0 Feb 23 14:47:49.686470 master-0 kubenswrapper[28758]: I0223 14:47:49.685873 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc7f9869-fvhpj" event={"ID":"c78f8eb5-a4fd-48e0-9eca-e89d4896fc67","Type":"ContainerDied","Data":"9a89dfa3610c9820fb0804f8fa0f927164a7399aaa227df9da1c91b6d1de11fb"} Feb 23 14:47:49.690231 master-0 kubenswrapper[28758]: I0223 14:47:49.690158 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"29c2ac3e-73c3-4a07-a129-fcea6817fda3","Type":"ContainerStarted","Data":"5581f30998fc0abbef5f292efd14a46e73ca8d4fa1f823b3a775fba9ccbc4453"} Feb 23 14:47:49.692280 master-0 kubenswrapper[28758]: I0223 14:47:49.692251 28758 generic.go:334] "Generic (PLEG): container finished" podID="f23ec168-6731-4fdf-8b85-837218a794f6" containerID="a56436abbd1ed7672c67ab216c64edf9c3002ae2f2d2c4944bd7ead7d7b255fd" exitCode=0 Feb 23 14:47:49.692377 master-0 kubenswrapper[28758]: I0223 14:47:49.692286 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c45d57b9c-qrtjc" event={"ID":"f23ec168-6731-4fdf-8b85-837218a794f6","Type":"ContainerDied","Data":"a56436abbd1ed7672c67ab216c64edf9c3002ae2f2d2c4944bd7ead7d7b255fd"} Feb 23 14:47:49.909806 master-0 kubenswrapper[28758]: I0223 14:47:49.908762 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 23 14:47:49.914611 master-0 kubenswrapper[28758]: W0223 14:47:49.912230 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode18414da_932f_4a26_ab6a_af32aa83196b.slice/crio-c192f562f8061d7390a26544b7c342d33b65ddcb254f4aff7976562da05b179c WatchSource:0}: Error finding container c192f562f8061d7390a26544b7c342d33b65ddcb254f4aff7976562da05b179c: Status 404 returned error can't find the container with id c192f562f8061d7390a26544b7c342d33b65ddcb254f4aff7976562da05b179c Feb 23 14:47:49.951593 master-0 kubenswrapper[28758]: W0223 14:47:49.951529 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7a29d97_0b1c_4657_ae05_8ef48a3813ba.slice/crio-7d204331ce49cb035da959569aea6f3dceeaca9d0118cab9fba7a2cc50418270 WatchSource:0}: Error finding container 7d204331ce49cb035da959569aea6f3dceeaca9d0118cab9fba7a2cc50418270: Status 404 returned error can't find the container with id 7d204331ce49cb035da959569aea6f3dceeaca9d0118cab9fba7a2cc50418270 Feb 23 14:47:49.959566 master-0 kubenswrapper[28758]: I0223 14:47:49.959519 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 23 14:47:49.968688 master-0 kubenswrapper[28758]: I0223 14:47:49.968620 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 23 14:47:49.976633 master-0 kubenswrapper[28758]: I0223 14:47:49.976573 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lssgn"] Feb 23 14:47:50.341333 master-0 kubenswrapper[28758]: I0223 14:47:50.341016 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 23 14:47:50.374159 master-0 kubenswrapper[28758]: I0223 14:47:50.374112 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc7f9869-fvhpj" Feb 23 14:47:50.558092 master-0 kubenswrapper[28758]: I0223 14:47:50.557030 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c78f8eb5-a4fd-48e0-9eca-e89d4896fc67-config\") pod \"c78f8eb5-a4fd-48e0-9eca-e89d4896fc67\" (UID: \"c78f8eb5-a4fd-48e0-9eca-e89d4896fc67\") " Feb 23 14:47:50.558661 master-0 kubenswrapper[28758]: I0223 14:47:50.558643 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdd52\" (UniqueName: \"kubernetes.io/projected/c78f8eb5-a4fd-48e0-9eca-e89d4896fc67-kube-api-access-bdd52\") pod \"c78f8eb5-a4fd-48e0-9eca-e89d4896fc67\" (UID: \"c78f8eb5-a4fd-48e0-9eca-e89d4896fc67\") " Feb 23 14:47:50.566168 master-0 kubenswrapper[28758]: I0223 14:47:50.566111 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c78f8eb5-a4fd-48e0-9eca-e89d4896fc67-kube-api-access-bdd52" (OuterVolumeSpecName: "kube-api-access-bdd52") pod "c78f8eb5-a4fd-48e0-9eca-e89d4896fc67" (UID: "c78f8eb5-a4fd-48e0-9eca-e89d4896fc67"). InnerVolumeSpecName "kube-api-access-bdd52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:47:50.589010 master-0 kubenswrapper[28758]: I0223 14:47:50.588706 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c78f8eb5-a4fd-48e0-9eca-e89d4896fc67-config" (OuterVolumeSpecName: "config") pod "c78f8eb5-a4fd-48e0-9eca-e89d4896fc67" (UID: "c78f8eb5-a4fd-48e0-9eca-e89d4896fc67"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:47:50.660168 master-0 kubenswrapper[28758]: I0223 14:47:50.660111 28758 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c78f8eb5-a4fd-48e0-9eca-e89d4896fc67-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:47:50.660168 master-0 kubenswrapper[28758]: I0223 14:47:50.660161 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdd52\" (UniqueName: \"kubernetes.io/projected/c78f8eb5-a4fd-48e0-9eca-e89d4896fc67-kube-api-access-bdd52\") on node \"master-0\" DevicePath \"\"" Feb 23 14:47:50.702122 master-0 kubenswrapper[28758]: I0223 14:47:50.702053 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e18414da-932f-4a26-ab6a-af32aa83196b","Type":"ContainerStarted","Data":"c192f562f8061d7390a26544b7c342d33b65ddcb254f4aff7976562da05b179c"} Feb 23 14:47:50.713492 master-0 kubenswrapper[28758]: I0223 14:47:50.704566 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc7f9869-fvhpj" Feb 23 14:47:50.713492 master-0 kubenswrapper[28758]: I0223 14:47:50.704582 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc7f9869-fvhpj" event={"ID":"c78f8eb5-a4fd-48e0-9eca-e89d4896fc67","Type":"ContainerDied","Data":"2fe52a6ced5b50c75a6cec9826c0d909bb5d0bf9bfa82c010ab5dbc4d86ddd65"} Feb 23 14:47:50.713492 master-0 kubenswrapper[28758]: I0223 14:47:50.704658 28758 scope.go:117] "RemoveContainer" containerID="9a89dfa3610c9820fb0804f8fa0f927164a7399aaa227df9da1c91b6d1de11fb" Feb 23 14:47:50.713492 master-0 kubenswrapper[28758]: I0223 14:47:50.707191 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6974cff98c-ph5jg" event={"ID":"ef57f6b9-940a-45fd-99a9-520e64c5be84","Type":"ContainerStarted","Data":"c6c992abcf7be04397717dec05fcac91e86bb4321d6f977078593b229371d544"} Feb 23 14:47:50.713492 master-0 kubenswrapper[28758]: I0223 14:47:50.708951 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lssgn" event={"ID":"b9342b63-0ec2-4c10-898a-cebd5e86414a","Type":"ContainerStarted","Data":"abbfad45489e014417a2636e8ab51bec3b78badf51f9dfcd92dffbd84dff1d53"} Feb 23 14:47:50.713492 master-0 kubenswrapper[28758]: I0223 14:47:50.711205 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c45d57b9c-qrtjc" event={"ID":"f23ec168-6731-4fdf-8b85-837218a794f6","Type":"ContainerStarted","Data":"c83e65d806e36841bc33b02f0bc6e4d81482e47102ef04f289cbbc1b90c287ef"} Feb 23 14:47:50.713492 master-0 kubenswrapper[28758]: I0223 14:47:50.711358 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c45d57b9c-qrtjc" Feb 23 14:47:50.713492 master-0 kubenswrapper[28758]: I0223 14:47:50.712757 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"373bbd85-b2d4-40a4-afc1-3ecf50a666e7","Type":"ContainerStarted","Data":"2c83553a62d80083d9c06877bf50ba118967fb981746ff62352817ef898f1bb3"} Feb 23 14:47:50.714077 master-0 kubenswrapper[28758]: I0223 14:47:50.714033 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"daacc97c-efdc-40e3-b833-237dde2caafe","Type":"ContainerStarted","Data":"fee6d05cfa3e53e583b0fcd2836fea805a51b51310528337a6a2615fe0bcbdbe"} Feb 23 14:47:50.715677 master-0 kubenswrapper[28758]: I0223 14:47:50.715650 28758 generic.go:334] "Generic (PLEG): container finished" podID="eaf6c126-dc0e-4840-9c9d-afef68564e47" containerID="0c809e0fef02c7a99e7a92bd7daea9c7e16fbab2f76b7f62c1e4ccee182707b0" exitCode=0 Feb 23 14:47:50.715798 master-0 kubenswrapper[28758]: I0223 14:47:50.715706 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d4c486879-dcskk" event={"ID":"eaf6c126-dc0e-4840-9c9d-afef68564e47","Type":"ContainerDied","Data":"0c809e0fef02c7a99e7a92bd7daea9c7e16fbab2f76b7f62c1e4ccee182707b0"} Feb 23 14:47:50.717469 master-0 kubenswrapper[28758]: I0223 14:47:50.717434 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f7a29d97-0b1c-4657-ae05-8ef48a3813ba","Type":"ContainerStarted","Data":"7d204331ce49cb035da959569aea6f3dceeaca9d0118cab9fba7a2cc50418270"} Feb 23 14:47:51.349040 master-0 kubenswrapper[28758]: I0223 14:47:51.348984 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d4c486879-dcskk" Feb 23 14:47:51.492261 master-0 kubenswrapper[28758]: I0223 14:47:51.492109 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wj2x4\" (UniqueName: \"kubernetes.io/projected/eaf6c126-dc0e-4840-9c9d-afef68564e47-kube-api-access-wj2x4\") pod \"eaf6c126-dc0e-4840-9c9d-afef68564e47\" (UID: \"eaf6c126-dc0e-4840-9c9d-afef68564e47\") " Feb 23 14:47:51.492261 master-0 kubenswrapper[28758]: I0223 14:47:51.492215 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eaf6c126-dc0e-4840-9c9d-afef68564e47-dns-svc\") pod \"eaf6c126-dc0e-4840-9c9d-afef68564e47\" (UID: \"eaf6c126-dc0e-4840-9c9d-afef68564e47\") " Feb 23 14:47:51.492261 master-0 kubenswrapper[28758]: I0223 14:47:51.492255 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaf6c126-dc0e-4840-9c9d-afef68564e47-config\") pod \"eaf6c126-dc0e-4840-9c9d-afef68564e47\" (UID: \"eaf6c126-dc0e-4840-9c9d-afef68564e47\") " Feb 23 14:47:51.497237 master-0 kubenswrapper[28758]: I0223 14:47:51.495689 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaf6c126-dc0e-4840-9c9d-afef68564e47-kube-api-access-wj2x4" (OuterVolumeSpecName: "kube-api-access-wj2x4") pod "eaf6c126-dc0e-4840-9c9d-afef68564e47" (UID: "eaf6c126-dc0e-4840-9c9d-afef68564e47"). InnerVolumeSpecName "kube-api-access-wj2x4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:47:51.517326 master-0 kubenswrapper[28758]: I0223 14:47:51.517234 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaf6c126-dc0e-4840-9c9d-afef68564e47-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eaf6c126-dc0e-4840-9c9d-afef68564e47" (UID: "eaf6c126-dc0e-4840-9c9d-afef68564e47"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:47:51.530747 master-0 kubenswrapper[28758]: I0223 14:47:51.530687 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaf6c126-dc0e-4840-9c9d-afef68564e47-config" (OuterVolumeSpecName: "config") pod "eaf6c126-dc0e-4840-9c9d-afef68564e47" (UID: "eaf6c126-dc0e-4840-9c9d-afef68564e47"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:47:51.551050 master-0 kubenswrapper[28758]: I0223 14:47:51.550933 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c45d57b9c-qrtjc" podStartSLOduration=4.012149691 podStartE2EDuration="20.550906733s" podCreationTimestamp="2026-02-23 14:47:31 +0000 UTC" firstStartedPulling="2026-02-23 14:47:32.672031616 +0000 UTC m=+784.798347538" lastFinishedPulling="2026-02-23 14:47:49.210788648 +0000 UTC m=+801.337104580" observedRunningTime="2026-02-23 14:47:51.545796257 +0000 UTC m=+803.672112189" watchObservedRunningTime="2026-02-23 14:47:51.550906733 +0000 UTC m=+803.677222665" Feb 23 14:47:51.595188 master-0 kubenswrapper[28758]: I0223 14:47:51.594707 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wj2x4\" (UniqueName: \"kubernetes.io/projected/eaf6c126-dc0e-4840-9c9d-afef68564e47-kube-api-access-wj2x4\") on node \"master-0\" DevicePath \"\"" Feb 23 14:47:51.595188 master-0 kubenswrapper[28758]: I0223 14:47:51.594774 28758 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eaf6c126-dc0e-4840-9c9d-afef68564e47-dns-svc\") on node \"master-0\" DevicePath \"\"" Feb 23 14:47:51.595188 master-0 kubenswrapper[28758]: I0223 14:47:51.594789 28758 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaf6c126-dc0e-4840-9c9d-afef68564e47-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:47:51.736669 master-0 kubenswrapper[28758]: I0223 14:47:51.736592 28758 generic.go:334] "Generic (PLEG): container finished" podID="ef57f6b9-940a-45fd-99a9-520e64c5be84" containerID="c6c992abcf7be04397717dec05fcac91e86bb4321d6f977078593b229371d544" exitCode=0 Feb 23 14:47:51.737232 master-0 kubenswrapper[28758]: I0223 14:47:51.736689 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6974cff98c-ph5jg" event={"ID":"ef57f6b9-940a-45fd-99a9-520e64c5be84","Type":"ContainerDied","Data":"c6c992abcf7be04397717dec05fcac91e86bb4321d6f977078593b229371d544"} Feb 23 14:47:51.741556 master-0 kubenswrapper[28758]: I0223 14:47:51.741509 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d4c486879-dcskk" event={"ID":"eaf6c126-dc0e-4840-9c9d-afef68564e47","Type":"ContainerDied","Data":"d59195820f21aadfc5248596519caec5ee5406053afd57b72ad59862d7fb04d9"} Feb 23 14:47:51.741649 master-0 kubenswrapper[28758]: I0223 14:47:51.741574 28758 scope.go:117] "RemoveContainer" containerID="0c809e0fef02c7a99e7a92bd7daea9c7e16fbab2f76b7f62c1e4ccee182707b0" Feb 23 14:47:51.741649 master-0 kubenswrapper[28758]: I0223 14:47:51.741523 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d4c486879-dcskk" Feb 23 14:47:51.956041 master-0 kubenswrapper[28758]: I0223 14:47:51.955941 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bc7f9869-fvhpj"] Feb 23 14:47:51.982321 master-0 kubenswrapper[28758]: I0223 14:47:51.982268 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bc7f9869-fvhpj"] Feb 23 14:47:52.010996 master-0 kubenswrapper[28758]: I0223 14:47:52.010865 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 23 14:47:52.027102 master-0 kubenswrapper[28758]: I0223 14:47:52.027008 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d4c486879-dcskk"] Feb 23 14:47:52.035311 master-0 kubenswrapper[28758]: I0223 14:47:52.035229 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d4c486879-dcskk"] Feb 23 14:47:52.100803 master-0 kubenswrapper[28758]: I0223 14:47:52.100741 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c78f8eb5-a4fd-48e0-9eca-e89d4896fc67" path="/var/lib/kubelet/pods/c78f8eb5-a4fd-48e0-9eca-e89d4896fc67/volumes" Feb 23 14:47:52.101357 master-0 kubenswrapper[28758]: I0223 14:47:52.101326 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaf6c126-dc0e-4840-9c9d-afef68564e47" path="/var/lib/kubelet/pods/eaf6c126-dc0e-4840-9c9d-afef68564e47/volumes" Feb 23 14:47:52.153943 master-0 kubenswrapper[28758]: I0223 14:47:52.153879 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-td6ds"] Feb 23 14:47:52.455821 master-0 kubenswrapper[28758]: W0223 14:47:52.455765 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4dd9afd_e1bd_494b_a6d8_d18012ac483b.slice/crio-d2c28b439d30ba1c4cadee7e2b2a534e9351220dff90775a5475dbea895e7fb7 WatchSource:0}: Error finding container d2c28b439d30ba1c4cadee7e2b2a534e9351220dff90775a5475dbea895e7fb7: Status 404 returned error can't find the container with id d2c28b439d30ba1c4cadee7e2b2a534e9351220dff90775a5475dbea895e7fb7 Feb 23 14:47:52.526806 master-0 kubenswrapper[28758]: I0223 14:47:52.526707 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 23 14:47:52.759468 master-0 kubenswrapper[28758]: I0223 14:47:52.759341 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d4dd9afd-e1bd-494b-a6d8-d18012ac483b","Type":"ContainerStarted","Data":"d2c28b439d30ba1c4cadee7e2b2a534e9351220dff90775a5475dbea895e7fb7"} Feb 23 14:47:52.764426 master-0 kubenswrapper[28758]: I0223 14:47:52.764357 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-td6ds" event={"ID":"f847003c-7775-4189-896d-b6c727a97222","Type":"ContainerStarted","Data":"ba54a74789b5703c1e85f39c87ac6b1e13bc7f71cde47582ff6d23f411308e7f"} Feb 23 14:47:53.776806 master-0 kubenswrapper[28758]: I0223 14:47:53.776727 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"714c1426-c191-4c69-8346-acf55252ed0a","Type":"ContainerStarted","Data":"28004d9c3a5931b705a947685d8ed7c335f94dbdb2ee6fe77d844f838ffe0686"} Feb 23 14:47:57.117324 master-0 kubenswrapper[28758]: I0223 14:47:57.117267 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c45d57b9c-qrtjc" Feb 23 14:47:57.200904 master-0 kubenswrapper[28758]: I0223 14:47:57.200834 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6974cff98c-ph5jg"] Feb 23 14:47:58.840372 master-0 kubenswrapper[28758]: I0223 14:47:58.839636 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-td6ds" event={"ID":"f847003c-7775-4189-896d-b6c727a97222","Type":"ContainerStarted","Data":"172449d9265e38cbb2021db3b30e18166c19af9fc882a6cdaf266ccc7c3f7ea8"} Feb 23 14:47:58.841801 master-0 kubenswrapper[28758]: I0223 14:47:58.841411 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"daacc97c-efdc-40e3-b833-237dde2caafe","Type":"ContainerStarted","Data":"8b4a338151c2cd5f2929de4d68e81e3577211c284917c9fdb6e1faab4caac85d"} Feb 23 14:47:58.843629 master-0 kubenswrapper[28758]: I0223 14:47:58.843561 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d4dd9afd-e1bd-494b-a6d8-d18012ac483b","Type":"ContainerStarted","Data":"0c384d50f4c2b2ef61b9b200833772ee43564b20df30b9432b5fd52fc4f6f3d6"} Feb 23 14:47:58.848047 master-0 kubenswrapper[28758]: I0223 14:47:58.847984 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6974cff98c-ph5jg" event={"ID":"ef57f6b9-940a-45fd-99a9-520e64c5be84","Type":"ContainerStarted","Data":"bcd912e4911bbc2a1ba92b1773787bc6212269ff0176a82441e10bd719cc51c5"} Feb 23 14:47:58.848146 master-0 kubenswrapper[28758]: I0223 14:47:58.848043 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6974cff98c-ph5jg" podUID="ef57f6b9-940a-45fd-99a9-520e64c5be84" containerName="dnsmasq-dns" containerID="cri-o://bcd912e4911bbc2a1ba92b1773787bc6212269ff0176a82441e10bd719cc51c5" gracePeriod=10 Feb 23 14:47:58.848146 master-0 kubenswrapper[28758]: I0223 14:47:58.848068 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6974cff98c-ph5jg" Feb 23 14:47:58.849507 master-0 kubenswrapper[28758]: I0223 14:47:58.849465 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"29c2ac3e-73c3-4a07-a129-fcea6817fda3","Type":"ContainerStarted","Data":"21dd199d6d45259f7a8ebf928cbd1e9314f9b197171fc3f62ec08a786a3f3bbd"} Feb 23 14:47:58.850299 master-0 kubenswrapper[28758]: I0223 14:47:58.850258 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 23 14:47:58.851955 master-0 kubenswrapper[28758]: I0223 14:47:58.851911 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lssgn" event={"ID":"b9342b63-0ec2-4c10-898a-cebd5e86414a","Type":"ContainerStarted","Data":"bb6c1ab4f7f8a80da473d6a2706dd29951d48688869f8c541a6c1c7b90ee31bb"} Feb 23 14:47:58.852132 master-0 kubenswrapper[28758]: I0223 14:47:58.852097 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-lssgn" Feb 23 14:47:58.857610 master-0 kubenswrapper[28758]: I0223 14:47:58.857424 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"714c1426-c191-4c69-8346-acf55252ed0a","Type":"ContainerStarted","Data":"71bc72bd8843bed2d399abfaf953e8a74b6e010e2464665a5979463ecabc7ece"} Feb 23 14:47:58.915516 master-0 kubenswrapper[28758]: I0223 14:47:58.914889 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=15.864756154 podStartE2EDuration="23.914868618s" podCreationTimestamp="2026-02-23 14:47:35 +0000 UTC" firstStartedPulling="2026-02-23 14:47:48.878288474 +0000 UTC m=+801.004604416" lastFinishedPulling="2026-02-23 14:47:56.928400958 +0000 UTC m=+809.054716880" observedRunningTime="2026-02-23 14:47:58.895644727 +0000 UTC m=+811.021960669" watchObservedRunningTime="2026-02-23 14:47:58.914868618 +0000 UTC m=+811.041184550" Feb 23 14:47:58.936578 master-0 kubenswrapper[28758]: I0223 14:47:58.936494 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-lssgn" podStartSLOduration=10.110555293 podStartE2EDuration="17.936458411s" podCreationTimestamp="2026-02-23 14:47:41 +0000 UTC" firstStartedPulling="2026-02-23 14:47:49.957641391 +0000 UTC m=+802.083957323" lastFinishedPulling="2026-02-23 14:47:57.783544509 +0000 UTC m=+809.909860441" observedRunningTime="2026-02-23 14:47:58.91908875 +0000 UTC m=+811.045404682" watchObservedRunningTime="2026-02-23 14:47:58.936458411 +0000 UTC m=+811.062774343" Feb 23 14:47:58.966372 master-0 kubenswrapper[28758]: I0223 14:47:58.966279 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6974cff98c-ph5jg" podStartSLOduration=9.916412384000001 podStartE2EDuration="27.966260283s" podCreationTimestamp="2026-02-23 14:47:31 +0000 UTC" firstStartedPulling="2026-02-23 14:47:32.29991546 +0000 UTC m=+784.426231392" lastFinishedPulling="2026-02-23 14:47:50.349763279 +0000 UTC m=+802.476079291" observedRunningTime="2026-02-23 14:47:58.961298771 +0000 UTC m=+811.087614703" watchObservedRunningTime="2026-02-23 14:47:58.966260283 +0000 UTC m=+811.092576225" Feb 23 14:48:00.269127 master-0 kubenswrapper[28758]: I0223 14:48:00.268785 28758 generic.go:334] "Generic (PLEG): container finished" podID="f847003c-7775-4189-896d-b6c727a97222" containerID="172449d9265e38cbb2021db3b30e18166c19af9fc882a6cdaf266ccc7c3f7ea8" exitCode=0 Feb 23 14:48:00.269127 master-0 kubenswrapper[28758]: I0223 14:48:00.268893 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-td6ds" event={"ID":"f847003c-7775-4189-896d-b6c727a97222","Type":"ContainerDied","Data":"172449d9265e38cbb2021db3b30e18166c19af9fc882a6cdaf266ccc7c3f7ea8"} Feb 23 14:48:00.281035 master-0 kubenswrapper[28758]: I0223 14:48:00.280965 28758 generic.go:334] "Generic (PLEG): container finished" podID="ef57f6b9-940a-45fd-99a9-520e64c5be84" containerID="bcd912e4911bbc2a1ba92b1773787bc6212269ff0176a82441e10bd719cc51c5" exitCode=0 Feb 23 14:48:00.281128 master-0 kubenswrapper[28758]: I0223 14:48:00.281050 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6974cff98c-ph5jg" event={"ID":"ef57f6b9-940a-45fd-99a9-520e64c5be84","Type":"ContainerDied","Data":"bcd912e4911bbc2a1ba92b1773787bc6212269ff0176a82441e10bd719cc51c5"} Feb 23 14:48:00.299565 master-0 kubenswrapper[28758]: I0223 14:48:00.298371 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f7a29d97-0b1c-4657-ae05-8ef48a3813ba","Type":"ContainerStarted","Data":"8d766af47845207179f5c820b9d4ceec8ccb38b2eccf9b5aa9318df84e864043"} Feb 23 14:48:00.645680 master-0 kubenswrapper[28758]: I0223 14:48:00.644080 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6974cff98c-ph5jg" Feb 23 14:48:00.811591 master-0 kubenswrapper[28758]: I0223 14:48:00.811510 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef57f6b9-940a-45fd-99a9-520e64c5be84-config\") pod \"ef57f6b9-940a-45fd-99a9-520e64c5be84\" (UID: \"ef57f6b9-940a-45fd-99a9-520e64c5be84\") " Feb 23 14:48:00.811980 master-0 kubenswrapper[28758]: I0223 14:48:00.811758 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef57f6b9-940a-45fd-99a9-520e64c5be84-dns-svc\") pod \"ef57f6b9-940a-45fd-99a9-520e64c5be84\" (UID: \"ef57f6b9-940a-45fd-99a9-520e64c5be84\") " Feb 23 14:48:00.811980 master-0 kubenswrapper[28758]: I0223 14:48:00.811794 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhvwx\" (UniqueName: \"kubernetes.io/projected/ef57f6b9-940a-45fd-99a9-520e64c5be84-kube-api-access-fhvwx\") pod \"ef57f6b9-940a-45fd-99a9-520e64c5be84\" (UID: \"ef57f6b9-940a-45fd-99a9-520e64c5be84\") " Feb 23 14:48:00.816759 master-0 kubenswrapper[28758]: I0223 14:48:00.816583 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef57f6b9-940a-45fd-99a9-520e64c5be84-kube-api-access-fhvwx" (OuterVolumeSpecName: "kube-api-access-fhvwx") pod "ef57f6b9-940a-45fd-99a9-520e64c5be84" (UID: "ef57f6b9-940a-45fd-99a9-520e64c5be84"). InnerVolumeSpecName "kube-api-access-fhvwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:48:00.861578 master-0 kubenswrapper[28758]: I0223 14:48:00.861528 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef57f6b9-940a-45fd-99a9-520e64c5be84-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ef57f6b9-940a-45fd-99a9-520e64c5be84" (UID: "ef57f6b9-940a-45fd-99a9-520e64c5be84"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:48:00.882565 master-0 kubenswrapper[28758]: I0223 14:48:00.881529 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef57f6b9-940a-45fd-99a9-520e64c5be84-config" (OuterVolumeSpecName: "config") pod "ef57f6b9-940a-45fd-99a9-520e64c5be84" (UID: "ef57f6b9-940a-45fd-99a9-520e64c5be84"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:48:00.914134 master-0 kubenswrapper[28758]: I0223 14:48:00.914025 28758 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef57f6b9-940a-45fd-99a9-520e64c5be84-dns-svc\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:00.914134 master-0 kubenswrapper[28758]: I0223 14:48:00.914133 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhvwx\" (UniqueName: \"kubernetes.io/projected/ef57f6b9-940a-45fd-99a9-520e64c5be84-kube-api-access-fhvwx\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:00.914134 master-0 kubenswrapper[28758]: I0223 14:48:00.914145 28758 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef57f6b9-940a-45fd-99a9-520e64c5be84-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:01.313181 master-0 kubenswrapper[28758]: I0223 14:48:01.313055 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e18414da-932f-4a26-ab6a-af32aa83196b","Type":"ContainerStarted","Data":"5f9cd75c8ea3610b17762a7a2a82fbd63abba29b71c7f31c59a93ce46315286e"} Feb 23 14:48:01.316138 master-0 kubenswrapper[28758]: I0223 14:48:01.316096 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"373bbd85-b2d4-40a4-afc1-3ecf50a666e7","Type":"ContainerStarted","Data":"28e4b899ada51892b7fdfdf96a548fe19b7e2f8172bb6e0f29396a68e2ad1446"} Feb 23 14:48:01.318947 master-0 kubenswrapper[28758]: I0223 14:48:01.318873 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6974cff98c-ph5jg" event={"ID":"ef57f6b9-940a-45fd-99a9-520e64c5be84","Type":"ContainerDied","Data":"e0bf3f6cfa4e400857642830a865234e9c759589d651ea562f5cf6cc5f68fec5"} Feb 23 14:48:01.318947 master-0 kubenswrapper[28758]: I0223 14:48:01.318937 28758 scope.go:117] "RemoveContainer" containerID="bcd912e4911bbc2a1ba92b1773787bc6212269ff0176a82441e10bd719cc51c5" Feb 23 14:48:01.319120 master-0 kubenswrapper[28758]: I0223 14:48:01.319031 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6974cff98c-ph5jg" Feb 23 14:48:01.330703 master-0 kubenswrapper[28758]: I0223 14:48:01.330634 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-td6ds" event={"ID":"f847003c-7775-4189-896d-b6c727a97222","Type":"ContainerStarted","Data":"7a740c4f6ed5ec60d677ab950c70b0dda7001c5f2c9d0c16a4cb89bf2152cd11"} Feb 23 14:48:01.415551 master-0 kubenswrapper[28758]: I0223 14:48:01.415452 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6974cff98c-ph5jg"] Feb 23 14:48:01.425190 master-0 kubenswrapper[28758]: I0223 14:48:01.425126 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6974cff98c-ph5jg"] Feb 23 14:48:01.541759 master-0 kubenswrapper[28758]: I0223 14:48:01.541709 28758 scope.go:117] "RemoveContainer" containerID="c6c992abcf7be04397717dec05fcac91e86bb4321d6f977078593b229371d544" Feb 23 14:48:02.099562 master-0 kubenswrapper[28758]: I0223 14:48:02.099498 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef57f6b9-940a-45fd-99a9-520e64c5be84" path="/var/lib/kubelet/pods/ef57f6b9-940a-45fd-99a9-520e64c5be84/volumes" Feb 23 14:48:02.341948 master-0 kubenswrapper[28758]: I0223 14:48:02.341859 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"714c1426-c191-4c69-8346-acf55252ed0a","Type":"ContainerStarted","Data":"d55a3ffe621006ab3ad8636c0832146576203de7a2073dcb3ce20bf9a4c40bb9"} Feb 23 14:48:02.346954 master-0 kubenswrapper[28758]: I0223 14:48:02.346906 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-td6ds" event={"ID":"f847003c-7775-4189-896d-b6c727a97222","Type":"ContainerStarted","Data":"3051dc624e01e9c3e4a8bd2b8c50e19fde42e142e05797e90c68bf33c26070ac"} Feb 23 14:48:02.347130 master-0 kubenswrapper[28758]: I0223 14:48:02.347069 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-td6ds" Feb 23 14:48:02.349579 master-0 kubenswrapper[28758]: I0223 14:48:02.349516 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d4dd9afd-e1bd-494b-a6d8-d18012ac483b","Type":"ContainerStarted","Data":"9990756a7bafb47d976ea9b898f311c8e5f2f326f1daa1a1485ba8ddc56b357e"} Feb 23 14:48:02.368256 master-0 kubenswrapper[28758]: I0223 14:48:02.368174 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=14.441816313 podStartE2EDuration="22.368154528s" podCreationTimestamp="2026-02-23 14:47:40 +0000 UTC" firstStartedPulling="2026-02-23 14:47:53.718082073 +0000 UTC m=+805.844398005" lastFinishedPulling="2026-02-23 14:48:01.644420288 +0000 UTC m=+813.770736220" observedRunningTime="2026-02-23 14:48:02.361766099 +0000 UTC m=+814.488082051" watchObservedRunningTime="2026-02-23 14:48:02.368154528 +0000 UTC m=+814.494470460" Feb 23 14:48:02.404991 master-0 kubenswrapper[28758]: I0223 14:48:02.404046 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-td6ds" podStartSLOduration=16.035111455 podStartE2EDuration="21.403994481s" podCreationTimestamp="2026-02-23 14:47:41 +0000 UTC" firstStartedPulling="2026-02-23 14:47:52.467018024 +0000 UTC m=+804.593333956" lastFinishedPulling="2026-02-23 14:47:57.83590105 +0000 UTC m=+809.962216982" observedRunningTime="2026-02-23 14:48:02.382421917 +0000 UTC m=+814.508737849" watchObservedRunningTime="2026-02-23 14:48:02.403994481 +0000 UTC m=+814.530310413" Feb 23 14:48:02.417690 master-0 kubenswrapper[28758]: I0223 14:48:02.417204 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=9.221643195 podStartE2EDuration="18.417186311s" podCreationTimestamp="2026-02-23 14:47:44 +0000 UTC" firstStartedPulling="2026-02-23 14:47:52.459031192 +0000 UTC m=+804.585347124" lastFinishedPulling="2026-02-23 14:48:01.654574308 +0000 UTC m=+813.780890240" observedRunningTime="2026-02-23 14:48:02.416706858 +0000 UTC m=+814.543022790" watchObservedRunningTime="2026-02-23 14:48:02.417186311 +0000 UTC m=+814.543502243" Feb 23 14:48:03.362625 master-0 kubenswrapper[28758]: I0223 14:48:03.362527 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-td6ds" Feb 23 14:48:04.227835 master-0 kubenswrapper[28758]: I0223 14:48:04.227744 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 23 14:48:04.266011 master-0 kubenswrapper[28758]: I0223 14:48:04.265934 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 23 14:48:04.266011 master-0 kubenswrapper[28758]: I0223 14:48:04.265998 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 23 14:48:04.305504 master-0 kubenswrapper[28758]: I0223 14:48:04.305402 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 23 14:48:04.372929 master-0 kubenswrapper[28758]: I0223 14:48:04.372870 28758 generic.go:334] "Generic (PLEG): container finished" podID="daacc97c-efdc-40e3-b833-237dde2caafe" containerID="8b4a338151c2cd5f2929de4d68e81e3577211c284917c9fdb6e1faab4caac85d" exitCode=0 Feb 23 14:48:04.372929 master-0 kubenswrapper[28758]: I0223 14:48:04.372897 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"daacc97c-efdc-40e3-b833-237dde2caafe","Type":"ContainerDied","Data":"8b4a338151c2cd5f2929de4d68e81e3577211c284917c9fdb6e1faab4caac85d"} Feb 23 14:48:04.414528 master-0 kubenswrapper[28758]: I0223 14:48:04.413779 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 23 14:48:04.714741 master-0 kubenswrapper[28758]: I0223 14:48:04.714663 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-679f75d775-xhpfv"] Feb 23 14:48:04.715255 master-0 kubenswrapper[28758]: E0223 14:48:04.715224 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c78f8eb5-a4fd-48e0-9eca-e89d4896fc67" containerName="init" Feb 23 14:48:04.715255 master-0 kubenswrapper[28758]: I0223 14:48:04.715248 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="c78f8eb5-a4fd-48e0-9eca-e89d4896fc67" containerName="init" Feb 23 14:48:04.715365 master-0 kubenswrapper[28758]: E0223 14:48:04.715272 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef57f6b9-940a-45fd-99a9-520e64c5be84" containerName="dnsmasq-dns" Feb 23 14:48:04.715365 master-0 kubenswrapper[28758]: I0223 14:48:04.715283 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef57f6b9-940a-45fd-99a9-520e64c5be84" containerName="dnsmasq-dns" Feb 23 14:48:04.715365 master-0 kubenswrapper[28758]: E0223 14:48:04.715317 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef57f6b9-940a-45fd-99a9-520e64c5be84" containerName="init" Feb 23 14:48:04.715365 master-0 kubenswrapper[28758]: I0223 14:48:04.715327 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef57f6b9-940a-45fd-99a9-520e64c5be84" containerName="init" Feb 23 14:48:04.715365 master-0 kubenswrapper[28758]: E0223 14:48:04.715356 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaf6c126-dc0e-4840-9c9d-afef68564e47" containerName="init" Feb 23 14:48:04.715365 master-0 kubenswrapper[28758]: I0223 14:48:04.715364 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf6c126-dc0e-4840-9c9d-afef68564e47" containerName="init" Feb 23 14:48:04.715709 master-0 kubenswrapper[28758]: I0223 14:48:04.715668 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef57f6b9-940a-45fd-99a9-520e64c5be84" containerName="dnsmasq-dns" Feb 23 14:48:04.715758 master-0 kubenswrapper[28758]: I0223 14:48:04.715721 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="c78f8eb5-a4fd-48e0-9eca-e89d4896fc67" containerName="init" Feb 23 14:48:04.715758 master-0 kubenswrapper[28758]: I0223 14:48:04.715752 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaf6c126-dc0e-4840-9c9d-afef68564e47" containerName="init" Feb 23 14:48:04.716874 master-0 kubenswrapper[28758]: I0223 14:48:04.716852 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-679f75d775-xhpfv" Feb 23 14:48:04.719692 master-0 kubenswrapper[28758]: I0223 14:48:04.719674 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 23 14:48:04.731046 master-0 kubenswrapper[28758]: I0223 14:48:04.730754 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-679f75d775-xhpfv"] Feb 23 14:48:04.794709 master-0 kubenswrapper[28758]: I0223 14:48:04.794637 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-pz8wl"] Feb 23 14:48:04.796291 master-0 kubenswrapper[28758]: I0223 14:48:04.796250 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-pz8wl" Feb 23 14:48:04.798761 master-0 kubenswrapper[28758]: I0223 14:48:04.798731 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 23 14:48:04.826081 master-0 kubenswrapper[28758]: I0223 14:48:04.826019 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-pz8wl"] Feb 23 14:48:04.878640 master-0 kubenswrapper[28758]: I0223 14:48:04.874766 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts2qz\" (UniqueName: \"kubernetes.io/projected/6aac2224-3117-41c5-b022-534475fa21f9-kube-api-access-ts2qz\") pod \"dnsmasq-dns-679f75d775-xhpfv\" (UID: \"6aac2224-3117-41c5-b022-534475fa21f9\") " pod="openstack/dnsmasq-dns-679f75d775-xhpfv" Feb 23 14:48:04.878640 master-0 kubenswrapper[28758]: I0223 14:48:04.874880 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2ab015df-a84b-45bb-8955-b2210c1564f0-ovs-rundir\") pod \"ovn-controller-metrics-pz8wl\" (UID: \"2ab015df-a84b-45bb-8955-b2210c1564f0\") " pod="openstack/ovn-controller-metrics-pz8wl" Feb 23 14:48:04.878640 master-0 kubenswrapper[28758]: I0223 14:48:04.874922 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ab015df-a84b-45bb-8955-b2210c1564f0-combined-ca-bundle\") pod \"ovn-controller-metrics-pz8wl\" (UID: \"2ab015df-a84b-45bb-8955-b2210c1564f0\") " pod="openstack/ovn-controller-metrics-pz8wl" Feb 23 14:48:04.878640 master-0 kubenswrapper[28758]: I0223 14:48:04.874999 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ab015df-a84b-45bb-8955-b2210c1564f0-config\") pod \"ovn-controller-metrics-pz8wl\" (UID: \"2ab015df-a84b-45bb-8955-b2210c1564f0\") " pod="openstack/ovn-controller-metrics-pz8wl" Feb 23 14:48:04.878640 master-0 kubenswrapper[28758]: I0223 14:48:04.875042 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6aac2224-3117-41c5-b022-534475fa21f9-config\") pod \"dnsmasq-dns-679f75d775-xhpfv\" (UID: \"6aac2224-3117-41c5-b022-534475fa21f9\") " pod="openstack/dnsmasq-dns-679f75d775-xhpfv" Feb 23 14:48:04.878640 master-0 kubenswrapper[28758]: I0223 14:48:04.875104 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6aac2224-3117-41c5-b022-534475fa21f9-ovsdbserver-sb\") pod \"dnsmasq-dns-679f75d775-xhpfv\" (UID: \"6aac2224-3117-41c5-b022-534475fa21f9\") " pod="openstack/dnsmasq-dns-679f75d775-xhpfv" Feb 23 14:48:04.878640 master-0 kubenswrapper[28758]: I0223 14:48:04.875145 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ab015df-a84b-45bb-8955-b2210c1564f0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-pz8wl\" (UID: \"2ab015df-a84b-45bb-8955-b2210c1564f0\") " pod="openstack/ovn-controller-metrics-pz8wl" Feb 23 14:48:04.878640 master-0 kubenswrapper[28758]: I0223 14:48:04.875171 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhh5c\" (UniqueName: \"kubernetes.io/projected/2ab015df-a84b-45bb-8955-b2210c1564f0-kube-api-access-qhh5c\") pod \"ovn-controller-metrics-pz8wl\" (UID: \"2ab015df-a84b-45bb-8955-b2210c1564f0\") " pod="openstack/ovn-controller-metrics-pz8wl" Feb 23 14:48:04.878640 master-0 kubenswrapper[28758]: I0223 14:48:04.875207 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6aac2224-3117-41c5-b022-534475fa21f9-dns-svc\") pod \"dnsmasq-dns-679f75d775-xhpfv\" (UID: \"6aac2224-3117-41c5-b022-534475fa21f9\") " pod="openstack/dnsmasq-dns-679f75d775-xhpfv" Feb 23 14:48:04.878640 master-0 kubenswrapper[28758]: I0223 14:48:04.875229 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2ab015df-a84b-45bb-8955-b2210c1564f0-ovn-rundir\") pod \"ovn-controller-metrics-pz8wl\" (UID: \"2ab015df-a84b-45bb-8955-b2210c1564f0\") " pod="openstack/ovn-controller-metrics-pz8wl" Feb 23 14:48:04.977368 master-0 kubenswrapper[28758]: I0223 14:48:04.977226 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6aac2224-3117-41c5-b022-534475fa21f9-ovsdbserver-sb\") pod \"dnsmasq-dns-679f75d775-xhpfv\" (UID: \"6aac2224-3117-41c5-b022-534475fa21f9\") " pod="openstack/dnsmasq-dns-679f75d775-xhpfv" Feb 23 14:48:04.977368 master-0 kubenswrapper[28758]: I0223 14:48:04.977326 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ab015df-a84b-45bb-8955-b2210c1564f0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-pz8wl\" (UID: \"2ab015df-a84b-45bb-8955-b2210c1564f0\") " pod="openstack/ovn-controller-metrics-pz8wl" Feb 23 14:48:04.977852 master-0 kubenswrapper[28758]: I0223 14:48:04.977459 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhh5c\" (UniqueName: \"kubernetes.io/projected/2ab015df-a84b-45bb-8955-b2210c1564f0-kube-api-access-qhh5c\") pod \"ovn-controller-metrics-pz8wl\" (UID: \"2ab015df-a84b-45bb-8955-b2210c1564f0\") " pod="openstack/ovn-controller-metrics-pz8wl" Feb 23 14:48:04.977852 master-0 kubenswrapper[28758]: I0223 14:48:04.977821 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6aac2224-3117-41c5-b022-534475fa21f9-dns-svc\") pod \"dnsmasq-dns-679f75d775-xhpfv\" (UID: \"6aac2224-3117-41c5-b022-534475fa21f9\") " pod="openstack/dnsmasq-dns-679f75d775-xhpfv" Feb 23 14:48:04.977930 master-0 kubenswrapper[28758]: I0223 14:48:04.977877 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2ab015df-a84b-45bb-8955-b2210c1564f0-ovn-rundir\") pod \"ovn-controller-metrics-pz8wl\" (UID: \"2ab015df-a84b-45bb-8955-b2210c1564f0\") " pod="openstack/ovn-controller-metrics-pz8wl" Feb 23 14:48:04.977983 master-0 kubenswrapper[28758]: I0223 14:48:04.977943 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts2qz\" (UniqueName: \"kubernetes.io/projected/6aac2224-3117-41c5-b022-534475fa21f9-kube-api-access-ts2qz\") pod \"dnsmasq-dns-679f75d775-xhpfv\" (UID: \"6aac2224-3117-41c5-b022-534475fa21f9\") " pod="openstack/dnsmasq-dns-679f75d775-xhpfv" Feb 23 14:48:04.978099 master-0 kubenswrapper[28758]: I0223 14:48:04.978047 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2ab015df-a84b-45bb-8955-b2210c1564f0-ovs-rundir\") pod \"ovn-controller-metrics-pz8wl\" (UID: \"2ab015df-a84b-45bb-8955-b2210c1564f0\") " pod="openstack/ovn-controller-metrics-pz8wl" Feb 23 14:48:04.978181 master-0 kubenswrapper[28758]: I0223 14:48:04.978127 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ab015df-a84b-45bb-8955-b2210c1564f0-combined-ca-bundle\") pod \"ovn-controller-metrics-pz8wl\" (UID: \"2ab015df-a84b-45bb-8955-b2210c1564f0\") " pod="openstack/ovn-controller-metrics-pz8wl" Feb 23 14:48:04.978181 master-0 kubenswrapper[28758]: I0223 14:48:04.978159 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ab015df-a84b-45bb-8955-b2210c1564f0-config\") pod \"ovn-controller-metrics-pz8wl\" (UID: \"2ab015df-a84b-45bb-8955-b2210c1564f0\") " pod="openstack/ovn-controller-metrics-pz8wl" Feb 23 14:48:04.978181 master-0 kubenswrapper[28758]: I0223 14:48:04.978175 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2ab015df-a84b-45bb-8955-b2210c1564f0-ovn-rundir\") pod \"ovn-controller-metrics-pz8wl\" (UID: \"2ab015df-a84b-45bb-8955-b2210c1564f0\") " pod="openstack/ovn-controller-metrics-pz8wl" Feb 23 14:48:04.978381 master-0 kubenswrapper[28758]: I0223 14:48:04.978288 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2ab015df-a84b-45bb-8955-b2210c1564f0-ovs-rundir\") pod \"ovn-controller-metrics-pz8wl\" (UID: \"2ab015df-a84b-45bb-8955-b2210c1564f0\") " pod="openstack/ovn-controller-metrics-pz8wl" Feb 23 14:48:04.978465 master-0 kubenswrapper[28758]: I0223 14:48:04.978409 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6aac2224-3117-41c5-b022-534475fa21f9-config\") pod \"dnsmasq-dns-679f75d775-xhpfv\" (UID: \"6aac2224-3117-41c5-b022-534475fa21f9\") " pod="openstack/dnsmasq-dns-679f75d775-xhpfv" Feb 23 14:48:04.978569 master-0 kubenswrapper[28758]: I0223 14:48:04.978544 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6aac2224-3117-41c5-b022-534475fa21f9-dns-svc\") pod \"dnsmasq-dns-679f75d775-xhpfv\" (UID: \"6aac2224-3117-41c5-b022-534475fa21f9\") " pod="openstack/dnsmasq-dns-679f75d775-xhpfv" Feb 23 14:48:04.978625 master-0 kubenswrapper[28758]: I0223 14:48:04.978583 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6aac2224-3117-41c5-b022-534475fa21f9-ovsdbserver-sb\") pod \"dnsmasq-dns-679f75d775-xhpfv\" (UID: \"6aac2224-3117-41c5-b022-534475fa21f9\") " pod="openstack/dnsmasq-dns-679f75d775-xhpfv" Feb 23 14:48:04.978989 master-0 kubenswrapper[28758]: I0223 14:48:04.978965 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ab015df-a84b-45bb-8955-b2210c1564f0-config\") pod \"ovn-controller-metrics-pz8wl\" (UID: \"2ab015df-a84b-45bb-8955-b2210c1564f0\") " pod="openstack/ovn-controller-metrics-pz8wl" Feb 23 14:48:04.979503 master-0 kubenswrapper[28758]: I0223 14:48:04.979425 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6aac2224-3117-41c5-b022-534475fa21f9-config\") pod \"dnsmasq-dns-679f75d775-xhpfv\" (UID: \"6aac2224-3117-41c5-b022-534475fa21f9\") " pod="openstack/dnsmasq-dns-679f75d775-xhpfv" Feb 23 14:48:04.981137 master-0 kubenswrapper[28758]: I0223 14:48:04.980751 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ab015df-a84b-45bb-8955-b2210c1564f0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-pz8wl\" (UID: \"2ab015df-a84b-45bb-8955-b2210c1564f0\") " pod="openstack/ovn-controller-metrics-pz8wl" Feb 23 14:48:04.990554 master-0 kubenswrapper[28758]: I0223 14:48:04.988889 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ab015df-a84b-45bb-8955-b2210c1564f0-combined-ca-bundle\") pod \"ovn-controller-metrics-pz8wl\" (UID: \"2ab015df-a84b-45bb-8955-b2210c1564f0\") " pod="openstack/ovn-controller-metrics-pz8wl" Feb 23 14:48:05.008930 master-0 kubenswrapper[28758]: I0223 14:48:05.007564 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts2qz\" (UniqueName: \"kubernetes.io/projected/6aac2224-3117-41c5-b022-534475fa21f9-kube-api-access-ts2qz\") pod \"dnsmasq-dns-679f75d775-xhpfv\" (UID: \"6aac2224-3117-41c5-b022-534475fa21f9\") " pod="openstack/dnsmasq-dns-679f75d775-xhpfv" Feb 23 14:48:05.022813 master-0 kubenswrapper[28758]: I0223 14:48:05.022703 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhh5c\" (UniqueName: \"kubernetes.io/projected/2ab015df-a84b-45bb-8955-b2210c1564f0-kube-api-access-qhh5c\") pod \"ovn-controller-metrics-pz8wl\" (UID: \"2ab015df-a84b-45bb-8955-b2210c1564f0\") " pod="openstack/ovn-controller-metrics-pz8wl" Feb 23 14:48:05.069166 master-0 kubenswrapper[28758]: I0223 14:48:05.069106 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-679f75d775-xhpfv" Feb 23 14:48:05.107918 master-0 kubenswrapper[28758]: I0223 14:48:05.107482 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-679f75d775-xhpfv"] Feb 23 14:48:05.131007 master-0 kubenswrapper[28758]: I0223 14:48:05.130324 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-pz8wl" Feb 23 14:48:05.166936 master-0 kubenswrapper[28758]: I0223 14:48:05.166874 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79745f7855-48j7g"] Feb 23 14:48:05.203613 master-0 kubenswrapper[28758]: I0223 14:48:05.201501 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79745f7855-48j7g" Feb 23 14:48:05.204152 master-0 kubenswrapper[28758]: I0223 14:48:05.204111 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79745f7855-48j7g"] Feb 23 14:48:05.207123 master-0 kubenswrapper[28758]: I0223 14:48:05.207098 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 23 14:48:05.228957 master-0 kubenswrapper[28758]: I0223 14:48:05.228808 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 23 14:48:05.287420 master-0 kubenswrapper[28758]: I0223 14:48:05.287373 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf60feaf-b30d-4b21-8cd8-f75b3c311d2a-ovsdbserver-sb\") pod \"dnsmasq-dns-79745f7855-48j7g\" (UID: \"cf60feaf-b30d-4b21-8cd8-f75b3c311d2a\") " pod="openstack/dnsmasq-dns-79745f7855-48j7g" Feb 23 14:48:05.287592 master-0 kubenswrapper[28758]: I0223 14:48:05.287448 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5g5j\" (UniqueName: \"kubernetes.io/projected/cf60feaf-b30d-4b21-8cd8-f75b3c311d2a-kube-api-access-t5g5j\") pod \"dnsmasq-dns-79745f7855-48j7g\" (UID: \"cf60feaf-b30d-4b21-8cd8-f75b3c311d2a\") " pod="openstack/dnsmasq-dns-79745f7855-48j7g" Feb 23 14:48:05.287592 master-0 kubenswrapper[28758]: I0223 14:48:05.287475 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf60feaf-b30d-4b21-8cd8-f75b3c311d2a-ovsdbserver-nb\") pod \"dnsmasq-dns-79745f7855-48j7g\" (UID: \"cf60feaf-b30d-4b21-8cd8-f75b3c311d2a\") " pod="openstack/dnsmasq-dns-79745f7855-48j7g" Feb 23 14:48:05.287592 master-0 kubenswrapper[28758]: I0223 14:48:05.287501 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf60feaf-b30d-4b21-8cd8-f75b3c311d2a-config\") pod \"dnsmasq-dns-79745f7855-48j7g\" (UID: \"cf60feaf-b30d-4b21-8cd8-f75b3c311d2a\") " pod="openstack/dnsmasq-dns-79745f7855-48j7g" Feb 23 14:48:05.288142 master-0 kubenswrapper[28758]: I0223 14:48:05.288088 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf60feaf-b30d-4b21-8cd8-f75b3c311d2a-dns-svc\") pod \"dnsmasq-dns-79745f7855-48j7g\" (UID: \"cf60feaf-b30d-4b21-8cd8-f75b3c311d2a\") " pod="openstack/dnsmasq-dns-79745f7855-48j7g" Feb 23 14:48:05.300031 master-0 kubenswrapper[28758]: I0223 14:48:05.298150 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 23 14:48:05.383983 master-0 kubenswrapper[28758]: I0223 14:48:05.383907 28758 generic.go:334] "Generic (PLEG): container finished" podID="f7a29d97-0b1c-4657-ae05-8ef48a3813ba" containerID="8d766af47845207179f5c820b9d4ceec8ccb38b2eccf9b5aa9318df84e864043" exitCode=0 Feb 23 14:48:05.384685 master-0 kubenswrapper[28758]: I0223 14:48:05.383995 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f7a29d97-0b1c-4657-ae05-8ef48a3813ba","Type":"ContainerDied","Data":"8d766af47845207179f5c820b9d4ceec8ccb38b2eccf9b5aa9318df84e864043"} Feb 23 14:48:05.389769 master-0 kubenswrapper[28758]: I0223 14:48:05.389706 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf60feaf-b30d-4b21-8cd8-f75b3c311d2a-dns-svc\") pod \"dnsmasq-dns-79745f7855-48j7g\" (UID: \"cf60feaf-b30d-4b21-8cd8-f75b3c311d2a\") " pod="openstack/dnsmasq-dns-79745f7855-48j7g" Feb 23 14:48:05.389844 master-0 kubenswrapper[28758]: I0223 14:48:05.389774 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf60feaf-b30d-4b21-8cd8-f75b3c311d2a-ovsdbserver-sb\") pod \"dnsmasq-dns-79745f7855-48j7g\" (UID: \"cf60feaf-b30d-4b21-8cd8-f75b3c311d2a\") " pod="openstack/dnsmasq-dns-79745f7855-48j7g" Feb 23 14:48:05.389844 master-0 kubenswrapper[28758]: I0223 14:48:05.389809 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5g5j\" (UniqueName: \"kubernetes.io/projected/cf60feaf-b30d-4b21-8cd8-f75b3c311d2a-kube-api-access-t5g5j\") pod \"dnsmasq-dns-79745f7855-48j7g\" (UID: \"cf60feaf-b30d-4b21-8cd8-f75b3c311d2a\") " pod="openstack/dnsmasq-dns-79745f7855-48j7g" Feb 23 14:48:05.389844 master-0 kubenswrapper[28758]: I0223 14:48:05.389831 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf60feaf-b30d-4b21-8cd8-f75b3c311d2a-ovsdbserver-nb\") pod \"dnsmasq-dns-79745f7855-48j7g\" (UID: \"cf60feaf-b30d-4b21-8cd8-f75b3c311d2a\") " pod="openstack/dnsmasq-dns-79745f7855-48j7g" Feb 23 14:48:05.389981 master-0 kubenswrapper[28758]: I0223 14:48:05.389851 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf60feaf-b30d-4b21-8cd8-f75b3c311d2a-config\") pod \"dnsmasq-dns-79745f7855-48j7g\" (UID: \"cf60feaf-b30d-4b21-8cd8-f75b3c311d2a\") " pod="openstack/dnsmasq-dns-79745f7855-48j7g" Feb 23 14:48:05.391135 master-0 kubenswrapper[28758]: I0223 14:48:05.391051 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf60feaf-b30d-4b21-8cd8-f75b3c311d2a-config\") pod \"dnsmasq-dns-79745f7855-48j7g\" (UID: \"cf60feaf-b30d-4b21-8cd8-f75b3c311d2a\") " pod="openstack/dnsmasq-dns-79745f7855-48j7g" Feb 23 14:48:05.391447 master-0 kubenswrapper[28758]: I0223 14:48:05.391397 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf60feaf-b30d-4b21-8cd8-f75b3c311d2a-dns-svc\") pod \"dnsmasq-dns-79745f7855-48j7g\" (UID: \"cf60feaf-b30d-4b21-8cd8-f75b3c311d2a\") " pod="openstack/dnsmasq-dns-79745f7855-48j7g" Feb 23 14:48:05.391447 master-0 kubenswrapper[28758]: I0223 14:48:05.391421 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf60feaf-b30d-4b21-8cd8-f75b3c311d2a-ovsdbserver-sb\") pod \"dnsmasq-dns-79745f7855-48j7g\" (UID: \"cf60feaf-b30d-4b21-8cd8-f75b3c311d2a\") " pod="openstack/dnsmasq-dns-79745f7855-48j7g" Feb 23 14:48:05.392201 master-0 kubenswrapper[28758]: I0223 14:48:05.392122 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"daacc97c-efdc-40e3-b833-237dde2caafe","Type":"ContainerStarted","Data":"16b6adf3f64eee32c1f08f2fce5e15105e3217a7d3a2be02ab405ec097a5a669"} Feb 23 14:48:05.394864 master-0 kubenswrapper[28758]: I0223 14:48:05.394811 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf60feaf-b30d-4b21-8cd8-f75b3c311d2a-ovsdbserver-nb\") pod \"dnsmasq-dns-79745f7855-48j7g\" (UID: \"cf60feaf-b30d-4b21-8cd8-f75b3c311d2a\") " pod="openstack/dnsmasq-dns-79745f7855-48j7g" Feb 23 14:48:05.413870 master-0 kubenswrapper[28758]: I0223 14:48:05.413811 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5g5j\" (UniqueName: \"kubernetes.io/projected/cf60feaf-b30d-4b21-8cd8-f75b3c311d2a-kube-api-access-t5g5j\") pod \"dnsmasq-dns-79745f7855-48j7g\" (UID: \"cf60feaf-b30d-4b21-8cd8-f75b3c311d2a\") " pod="openstack/dnsmasq-dns-79745f7855-48j7g" Feb 23 14:48:05.445064 master-0 kubenswrapper[28758]: I0223 14:48:05.443768 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=26.019176559 podStartE2EDuration="33.443745464s" podCreationTimestamp="2026-02-23 14:47:32 +0000 UTC" firstStartedPulling="2026-02-23 14:47:50.352563124 +0000 UTC m=+802.478879056" lastFinishedPulling="2026-02-23 14:47:57.777132029 +0000 UTC m=+809.903447961" observedRunningTime="2026-02-23 14:48:05.435025772 +0000 UTC m=+817.561341704" watchObservedRunningTime="2026-02-23 14:48:05.443745464 +0000 UTC m=+817.570061396" Feb 23 14:48:05.447994 master-0 kubenswrapper[28758]: I0223 14:48:05.447964 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 23 14:48:05.564094 master-0 kubenswrapper[28758]: I0223 14:48:05.563259 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79745f7855-48j7g" Feb 23 14:48:05.648062 master-0 kubenswrapper[28758]: I0223 14:48:05.639948 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-679f75d775-xhpfv"] Feb 23 14:48:05.663682 master-0 kubenswrapper[28758]: I0223 14:48:05.663631 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 23 14:48:05.666785 master-0 kubenswrapper[28758]: I0223 14:48:05.665240 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 23 14:48:05.675721 master-0 kubenswrapper[28758]: I0223 14:48:05.673707 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 23 14:48:05.675721 master-0 kubenswrapper[28758]: I0223 14:48:05.673917 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 23 14:48:05.675721 master-0 kubenswrapper[28758]: I0223 14:48:05.674066 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 23 14:48:05.688630 master-0 kubenswrapper[28758]: W0223 14:48:05.687617 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6aac2224_3117_41c5_b022_534475fa21f9.slice/crio-dc1f50f00a6e697f77692fe52dc995e98b7ec3b0229b3bb62968fc737c4ec987 WatchSource:0}: Error finding container dc1f50f00a6e697f77692fe52dc995e98b7ec3b0229b3bb62968fc737c4ec987: Status 404 returned error can't find the container with id dc1f50f00a6e697f77692fe52dc995e98b7ec3b0229b3bb62968fc737c4ec987 Feb 23 14:48:05.702400 master-0 kubenswrapper[28758]: I0223 14:48:05.702276 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 23 14:48:05.798715 master-0 kubenswrapper[28758]: I0223 14:48:05.797561 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-pz8wl"] Feb 23 14:48:05.804064 master-0 kubenswrapper[28758]: I0223 14:48:05.804010 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83080fd9-e913-441c-b248-8fd33251ced3-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"83080fd9-e913-441c-b248-8fd33251ced3\") " pod="openstack/ovn-northd-0" Feb 23 14:48:05.804171 master-0 kubenswrapper[28758]: I0223 14:48:05.804123 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83080fd9-e913-441c-b248-8fd33251ced3-scripts\") pod \"ovn-northd-0\" (UID: \"83080fd9-e913-441c-b248-8fd33251ced3\") " pod="openstack/ovn-northd-0" Feb 23 14:48:05.804171 master-0 kubenswrapper[28758]: I0223 14:48:05.804156 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83080fd9-e913-441c-b248-8fd33251ced3-config\") pod \"ovn-northd-0\" (UID: \"83080fd9-e913-441c-b248-8fd33251ced3\") " pod="openstack/ovn-northd-0" Feb 23 14:48:05.804321 master-0 kubenswrapper[28758]: I0223 14:48:05.804186 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrr8d\" (UniqueName: \"kubernetes.io/projected/83080fd9-e913-441c-b248-8fd33251ced3-kube-api-access-jrr8d\") pod \"ovn-northd-0\" (UID: \"83080fd9-e913-441c-b248-8fd33251ced3\") " pod="openstack/ovn-northd-0" Feb 23 14:48:05.804321 master-0 kubenswrapper[28758]: I0223 14:48:05.804204 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/83080fd9-e913-441c-b248-8fd33251ced3-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"83080fd9-e913-441c-b248-8fd33251ced3\") " pod="openstack/ovn-northd-0" Feb 23 14:48:05.804321 master-0 kubenswrapper[28758]: I0223 14:48:05.804220 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/83080fd9-e913-441c-b248-8fd33251ced3-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"83080fd9-e913-441c-b248-8fd33251ced3\") " pod="openstack/ovn-northd-0" Feb 23 14:48:05.804321 master-0 kubenswrapper[28758]: I0223 14:48:05.804257 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/83080fd9-e913-441c-b248-8fd33251ced3-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"83080fd9-e913-441c-b248-8fd33251ced3\") " pod="openstack/ovn-northd-0" Feb 23 14:48:05.907629 master-0 kubenswrapper[28758]: I0223 14:48:05.907469 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83080fd9-e913-441c-b248-8fd33251ced3-scripts\") pod \"ovn-northd-0\" (UID: \"83080fd9-e913-441c-b248-8fd33251ced3\") " pod="openstack/ovn-northd-0" Feb 23 14:48:05.907629 master-0 kubenswrapper[28758]: I0223 14:48:05.907586 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83080fd9-e913-441c-b248-8fd33251ced3-config\") pod \"ovn-northd-0\" (UID: \"83080fd9-e913-441c-b248-8fd33251ced3\") " pod="openstack/ovn-northd-0" Feb 23 14:48:05.907629 master-0 kubenswrapper[28758]: I0223 14:48:05.907626 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrr8d\" (UniqueName: \"kubernetes.io/projected/83080fd9-e913-441c-b248-8fd33251ced3-kube-api-access-jrr8d\") pod \"ovn-northd-0\" (UID: \"83080fd9-e913-441c-b248-8fd33251ced3\") " pod="openstack/ovn-northd-0" Feb 23 14:48:05.907629 master-0 kubenswrapper[28758]: I0223 14:48:05.907653 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/83080fd9-e913-441c-b248-8fd33251ced3-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"83080fd9-e913-441c-b248-8fd33251ced3\") " pod="openstack/ovn-northd-0" Feb 23 14:48:05.908248 master-0 kubenswrapper[28758]: I0223 14:48:05.907671 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/83080fd9-e913-441c-b248-8fd33251ced3-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"83080fd9-e913-441c-b248-8fd33251ced3\") " pod="openstack/ovn-northd-0" Feb 23 14:48:05.908248 master-0 kubenswrapper[28758]: I0223 14:48:05.907714 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/83080fd9-e913-441c-b248-8fd33251ced3-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"83080fd9-e913-441c-b248-8fd33251ced3\") " pod="openstack/ovn-northd-0" Feb 23 14:48:05.908248 master-0 kubenswrapper[28758]: I0223 14:48:05.907780 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83080fd9-e913-441c-b248-8fd33251ced3-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"83080fd9-e913-441c-b248-8fd33251ced3\") " pod="openstack/ovn-northd-0" Feb 23 14:48:05.910200 master-0 kubenswrapper[28758]: I0223 14:48:05.910152 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/83080fd9-e913-441c-b248-8fd33251ced3-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"83080fd9-e913-441c-b248-8fd33251ced3\") " pod="openstack/ovn-northd-0" Feb 23 14:48:05.910451 master-0 kubenswrapper[28758]: I0223 14:48:05.910408 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/83080fd9-e913-441c-b248-8fd33251ced3-scripts\") pod \"ovn-northd-0\" (UID: \"83080fd9-e913-441c-b248-8fd33251ced3\") " pod="openstack/ovn-northd-0" Feb 23 14:48:05.911879 master-0 kubenswrapper[28758]: I0223 14:48:05.911816 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83080fd9-e913-441c-b248-8fd33251ced3-config\") pod \"ovn-northd-0\" (UID: \"83080fd9-e913-441c-b248-8fd33251ced3\") " pod="openstack/ovn-northd-0" Feb 23 14:48:05.913115 master-0 kubenswrapper[28758]: I0223 14:48:05.913084 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/83080fd9-e913-441c-b248-8fd33251ced3-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"83080fd9-e913-441c-b248-8fd33251ced3\") " pod="openstack/ovn-northd-0" Feb 23 14:48:05.916372 master-0 kubenswrapper[28758]: I0223 14:48:05.916323 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83080fd9-e913-441c-b248-8fd33251ced3-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"83080fd9-e913-441c-b248-8fd33251ced3\") " pod="openstack/ovn-northd-0" Feb 23 14:48:05.918396 master-0 kubenswrapper[28758]: I0223 14:48:05.918367 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/83080fd9-e913-441c-b248-8fd33251ced3-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"83080fd9-e913-441c-b248-8fd33251ced3\") " pod="openstack/ovn-northd-0" Feb 23 14:48:05.924188 master-0 kubenswrapper[28758]: I0223 14:48:05.924135 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrr8d\" (UniqueName: \"kubernetes.io/projected/83080fd9-e913-441c-b248-8fd33251ced3-kube-api-access-jrr8d\") pod \"ovn-northd-0\" (UID: \"83080fd9-e913-441c-b248-8fd33251ced3\") " pod="openstack/ovn-northd-0" Feb 23 14:48:06.011441 master-0 kubenswrapper[28758]: I0223 14:48:06.011310 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 23 14:48:06.148866 master-0 kubenswrapper[28758]: I0223 14:48:06.143029 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 23 14:48:06.172229 master-0 kubenswrapper[28758]: I0223 14:48:06.172176 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79745f7855-48j7g"] Feb 23 14:48:06.403153 master-0 kubenswrapper[28758]: I0223 14:48:06.403086 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-pz8wl" event={"ID":"2ab015df-a84b-45bb-8955-b2210c1564f0","Type":"ContainerStarted","Data":"f2cc7fd11fc8eff81cc3addae18603c1e56fcbe75a37401f0b104f60963f6ee9"} Feb 23 14:48:06.404409 master-0 kubenswrapper[28758]: I0223 14:48:06.404350 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-pz8wl" event={"ID":"2ab015df-a84b-45bb-8955-b2210c1564f0","Type":"ContainerStarted","Data":"d6fd1204a2eb5ac0f2c8672353df7f6f2759278b9a59c8692f31efbfdc105bf1"} Feb 23 14:48:06.405232 master-0 kubenswrapper[28758]: I0223 14:48:06.404989 28758 generic.go:334] "Generic (PLEG): container finished" podID="6aac2224-3117-41c5-b022-534475fa21f9" containerID="08b76e92f2dd24704c22ea509fc93caf5391ef6ea4bda1a98bf0abaad71d2a39" exitCode=0 Feb 23 14:48:06.405232 master-0 kubenswrapper[28758]: I0223 14:48:06.405049 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-679f75d775-xhpfv" event={"ID":"6aac2224-3117-41c5-b022-534475fa21f9","Type":"ContainerDied","Data":"08b76e92f2dd24704c22ea509fc93caf5391ef6ea4bda1a98bf0abaad71d2a39"} Feb 23 14:48:06.405232 master-0 kubenswrapper[28758]: I0223 14:48:06.405073 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-679f75d775-xhpfv" event={"ID":"6aac2224-3117-41c5-b022-534475fa21f9","Type":"ContainerStarted","Data":"dc1f50f00a6e697f77692fe52dc995e98b7ec3b0229b3bb62968fc737c4ec987"} Feb 23 14:48:06.409685 master-0 kubenswrapper[28758]: I0223 14:48:06.409607 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f7a29d97-0b1c-4657-ae05-8ef48a3813ba","Type":"ContainerStarted","Data":"508d2bd434b3c9783f4d80b5af2cedd784c18a3e416911557f04fff782279880"} Feb 23 14:48:06.412192 master-0 kubenswrapper[28758]: I0223 14:48:06.412061 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79745f7855-48j7g" event={"ID":"cf60feaf-b30d-4b21-8cd8-f75b3c311d2a","Type":"ContainerStarted","Data":"ad7492066209c8ad7d8789d76768034c93bb7949707b18c6395ad3745b7bf3c4"} Feb 23 14:48:06.412311 master-0 kubenswrapper[28758]: I0223 14:48:06.412297 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79745f7855-48j7g" event={"ID":"cf60feaf-b30d-4b21-8cd8-f75b3c311d2a","Type":"ContainerStarted","Data":"e4790f769f844387566c9ab87c820791a82862e266f69c4b7e046d3a15ebb453"} Feb 23 14:48:06.436836 master-0 kubenswrapper[28758]: I0223 14:48:06.436750 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-pz8wl" podStartSLOduration=2.436719656 podStartE2EDuration="2.436719656s" podCreationTimestamp="2026-02-23 14:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:48:06.419017226 +0000 UTC m=+818.545333158" watchObservedRunningTime="2026-02-23 14:48:06.436719656 +0000 UTC m=+818.563035578" Feb 23 14:48:06.530897 master-0 kubenswrapper[28758]: I0223 14:48:06.530819 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=24.657978601 podStartE2EDuration="32.530789806s" podCreationTimestamp="2026-02-23 14:47:34 +0000 UTC" firstStartedPulling="2026-02-23 14:47:49.957651161 +0000 UTC m=+802.083967093" lastFinishedPulling="2026-02-23 14:47:57.830462366 +0000 UTC m=+809.956778298" observedRunningTime="2026-02-23 14:48:06.526278016 +0000 UTC m=+818.652593948" watchObservedRunningTime="2026-02-23 14:48:06.530789806 +0000 UTC m=+818.657105738" Feb 23 14:48:06.622544 master-0 kubenswrapper[28758]: I0223 14:48:06.620739 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 23 14:48:06.951278 master-0 kubenswrapper[28758]: I0223 14:48:06.951231 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-679f75d775-xhpfv" Feb 23 14:48:07.056224 master-0 kubenswrapper[28758]: I0223 14:48:07.056097 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6aac2224-3117-41c5-b022-534475fa21f9-config\") pod \"6aac2224-3117-41c5-b022-534475fa21f9\" (UID: \"6aac2224-3117-41c5-b022-534475fa21f9\") " Feb 23 14:48:07.056224 master-0 kubenswrapper[28758]: I0223 14:48:07.056184 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6aac2224-3117-41c5-b022-534475fa21f9-ovsdbserver-sb\") pod \"6aac2224-3117-41c5-b022-534475fa21f9\" (UID: \"6aac2224-3117-41c5-b022-534475fa21f9\") " Feb 23 14:48:07.056550 master-0 kubenswrapper[28758]: I0223 14:48:07.056500 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ts2qz\" (UniqueName: \"kubernetes.io/projected/6aac2224-3117-41c5-b022-534475fa21f9-kube-api-access-ts2qz\") pod \"6aac2224-3117-41c5-b022-534475fa21f9\" (UID: \"6aac2224-3117-41c5-b022-534475fa21f9\") " Feb 23 14:48:07.056653 master-0 kubenswrapper[28758]: I0223 14:48:07.056629 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6aac2224-3117-41c5-b022-534475fa21f9-dns-svc\") pod \"6aac2224-3117-41c5-b022-534475fa21f9\" (UID: \"6aac2224-3117-41c5-b022-534475fa21f9\") " Feb 23 14:48:07.059214 master-0 kubenswrapper[28758]: I0223 14:48:07.059052 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aac2224-3117-41c5-b022-534475fa21f9-kube-api-access-ts2qz" (OuterVolumeSpecName: "kube-api-access-ts2qz") pod "6aac2224-3117-41c5-b022-534475fa21f9" (UID: "6aac2224-3117-41c5-b022-534475fa21f9"). InnerVolumeSpecName "kube-api-access-ts2qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:48:07.078736 master-0 kubenswrapper[28758]: I0223 14:48:07.078654 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aac2224-3117-41c5-b022-534475fa21f9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6aac2224-3117-41c5-b022-534475fa21f9" (UID: "6aac2224-3117-41c5-b022-534475fa21f9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:48:07.084894 master-0 kubenswrapper[28758]: I0223 14:48:07.084835 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aac2224-3117-41c5-b022-534475fa21f9-config" (OuterVolumeSpecName: "config") pod "6aac2224-3117-41c5-b022-534475fa21f9" (UID: "6aac2224-3117-41c5-b022-534475fa21f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:48:07.091095 master-0 kubenswrapper[28758]: I0223 14:48:07.090958 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aac2224-3117-41c5-b022-534475fa21f9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6aac2224-3117-41c5-b022-534475fa21f9" (UID: "6aac2224-3117-41c5-b022-534475fa21f9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:48:07.159940 master-0 kubenswrapper[28758]: I0223 14:48:07.159066 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ts2qz\" (UniqueName: \"kubernetes.io/projected/6aac2224-3117-41c5-b022-534475fa21f9-kube-api-access-ts2qz\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:07.159940 master-0 kubenswrapper[28758]: I0223 14:48:07.159105 28758 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6aac2224-3117-41c5-b022-534475fa21f9-dns-svc\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:07.159940 master-0 kubenswrapper[28758]: I0223 14:48:07.159118 28758 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6aac2224-3117-41c5-b022-534475fa21f9-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:07.159940 master-0 kubenswrapper[28758]: I0223 14:48:07.159127 28758 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6aac2224-3117-41c5-b022-534475fa21f9-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:07.421493 master-0 kubenswrapper[28758]: I0223 14:48:07.421423 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"83080fd9-e913-441c-b248-8fd33251ced3","Type":"ContainerStarted","Data":"ce6b6bb74b8581b7620f2dd94be1b90a19c1e4d1fcee39ba8f9cd2caa80135f9"} Feb 23 14:48:07.423822 master-0 kubenswrapper[28758]: I0223 14:48:07.423582 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-679f75d775-xhpfv" event={"ID":"6aac2224-3117-41c5-b022-534475fa21f9","Type":"ContainerDied","Data":"dc1f50f00a6e697f77692fe52dc995e98b7ec3b0229b3bb62968fc737c4ec987"} Feb 23 14:48:07.423822 master-0 kubenswrapper[28758]: I0223 14:48:07.423619 28758 scope.go:117] "RemoveContainer" containerID="08b76e92f2dd24704c22ea509fc93caf5391ef6ea4bda1a98bf0abaad71d2a39" Feb 23 14:48:07.424157 master-0 kubenswrapper[28758]: I0223 14:48:07.424035 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-679f75d775-xhpfv" Feb 23 14:48:07.425746 master-0 kubenswrapper[28758]: I0223 14:48:07.425686 28758 generic.go:334] "Generic (PLEG): container finished" podID="cf60feaf-b30d-4b21-8cd8-f75b3c311d2a" containerID="ad7492066209c8ad7d8789d76768034c93bb7949707b18c6395ad3745b7bf3c4" exitCode=0 Feb 23 14:48:07.426680 master-0 kubenswrapper[28758]: I0223 14:48:07.426638 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79745f7855-48j7g" event={"ID":"cf60feaf-b30d-4b21-8cd8-f75b3c311d2a","Type":"ContainerDied","Data":"ad7492066209c8ad7d8789d76768034c93bb7949707b18c6395ad3745b7bf3c4"} Feb 23 14:48:07.592661 master-0 kubenswrapper[28758]: I0223 14:48:07.583414 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-679f75d775-xhpfv"] Feb 23 14:48:07.592661 master-0 kubenswrapper[28758]: I0223 14:48:07.590566 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-679f75d775-xhpfv"] Feb 23 14:48:08.111312 master-0 kubenswrapper[28758]: I0223 14:48:08.110270 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aac2224-3117-41c5-b022-534475fa21f9" path="/var/lib/kubelet/pods/6aac2224-3117-41c5-b022-534475fa21f9/volumes" Feb 23 14:48:08.286661 master-0 kubenswrapper[28758]: I0223 14:48:08.270886 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79745f7855-48j7g"] Feb 23 14:48:08.354084 master-0 kubenswrapper[28758]: I0223 14:48:08.350684 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b55dc5f67-8cr6h"] Feb 23 14:48:08.354084 master-0 kubenswrapper[28758]: E0223 14:48:08.351227 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aac2224-3117-41c5-b022-534475fa21f9" containerName="init" Feb 23 14:48:08.354084 master-0 kubenswrapper[28758]: I0223 14:48:08.351247 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aac2224-3117-41c5-b022-534475fa21f9" containerName="init" Feb 23 14:48:08.354084 master-0 kubenswrapper[28758]: I0223 14:48:08.351794 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aac2224-3117-41c5-b022-534475fa21f9" containerName="init" Feb 23 14:48:08.354084 master-0 kubenswrapper[28758]: I0223 14:48:08.353183 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b55dc5f67-8cr6h" Feb 23 14:48:08.381255 master-0 kubenswrapper[28758]: I0223 14:48:08.378308 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b55dc5f67-8cr6h"] Feb 23 14:48:08.443321 master-0 kubenswrapper[28758]: I0223 14:48:08.443247 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79745f7855-48j7g" event={"ID":"cf60feaf-b30d-4b21-8cd8-f75b3c311d2a","Type":"ContainerStarted","Data":"4ffc85183d69efd71b74e7c1e82ee7f4d0b7cf5295bebb86266de65f45daa24a"} Feb 23 14:48:08.443714 master-0 kubenswrapper[28758]: I0223 14:48:08.443420 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79745f7855-48j7g" Feb 23 14:48:08.446430 master-0 kubenswrapper[28758]: I0223 14:48:08.446397 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"83080fd9-e913-441c-b248-8fd33251ced3","Type":"ContainerStarted","Data":"89c5d19e7e842a7345f2b9521330b38b3e38e975f95dc954018c82dbdf9ce8a5"} Feb 23 14:48:08.494572 master-0 kubenswrapper[28758]: I0223 14:48:08.494410 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/162300cf-7eec-4b50-8667-5b6abea5e4d1-config\") pod \"dnsmasq-dns-5b55dc5f67-8cr6h\" (UID: \"162300cf-7eec-4b50-8667-5b6abea5e4d1\") " pod="openstack/dnsmasq-dns-5b55dc5f67-8cr6h" Feb 23 14:48:08.494572 master-0 kubenswrapper[28758]: I0223 14:48:08.494529 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/162300cf-7eec-4b50-8667-5b6abea5e4d1-dns-svc\") pod \"dnsmasq-dns-5b55dc5f67-8cr6h\" (UID: \"162300cf-7eec-4b50-8667-5b6abea5e4d1\") " pod="openstack/dnsmasq-dns-5b55dc5f67-8cr6h" Feb 23 14:48:08.494801 master-0 kubenswrapper[28758]: I0223 14:48:08.494563 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scmqn\" (UniqueName: \"kubernetes.io/projected/162300cf-7eec-4b50-8667-5b6abea5e4d1-kube-api-access-scmqn\") pod \"dnsmasq-dns-5b55dc5f67-8cr6h\" (UID: \"162300cf-7eec-4b50-8667-5b6abea5e4d1\") " pod="openstack/dnsmasq-dns-5b55dc5f67-8cr6h" Feb 23 14:48:08.495163 master-0 kubenswrapper[28758]: I0223 14:48:08.494922 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/162300cf-7eec-4b50-8667-5b6abea5e4d1-ovsdbserver-sb\") pod \"dnsmasq-dns-5b55dc5f67-8cr6h\" (UID: \"162300cf-7eec-4b50-8667-5b6abea5e4d1\") " pod="openstack/dnsmasq-dns-5b55dc5f67-8cr6h" Feb 23 14:48:08.501115 master-0 kubenswrapper[28758]: I0223 14:48:08.501056 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/162300cf-7eec-4b50-8667-5b6abea5e4d1-ovsdbserver-nb\") pod \"dnsmasq-dns-5b55dc5f67-8cr6h\" (UID: \"162300cf-7eec-4b50-8667-5b6abea5e4d1\") " pod="openstack/dnsmasq-dns-5b55dc5f67-8cr6h" Feb 23 14:48:08.544226 master-0 kubenswrapper[28758]: I0223 14:48:08.543970 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79745f7855-48j7g" podStartSLOduration=3.543936323 podStartE2EDuration="3.543936323s" podCreationTimestamp="2026-02-23 14:48:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:48:08.519921765 +0000 UTC m=+820.646237707" watchObservedRunningTime="2026-02-23 14:48:08.543936323 +0000 UTC m=+820.670252255" Feb 23 14:48:08.603850 master-0 kubenswrapper[28758]: I0223 14:48:08.603794 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/162300cf-7eec-4b50-8667-5b6abea5e4d1-ovsdbserver-sb\") pod \"dnsmasq-dns-5b55dc5f67-8cr6h\" (UID: \"162300cf-7eec-4b50-8667-5b6abea5e4d1\") " pod="openstack/dnsmasq-dns-5b55dc5f67-8cr6h" Feb 23 14:48:08.604082 master-0 kubenswrapper[28758]: I0223 14:48:08.603857 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/162300cf-7eec-4b50-8667-5b6abea5e4d1-ovsdbserver-nb\") pod \"dnsmasq-dns-5b55dc5f67-8cr6h\" (UID: \"162300cf-7eec-4b50-8667-5b6abea5e4d1\") " pod="openstack/dnsmasq-dns-5b55dc5f67-8cr6h" Feb 23 14:48:08.604082 master-0 kubenswrapper[28758]: I0223 14:48:08.603975 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/162300cf-7eec-4b50-8667-5b6abea5e4d1-config\") pod \"dnsmasq-dns-5b55dc5f67-8cr6h\" (UID: \"162300cf-7eec-4b50-8667-5b6abea5e4d1\") " pod="openstack/dnsmasq-dns-5b55dc5f67-8cr6h" Feb 23 14:48:08.604082 master-0 kubenswrapper[28758]: I0223 14:48:08.604006 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/162300cf-7eec-4b50-8667-5b6abea5e4d1-dns-svc\") pod \"dnsmasq-dns-5b55dc5f67-8cr6h\" (UID: \"162300cf-7eec-4b50-8667-5b6abea5e4d1\") " pod="openstack/dnsmasq-dns-5b55dc5f67-8cr6h" Feb 23 14:48:08.604082 master-0 kubenswrapper[28758]: I0223 14:48:08.604036 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scmqn\" (UniqueName: \"kubernetes.io/projected/162300cf-7eec-4b50-8667-5b6abea5e4d1-kube-api-access-scmqn\") pod \"dnsmasq-dns-5b55dc5f67-8cr6h\" (UID: \"162300cf-7eec-4b50-8667-5b6abea5e4d1\") " pod="openstack/dnsmasq-dns-5b55dc5f67-8cr6h" Feb 23 14:48:08.604879 master-0 kubenswrapper[28758]: I0223 14:48:08.604840 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/162300cf-7eec-4b50-8667-5b6abea5e4d1-ovsdbserver-sb\") pod \"dnsmasq-dns-5b55dc5f67-8cr6h\" (UID: \"162300cf-7eec-4b50-8667-5b6abea5e4d1\") " pod="openstack/dnsmasq-dns-5b55dc5f67-8cr6h" Feb 23 14:48:08.605038 master-0 kubenswrapper[28758]: I0223 14:48:08.605007 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/162300cf-7eec-4b50-8667-5b6abea5e4d1-ovsdbserver-nb\") pod \"dnsmasq-dns-5b55dc5f67-8cr6h\" (UID: \"162300cf-7eec-4b50-8667-5b6abea5e4d1\") " pod="openstack/dnsmasq-dns-5b55dc5f67-8cr6h" Feb 23 14:48:08.605088 master-0 kubenswrapper[28758]: I0223 14:48:08.605044 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/162300cf-7eec-4b50-8667-5b6abea5e4d1-config\") pod \"dnsmasq-dns-5b55dc5f67-8cr6h\" (UID: \"162300cf-7eec-4b50-8667-5b6abea5e4d1\") " pod="openstack/dnsmasq-dns-5b55dc5f67-8cr6h" Feb 23 14:48:08.605157 master-0 kubenswrapper[28758]: I0223 14:48:08.605122 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/162300cf-7eec-4b50-8667-5b6abea5e4d1-dns-svc\") pod \"dnsmasq-dns-5b55dc5f67-8cr6h\" (UID: \"162300cf-7eec-4b50-8667-5b6abea5e4d1\") " pod="openstack/dnsmasq-dns-5b55dc5f67-8cr6h" Feb 23 14:48:08.625451 master-0 kubenswrapper[28758]: I0223 14:48:08.625396 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scmqn\" (UniqueName: \"kubernetes.io/projected/162300cf-7eec-4b50-8667-5b6abea5e4d1-kube-api-access-scmqn\") pod \"dnsmasq-dns-5b55dc5f67-8cr6h\" (UID: \"162300cf-7eec-4b50-8667-5b6abea5e4d1\") " pod="openstack/dnsmasq-dns-5b55dc5f67-8cr6h" Feb 23 14:48:08.683624 master-0 kubenswrapper[28758]: I0223 14:48:08.683537 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b55dc5f67-8cr6h" Feb 23 14:48:09.302279 master-0 kubenswrapper[28758]: I0223 14:48:09.302224 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b55dc5f67-8cr6h"] Feb 23 14:48:09.463674 master-0 kubenswrapper[28758]: I0223 14:48:09.463607 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"83080fd9-e913-441c-b248-8fd33251ced3","Type":"ContainerStarted","Data":"095b9e555bc8a9326a4d9af01fc15379981e13e046ac4f7c471587410ee3a929"} Feb 23 14:48:09.466850 master-0 kubenswrapper[28758]: I0223 14:48:09.464678 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 23 14:48:09.468434 master-0 kubenswrapper[28758]: I0223 14:48:09.468385 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b55dc5f67-8cr6h" event={"ID":"162300cf-7eec-4b50-8667-5b6abea5e4d1","Type":"ContainerStarted","Data":"2757e8f14dd508d75782381be5749d4f93480f9312cb88d5b478319ab1997851"} Feb 23 14:48:09.468652 master-0 kubenswrapper[28758]: I0223 14:48:09.468577 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79745f7855-48j7g" podUID="cf60feaf-b30d-4b21-8cd8-f75b3c311d2a" containerName="dnsmasq-dns" containerID="cri-o://4ffc85183d69efd71b74e7c1e82ee7f4d0b7cf5295bebb86266de65f45daa24a" gracePeriod=10 Feb 23 14:48:09.762567 master-0 kubenswrapper[28758]: I0223 14:48:09.762451 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.414905506 podStartE2EDuration="4.762429458s" podCreationTimestamp="2026-02-23 14:48:05 +0000 UTC" firstStartedPulling="2026-02-23 14:48:06.585484709 +0000 UTC m=+818.711800641" lastFinishedPulling="2026-02-23 14:48:07.933008661 +0000 UTC m=+820.059324593" observedRunningTime="2026-02-23 14:48:09.75722787 +0000 UTC m=+821.883543802" watchObservedRunningTime="2026-02-23 14:48:09.762429458 +0000 UTC m=+821.888745390" Feb 23 14:48:10.024824 master-0 kubenswrapper[28758]: I0223 14:48:10.024753 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 23 14:48:10.024824 master-0 kubenswrapper[28758]: I0223 14:48:10.024831 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 23 14:48:10.099756 master-0 kubenswrapper[28758]: I0223 14:48:10.099675 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 23 14:48:10.479880 master-0 kubenswrapper[28758]: I0223 14:48:10.479787 28758 generic.go:334] "Generic (PLEG): container finished" podID="162300cf-7eec-4b50-8667-5b6abea5e4d1" containerID="e1aed1dc5b64635fff97630e05593e15292c2d09f5c97e5ef47dd7f57144fc25" exitCode=0 Feb 23 14:48:10.479880 master-0 kubenswrapper[28758]: I0223 14:48:10.479869 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b55dc5f67-8cr6h" event={"ID":"162300cf-7eec-4b50-8667-5b6abea5e4d1","Type":"ContainerDied","Data":"e1aed1dc5b64635fff97630e05593e15292c2d09f5c97e5ef47dd7f57144fc25"} Feb 23 14:48:10.489854 master-0 kubenswrapper[28758]: I0223 14:48:10.489730 28758 generic.go:334] "Generic (PLEG): container finished" podID="cf60feaf-b30d-4b21-8cd8-f75b3c311d2a" containerID="4ffc85183d69efd71b74e7c1e82ee7f4d0b7cf5295bebb86266de65f45daa24a" exitCode=0 Feb 23 14:48:10.489854 master-0 kubenswrapper[28758]: I0223 14:48:10.489781 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79745f7855-48j7g" event={"ID":"cf60feaf-b30d-4b21-8cd8-f75b3c311d2a","Type":"ContainerDied","Data":"4ffc85183d69efd71b74e7c1e82ee7f4d0b7cf5295bebb86266de65f45daa24a"} Feb 23 14:48:10.579160 master-0 kubenswrapper[28758]: I0223 14:48:10.578994 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 23 14:48:10.587226 master-0 kubenswrapper[28758]: I0223 14:48:10.587163 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 23 14:48:10.594130 master-0 kubenswrapper[28758]: I0223 14:48:10.593895 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 23 14:48:10.594130 master-0 kubenswrapper[28758]: I0223 14:48:10.593961 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 23 14:48:10.594377 master-0 kubenswrapper[28758]: I0223 14:48:10.594214 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 23 14:48:10.600148 master-0 kubenswrapper[28758]: I0223 14:48:10.600093 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 23 14:48:10.752278 master-0 kubenswrapper[28758]: I0223 14:48:10.752222 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79745f7855-48j7g" Feb 23 14:48:10.855331 master-0 kubenswrapper[28758]: I0223 14:48:10.855196 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf60feaf-b30d-4b21-8cd8-f75b3c311d2a-ovsdbserver-sb\") pod \"cf60feaf-b30d-4b21-8cd8-f75b3c311d2a\" (UID: \"cf60feaf-b30d-4b21-8cd8-f75b3c311d2a\") " Feb 23 14:48:10.855606 master-0 kubenswrapper[28758]: I0223 14:48:10.855584 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5g5j\" (UniqueName: \"kubernetes.io/projected/cf60feaf-b30d-4b21-8cd8-f75b3c311d2a-kube-api-access-t5g5j\") pod \"cf60feaf-b30d-4b21-8cd8-f75b3c311d2a\" (UID: \"cf60feaf-b30d-4b21-8cd8-f75b3c311d2a\") " Feb 23 14:48:10.855874 master-0 kubenswrapper[28758]: I0223 14:48:10.855855 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf60feaf-b30d-4b21-8cd8-f75b3c311d2a-ovsdbserver-nb\") pod \"cf60feaf-b30d-4b21-8cd8-f75b3c311d2a\" (UID: \"cf60feaf-b30d-4b21-8cd8-f75b3c311d2a\") " Feb 23 14:48:10.856059 master-0 kubenswrapper[28758]: I0223 14:48:10.856039 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf60feaf-b30d-4b21-8cd8-f75b3c311d2a-config\") pod \"cf60feaf-b30d-4b21-8cd8-f75b3c311d2a\" (UID: \"cf60feaf-b30d-4b21-8cd8-f75b3c311d2a\") " Feb 23 14:48:10.856290 master-0 kubenswrapper[28758]: I0223 14:48:10.856271 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf60feaf-b30d-4b21-8cd8-f75b3c311d2a-dns-svc\") pod \"cf60feaf-b30d-4b21-8cd8-f75b3c311d2a\" (UID: \"cf60feaf-b30d-4b21-8cd8-f75b3c311d2a\") " Feb 23 14:48:10.865620 master-0 kubenswrapper[28758]: I0223 14:48:10.865476 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 23 14:48:10.872602 master-0 kubenswrapper[28758]: I0223 14:48:10.872527 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf60feaf-b30d-4b21-8cd8-f75b3c311d2a-kube-api-access-t5g5j" (OuterVolumeSpecName: "kube-api-access-t5g5j") pod "cf60feaf-b30d-4b21-8cd8-f75b3c311d2a" (UID: "cf60feaf-b30d-4b21-8cd8-f75b3c311d2a"). InnerVolumeSpecName "kube-api-access-t5g5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:48:10.911290 master-0 kubenswrapper[28758]: I0223 14:48:10.911242 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf60feaf-b30d-4b21-8cd8-f75b3c311d2a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cf60feaf-b30d-4b21-8cd8-f75b3c311d2a" (UID: "cf60feaf-b30d-4b21-8cd8-f75b3c311d2a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:48:10.932882 master-0 kubenswrapper[28758]: I0223 14:48:10.932814 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf60feaf-b30d-4b21-8cd8-f75b3c311d2a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cf60feaf-b30d-4b21-8cd8-f75b3c311d2a" (UID: "cf60feaf-b30d-4b21-8cd8-f75b3c311d2a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:48:10.934135 master-0 kubenswrapper[28758]: I0223 14:48:10.934105 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf60feaf-b30d-4b21-8cd8-f75b3c311d2a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cf60feaf-b30d-4b21-8cd8-f75b3c311d2a" (UID: "cf60feaf-b30d-4b21-8cd8-f75b3c311d2a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:48:10.935661 master-0 kubenswrapper[28758]: I0223 14:48:10.935631 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf60feaf-b30d-4b21-8cd8-f75b3c311d2a-config" (OuterVolumeSpecName: "config") pod "cf60feaf-b30d-4b21-8cd8-f75b3c311d2a" (UID: "cf60feaf-b30d-4b21-8cd8-f75b3c311d2a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:48:10.967958 master-0 kubenswrapper[28758]: I0223 14:48:10.967831 28758 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf60feaf-b30d-4b21-8cd8-f75b3c311d2a-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:10.967958 master-0 kubenswrapper[28758]: I0223 14:48:10.967892 28758 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf60feaf-b30d-4b21-8cd8-f75b3c311d2a-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:10.967958 master-0 kubenswrapper[28758]: I0223 14:48:10.967902 28758 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf60feaf-b30d-4b21-8cd8-f75b3c311d2a-dns-svc\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:10.967958 master-0 kubenswrapper[28758]: I0223 14:48:10.967910 28758 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf60feaf-b30d-4b21-8cd8-f75b3c311d2a-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:10.967958 master-0 kubenswrapper[28758]: I0223 14:48:10.967923 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5g5j\" (UniqueName: \"kubernetes.io/projected/cf60feaf-b30d-4b21-8cd8-f75b3c311d2a-kube-api-access-t5g5j\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:11.172255 master-0 kubenswrapper[28758]: I0223 14:48:11.172184 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-925c6af3-41f7-409a-968b-0ed621490764\" (UniqueName: \"kubernetes.io/csi/topolvm.io^a275db2b-1d19-4445-ad46-700721698a0c\") pod \"swift-storage-0\" (UID: \"3b978484-31f3-46af-aedd-e96a997da517\") " pod="openstack/swift-storage-0" Feb 23 14:48:11.172255 master-0 kubenswrapper[28758]: I0223 14:48:11.172264 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3b978484-31f3-46af-aedd-e96a997da517-lock\") pod \"swift-storage-0\" (UID: \"3b978484-31f3-46af-aedd-e96a997da517\") " pod="openstack/swift-storage-0" Feb 23 14:48:11.172935 master-0 kubenswrapper[28758]: I0223 14:48:11.172290 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3b978484-31f3-46af-aedd-e96a997da517-etc-swift\") pod \"swift-storage-0\" (UID: \"3b978484-31f3-46af-aedd-e96a997da517\") " pod="openstack/swift-storage-0" Feb 23 14:48:11.172935 master-0 kubenswrapper[28758]: I0223 14:48:11.172344 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3b978484-31f3-46af-aedd-e96a997da517-cache\") pod \"swift-storage-0\" (UID: \"3b978484-31f3-46af-aedd-e96a997da517\") " pod="openstack/swift-storage-0" Feb 23 14:48:11.172935 master-0 kubenswrapper[28758]: I0223 14:48:11.172383 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b978484-31f3-46af-aedd-e96a997da517-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"3b978484-31f3-46af-aedd-e96a997da517\") " pod="openstack/swift-storage-0" Feb 23 14:48:11.172935 master-0 kubenswrapper[28758]: I0223 14:48:11.172419 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bnk5\" (UniqueName: \"kubernetes.io/projected/3b978484-31f3-46af-aedd-e96a997da517-kube-api-access-9bnk5\") pod \"swift-storage-0\" (UID: \"3b978484-31f3-46af-aedd-e96a997da517\") " pod="openstack/swift-storage-0" Feb 23 14:48:11.265436 master-0 kubenswrapper[28758]: I0223 14:48:11.265336 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 23 14:48:11.265436 master-0 kubenswrapper[28758]: I0223 14:48:11.265413 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 23 14:48:11.274898 master-0 kubenswrapper[28758]: I0223 14:48:11.274849 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3b978484-31f3-46af-aedd-e96a997da517-lock\") pod \"swift-storage-0\" (UID: \"3b978484-31f3-46af-aedd-e96a997da517\") " pod="openstack/swift-storage-0" Feb 23 14:48:11.275001 master-0 kubenswrapper[28758]: I0223 14:48:11.274921 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3b978484-31f3-46af-aedd-e96a997da517-etc-swift\") pod \"swift-storage-0\" (UID: \"3b978484-31f3-46af-aedd-e96a997da517\") " pod="openstack/swift-storage-0" Feb 23 14:48:11.275117 master-0 kubenswrapper[28758]: I0223 14:48:11.275080 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3b978484-31f3-46af-aedd-e96a997da517-cache\") pod \"swift-storage-0\" (UID: \"3b978484-31f3-46af-aedd-e96a997da517\") " pod="openstack/swift-storage-0" Feb 23 14:48:11.275170 master-0 kubenswrapper[28758]: I0223 14:48:11.275131 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b978484-31f3-46af-aedd-e96a997da517-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"3b978484-31f3-46af-aedd-e96a997da517\") " pod="openstack/swift-storage-0" Feb 23 14:48:11.275219 master-0 kubenswrapper[28758]: I0223 14:48:11.275180 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bnk5\" (UniqueName: \"kubernetes.io/projected/3b978484-31f3-46af-aedd-e96a997da517-kube-api-access-9bnk5\") pod \"swift-storage-0\" (UID: \"3b978484-31f3-46af-aedd-e96a997da517\") " pod="openstack/swift-storage-0" Feb 23 14:48:11.277453 master-0 kubenswrapper[28758]: E0223 14:48:11.276449 28758 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 23 14:48:11.277453 master-0 kubenswrapper[28758]: E0223 14:48:11.276483 28758 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 23 14:48:11.277453 master-0 kubenswrapper[28758]: E0223 14:48:11.276536 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b978484-31f3-46af-aedd-e96a997da517-etc-swift podName:3b978484-31f3-46af-aedd-e96a997da517 nodeName:}" failed. No retries permitted until 2026-02-23 14:48:11.776521295 +0000 UTC m=+823.902837227 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3b978484-31f3-46af-aedd-e96a997da517-etc-swift") pod "swift-storage-0" (UID: "3b978484-31f3-46af-aedd-e96a997da517") : configmap "swift-ring-files" not found Feb 23 14:48:11.278865 master-0 kubenswrapper[28758]: I0223 14:48:11.278818 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3b978484-31f3-46af-aedd-e96a997da517-lock\") pod \"swift-storage-0\" (UID: \"3b978484-31f3-46af-aedd-e96a997da517\") " pod="openstack/swift-storage-0" Feb 23 14:48:11.280186 master-0 kubenswrapper[28758]: I0223 14:48:11.280151 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b978484-31f3-46af-aedd-e96a997da517-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"3b978484-31f3-46af-aedd-e96a997da517\") " pod="openstack/swift-storage-0" Feb 23 14:48:11.281595 master-0 kubenswrapper[28758]: I0223 14:48:11.281479 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3b978484-31f3-46af-aedd-e96a997da517-cache\") pod \"swift-storage-0\" (UID: \"3b978484-31f3-46af-aedd-e96a997da517\") " pod="openstack/swift-storage-0" Feb 23 14:48:11.380637 master-0 kubenswrapper[28758]: I0223 14:48:11.377207 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-925c6af3-41f7-409a-968b-0ed621490764\" (UniqueName: \"kubernetes.io/csi/topolvm.io^a275db2b-1d19-4445-ad46-700721698a0c\") pod \"swift-storage-0\" (UID: \"3b978484-31f3-46af-aedd-e96a997da517\") " pod="openstack/swift-storage-0" Feb 23 14:48:11.380637 master-0 kubenswrapper[28758]: I0223 14:48:11.378817 28758 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 14:48:11.380637 master-0 kubenswrapper[28758]: I0223 14:48:11.378842 28758 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-925c6af3-41f7-409a-968b-0ed621490764\" (UniqueName: \"kubernetes.io/csi/topolvm.io^a275db2b-1d19-4445-ad46-700721698a0c\") pod \"swift-storage-0\" (UID: \"3b978484-31f3-46af-aedd-e96a997da517\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/70bee7cc419191282a73c951eb69531526421d5dde0713ecf1efd67083574c1e/globalmount\"" pod="openstack/swift-storage-0" Feb 23 14:48:11.510388 master-0 kubenswrapper[28758]: I0223 14:48:11.510329 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b55dc5f67-8cr6h" event={"ID":"162300cf-7eec-4b50-8667-5b6abea5e4d1","Type":"ContainerStarted","Data":"a1fae130cb2207727b5ef0cadf257409cc274b961ec4da49515e1ca9f908680b"} Feb 23 14:48:11.510846 master-0 kubenswrapper[28758]: I0223 14:48:11.510549 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b55dc5f67-8cr6h" Feb 23 14:48:11.513625 master-0 kubenswrapper[28758]: I0223 14:48:11.513591 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79745f7855-48j7g" Feb 23 14:48:11.514304 master-0 kubenswrapper[28758]: I0223 14:48:11.514271 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79745f7855-48j7g" event={"ID":"cf60feaf-b30d-4b21-8cd8-f75b3c311d2a","Type":"ContainerDied","Data":"e4790f769f844387566c9ab87c820791a82862e266f69c4b7e046d3a15ebb453"} Feb 23 14:48:11.514350 master-0 kubenswrapper[28758]: I0223 14:48:11.514315 28758 scope.go:117] "RemoveContainer" containerID="4ffc85183d69efd71b74e7c1e82ee7f4d0b7cf5295bebb86266de65f45daa24a" Feb 23 14:48:11.568029 master-0 kubenswrapper[28758]: I0223 14:48:11.567969 28758 scope.go:117] "RemoveContainer" containerID="ad7492066209c8ad7d8789d76768034c93bb7949707b18c6395ad3745b7bf3c4" Feb 23 14:48:11.642645 master-0 kubenswrapper[28758]: I0223 14:48:11.642399 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bnk5\" (UniqueName: \"kubernetes.io/projected/3b978484-31f3-46af-aedd-e96a997da517-kube-api-access-9bnk5\") pod \"swift-storage-0\" (UID: \"3b978484-31f3-46af-aedd-e96a997da517\") " pod="openstack/swift-storage-0" Feb 23 14:48:11.845725 master-0 kubenswrapper[28758]: I0223 14:48:11.828507 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3b978484-31f3-46af-aedd-e96a997da517-etc-swift\") pod \"swift-storage-0\" (UID: \"3b978484-31f3-46af-aedd-e96a997da517\") " pod="openstack/swift-storage-0" Feb 23 14:48:11.845725 master-0 kubenswrapper[28758]: E0223 14:48:11.829123 28758 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 23 14:48:11.845725 master-0 kubenswrapper[28758]: E0223 14:48:11.838563 28758 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 23 14:48:11.845725 master-0 kubenswrapper[28758]: E0223 14:48:11.838643 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b978484-31f3-46af-aedd-e96a997da517-etc-swift podName:3b978484-31f3-46af-aedd-e96a997da517 nodeName:}" failed. No retries permitted until 2026-02-23 14:48:12.8386189 +0000 UTC m=+824.964934842 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3b978484-31f3-46af-aedd-e96a997da517-etc-swift") pod "swift-storage-0" (UID: "3b978484-31f3-46af-aedd-e96a997da517") : configmap "swift-ring-files" not found Feb 23 14:48:11.906537 master-0 kubenswrapper[28758]: I0223 14:48:11.890250 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-72x5r"] Feb 23 14:48:11.906537 master-0 kubenswrapper[28758]: E0223 14:48:11.891206 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf60feaf-b30d-4b21-8cd8-f75b3c311d2a" containerName="dnsmasq-dns" Feb 23 14:48:11.906537 master-0 kubenswrapper[28758]: I0223 14:48:11.891258 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf60feaf-b30d-4b21-8cd8-f75b3c311d2a" containerName="dnsmasq-dns" Feb 23 14:48:11.906537 master-0 kubenswrapper[28758]: E0223 14:48:11.891273 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf60feaf-b30d-4b21-8cd8-f75b3c311d2a" containerName="init" Feb 23 14:48:11.906537 master-0 kubenswrapper[28758]: I0223 14:48:11.891279 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf60feaf-b30d-4b21-8cd8-f75b3c311d2a" containerName="init" Feb 23 14:48:11.906537 master-0 kubenswrapper[28758]: I0223 14:48:11.891587 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf60feaf-b30d-4b21-8cd8-f75b3c311d2a" containerName="dnsmasq-dns" Feb 23 14:48:11.906537 master-0 kubenswrapper[28758]: I0223 14:48:11.892341 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-72x5r" Feb 23 14:48:11.906537 master-0 kubenswrapper[28758]: I0223 14:48:11.896128 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 23 14:48:11.906537 master-0 kubenswrapper[28758]: I0223 14:48:11.896425 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 23 14:48:11.906537 master-0 kubenswrapper[28758]: I0223 14:48:11.896607 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 23 14:48:11.910751 master-0 kubenswrapper[28758]: I0223 14:48:11.907681 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79745f7855-48j7g"] Feb 23 14:48:11.948573 master-0 kubenswrapper[28758]: I0223 14:48:11.938332 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79745f7855-48j7g"] Feb 23 14:48:11.948573 master-0 kubenswrapper[28758]: I0223 14:48:11.939926 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-dispersionconf\") pod \"swift-ring-rebalance-72x5r\" (UID: \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\") " pod="openstack/swift-ring-rebalance-72x5r" Feb 23 14:48:11.948573 master-0 kubenswrapper[28758]: I0223 14:48:11.940046 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-swiftconf\") pod \"swift-ring-rebalance-72x5r\" (UID: \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\") " pod="openstack/swift-ring-rebalance-72x5r" Feb 23 14:48:11.948573 master-0 kubenswrapper[28758]: I0223 14:48:11.940071 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-ring-data-devices\") pod \"swift-ring-rebalance-72x5r\" (UID: \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\") " pod="openstack/swift-ring-rebalance-72x5r" Feb 23 14:48:11.948573 master-0 kubenswrapper[28758]: I0223 14:48:11.940094 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-combined-ca-bundle\") pod \"swift-ring-rebalance-72x5r\" (UID: \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\") " pod="openstack/swift-ring-rebalance-72x5r" Feb 23 14:48:11.948573 master-0 kubenswrapper[28758]: I0223 14:48:11.940115 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6kzx\" (UniqueName: \"kubernetes.io/projected/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-kube-api-access-d6kzx\") pod \"swift-ring-rebalance-72x5r\" (UID: \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\") " pod="openstack/swift-ring-rebalance-72x5r" Feb 23 14:48:11.948573 master-0 kubenswrapper[28758]: I0223 14:48:11.940148 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-etc-swift\") pod \"swift-ring-rebalance-72x5r\" (UID: \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\") " pod="openstack/swift-ring-rebalance-72x5r" Feb 23 14:48:11.948573 master-0 kubenswrapper[28758]: I0223 14:48:11.940176 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-scripts\") pod \"swift-ring-rebalance-72x5r\" (UID: \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\") " pod="openstack/swift-ring-rebalance-72x5r" Feb 23 14:48:11.957034 master-0 kubenswrapper[28758]: I0223 14:48:11.956959 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-72x5r"] Feb 23 14:48:11.965590 master-0 kubenswrapper[28758]: I0223 14:48:11.965505 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b55dc5f67-8cr6h" podStartSLOduration=3.96546515 podStartE2EDuration="3.96546515s" podCreationTimestamp="2026-02-23 14:48:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:48:11.874199845 +0000 UTC m=+824.000515777" watchObservedRunningTime="2026-02-23 14:48:11.96546515 +0000 UTC m=+824.091781082" Feb 23 14:48:11.990224 master-0 kubenswrapper[28758]: I0223 14:48:11.990157 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-72x5r"] Feb 23 14:48:11.993054 master-0 kubenswrapper[28758]: E0223 14:48:11.991829 28758 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-d6kzx ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-72x5r" podUID="8a9bef43-36af-4765-8ace-cfc1fdb3c80a" Feb 23 14:48:12.002017 master-0 kubenswrapper[28758]: I0223 14:48:12.001955 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-m897g"] Feb 23 14:48:12.003965 master-0 kubenswrapper[28758]: I0223 14:48:12.003912 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m897g" Feb 23 14:48:12.016569 master-0 kubenswrapper[28758]: I0223 14:48:12.015691 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-m897g"] Feb 23 14:48:12.041702 master-0 kubenswrapper[28758]: I0223 14:48:12.041637 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-dispersionconf\") pod \"swift-ring-rebalance-72x5r\" (UID: \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\") " pod="openstack/swift-ring-rebalance-72x5r" Feb 23 14:48:12.041944 master-0 kubenswrapper[28758]: I0223 14:48:12.041896 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/518b33ee-87c2-4090-a971-735d98c01d7f-scripts\") pod \"swift-ring-rebalance-m897g\" (UID: \"518b33ee-87c2-4090-a971-735d98c01d7f\") " pod="openstack/swift-ring-rebalance-m897g" Feb 23 14:48:12.042005 master-0 kubenswrapper[28758]: I0223 14:48:12.041978 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/518b33ee-87c2-4090-a971-735d98c01d7f-combined-ca-bundle\") pod \"swift-ring-rebalance-m897g\" (UID: \"518b33ee-87c2-4090-a971-735d98c01d7f\") " pod="openstack/swift-ring-rebalance-m897g" Feb 23 14:48:12.042074 master-0 kubenswrapper[28758]: I0223 14:48:12.042017 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/518b33ee-87c2-4090-a971-735d98c01d7f-dispersionconf\") pod \"swift-ring-rebalance-m897g\" (UID: \"518b33ee-87c2-4090-a971-735d98c01d7f\") " pod="openstack/swift-ring-rebalance-m897g" Feb 23 14:48:12.042074 master-0 kubenswrapper[28758]: I0223 14:48:12.042035 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/518b33ee-87c2-4090-a971-735d98c01d7f-etc-swift\") pod \"swift-ring-rebalance-m897g\" (UID: \"518b33ee-87c2-4090-a971-735d98c01d7f\") " pod="openstack/swift-ring-rebalance-m897g" Feb 23 14:48:12.042074 master-0 kubenswrapper[28758]: I0223 14:48:12.042060 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-swiftconf\") pod \"swift-ring-rebalance-72x5r\" (UID: \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\") " pod="openstack/swift-ring-rebalance-72x5r" Feb 23 14:48:12.042224 master-0 kubenswrapper[28758]: I0223 14:48:12.042085 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/518b33ee-87c2-4090-a971-735d98c01d7f-ring-data-devices\") pod \"swift-ring-rebalance-m897g\" (UID: \"518b33ee-87c2-4090-a971-735d98c01d7f\") " pod="openstack/swift-ring-rebalance-m897g" Feb 23 14:48:12.042224 master-0 kubenswrapper[28758]: I0223 14:48:12.042111 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwd8n\" (UniqueName: \"kubernetes.io/projected/518b33ee-87c2-4090-a971-735d98c01d7f-kube-api-access-gwd8n\") pod \"swift-ring-rebalance-m897g\" (UID: \"518b33ee-87c2-4090-a971-735d98c01d7f\") " pod="openstack/swift-ring-rebalance-m897g" Feb 23 14:48:12.042224 master-0 kubenswrapper[28758]: I0223 14:48:12.042137 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-ring-data-devices\") pod \"swift-ring-rebalance-72x5r\" (UID: \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\") " pod="openstack/swift-ring-rebalance-72x5r" Feb 23 14:48:12.042224 master-0 kubenswrapper[28758]: I0223 14:48:12.042164 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-combined-ca-bundle\") pod \"swift-ring-rebalance-72x5r\" (UID: \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\") " pod="openstack/swift-ring-rebalance-72x5r" Feb 23 14:48:12.042224 master-0 kubenswrapper[28758]: I0223 14:48:12.042192 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6kzx\" (UniqueName: \"kubernetes.io/projected/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-kube-api-access-d6kzx\") pod \"swift-ring-rebalance-72x5r\" (UID: \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\") " pod="openstack/swift-ring-rebalance-72x5r" Feb 23 14:48:12.042224 master-0 kubenswrapper[28758]: I0223 14:48:12.042215 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/518b33ee-87c2-4090-a971-735d98c01d7f-swiftconf\") pod \"swift-ring-rebalance-m897g\" (UID: \"518b33ee-87c2-4090-a971-735d98c01d7f\") " pod="openstack/swift-ring-rebalance-m897g" Feb 23 14:48:12.042514 master-0 kubenswrapper[28758]: I0223 14:48:12.042241 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-etc-swift\") pod \"swift-ring-rebalance-72x5r\" (UID: \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\") " pod="openstack/swift-ring-rebalance-72x5r" Feb 23 14:48:12.042514 master-0 kubenswrapper[28758]: I0223 14:48:12.042273 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-scripts\") pod \"swift-ring-rebalance-72x5r\" (UID: \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\") " pod="openstack/swift-ring-rebalance-72x5r" Feb 23 14:48:12.042888 master-0 kubenswrapper[28758]: I0223 14:48:12.042815 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-etc-swift\") pod \"swift-ring-rebalance-72x5r\" (UID: \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\") " pod="openstack/swift-ring-rebalance-72x5r" Feb 23 14:48:12.043588 master-0 kubenswrapper[28758]: I0223 14:48:12.043548 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-ring-data-devices\") pod \"swift-ring-rebalance-72x5r\" (UID: \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\") " pod="openstack/swift-ring-rebalance-72x5r" Feb 23 14:48:12.044638 master-0 kubenswrapper[28758]: I0223 14:48:12.044545 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-scripts\") pod \"swift-ring-rebalance-72x5r\" (UID: \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\") " pod="openstack/swift-ring-rebalance-72x5r" Feb 23 14:48:12.045093 master-0 kubenswrapper[28758]: I0223 14:48:12.044944 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-dispersionconf\") pod \"swift-ring-rebalance-72x5r\" (UID: \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\") " pod="openstack/swift-ring-rebalance-72x5r" Feb 23 14:48:12.046695 master-0 kubenswrapper[28758]: I0223 14:48:12.046670 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-swiftconf\") pod \"swift-ring-rebalance-72x5r\" (UID: \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\") " pod="openstack/swift-ring-rebalance-72x5r" Feb 23 14:48:12.048852 master-0 kubenswrapper[28758]: I0223 14:48:12.048336 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-combined-ca-bundle\") pod \"swift-ring-rebalance-72x5r\" (UID: \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\") " pod="openstack/swift-ring-rebalance-72x5r" Feb 23 14:48:12.058635 master-0 kubenswrapper[28758]: I0223 14:48:12.058582 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6kzx\" (UniqueName: \"kubernetes.io/projected/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-kube-api-access-d6kzx\") pod \"swift-ring-rebalance-72x5r\" (UID: \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\") " pod="openstack/swift-ring-rebalance-72x5r" Feb 23 14:48:12.099378 master-0 kubenswrapper[28758]: I0223 14:48:12.099326 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf60feaf-b30d-4b21-8cd8-f75b3c311d2a" path="/var/lib/kubelet/pods/cf60feaf-b30d-4b21-8cd8-f75b3c311d2a/volumes" Feb 23 14:48:12.144581 master-0 kubenswrapper[28758]: I0223 14:48:12.144486 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/518b33ee-87c2-4090-a971-735d98c01d7f-scripts\") pod \"swift-ring-rebalance-m897g\" (UID: \"518b33ee-87c2-4090-a971-735d98c01d7f\") " pod="openstack/swift-ring-rebalance-m897g" Feb 23 14:48:12.144806 master-0 kubenswrapper[28758]: I0223 14:48:12.144713 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/518b33ee-87c2-4090-a971-735d98c01d7f-combined-ca-bundle\") pod \"swift-ring-rebalance-m897g\" (UID: \"518b33ee-87c2-4090-a971-735d98c01d7f\") " pod="openstack/swift-ring-rebalance-m897g" Feb 23 14:48:12.144806 master-0 kubenswrapper[28758]: I0223 14:48:12.144745 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/518b33ee-87c2-4090-a971-735d98c01d7f-dispersionconf\") pod \"swift-ring-rebalance-m897g\" (UID: \"518b33ee-87c2-4090-a971-735d98c01d7f\") " pod="openstack/swift-ring-rebalance-m897g" Feb 23 14:48:12.146383 master-0 kubenswrapper[28758]: I0223 14:48:12.145171 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/518b33ee-87c2-4090-a971-735d98c01d7f-etc-swift\") pod \"swift-ring-rebalance-m897g\" (UID: \"518b33ee-87c2-4090-a971-735d98c01d7f\") " pod="openstack/swift-ring-rebalance-m897g" Feb 23 14:48:12.146383 master-0 kubenswrapper[28758]: I0223 14:48:12.145310 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/518b33ee-87c2-4090-a971-735d98c01d7f-ring-data-devices\") pod \"swift-ring-rebalance-m897g\" (UID: \"518b33ee-87c2-4090-a971-735d98c01d7f\") " pod="openstack/swift-ring-rebalance-m897g" Feb 23 14:48:12.146383 master-0 kubenswrapper[28758]: I0223 14:48:12.145440 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwd8n\" (UniqueName: \"kubernetes.io/projected/518b33ee-87c2-4090-a971-735d98c01d7f-kube-api-access-gwd8n\") pod \"swift-ring-rebalance-m897g\" (UID: \"518b33ee-87c2-4090-a971-735d98c01d7f\") " pod="openstack/swift-ring-rebalance-m897g" Feb 23 14:48:12.146383 master-0 kubenswrapper[28758]: I0223 14:48:12.145610 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/518b33ee-87c2-4090-a971-735d98c01d7f-swiftconf\") pod \"swift-ring-rebalance-m897g\" (UID: \"518b33ee-87c2-4090-a971-735d98c01d7f\") " pod="openstack/swift-ring-rebalance-m897g" Feb 23 14:48:12.146383 master-0 kubenswrapper[28758]: I0223 14:48:12.145761 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/518b33ee-87c2-4090-a971-735d98c01d7f-scripts\") pod \"swift-ring-rebalance-m897g\" (UID: \"518b33ee-87c2-4090-a971-735d98c01d7f\") " pod="openstack/swift-ring-rebalance-m897g" Feb 23 14:48:12.146383 master-0 kubenswrapper[28758]: I0223 14:48:12.145971 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/518b33ee-87c2-4090-a971-735d98c01d7f-etc-swift\") pod \"swift-ring-rebalance-m897g\" (UID: \"518b33ee-87c2-4090-a971-735d98c01d7f\") " pod="openstack/swift-ring-rebalance-m897g" Feb 23 14:48:12.146383 master-0 kubenswrapper[28758]: I0223 14:48:12.146326 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/518b33ee-87c2-4090-a971-735d98c01d7f-ring-data-devices\") pod \"swift-ring-rebalance-m897g\" (UID: \"518b33ee-87c2-4090-a971-735d98c01d7f\") " pod="openstack/swift-ring-rebalance-m897g" Feb 23 14:48:12.148737 master-0 kubenswrapper[28758]: I0223 14:48:12.148324 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/518b33ee-87c2-4090-a971-735d98c01d7f-combined-ca-bundle\") pod \"swift-ring-rebalance-m897g\" (UID: \"518b33ee-87c2-4090-a971-735d98c01d7f\") " pod="openstack/swift-ring-rebalance-m897g" Feb 23 14:48:12.149194 master-0 kubenswrapper[28758]: I0223 14:48:12.148920 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/518b33ee-87c2-4090-a971-735d98c01d7f-dispersionconf\") pod \"swift-ring-rebalance-m897g\" (UID: \"518b33ee-87c2-4090-a971-735d98c01d7f\") " pod="openstack/swift-ring-rebalance-m897g" Feb 23 14:48:12.149194 master-0 kubenswrapper[28758]: I0223 14:48:12.149154 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/518b33ee-87c2-4090-a971-735d98c01d7f-swiftconf\") pod \"swift-ring-rebalance-m897g\" (UID: \"518b33ee-87c2-4090-a971-735d98c01d7f\") " pod="openstack/swift-ring-rebalance-m897g" Feb 23 14:48:12.162086 master-0 kubenswrapper[28758]: I0223 14:48:12.161947 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwd8n\" (UniqueName: \"kubernetes.io/projected/518b33ee-87c2-4090-a971-735d98c01d7f-kube-api-access-gwd8n\") pod \"swift-ring-rebalance-m897g\" (UID: \"518b33ee-87c2-4090-a971-735d98c01d7f\") " pod="openstack/swift-ring-rebalance-m897g" Feb 23 14:48:12.330263 master-0 kubenswrapper[28758]: I0223 14:48:12.330199 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m897g" Feb 23 14:48:12.496452 master-0 kubenswrapper[28758]: I0223 14:48:12.496398 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 23 14:48:12.539244 master-0 kubenswrapper[28758]: I0223 14:48:12.538425 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-72x5r" Feb 23 14:48:12.551750 master-0 kubenswrapper[28758]: I0223 14:48:12.551695 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-72x5r" Feb 23 14:48:12.576794 master-0 kubenswrapper[28758]: I0223 14:48:12.576746 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 23 14:48:12.657683 master-0 kubenswrapper[28758]: I0223 14:48:12.657068 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6kzx\" (UniqueName: \"kubernetes.io/projected/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-kube-api-access-d6kzx\") pod \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\" (UID: \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\") " Feb 23 14:48:12.658173 master-0 kubenswrapper[28758]: I0223 14:48:12.658152 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-scripts\") pod \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\" (UID: \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\") " Feb 23 14:48:12.661804 master-0 kubenswrapper[28758]: I0223 14:48:12.661762 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-combined-ca-bundle\") pod \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\" (UID: \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\") " Feb 23 14:48:12.662177 master-0 kubenswrapper[28758]: I0223 14:48:12.662137 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-etc-swift\") pod \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\" (UID: \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\") " Feb 23 14:48:12.662596 master-0 kubenswrapper[28758]: I0223 14:48:12.662576 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-swiftconf\") pod \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\" (UID: \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\") " Feb 23 14:48:12.662749 master-0 kubenswrapper[28758]: I0223 14:48:12.662731 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-ring-data-devices\") pod \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\" (UID: \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\") " Feb 23 14:48:12.662917 master-0 kubenswrapper[28758]: I0223 14:48:12.662879 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-dispersionconf\") pod \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\" (UID: \"8a9bef43-36af-4765-8ace-cfc1fdb3c80a\") " Feb 23 14:48:12.664575 master-0 kubenswrapper[28758]: I0223 14:48:12.660656 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-scripts" (OuterVolumeSpecName: "scripts") pod "8a9bef43-36af-4765-8ace-cfc1fdb3c80a" (UID: "8a9bef43-36af-4765-8ace-cfc1fdb3c80a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:48:12.664575 master-0 kubenswrapper[28758]: I0223 14:48:12.661612 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-kube-api-access-d6kzx" (OuterVolumeSpecName: "kube-api-access-d6kzx") pod "8a9bef43-36af-4765-8ace-cfc1fdb3c80a" (UID: "8a9bef43-36af-4765-8ace-cfc1fdb3c80a"). InnerVolumeSpecName "kube-api-access-d6kzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:48:12.665581 master-0 kubenswrapper[28758]: I0223 14:48:12.665527 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8a9bef43-36af-4765-8ace-cfc1fdb3c80a" (UID: "8a9bef43-36af-4765-8ace-cfc1fdb3c80a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:48:12.665581 master-0 kubenswrapper[28758]: I0223 14:48:12.665569 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8a9bef43-36af-4765-8ace-cfc1fdb3c80a" (UID: "8a9bef43-36af-4765-8ace-cfc1fdb3c80a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:48:12.667199 master-0 kubenswrapper[28758]: I0223 14:48:12.667109 28758 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-ring-data-devices\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:12.667199 master-0 kubenswrapper[28758]: I0223 14:48:12.667138 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6kzx\" (UniqueName: \"kubernetes.io/projected/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-kube-api-access-d6kzx\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:12.667199 master-0 kubenswrapper[28758]: I0223 14:48:12.667149 28758 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:12.667199 master-0 kubenswrapper[28758]: I0223 14:48:12.667158 28758 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-etc-swift\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:12.667565 master-0 kubenswrapper[28758]: I0223 14:48:12.667474 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8a9bef43-36af-4765-8ace-cfc1fdb3c80a" (UID: "8a9bef43-36af-4765-8ace-cfc1fdb3c80a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:48:12.669612 master-0 kubenswrapper[28758]: I0223 14:48:12.669556 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8a9bef43-36af-4765-8ace-cfc1fdb3c80a" (UID: "8a9bef43-36af-4765-8ace-cfc1fdb3c80a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:48:12.670242 master-0 kubenswrapper[28758]: I0223 14:48:12.670197 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a9bef43-36af-4765-8ace-cfc1fdb3c80a" (UID: "8a9bef43-36af-4765-8ace-cfc1fdb3c80a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:48:12.770767 master-0 kubenswrapper[28758]: I0223 14:48:12.770693 28758 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-swiftconf\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:12.770767 master-0 kubenswrapper[28758]: I0223 14:48:12.770761 28758 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-dispersionconf\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:12.771026 master-0 kubenswrapper[28758]: I0223 14:48:12.770778 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a9bef43-36af-4765-8ace-cfc1fdb3c80a-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:12.773451 master-0 kubenswrapper[28758]: I0223 14:48:12.773417 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-m897g"] Feb 23 14:48:12.845152 master-0 kubenswrapper[28758]: I0223 14:48:12.845091 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-925c6af3-41f7-409a-968b-0ed621490764\" (UniqueName: \"kubernetes.io/csi/topolvm.io^a275db2b-1d19-4445-ad46-700721698a0c\") pod \"swift-storage-0\" (UID: \"3b978484-31f3-46af-aedd-e96a997da517\") " pod="openstack/swift-storage-0" Feb 23 14:48:12.872900 master-0 kubenswrapper[28758]: I0223 14:48:12.872849 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3b978484-31f3-46af-aedd-e96a997da517-etc-swift\") pod \"swift-storage-0\" (UID: \"3b978484-31f3-46af-aedd-e96a997da517\") " pod="openstack/swift-storage-0" Feb 23 14:48:12.873157 master-0 kubenswrapper[28758]: E0223 14:48:12.873129 28758 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 23 14:48:12.873157 master-0 kubenswrapper[28758]: E0223 14:48:12.873157 28758 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 23 14:48:12.873264 master-0 kubenswrapper[28758]: E0223 14:48:12.873210 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b978484-31f3-46af-aedd-e96a997da517-etc-swift podName:3b978484-31f3-46af-aedd-e96a997da517 nodeName:}" failed. No retries permitted until 2026-02-23 14:48:14.873192858 +0000 UTC m=+826.999508790 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3b978484-31f3-46af-aedd-e96a997da517-etc-swift") pod "swift-storage-0" (UID: "3b978484-31f3-46af-aedd-e96a997da517") : configmap "swift-ring-files" not found Feb 23 14:48:12.964106 master-0 kubenswrapper[28758]: I0223 14:48:12.964043 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-x2jgp"] Feb 23 14:48:12.968552 master-0 kubenswrapper[28758]: I0223 14:48:12.968469 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-x2jgp" Feb 23 14:48:12.971093 master-0 kubenswrapper[28758]: I0223 14:48:12.970951 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 23 14:48:12.975357 master-0 kubenswrapper[28758]: I0223 14:48:12.975326 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-x2jgp"] Feb 23 14:48:13.076299 master-0 kubenswrapper[28758]: I0223 14:48:13.076020 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/614e1111-ec92-499f-8dff-a31274ee9922-operator-scripts\") pod \"root-account-create-update-x2jgp\" (UID: \"614e1111-ec92-499f-8dff-a31274ee9922\") " pod="openstack/root-account-create-update-x2jgp" Feb 23 14:48:13.076299 master-0 kubenswrapper[28758]: I0223 14:48:13.076190 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf4mv\" (UniqueName: \"kubernetes.io/projected/614e1111-ec92-499f-8dff-a31274ee9922-kube-api-access-tf4mv\") pod \"root-account-create-update-x2jgp\" (UID: \"614e1111-ec92-499f-8dff-a31274ee9922\") " pod="openstack/root-account-create-update-x2jgp" Feb 23 14:48:13.178280 master-0 kubenswrapper[28758]: I0223 14:48:13.178212 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf4mv\" (UniqueName: \"kubernetes.io/projected/614e1111-ec92-499f-8dff-a31274ee9922-kube-api-access-tf4mv\") pod \"root-account-create-update-x2jgp\" (UID: \"614e1111-ec92-499f-8dff-a31274ee9922\") " pod="openstack/root-account-create-update-x2jgp" Feb 23 14:48:13.179072 master-0 kubenswrapper[28758]: I0223 14:48:13.178667 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/614e1111-ec92-499f-8dff-a31274ee9922-operator-scripts\") pod \"root-account-create-update-x2jgp\" (UID: \"614e1111-ec92-499f-8dff-a31274ee9922\") " pod="openstack/root-account-create-update-x2jgp" Feb 23 14:48:13.181660 master-0 kubenswrapper[28758]: I0223 14:48:13.179354 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/614e1111-ec92-499f-8dff-a31274ee9922-operator-scripts\") pod \"root-account-create-update-x2jgp\" (UID: \"614e1111-ec92-499f-8dff-a31274ee9922\") " pod="openstack/root-account-create-update-x2jgp" Feb 23 14:48:13.204141 master-0 kubenswrapper[28758]: I0223 14:48:13.203735 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf4mv\" (UniqueName: \"kubernetes.io/projected/614e1111-ec92-499f-8dff-a31274ee9922-kube-api-access-tf4mv\") pod \"root-account-create-update-x2jgp\" (UID: \"614e1111-ec92-499f-8dff-a31274ee9922\") " pod="openstack/root-account-create-update-x2jgp" Feb 23 14:48:13.335059 master-0 kubenswrapper[28758]: I0223 14:48:13.334300 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-x2jgp" Feb 23 14:48:13.553818 master-0 kubenswrapper[28758]: I0223 14:48:13.553625 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-m897g" event={"ID":"518b33ee-87c2-4090-a971-735d98c01d7f","Type":"ContainerStarted","Data":"8cc31f394eb8a028a9987aef34e7fb4796a685e7fa9c552a2a03856c2af485b3"} Feb 23 14:48:13.553818 master-0 kubenswrapper[28758]: I0223 14:48:13.553627 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-72x5r" Feb 23 14:48:13.629619 master-0 kubenswrapper[28758]: I0223 14:48:13.629423 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-72x5r"] Feb 23 14:48:13.649099 master-0 kubenswrapper[28758]: I0223 14:48:13.649032 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-72x5r"] Feb 23 14:48:13.831351 master-0 kubenswrapper[28758]: I0223 14:48:13.831211 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-x2jgp"] Feb 23 14:48:13.834087 master-0 kubenswrapper[28758]: W0223 14:48:13.833960 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod614e1111_ec92_499f_8dff_a31274ee9922.slice/crio-9fbf61418a47dd78ff30e6ac7b84754050eb891b77136080e11f5fea5a578014 WatchSource:0}: Error finding container 9fbf61418a47dd78ff30e6ac7b84754050eb891b77136080e11f5fea5a578014: Status 404 returned error can't find the container with id 9fbf61418a47dd78ff30e6ac7b84754050eb891b77136080e11f5fea5a578014 Feb 23 14:48:14.114757 master-0 kubenswrapper[28758]: I0223 14:48:14.114683 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a9bef43-36af-4765-8ace-cfc1fdb3c80a" path="/var/lib/kubelet/pods/8a9bef43-36af-4765-8ace-cfc1fdb3c80a/volumes" Feb 23 14:48:14.570647 master-0 kubenswrapper[28758]: I0223 14:48:14.569865 28758 generic.go:334] "Generic (PLEG): container finished" podID="614e1111-ec92-499f-8dff-a31274ee9922" containerID="ea8438f93e84603e9f263f66805cae02d647db14a713ffc4dbbff8d1283f37ee" exitCode=0 Feb 23 14:48:14.570647 master-0 kubenswrapper[28758]: I0223 14:48:14.569930 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-x2jgp" event={"ID":"614e1111-ec92-499f-8dff-a31274ee9922","Type":"ContainerDied","Data":"ea8438f93e84603e9f263f66805cae02d647db14a713ffc4dbbff8d1283f37ee"} Feb 23 14:48:14.570647 master-0 kubenswrapper[28758]: I0223 14:48:14.569968 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-x2jgp" event={"ID":"614e1111-ec92-499f-8dff-a31274ee9922","Type":"ContainerStarted","Data":"9fbf61418a47dd78ff30e6ac7b84754050eb891b77136080e11f5fea5a578014"} Feb 23 14:48:14.933737 master-0 kubenswrapper[28758]: I0223 14:48:14.933658 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3b978484-31f3-46af-aedd-e96a997da517-etc-swift\") pod \"swift-storage-0\" (UID: \"3b978484-31f3-46af-aedd-e96a997da517\") " pod="openstack/swift-storage-0" Feb 23 14:48:14.933949 master-0 kubenswrapper[28758]: E0223 14:48:14.933869 28758 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 23 14:48:14.933949 master-0 kubenswrapper[28758]: E0223 14:48:14.933897 28758 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 23 14:48:14.934048 master-0 kubenswrapper[28758]: E0223 14:48:14.933965 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b978484-31f3-46af-aedd-e96a997da517-etc-swift podName:3b978484-31f3-46af-aedd-e96a997da517 nodeName:}" failed. No retries permitted until 2026-02-23 14:48:18.93393926 +0000 UTC m=+831.060255202 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3b978484-31f3-46af-aedd-e96a997da517-etc-swift") pod "swift-storage-0" (UID: "3b978484-31f3-46af-aedd-e96a997da517") : configmap "swift-ring-files" not found Feb 23 14:48:16.082141 master-0 kubenswrapper[28758]: I0223 14:48:16.082094 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-x2jgp" Feb 23 14:48:16.177408 master-0 kubenswrapper[28758]: I0223 14:48:16.177315 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tf4mv\" (UniqueName: \"kubernetes.io/projected/614e1111-ec92-499f-8dff-a31274ee9922-kube-api-access-tf4mv\") pod \"614e1111-ec92-499f-8dff-a31274ee9922\" (UID: \"614e1111-ec92-499f-8dff-a31274ee9922\") " Feb 23 14:48:16.177408 master-0 kubenswrapper[28758]: I0223 14:48:16.177406 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/614e1111-ec92-499f-8dff-a31274ee9922-operator-scripts\") pod \"614e1111-ec92-499f-8dff-a31274ee9922\" (UID: \"614e1111-ec92-499f-8dff-a31274ee9922\") " Feb 23 14:48:16.186679 master-0 kubenswrapper[28758]: I0223 14:48:16.185909 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/614e1111-ec92-499f-8dff-a31274ee9922-kube-api-access-tf4mv" (OuterVolumeSpecName: "kube-api-access-tf4mv") pod "614e1111-ec92-499f-8dff-a31274ee9922" (UID: "614e1111-ec92-499f-8dff-a31274ee9922"). InnerVolumeSpecName "kube-api-access-tf4mv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:48:16.187267 master-0 kubenswrapper[28758]: I0223 14:48:16.187193 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/614e1111-ec92-499f-8dff-a31274ee9922-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "614e1111-ec92-499f-8dff-a31274ee9922" (UID: "614e1111-ec92-499f-8dff-a31274ee9922"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:48:16.281613 master-0 kubenswrapper[28758]: I0223 14:48:16.281555 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tf4mv\" (UniqueName: \"kubernetes.io/projected/614e1111-ec92-499f-8dff-a31274ee9922-kube-api-access-tf4mv\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:16.281613 master-0 kubenswrapper[28758]: I0223 14:48:16.281594 28758 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/614e1111-ec92-499f-8dff-a31274ee9922-operator-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:16.326517 master-0 kubenswrapper[28758]: I0223 14:48:16.326419 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-sl8ng"] Feb 23 14:48:16.327289 master-0 kubenswrapper[28758]: E0223 14:48:16.327258 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="614e1111-ec92-499f-8dff-a31274ee9922" containerName="mariadb-account-create-update" Feb 23 14:48:16.327289 master-0 kubenswrapper[28758]: I0223 14:48:16.327284 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="614e1111-ec92-499f-8dff-a31274ee9922" containerName="mariadb-account-create-update" Feb 23 14:48:16.327666 master-0 kubenswrapper[28758]: I0223 14:48:16.327625 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="614e1111-ec92-499f-8dff-a31274ee9922" containerName="mariadb-account-create-update" Feb 23 14:48:16.328410 master-0 kubenswrapper[28758]: I0223 14:48:16.328373 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-sl8ng" Feb 23 14:48:16.339623 master-0 kubenswrapper[28758]: I0223 14:48:16.339523 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-sl8ng"] Feb 23 14:48:16.399275 master-0 kubenswrapper[28758]: I0223 14:48:16.399129 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bb37fbb-098a-4f0c-894a-918e22d2c343-operator-scripts\") pod \"glance-db-create-sl8ng\" (UID: \"8bb37fbb-098a-4f0c-894a-918e22d2c343\") " pod="openstack/glance-db-create-sl8ng" Feb 23 14:48:16.399275 master-0 kubenswrapper[28758]: I0223 14:48:16.399220 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clvqf\" (UniqueName: \"kubernetes.io/projected/8bb37fbb-098a-4f0c-894a-918e22d2c343-kube-api-access-clvqf\") pod \"glance-db-create-sl8ng\" (UID: \"8bb37fbb-098a-4f0c-894a-918e22d2c343\") " pod="openstack/glance-db-create-sl8ng" Feb 23 14:48:16.458165 master-0 kubenswrapper[28758]: I0223 14:48:16.458097 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b4e2-account-create-update-r6sdz"] Feb 23 14:48:16.459385 master-0 kubenswrapper[28758]: I0223 14:48:16.459360 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b4e2-account-create-update-r6sdz" Feb 23 14:48:16.462841 master-0 kubenswrapper[28758]: I0223 14:48:16.461968 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 23 14:48:16.476502 master-0 kubenswrapper[28758]: I0223 14:48:16.476432 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b4e2-account-create-update-r6sdz"] Feb 23 14:48:16.501327 master-0 kubenswrapper[28758]: I0223 14:48:16.501249 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/031f392b-0b1d-4f88-a32b-d015b41857f6-operator-scripts\") pod \"glance-b4e2-account-create-update-r6sdz\" (UID: \"031f392b-0b1d-4f88-a32b-d015b41857f6\") " pod="openstack/glance-b4e2-account-create-update-r6sdz" Feb 23 14:48:16.501595 master-0 kubenswrapper[28758]: I0223 14:48:16.501347 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wm9q\" (UniqueName: \"kubernetes.io/projected/031f392b-0b1d-4f88-a32b-d015b41857f6-kube-api-access-7wm9q\") pod \"glance-b4e2-account-create-update-r6sdz\" (UID: \"031f392b-0b1d-4f88-a32b-d015b41857f6\") " pod="openstack/glance-b4e2-account-create-update-r6sdz" Feb 23 14:48:16.501595 master-0 kubenswrapper[28758]: I0223 14:48:16.501427 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bb37fbb-098a-4f0c-894a-918e22d2c343-operator-scripts\") pod \"glance-db-create-sl8ng\" (UID: \"8bb37fbb-098a-4f0c-894a-918e22d2c343\") " pod="openstack/glance-db-create-sl8ng" Feb 23 14:48:16.501595 master-0 kubenswrapper[28758]: I0223 14:48:16.501465 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clvqf\" (UniqueName: \"kubernetes.io/projected/8bb37fbb-098a-4f0c-894a-918e22d2c343-kube-api-access-clvqf\") pod \"glance-db-create-sl8ng\" (UID: \"8bb37fbb-098a-4f0c-894a-918e22d2c343\") " pod="openstack/glance-db-create-sl8ng" Feb 23 14:48:16.502177 master-0 kubenswrapper[28758]: I0223 14:48:16.502145 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bb37fbb-098a-4f0c-894a-918e22d2c343-operator-scripts\") pod \"glance-db-create-sl8ng\" (UID: \"8bb37fbb-098a-4f0c-894a-918e22d2c343\") " pod="openstack/glance-db-create-sl8ng" Feb 23 14:48:16.517626 master-0 kubenswrapper[28758]: I0223 14:48:16.517549 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clvqf\" (UniqueName: \"kubernetes.io/projected/8bb37fbb-098a-4f0c-894a-918e22d2c343-kube-api-access-clvqf\") pod \"glance-db-create-sl8ng\" (UID: \"8bb37fbb-098a-4f0c-894a-918e22d2c343\") " pod="openstack/glance-db-create-sl8ng" Feb 23 14:48:16.599651 master-0 kubenswrapper[28758]: I0223 14:48:16.598562 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-x2jgp" event={"ID":"614e1111-ec92-499f-8dff-a31274ee9922","Type":"ContainerDied","Data":"9fbf61418a47dd78ff30e6ac7b84754050eb891b77136080e11f5fea5a578014"} Feb 23 14:48:16.599651 master-0 kubenswrapper[28758]: I0223 14:48:16.598627 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fbf61418a47dd78ff30e6ac7b84754050eb891b77136080e11f5fea5a578014" Feb 23 14:48:16.599651 master-0 kubenswrapper[28758]: I0223 14:48:16.598591 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-x2jgp" Feb 23 14:48:16.602938 master-0 kubenswrapper[28758]: I0223 14:48:16.600076 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-m897g" event={"ID":"518b33ee-87c2-4090-a971-735d98c01d7f","Type":"ContainerStarted","Data":"8ec02d365dd2b0d29e1bc74ea3698781d4530e9c0e2f5e7d1e4f99d28c6c8f1f"} Feb 23 14:48:16.607712 master-0 kubenswrapper[28758]: I0223 14:48:16.607250 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/031f392b-0b1d-4f88-a32b-d015b41857f6-operator-scripts\") pod \"glance-b4e2-account-create-update-r6sdz\" (UID: \"031f392b-0b1d-4f88-a32b-d015b41857f6\") " pod="openstack/glance-b4e2-account-create-update-r6sdz" Feb 23 14:48:16.607712 master-0 kubenswrapper[28758]: I0223 14:48:16.607421 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wm9q\" (UniqueName: \"kubernetes.io/projected/031f392b-0b1d-4f88-a32b-d015b41857f6-kube-api-access-7wm9q\") pod \"glance-b4e2-account-create-update-r6sdz\" (UID: \"031f392b-0b1d-4f88-a32b-d015b41857f6\") " pod="openstack/glance-b4e2-account-create-update-r6sdz" Feb 23 14:48:16.617164 master-0 kubenswrapper[28758]: I0223 14:48:16.614795 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/031f392b-0b1d-4f88-a32b-d015b41857f6-operator-scripts\") pod \"glance-b4e2-account-create-update-r6sdz\" (UID: \"031f392b-0b1d-4f88-a32b-d015b41857f6\") " pod="openstack/glance-b4e2-account-create-update-r6sdz" Feb 23 14:48:16.629862 master-0 kubenswrapper[28758]: I0223 14:48:16.629740 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wm9q\" (UniqueName: \"kubernetes.io/projected/031f392b-0b1d-4f88-a32b-d015b41857f6-kube-api-access-7wm9q\") pod \"glance-b4e2-account-create-update-r6sdz\" (UID: \"031f392b-0b1d-4f88-a32b-d015b41857f6\") " pod="openstack/glance-b4e2-account-create-update-r6sdz" Feb 23 14:48:16.637531 master-0 kubenswrapper[28758]: I0223 14:48:16.637385 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-m897g" podStartSLOduration=2.546375754 podStartE2EDuration="5.637358309s" podCreationTimestamp="2026-02-23 14:48:11 +0000 UTC" firstStartedPulling="2026-02-23 14:48:12.799356646 +0000 UTC m=+824.925672578" lastFinishedPulling="2026-02-23 14:48:15.890339211 +0000 UTC m=+828.016655133" observedRunningTime="2026-02-23 14:48:16.626794458 +0000 UTC m=+828.753110390" watchObservedRunningTime="2026-02-23 14:48:16.637358309 +0000 UTC m=+828.763674251" Feb 23 14:48:16.714224 master-0 kubenswrapper[28758]: I0223 14:48:16.712343 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-sl8ng" Feb 23 14:48:16.803098 master-0 kubenswrapper[28758]: I0223 14:48:16.802233 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b4e2-account-create-update-r6sdz" Feb 23 14:48:17.038600 master-0 kubenswrapper[28758]: I0223 14:48:17.033815 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-qwl8p"] Feb 23 14:48:17.038600 master-0 kubenswrapper[28758]: I0223 14:48:17.035128 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qwl8p" Feb 23 14:48:17.066557 master-0 kubenswrapper[28758]: I0223 14:48:17.065510 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-qwl8p"] Feb 23 14:48:17.143047 master-0 kubenswrapper[28758]: I0223 14:48:17.142962 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7wm5\" (UniqueName: \"kubernetes.io/projected/6cc23db2-205f-4165-8675-1fa946a30a84-kube-api-access-f7wm5\") pod \"keystone-db-create-qwl8p\" (UID: \"6cc23db2-205f-4165-8675-1fa946a30a84\") " pod="openstack/keystone-db-create-qwl8p" Feb 23 14:48:17.144375 master-0 kubenswrapper[28758]: I0223 14:48:17.144356 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cc23db2-205f-4165-8675-1fa946a30a84-operator-scripts\") pod \"keystone-db-create-qwl8p\" (UID: \"6cc23db2-205f-4165-8675-1fa946a30a84\") " pod="openstack/keystone-db-create-qwl8p" Feb 23 14:48:17.158152 master-0 kubenswrapper[28758]: I0223 14:48:17.155702 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-ec94-account-create-update-qwcst"] Feb 23 14:48:17.158152 master-0 kubenswrapper[28758]: I0223 14:48:17.157891 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ec94-account-create-update-qwcst" Feb 23 14:48:17.174132 master-0 kubenswrapper[28758]: I0223 14:48:17.174034 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 23 14:48:17.175618 master-0 kubenswrapper[28758]: I0223 14:48:17.175580 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ec94-account-create-update-qwcst"] Feb 23 14:48:17.247320 master-0 kubenswrapper[28758]: I0223 14:48:17.247241 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cc23db2-205f-4165-8675-1fa946a30a84-operator-scripts\") pod \"keystone-db-create-qwl8p\" (UID: \"6cc23db2-205f-4165-8675-1fa946a30a84\") " pod="openstack/keystone-db-create-qwl8p" Feb 23 14:48:17.247630 master-0 kubenswrapper[28758]: I0223 14:48:17.247355 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6qkd\" (UniqueName: \"kubernetes.io/projected/03be8bbd-b280-416a-9197-9dbdd3eaf4f7-kube-api-access-k6qkd\") pod \"keystone-ec94-account-create-update-qwcst\" (UID: \"03be8bbd-b280-416a-9197-9dbdd3eaf4f7\") " pod="openstack/keystone-ec94-account-create-update-qwcst" Feb 23 14:48:17.247630 master-0 kubenswrapper[28758]: I0223 14:48:17.247382 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03be8bbd-b280-416a-9197-9dbdd3eaf4f7-operator-scripts\") pod \"keystone-ec94-account-create-update-qwcst\" (UID: \"03be8bbd-b280-416a-9197-9dbdd3eaf4f7\") " pod="openstack/keystone-ec94-account-create-update-qwcst" Feb 23 14:48:17.247630 master-0 kubenswrapper[28758]: I0223 14:48:17.247427 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7wm5\" (UniqueName: \"kubernetes.io/projected/6cc23db2-205f-4165-8675-1fa946a30a84-kube-api-access-f7wm5\") pod \"keystone-db-create-qwl8p\" (UID: \"6cc23db2-205f-4165-8675-1fa946a30a84\") " pod="openstack/keystone-db-create-qwl8p" Feb 23 14:48:17.265990 master-0 kubenswrapper[28758]: I0223 14:48:17.265321 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cc23db2-205f-4165-8675-1fa946a30a84-operator-scripts\") pod \"keystone-db-create-qwl8p\" (UID: \"6cc23db2-205f-4165-8675-1fa946a30a84\") " pod="openstack/keystone-db-create-qwl8p" Feb 23 14:48:17.266799 master-0 kubenswrapper[28758]: I0223 14:48:17.266775 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-sl8ng"] Feb 23 14:48:17.270864 master-0 kubenswrapper[28758]: I0223 14:48:17.270841 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7wm5\" (UniqueName: \"kubernetes.io/projected/6cc23db2-205f-4165-8675-1fa946a30a84-kube-api-access-f7wm5\") pod \"keystone-db-create-qwl8p\" (UID: \"6cc23db2-205f-4165-8675-1fa946a30a84\") " pod="openstack/keystone-db-create-qwl8p" Feb 23 14:48:17.275319 master-0 kubenswrapper[28758]: I0223 14:48:17.275296 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-7xv7b"] Feb 23 14:48:17.276898 master-0 kubenswrapper[28758]: I0223 14:48:17.276878 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7xv7b" Feb 23 14:48:17.314301 master-0 kubenswrapper[28758]: I0223 14:48:17.314044 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-7xv7b"] Feb 23 14:48:17.349205 master-0 kubenswrapper[28758]: I0223 14:48:17.349138 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j5z5\" (UniqueName: \"kubernetes.io/projected/581dcbe4-b0bd-4099-8150-902c7a87c5d0-kube-api-access-9j5z5\") pod \"placement-db-create-7xv7b\" (UID: \"581dcbe4-b0bd-4099-8150-902c7a87c5d0\") " pod="openstack/placement-db-create-7xv7b" Feb 23 14:48:17.349429 master-0 kubenswrapper[28758]: I0223 14:48:17.349261 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/581dcbe4-b0bd-4099-8150-902c7a87c5d0-operator-scripts\") pod \"placement-db-create-7xv7b\" (UID: \"581dcbe4-b0bd-4099-8150-902c7a87c5d0\") " pod="openstack/placement-db-create-7xv7b" Feb 23 14:48:17.349429 master-0 kubenswrapper[28758]: I0223 14:48:17.349315 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6qkd\" (UniqueName: \"kubernetes.io/projected/03be8bbd-b280-416a-9197-9dbdd3eaf4f7-kube-api-access-k6qkd\") pod \"keystone-ec94-account-create-update-qwcst\" (UID: \"03be8bbd-b280-416a-9197-9dbdd3eaf4f7\") " pod="openstack/keystone-ec94-account-create-update-qwcst" Feb 23 14:48:17.349429 master-0 kubenswrapper[28758]: I0223 14:48:17.349334 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03be8bbd-b280-416a-9197-9dbdd3eaf4f7-operator-scripts\") pod \"keystone-ec94-account-create-update-qwcst\" (UID: \"03be8bbd-b280-416a-9197-9dbdd3eaf4f7\") " pod="openstack/keystone-ec94-account-create-update-qwcst" Feb 23 14:48:17.350401 master-0 kubenswrapper[28758]: I0223 14:48:17.350348 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03be8bbd-b280-416a-9197-9dbdd3eaf4f7-operator-scripts\") pod \"keystone-ec94-account-create-update-qwcst\" (UID: \"03be8bbd-b280-416a-9197-9dbdd3eaf4f7\") " pod="openstack/keystone-ec94-account-create-update-qwcst" Feb 23 14:48:17.374679 master-0 kubenswrapper[28758]: I0223 14:48:17.374643 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6qkd\" (UniqueName: \"kubernetes.io/projected/03be8bbd-b280-416a-9197-9dbdd3eaf4f7-kube-api-access-k6qkd\") pod \"keystone-ec94-account-create-update-qwcst\" (UID: \"03be8bbd-b280-416a-9197-9dbdd3eaf4f7\") " pod="openstack/keystone-ec94-account-create-update-qwcst" Feb 23 14:48:17.388121 master-0 kubenswrapper[28758]: I0223 14:48:17.388088 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-8f8d-account-create-update-6sf9t"] Feb 23 14:48:17.389655 master-0 kubenswrapper[28758]: I0223 14:48:17.389637 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8f8d-account-create-update-6sf9t" Feb 23 14:48:17.392168 master-0 kubenswrapper[28758]: I0223 14:48:17.392152 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 23 14:48:17.410212 master-0 kubenswrapper[28758]: I0223 14:48:17.410085 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qwl8p" Feb 23 14:48:17.424167 master-0 kubenswrapper[28758]: I0223 14:48:17.424126 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8f8d-account-create-update-6sf9t"] Feb 23 14:48:17.451136 master-0 kubenswrapper[28758]: I0223 14:48:17.451081 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j5z5\" (UniqueName: \"kubernetes.io/projected/581dcbe4-b0bd-4099-8150-902c7a87c5d0-kube-api-access-9j5z5\") pod \"placement-db-create-7xv7b\" (UID: \"581dcbe4-b0bd-4099-8150-902c7a87c5d0\") " pod="openstack/placement-db-create-7xv7b" Feb 23 14:48:17.451320 master-0 kubenswrapper[28758]: I0223 14:48:17.451164 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttz7s\" (UniqueName: \"kubernetes.io/projected/0f01ce05-6e10-4db8-b887-bd8caf29c98a-kube-api-access-ttz7s\") pod \"placement-8f8d-account-create-update-6sf9t\" (UID: \"0f01ce05-6e10-4db8-b887-bd8caf29c98a\") " pod="openstack/placement-8f8d-account-create-update-6sf9t" Feb 23 14:48:17.451320 master-0 kubenswrapper[28758]: I0223 14:48:17.451262 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/581dcbe4-b0bd-4099-8150-902c7a87c5d0-operator-scripts\") pod \"placement-db-create-7xv7b\" (UID: \"581dcbe4-b0bd-4099-8150-902c7a87c5d0\") " pod="openstack/placement-db-create-7xv7b" Feb 23 14:48:17.451410 master-0 kubenswrapper[28758]: I0223 14:48:17.451383 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f01ce05-6e10-4db8-b887-bd8caf29c98a-operator-scripts\") pod \"placement-8f8d-account-create-update-6sf9t\" (UID: \"0f01ce05-6e10-4db8-b887-bd8caf29c98a\") " pod="openstack/placement-8f8d-account-create-update-6sf9t" Feb 23 14:48:17.452119 master-0 kubenswrapper[28758]: I0223 14:48:17.452101 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/581dcbe4-b0bd-4099-8150-902c7a87c5d0-operator-scripts\") pod \"placement-db-create-7xv7b\" (UID: \"581dcbe4-b0bd-4099-8150-902c7a87c5d0\") " pod="openstack/placement-db-create-7xv7b" Feb 23 14:48:17.474669 master-0 kubenswrapper[28758]: I0223 14:48:17.474602 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j5z5\" (UniqueName: \"kubernetes.io/projected/581dcbe4-b0bd-4099-8150-902c7a87c5d0-kube-api-access-9j5z5\") pod \"placement-db-create-7xv7b\" (UID: \"581dcbe4-b0bd-4099-8150-902c7a87c5d0\") " pod="openstack/placement-db-create-7xv7b" Feb 23 14:48:17.481467 master-0 kubenswrapper[28758]: I0223 14:48:17.481113 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ec94-account-create-update-qwcst" Feb 23 14:48:17.499229 master-0 kubenswrapper[28758]: I0223 14:48:17.499152 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b4e2-account-create-update-r6sdz"] Feb 23 14:48:17.588355 master-0 kubenswrapper[28758]: I0223 14:48:17.588212 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f01ce05-6e10-4db8-b887-bd8caf29c98a-operator-scripts\") pod \"placement-8f8d-account-create-update-6sf9t\" (UID: \"0f01ce05-6e10-4db8-b887-bd8caf29c98a\") " pod="openstack/placement-8f8d-account-create-update-6sf9t" Feb 23 14:48:17.588597 master-0 kubenswrapper[28758]: I0223 14:48:17.588405 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttz7s\" (UniqueName: \"kubernetes.io/projected/0f01ce05-6e10-4db8-b887-bd8caf29c98a-kube-api-access-ttz7s\") pod \"placement-8f8d-account-create-update-6sf9t\" (UID: \"0f01ce05-6e10-4db8-b887-bd8caf29c98a\") " pod="openstack/placement-8f8d-account-create-update-6sf9t" Feb 23 14:48:17.589315 master-0 kubenswrapper[28758]: I0223 14:48:17.589134 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f01ce05-6e10-4db8-b887-bd8caf29c98a-operator-scripts\") pod \"placement-8f8d-account-create-update-6sf9t\" (UID: \"0f01ce05-6e10-4db8-b887-bd8caf29c98a\") " pod="openstack/placement-8f8d-account-create-update-6sf9t" Feb 23 14:48:17.611384 master-0 kubenswrapper[28758]: I0223 14:48:17.611328 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttz7s\" (UniqueName: \"kubernetes.io/projected/0f01ce05-6e10-4db8-b887-bd8caf29c98a-kube-api-access-ttz7s\") pod \"placement-8f8d-account-create-update-6sf9t\" (UID: \"0f01ce05-6e10-4db8-b887-bd8caf29c98a\") " pod="openstack/placement-8f8d-account-create-update-6sf9t" Feb 23 14:48:17.614637 master-0 kubenswrapper[28758]: I0223 14:48:17.614596 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-sl8ng" event={"ID":"8bb37fbb-098a-4f0c-894a-918e22d2c343","Type":"ContainerStarted","Data":"2c08d53d13d7f30165ac7aedc93d8e1c641d5d13356b68e04f7c1f2da4eabe08"} Feb 23 14:48:17.614712 master-0 kubenswrapper[28758]: I0223 14:48:17.614647 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-sl8ng" event={"ID":"8bb37fbb-098a-4f0c-894a-918e22d2c343","Type":"ContainerStarted","Data":"df18571687a6d0e0eee987aadb1297a18201732746fa95629d097dac3eae5efe"} Feb 23 14:48:17.617423 master-0 kubenswrapper[28758]: I0223 14:48:17.617380 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b4e2-account-create-update-r6sdz" event={"ID":"031f392b-0b1d-4f88-a32b-d015b41857f6","Type":"ContainerStarted","Data":"933611e784c4b5b557db1eb09900963b493673723c313c5b224b7991e6d83b34"} Feb 23 14:48:17.634707 master-0 kubenswrapper[28758]: I0223 14:48:17.633170 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-sl8ng" podStartSLOduration=1.633148007 podStartE2EDuration="1.633148007s" podCreationTimestamp="2026-02-23 14:48:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:48:17.63062698 +0000 UTC m=+829.756942922" watchObservedRunningTime="2026-02-23 14:48:17.633148007 +0000 UTC m=+829.759463939" Feb 23 14:48:17.671450 master-0 kubenswrapper[28758]: I0223 14:48:17.670603 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7xv7b" Feb 23 14:48:17.729684 master-0 kubenswrapper[28758]: I0223 14:48:17.729616 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8f8d-account-create-update-6sf9t" Feb 23 14:48:17.884679 master-0 kubenswrapper[28758]: I0223 14:48:17.882951 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-qwl8p"] Feb 23 14:48:17.912812 master-0 kubenswrapper[28758]: W0223 14:48:17.912766 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cc23db2_205f_4165_8675_1fa946a30a84.slice/crio-ae980f16da40639bf110418a554fe68172257f8377b8f8dc6200e992cb5ca482 WatchSource:0}: Error finding container ae980f16da40639bf110418a554fe68172257f8377b8f8dc6200e992cb5ca482: Status 404 returned error can't find the container with id ae980f16da40639bf110418a554fe68172257f8377b8f8dc6200e992cb5ca482 Feb 23 14:48:18.050684 master-0 kubenswrapper[28758]: I0223 14:48:18.050610 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ec94-account-create-update-qwcst"] Feb 23 14:48:18.059348 master-0 kubenswrapper[28758]: W0223 14:48:18.059286 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03be8bbd_b280_416a_9197_9dbdd3eaf4f7.slice/crio-66e3375142fab6ba72309f05e3d897016f0918e7221a9542eba72f42aed554c8 WatchSource:0}: Error finding container 66e3375142fab6ba72309f05e3d897016f0918e7221a9542eba72f42aed554c8: Status 404 returned error can't find the container with id 66e3375142fab6ba72309f05e3d897016f0918e7221a9542eba72f42aed554c8 Feb 23 14:48:18.214032 master-0 kubenswrapper[28758]: I0223 14:48:18.207644 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-7xv7b"] Feb 23 14:48:18.327086 master-0 kubenswrapper[28758]: I0223 14:48:18.326187 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8f8d-account-create-update-6sf9t"] Feb 23 14:48:18.336089 master-0 kubenswrapper[28758]: W0223 14:48:18.336045 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f01ce05_6e10_4db8_b887_bd8caf29c98a.slice/crio-9bc11160c65ad8b6af8c3045562ee4d02e5f2baa56b298e0e2e2256fe02a2828 WatchSource:0}: Error finding container 9bc11160c65ad8b6af8c3045562ee4d02e5f2baa56b298e0e2e2256fe02a2828: Status 404 returned error can't find the container with id 9bc11160c65ad8b6af8c3045562ee4d02e5f2baa56b298e0e2e2256fe02a2828 Feb 23 14:48:18.641701 master-0 kubenswrapper[28758]: I0223 14:48:18.641531 28758 generic.go:334] "Generic (PLEG): container finished" podID="6cc23db2-205f-4165-8675-1fa946a30a84" containerID="145da82ebe449309f1e45329b59bacb66a1a189eca6a38a3f0bd22e77feea4ee" exitCode=0 Feb 23 14:48:18.641701 master-0 kubenswrapper[28758]: I0223 14:48:18.641612 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qwl8p" event={"ID":"6cc23db2-205f-4165-8675-1fa946a30a84","Type":"ContainerDied","Data":"145da82ebe449309f1e45329b59bacb66a1a189eca6a38a3f0bd22e77feea4ee"} Feb 23 14:48:18.641701 master-0 kubenswrapper[28758]: I0223 14:48:18.641641 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qwl8p" event={"ID":"6cc23db2-205f-4165-8675-1fa946a30a84","Type":"ContainerStarted","Data":"ae980f16da40639bf110418a554fe68172257f8377b8f8dc6200e992cb5ca482"} Feb 23 14:48:18.643827 master-0 kubenswrapper[28758]: I0223 14:48:18.643352 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8f8d-account-create-update-6sf9t" event={"ID":"0f01ce05-6e10-4db8-b887-bd8caf29c98a","Type":"ContainerStarted","Data":"b93b3d403c2dfa5ad4f37ec243fb3e4f8ec183bddff0e7512111eadcc6035509"} Feb 23 14:48:18.643827 master-0 kubenswrapper[28758]: I0223 14:48:18.643384 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8f8d-account-create-update-6sf9t" event={"ID":"0f01ce05-6e10-4db8-b887-bd8caf29c98a","Type":"ContainerStarted","Data":"9bc11160c65ad8b6af8c3045562ee4d02e5f2baa56b298e0e2e2256fe02a2828"} Feb 23 14:48:18.647816 master-0 kubenswrapper[28758]: I0223 14:48:18.646538 28758 generic.go:334] "Generic (PLEG): container finished" podID="581dcbe4-b0bd-4099-8150-902c7a87c5d0" containerID="89fff8e9c27bd5cbbd4db4f47e44b76657cb24111b81780c074ebabbc09c8ac4" exitCode=0 Feb 23 14:48:18.647816 master-0 kubenswrapper[28758]: I0223 14:48:18.646580 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7xv7b" event={"ID":"581dcbe4-b0bd-4099-8150-902c7a87c5d0","Type":"ContainerDied","Data":"89fff8e9c27bd5cbbd4db4f47e44b76657cb24111b81780c074ebabbc09c8ac4"} Feb 23 14:48:18.647816 master-0 kubenswrapper[28758]: I0223 14:48:18.646596 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7xv7b" event={"ID":"581dcbe4-b0bd-4099-8150-902c7a87c5d0","Type":"ContainerStarted","Data":"f64a0df9a1304643426cfb0d931a55b804d3922d7a9952cc19e627121b43bad1"} Feb 23 14:48:18.652915 master-0 kubenswrapper[28758]: I0223 14:48:18.652864 28758 generic.go:334] "Generic (PLEG): container finished" podID="031f392b-0b1d-4f88-a32b-d015b41857f6" containerID="2472c8e571517e2668f4a6d92d50d33d4b5bfec5319bf6de6c982ddbe222435e" exitCode=0 Feb 23 14:48:18.653170 master-0 kubenswrapper[28758]: I0223 14:48:18.653051 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b4e2-account-create-update-r6sdz" event={"ID":"031f392b-0b1d-4f88-a32b-d015b41857f6","Type":"ContainerDied","Data":"2472c8e571517e2668f4a6d92d50d33d4b5bfec5319bf6de6c982ddbe222435e"} Feb 23 14:48:18.658581 master-0 kubenswrapper[28758]: I0223 14:48:18.658421 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ec94-account-create-update-qwcst" event={"ID":"03be8bbd-b280-416a-9197-9dbdd3eaf4f7","Type":"ContainerStarted","Data":"69379567b30488cb54059fec6b95c6d1b4534033cd6568962e1e943d7c2d18b4"} Feb 23 14:48:18.658581 master-0 kubenswrapper[28758]: I0223 14:48:18.658497 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ec94-account-create-update-qwcst" event={"ID":"03be8bbd-b280-416a-9197-9dbdd3eaf4f7","Type":"ContainerStarted","Data":"66e3375142fab6ba72309f05e3d897016f0918e7221a9542eba72f42aed554c8"} Feb 23 14:48:18.666054 master-0 kubenswrapper[28758]: I0223 14:48:18.663807 28758 generic.go:334] "Generic (PLEG): container finished" podID="8bb37fbb-098a-4f0c-894a-918e22d2c343" containerID="2c08d53d13d7f30165ac7aedc93d8e1c641d5d13356b68e04f7c1f2da4eabe08" exitCode=0 Feb 23 14:48:18.666054 master-0 kubenswrapper[28758]: I0223 14:48:18.663868 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-sl8ng" event={"ID":"8bb37fbb-098a-4f0c-894a-918e22d2c343","Type":"ContainerDied","Data":"2c08d53d13d7f30165ac7aedc93d8e1c641d5d13356b68e04f7c1f2da4eabe08"} Feb 23 14:48:18.684704 master-0 kubenswrapper[28758]: I0223 14:48:18.684627 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b55dc5f67-8cr6h" Feb 23 14:48:18.708250 master-0 kubenswrapper[28758]: I0223 14:48:18.708168 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-8f8d-account-create-update-6sf9t" podStartSLOduration=1.708148058 podStartE2EDuration="1.708148058s" podCreationTimestamp="2026-02-23 14:48:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:48:18.698897272 +0000 UTC m=+830.825213204" watchObservedRunningTime="2026-02-23 14:48:18.708148058 +0000 UTC m=+830.834464010" Feb 23 14:48:18.936008 master-0 kubenswrapper[28758]: I0223 14:48:18.935941 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c45d57b9c-qrtjc"] Feb 23 14:48:18.936288 master-0 kubenswrapper[28758]: I0223 14:48:18.936221 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c45d57b9c-qrtjc" podUID="f23ec168-6731-4fdf-8b85-837218a794f6" containerName="dnsmasq-dns" containerID="cri-o://c83e65d806e36841bc33b02f0bc6e4d81482e47102ef04f289cbbc1b90c287ef" gracePeriod=10 Feb 23 14:48:19.053577 master-0 kubenswrapper[28758]: I0223 14:48:19.051546 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3b978484-31f3-46af-aedd-e96a997da517-etc-swift\") pod \"swift-storage-0\" (UID: \"3b978484-31f3-46af-aedd-e96a997da517\") " pod="openstack/swift-storage-0" Feb 23 14:48:19.053577 master-0 kubenswrapper[28758]: E0223 14:48:19.051785 28758 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 23 14:48:19.053577 master-0 kubenswrapper[28758]: E0223 14:48:19.051799 28758 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 23 14:48:19.053577 master-0 kubenswrapper[28758]: E0223 14:48:19.051852 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b978484-31f3-46af-aedd-e96a997da517-etc-swift podName:3b978484-31f3-46af-aedd-e96a997da517 nodeName:}" failed. No retries permitted until 2026-02-23 14:48:27.051837549 +0000 UTC m=+839.178153471 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3b978484-31f3-46af-aedd-e96a997da517-etc-swift") pod "swift-storage-0" (UID: "3b978484-31f3-46af-aedd-e96a997da517") : configmap "swift-ring-files" not found Feb 23 14:48:19.419783 master-0 kubenswrapper[28758]: I0223 14:48:19.419191 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-x2jgp"] Feb 23 14:48:19.457578 master-0 kubenswrapper[28758]: I0223 14:48:19.451707 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-x2jgp"] Feb 23 14:48:19.528140 master-0 kubenswrapper[28758]: I0223 14:48:19.527944 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c45d57b9c-qrtjc" Feb 23 14:48:19.569637 master-0 kubenswrapper[28758]: I0223 14:48:19.569576 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6shm8\" (UniqueName: \"kubernetes.io/projected/f23ec168-6731-4fdf-8b85-837218a794f6-kube-api-access-6shm8\") pod \"f23ec168-6731-4fdf-8b85-837218a794f6\" (UID: \"f23ec168-6731-4fdf-8b85-837218a794f6\") " Feb 23 14:48:19.569907 master-0 kubenswrapper[28758]: I0223 14:48:19.569852 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f23ec168-6731-4fdf-8b85-837218a794f6-config\") pod \"f23ec168-6731-4fdf-8b85-837218a794f6\" (UID: \"f23ec168-6731-4fdf-8b85-837218a794f6\") " Feb 23 14:48:19.569970 master-0 kubenswrapper[28758]: I0223 14:48:19.569914 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f23ec168-6731-4fdf-8b85-837218a794f6-dns-svc\") pod \"f23ec168-6731-4fdf-8b85-837218a794f6\" (UID: \"f23ec168-6731-4fdf-8b85-837218a794f6\") " Feb 23 14:48:19.589873 master-0 kubenswrapper[28758]: I0223 14:48:19.589809 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f23ec168-6731-4fdf-8b85-837218a794f6-kube-api-access-6shm8" (OuterVolumeSpecName: "kube-api-access-6shm8") pod "f23ec168-6731-4fdf-8b85-837218a794f6" (UID: "f23ec168-6731-4fdf-8b85-837218a794f6"). InnerVolumeSpecName "kube-api-access-6shm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:48:19.629128 master-0 kubenswrapper[28758]: I0223 14:48:19.629070 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f23ec168-6731-4fdf-8b85-837218a794f6-config" (OuterVolumeSpecName: "config") pod "f23ec168-6731-4fdf-8b85-837218a794f6" (UID: "f23ec168-6731-4fdf-8b85-837218a794f6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:48:19.633956 master-0 kubenswrapper[28758]: I0223 14:48:19.633113 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f23ec168-6731-4fdf-8b85-837218a794f6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f23ec168-6731-4fdf-8b85-837218a794f6" (UID: "f23ec168-6731-4fdf-8b85-837218a794f6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:48:19.672699 master-0 kubenswrapper[28758]: I0223 14:48:19.672568 28758 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f23ec168-6731-4fdf-8b85-837218a794f6-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:19.672699 master-0 kubenswrapper[28758]: I0223 14:48:19.672617 28758 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f23ec168-6731-4fdf-8b85-837218a794f6-dns-svc\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:19.672699 master-0 kubenswrapper[28758]: I0223 14:48:19.672630 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6shm8\" (UniqueName: \"kubernetes.io/projected/f23ec168-6731-4fdf-8b85-837218a794f6-kube-api-access-6shm8\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:19.682925 master-0 kubenswrapper[28758]: I0223 14:48:19.682860 28758 generic.go:334] "Generic (PLEG): container finished" podID="0f01ce05-6e10-4db8-b887-bd8caf29c98a" containerID="b93b3d403c2dfa5ad4f37ec243fb3e4f8ec183bddff0e7512111eadcc6035509" exitCode=0 Feb 23 14:48:19.683143 master-0 kubenswrapper[28758]: I0223 14:48:19.682937 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8f8d-account-create-update-6sf9t" event={"ID":"0f01ce05-6e10-4db8-b887-bd8caf29c98a","Type":"ContainerDied","Data":"b93b3d403c2dfa5ad4f37ec243fb3e4f8ec183bddff0e7512111eadcc6035509"} Feb 23 14:48:19.691143 master-0 kubenswrapper[28758]: I0223 14:48:19.691090 28758 generic.go:334] "Generic (PLEG): container finished" podID="03be8bbd-b280-416a-9197-9dbdd3eaf4f7" containerID="69379567b30488cb54059fec6b95c6d1b4534033cd6568962e1e943d7c2d18b4" exitCode=0 Feb 23 14:48:19.691245 master-0 kubenswrapper[28758]: I0223 14:48:19.691163 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ec94-account-create-update-qwcst" event={"ID":"03be8bbd-b280-416a-9197-9dbdd3eaf4f7","Type":"ContainerDied","Data":"69379567b30488cb54059fec6b95c6d1b4534033cd6568962e1e943d7c2d18b4"} Feb 23 14:48:19.693873 master-0 kubenswrapper[28758]: I0223 14:48:19.693825 28758 generic.go:334] "Generic (PLEG): container finished" podID="f23ec168-6731-4fdf-8b85-837218a794f6" containerID="c83e65d806e36841bc33b02f0bc6e4d81482e47102ef04f289cbbc1b90c287ef" exitCode=0 Feb 23 14:48:19.694084 master-0 kubenswrapper[28758]: I0223 14:48:19.694056 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c45d57b9c-qrtjc" Feb 23 14:48:19.698668 master-0 kubenswrapper[28758]: I0223 14:48:19.697048 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c45d57b9c-qrtjc" event={"ID":"f23ec168-6731-4fdf-8b85-837218a794f6","Type":"ContainerDied","Data":"c83e65d806e36841bc33b02f0bc6e4d81482e47102ef04f289cbbc1b90c287ef"} Feb 23 14:48:19.698758 master-0 kubenswrapper[28758]: I0223 14:48:19.698693 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c45d57b9c-qrtjc" event={"ID":"f23ec168-6731-4fdf-8b85-837218a794f6","Type":"ContainerDied","Data":"51e34f0a2697d5173501f9e1efcdff962de18b3fc33ea0f3dd16904d58721fe9"} Feb 23 14:48:19.698758 master-0 kubenswrapper[28758]: I0223 14:48:19.698734 28758 scope.go:117] "RemoveContainer" containerID="c83e65d806e36841bc33b02f0bc6e4d81482e47102ef04f289cbbc1b90c287ef" Feb 23 14:48:19.750261 master-0 kubenswrapper[28758]: I0223 14:48:19.750215 28758 scope.go:117] "RemoveContainer" containerID="a56436abbd1ed7672c67ab216c64edf9c3002ae2f2d2c4944bd7ead7d7b255fd" Feb 23 14:48:19.755608 master-0 kubenswrapper[28758]: I0223 14:48:19.755550 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c45d57b9c-qrtjc"] Feb 23 14:48:19.771757 master-0 kubenswrapper[28758]: I0223 14:48:19.771676 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c45d57b9c-qrtjc"] Feb 23 14:48:19.790521 master-0 kubenswrapper[28758]: I0223 14:48:19.790151 28758 scope.go:117] "RemoveContainer" containerID="c83e65d806e36841bc33b02f0bc6e4d81482e47102ef04f289cbbc1b90c287ef" Feb 23 14:48:19.792174 master-0 kubenswrapper[28758]: E0223 14:48:19.790755 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c83e65d806e36841bc33b02f0bc6e4d81482e47102ef04f289cbbc1b90c287ef\": container with ID starting with c83e65d806e36841bc33b02f0bc6e4d81482e47102ef04f289cbbc1b90c287ef not found: ID does not exist" containerID="c83e65d806e36841bc33b02f0bc6e4d81482e47102ef04f289cbbc1b90c287ef" Feb 23 14:48:19.792174 master-0 kubenswrapper[28758]: I0223 14:48:19.790812 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c83e65d806e36841bc33b02f0bc6e4d81482e47102ef04f289cbbc1b90c287ef"} err="failed to get container status \"c83e65d806e36841bc33b02f0bc6e4d81482e47102ef04f289cbbc1b90c287ef\": rpc error: code = NotFound desc = could not find container \"c83e65d806e36841bc33b02f0bc6e4d81482e47102ef04f289cbbc1b90c287ef\": container with ID starting with c83e65d806e36841bc33b02f0bc6e4d81482e47102ef04f289cbbc1b90c287ef not found: ID does not exist" Feb 23 14:48:19.792174 master-0 kubenswrapper[28758]: I0223 14:48:19.790841 28758 scope.go:117] "RemoveContainer" containerID="a56436abbd1ed7672c67ab216c64edf9c3002ae2f2d2c4944bd7ead7d7b255fd" Feb 23 14:48:19.792174 master-0 kubenswrapper[28758]: E0223 14:48:19.791170 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a56436abbd1ed7672c67ab216c64edf9c3002ae2f2d2c4944bd7ead7d7b255fd\": container with ID starting with a56436abbd1ed7672c67ab216c64edf9c3002ae2f2d2c4944bd7ead7d7b255fd not found: ID does not exist" containerID="a56436abbd1ed7672c67ab216c64edf9c3002ae2f2d2c4944bd7ead7d7b255fd" Feb 23 14:48:19.792174 master-0 kubenswrapper[28758]: I0223 14:48:19.791229 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a56436abbd1ed7672c67ab216c64edf9c3002ae2f2d2c4944bd7ead7d7b255fd"} err="failed to get container status \"a56436abbd1ed7672c67ab216c64edf9c3002ae2f2d2c4944bd7ead7d7b255fd\": rpc error: code = NotFound desc = could not find container \"a56436abbd1ed7672c67ab216c64edf9c3002ae2f2d2c4944bd7ead7d7b255fd\": container with ID starting with a56436abbd1ed7672c67ab216c64edf9c3002ae2f2d2c4944bd7ead7d7b255fd not found: ID does not exist" Feb 23 14:48:20.106255 master-0 kubenswrapper[28758]: I0223 14:48:20.106152 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="614e1111-ec92-499f-8dff-a31274ee9922" path="/var/lib/kubelet/pods/614e1111-ec92-499f-8dff-a31274ee9922/volumes" Feb 23 14:48:20.107902 master-0 kubenswrapper[28758]: I0223 14:48:20.107827 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f23ec168-6731-4fdf-8b85-837218a794f6" path="/var/lib/kubelet/pods/f23ec168-6731-4fdf-8b85-837218a794f6/volumes" Feb 23 14:48:20.321755 master-0 kubenswrapper[28758]: I0223 14:48:20.321709 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ec94-account-create-update-qwcst" Feb 23 14:48:20.391034 master-0 kubenswrapper[28758]: I0223 14:48:20.390951 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03be8bbd-b280-416a-9197-9dbdd3eaf4f7-operator-scripts\") pod \"03be8bbd-b280-416a-9197-9dbdd3eaf4f7\" (UID: \"03be8bbd-b280-416a-9197-9dbdd3eaf4f7\") " Feb 23 14:48:20.391270 master-0 kubenswrapper[28758]: I0223 14:48:20.391043 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6qkd\" (UniqueName: \"kubernetes.io/projected/03be8bbd-b280-416a-9197-9dbdd3eaf4f7-kube-api-access-k6qkd\") pod \"03be8bbd-b280-416a-9197-9dbdd3eaf4f7\" (UID: \"03be8bbd-b280-416a-9197-9dbdd3eaf4f7\") " Feb 23 14:48:20.393001 master-0 kubenswrapper[28758]: I0223 14:48:20.392961 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03be8bbd-b280-416a-9197-9dbdd3eaf4f7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "03be8bbd-b280-416a-9197-9dbdd3eaf4f7" (UID: "03be8bbd-b280-416a-9197-9dbdd3eaf4f7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:48:20.395531 master-0 kubenswrapper[28758]: I0223 14:48:20.395464 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03be8bbd-b280-416a-9197-9dbdd3eaf4f7-kube-api-access-k6qkd" (OuterVolumeSpecName: "kube-api-access-k6qkd") pod "03be8bbd-b280-416a-9197-9dbdd3eaf4f7" (UID: "03be8bbd-b280-416a-9197-9dbdd3eaf4f7"). InnerVolumeSpecName "kube-api-access-k6qkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:48:20.494567 master-0 kubenswrapper[28758]: I0223 14:48:20.493609 28758 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03be8bbd-b280-416a-9197-9dbdd3eaf4f7-operator-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:20.494567 master-0 kubenswrapper[28758]: I0223 14:48:20.493654 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6qkd\" (UniqueName: \"kubernetes.io/projected/03be8bbd-b280-416a-9197-9dbdd3eaf4f7-kube-api-access-k6qkd\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:20.585369 master-0 kubenswrapper[28758]: I0223 14:48:20.584928 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-sl8ng" Feb 23 14:48:20.590998 master-0 kubenswrapper[28758]: I0223 14:48:20.590955 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qwl8p" Feb 23 14:48:20.595672 master-0 kubenswrapper[28758]: I0223 14:48:20.595628 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7xv7b" Feb 23 14:48:20.609017 master-0 kubenswrapper[28758]: I0223 14:48:20.608945 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b4e2-account-create-update-r6sdz" Feb 23 14:48:20.698178 master-0 kubenswrapper[28758]: I0223 14:48:20.698115 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7wm5\" (UniqueName: \"kubernetes.io/projected/6cc23db2-205f-4165-8675-1fa946a30a84-kube-api-access-f7wm5\") pod \"6cc23db2-205f-4165-8675-1fa946a30a84\" (UID: \"6cc23db2-205f-4165-8675-1fa946a30a84\") " Feb 23 14:48:20.698364 master-0 kubenswrapper[28758]: I0223 14:48:20.698227 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9j5z5\" (UniqueName: \"kubernetes.io/projected/581dcbe4-b0bd-4099-8150-902c7a87c5d0-kube-api-access-9j5z5\") pod \"581dcbe4-b0bd-4099-8150-902c7a87c5d0\" (UID: \"581dcbe4-b0bd-4099-8150-902c7a87c5d0\") " Feb 23 14:48:20.698364 master-0 kubenswrapper[28758]: I0223 14:48:20.698329 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/031f392b-0b1d-4f88-a32b-d015b41857f6-operator-scripts\") pod \"031f392b-0b1d-4f88-a32b-d015b41857f6\" (UID: \"031f392b-0b1d-4f88-a32b-d015b41857f6\") " Feb 23 14:48:20.698455 master-0 kubenswrapper[28758]: I0223 14:48:20.698362 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/581dcbe4-b0bd-4099-8150-902c7a87c5d0-operator-scripts\") pod \"581dcbe4-b0bd-4099-8150-902c7a87c5d0\" (UID: \"581dcbe4-b0bd-4099-8150-902c7a87c5d0\") " Feb 23 14:48:20.698455 master-0 kubenswrapper[28758]: I0223 14:48:20.698424 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bb37fbb-098a-4f0c-894a-918e22d2c343-operator-scripts\") pod \"8bb37fbb-098a-4f0c-894a-918e22d2c343\" (UID: \"8bb37fbb-098a-4f0c-894a-918e22d2c343\") " Feb 23 14:48:20.698577 master-0 kubenswrapper[28758]: I0223 14:48:20.698454 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clvqf\" (UniqueName: \"kubernetes.io/projected/8bb37fbb-098a-4f0c-894a-918e22d2c343-kube-api-access-clvqf\") pod \"8bb37fbb-098a-4f0c-894a-918e22d2c343\" (UID: \"8bb37fbb-098a-4f0c-894a-918e22d2c343\") " Feb 23 14:48:20.699148 master-0 kubenswrapper[28758]: I0223 14:48:20.699066 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/031f392b-0b1d-4f88-a32b-d015b41857f6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "031f392b-0b1d-4f88-a32b-d015b41857f6" (UID: "031f392b-0b1d-4f88-a32b-d015b41857f6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:48:20.699909 master-0 kubenswrapper[28758]: I0223 14:48:20.699867 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/581dcbe4-b0bd-4099-8150-902c7a87c5d0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "581dcbe4-b0bd-4099-8150-902c7a87c5d0" (UID: "581dcbe4-b0bd-4099-8150-902c7a87c5d0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:48:20.699993 master-0 kubenswrapper[28758]: I0223 14:48:20.699940 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bb37fbb-098a-4f0c-894a-918e22d2c343-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8bb37fbb-098a-4f0c-894a-918e22d2c343" (UID: "8bb37fbb-098a-4f0c-894a-918e22d2c343"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:48:20.701270 master-0 kubenswrapper[28758]: I0223 14:48:20.701233 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cc23db2-205f-4165-8675-1fa946a30a84-operator-scripts\") pod \"6cc23db2-205f-4165-8675-1fa946a30a84\" (UID: \"6cc23db2-205f-4165-8675-1fa946a30a84\") " Feb 23 14:48:20.701334 master-0 kubenswrapper[28758]: I0223 14:48:20.701299 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wm9q\" (UniqueName: \"kubernetes.io/projected/031f392b-0b1d-4f88-a32b-d015b41857f6-kube-api-access-7wm9q\") pod \"031f392b-0b1d-4f88-a32b-d015b41857f6\" (UID: \"031f392b-0b1d-4f88-a32b-d015b41857f6\") " Feb 23 14:48:20.701881 master-0 kubenswrapper[28758]: I0223 14:48:20.701819 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cc23db2-205f-4165-8675-1fa946a30a84-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6cc23db2-205f-4165-8675-1fa946a30a84" (UID: "6cc23db2-205f-4165-8675-1fa946a30a84"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:48:20.702442 master-0 kubenswrapper[28758]: I0223 14:48:20.702408 28758 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/031f392b-0b1d-4f88-a32b-d015b41857f6-operator-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:20.702520 master-0 kubenswrapper[28758]: I0223 14:48:20.702444 28758 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/581dcbe4-b0bd-4099-8150-902c7a87c5d0-operator-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:20.702520 master-0 kubenswrapper[28758]: I0223 14:48:20.702458 28758 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bb37fbb-098a-4f0c-894a-918e22d2c343-operator-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:20.702520 master-0 kubenswrapper[28758]: I0223 14:48:20.702470 28758 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cc23db2-205f-4165-8675-1fa946a30a84-operator-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:20.702520 master-0 kubenswrapper[28758]: I0223 14:48:20.702501 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cc23db2-205f-4165-8675-1fa946a30a84-kube-api-access-f7wm5" (OuterVolumeSpecName: "kube-api-access-f7wm5") pod "6cc23db2-205f-4165-8675-1fa946a30a84" (UID: "6cc23db2-205f-4165-8675-1fa946a30a84"). InnerVolumeSpecName "kube-api-access-f7wm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:48:20.703028 master-0 kubenswrapper[28758]: I0223 14:48:20.702997 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/581dcbe4-b0bd-4099-8150-902c7a87c5d0-kube-api-access-9j5z5" (OuterVolumeSpecName: "kube-api-access-9j5z5") pod "581dcbe4-b0bd-4099-8150-902c7a87c5d0" (UID: "581dcbe4-b0bd-4099-8150-902c7a87c5d0"). InnerVolumeSpecName "kube-api-access-9j5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:48:20.703142 master-0 kubenswrapper[28758]: I0223 14:48:20.703112 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bb37fbb-098a-4f0c-894a-918e22d2c343-kube-api-access-clvqf" (OuterVolumeSpecName: "kube-api-access-clvqf") pod "8bb37fbb-098a-4f0c-894a-918e22d2c343" (UID: "8bb37fbb-098a-4f0c-894a-918e22d2c343"). InnerVolumeSpecName "kube-api-access-clvqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:48:20.705562 master-0 kubenswrapper[28758]: I0223 14:48:20.705511 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/031f392b-0b1d-4f88-a32b-d015b41857f6-kube-api-access-7wm9q" (OuterVolumeSpecName: "kube-api-access-7wm9q") pod "031f392b-0b1d-4f88-a32b-d015b41857f6" (UID: "031f392b-0b1d-4f88-a32b-d015b41857f6"). InnerVolumeSpecName "kube-api-access-7wm9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:48:20.706129 master-0 kubenswrapper[28758]: I0223 14:48:20.705751 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7xv7b" Feb 23 14:48:20.708586 master-0 kubenswrapper[28758]: I0223 14:48:20.708528 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7xv7b" event={"ID":"581dcbe4-b0bd-4099-8150-902c7a87c5d0","Type":"ContainerDied","Data":"f64a0df9a1304643426cfb0d931a55b804d3922d7a9952cc19e627121b43bad1"} Feb 23 14:48:20.708704 master-0 kubenswrapper[28758]: I0223 14:48:20.708589 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f64a0df9a1304643426cfb0d931a55b804d3922d7a9952cc19e627121b43bad1" Feb 23 14:48:20.719600 master-0 kubenswrapper[28758]: I0223 14:48:20.719014 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b4e2-account-create-update-r6sdz" Feb 23 14:48:20.719600 master-0 kubenswrapper[28758]: I0223 14:48:20.719197 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b4e2-account-create-update-r6sdz" event={"ID":"031f392b-0b1d-4f88-a32b-d015b41857f6","Type":"ContainerDied","Data":"933611e784c4b5b557db1eb09900963b493673723c313c5b224b7991e6d83b34"} Feb 23 14:48:20.719600 master-0 kubenswrapper[28758]: I0223 14:48:20.719259 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="933611e784c4b5b557db1eb09900963b493673723c313c5b224b7991e6d83b34" Feb 23 14:48:20.725702 master-0 kubenswrapper[28758]: I0223 14:48:20.725678 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ec94-account-create-update-qwcst" Feb 23 14:48:20.725802 master-0 kubenswrapper[28758]: I0223 14:48:20.725705 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ec94-account-create-update-qwcst" event={"ID":"03be8bbd-b280-416a-9197-9dbdd3eaf4f7","Type":"ContainerDied","Data":"66e3375142fab6ba72309f05e3d897016f0918e7221a9542eba72f42aed554c8"} Feb 23 14:48:20.725905 master-0 kubenswrapper[28758]: I0223 14:48:20.725885 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66e3375142fab6ba72309f05e3d897016f0918e7221a9542eba72f42aed554c8" Feb 23 14:48:20.727899 master-0 kubenswrapper[28758]: I0223 14:48:20.727867 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-sl8ng" event={"ID":"8bb37fbb-098a-4f0c-894a-918e22d2c343","Type":"ContainerDied","Data":"df18571687a6d0e0eee987aadb1297a18201732746fa95629d097dac3eae5efe"} Feb 23 14:48:20.727983 master-0 kubenswrapper[28758]: I0223 14:48:20.727936 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df18571687a6d0e0eee987aadb1297a18201732746fa95629d097dac3eae5efe" Feb 23 14:48:20.728024 master-0 kubenswrapper[28758]: I0223 14:48:20.727984 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-sl8ng" Feb 23 14:48:20.732469 master-0 kubenswrapper[28758]: I0223 14:48:20.732344 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qwl8p" event={"ID":"6cc23db2-205f-4165-8675-1fa946a30a84","Type":"ContainerDied","Data":"ae980f16da40639bf110418a554fe68172257f8377b8f8dc6200e992cb5ca482"} Feb 23 14:48:20.732469 master-0 kubenswrapper[28758]: I0223 14:48:20.732398 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae980f16da40639bf110418a554fe68172257f8377b8f8dc6200e992cb5ca482" Feb 23 14:48:20.732469 master-0 kubenswrapper[28758]: I0223 14:48:20.732400 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qwl8p" Feb 23 14:48:20.804599 master-0 kubenswrapper[28758]: I0223 14:48:20.804526 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clvqf\" (UniqueName: \"kubernetes.io/projected/8bb37fbb-098a-4f0c-894a-918e22d2c343-kube-api-access-clvqf\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:20.804599 master-0 kubenswrapper[28758]: I0223 14:48:20.804572 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wm9q\" (UniqueName: \"kubernetes.io/projected/031f392b-0b1d-4f88-a32b-d015b41857f6-kube-api-access-7wm9q\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:20.804599 master-0 kubenswrapper[28758]: I0223 14:48:20.804584 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7wm5\" (UniqueName: \"kubernetes.io/projected/6cc23db2-205f-4165-8675-1fa946a30a84-kube-api-access-f7wm5\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:20.804599 master-0 kubenswrapper[28758]: I0223 14:48:20.804595 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9j5z5\" (UniqueName: \"kubernetes.io/projected/581dcbe4-b0bd-4099-8150-902c7a87c5d0-kube-api-access-9j5z5\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:21.281738 master-0 kubenswrapper[28758]: I0223 14:48:21.281691 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8f8d-account-create-update-6sf9t" Feb 23 14:48:21.315417 master-0 kubenswrapper[28758]: I0223 14:48:21.315328 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttz7s\" (UniqueName: \"kubernetes.io/projected/0f01ce05-6e10-4db8-b887-bd8caf29c98a-kube-api-access-ttz7s\") pod \"0f01ce05-6e10-4db8-b887-bd8caf29c98a\" (UID: \"0f01ce05-6e10-4db8-b887-bd8caf29c98a\") " Feb 23 14:48:21.315417 master-0 kubenswrapper[28758]: I0223 14:48:21.315426 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f01ce05-6e10-4db8-b887-bd8caf29c98a-operator-scripts\") pod \"0f01ce05-6e10-4db8-b887-bd8caf29c98a\" (UID: \"0f01ce05-6e10-4db8-b887-bd8caf29c98a\") " Feb 23 14:48:21.316563 master-0 kubenswrapper[28758]: I0223 14:48:21.316513 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f01ce05-6e10-4db8-b887-bd8caf29c98a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0f01ce05-6e10-4db8-b887-bd8caf29c98a" (UID: "0f01ce05-6e10-4db8-b887-bd8caf29c98a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:48:21.352242 master-0 kubenswrapper[28758]: I0223 14:48:21.352164 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f01ce05-6e10-4db8-b887-bd8caf29c98a-kube-api-access-ttz7s" (OuterVolumeSpecName: "kube-api-access-ttz7s") pod "0f01ce05-6e10-4db8-b887-bd8caf29c98a" (UID: "0f01ce05-6e10-4db8-b887-bd8caf29c98a"). InnerVolumeSpecName "kube-api-access-ttz7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:48:21.417292 master-0 kubenswrapper[28758]: I0223 14:48:21.417244 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttz7s\" (UniqueName: \"kubernetes.io/projected/0f01ce05-6e10-4db8-b887-bd8caf29c98a-kube-api-access-ttz7s\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:21.417292 master-0 kubenswrapper[28758]: I0223 14:48:21.417285 28758 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f01ce05-6e10-4db8-b887-bd8caf29c98a-operator-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:21.758940 master-0 kubenswrapper[28758]: I0223 14:48:21.758885 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8f8d-account-create-update-6sf9t" event={"ID":"0f01ce05-6e10-4db8-b887-bd8caf29c98a","Type":"ContainerDied","Data":"9bc11160c65ad8b6af8c3045562ee4d02e5f2baa56b298e0e2e2256fe02a2828"} Feb 23 14:48:21.758940 master-0 kubenswrapper[28758]: I0223 14:48:21.758934 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bc11160c65ad8b6af8c3045562ee4d02e5f2baa56b298e0e2e2256fe02a2828" Feb 23 14:48:21.759542 master-0 kubenswrapper[28758]: I0223 14:48:21.758985 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8f8d-account-create-update-6sf9t" Feb 23 14:48:23.778955 master-0 kubenswrapper[28758]: I0223 14:48:23.778879 28758 generic.go:334] "Generic (PLEG): container finished" podID="518b33ee-87c2-4090-a971-735d98c01d7f" containerID="8ec02d365dd2b0d29e1bc74ea3698781d4530e9c0e2f5e7d1e4f99d28c6c8f1f" exitCode=0 Feb 23 14:48:23.778955 master-0 kubenswrapper[28758]: I0223 14:48:23.778943 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-m897g" event={"ID":"518b33ee-87c2-4090-a971-735d98c01d7f","Type":"ContainerDied","Data":"8ec02d365dd2b0d29e1bc74ea3698781d4530e9c0e2f5e7d1e4f99d28c6c8f1f"} Feb 23 14:48:24.442015 master-0 kubenswrapper[28758]: I0223 14:48:24.441941 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-jwf8n"] Feb 23 14:48:24.442608 master-0 kubenswrapper[28758]: E0223 14:48:24.442582 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f01ce05-6e10-4db8-b887-bd8caf29c98a" containerName="mariadb-account-create-update" Feb 23 14:48:24.442608 master-0 kubenswrapper[28758]: I0223 14:48:24.442604 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f01ce05-6e10-4db8-b887-bd8caf29c98a" containerName="mariadb-account-create-update" Feb 23 14:48:24.442733 master-0 kubenswrapper[28758]: E0223 14:48:24.442625 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03be8bbd-b280-416a-9197-9dbdd3eaf4f7" containerName="mariadb-account-create-update" Feb 23 14:48:24.442733 master-0 kubenswrapper[28758]: I0223 14:48:24.442632 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="03be8bbd-b280-416a-9197-9dbdd3eaf4f7" containerName="mariadb-account-create-update" Feb 23 14:48:24.442733 master-0 kubenswrapper[28758]: E0223 14:48:24.442642 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f23ec168-6731-4fdf-8b85-837218a794f6" containerName="dnsmasq-dns" Feb 23 14:48:24.442733 master-0 kubenswrapper[28758]: I0223 14:48:24.442648 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="f23ec168-6731-4fdf-8b85-837218a794f6" containerName="dnsmasq-dns" Feb 23 14:48:24.442733 master-0 kubenswrapper[28758]: E0223 14:48:24.442663 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="031f392b-0b1d-4f88-a32b-d015b41857f6" containerName="mariadb-account-create-update" Feb 23 14:48:24.442733 master-0 kubenswrapper[28758]: I0223 14:48:24.442671 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="031f392b-0b1d-4f88-a32b-d015b41857f6" containerName="mariadb-account-create-update" Feb 23 14:48:24.442733 master-0 kubenswrapper[28758]: E0223 14:48:24.442692 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="581dcbe4-b0bd-4099-8150-902c7a87c5d0" containerName="mariadb-database-create" Feb 23 14:48:24.442733 master-0 kubenswrapper[28758]: I0223 14:48:24.442699 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="581dcbe4-b0bd-4099-8150-902c7a87c5d0" containerName="mariadb-database-create" Feb 23 14:48:24.442733 master-0 kubenswrapper[28758]: E0223 14:48:24.442713 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cc23db2-205f-4165-8675-1fa946a30a84" containerName="mariadb-database-create" Feb 23 14:48:24.442733 master-0 kubenswrapper[28758]: I0223 14:48:24.442719 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cc23db2-205f-4165-8675-1fa946a30a84" containerName="mariadb-database-create" Feb 23 14:48:24.442733 master-0 kubenswrapper[28758]: E0223 14:48:24.442730 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f23ec168-6731-4fdf-8b85-837218a794f6" containerName="init" Feb 23 14:48:24.442733 master-0 kubenswrapper[28758]: I0223 14:48:24.442745 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="f23ec168-6731-4fdf-8b85-837218a794f6" containerName="init" Feb 23 14:48:24.443239 master-0 kubenswrapper[28758]: E0223 14:48:24.442755 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb37fbb-098a-4f0c-894a-918e22d2c343" containerName="mariadb-database-create" Feb 23 14:48:24.443239 master-0 kubenswrapper[28758]: I0223 14:48:24.442762 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb37fbb-098a-4f0c-894a-918e22d2c343" containerName="mariadb-database-create" Feb 23 14:48:24.443239 master-0 kubenswrapper[28758]: I0223 14:48:24.443025 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="f23ec168-6731-4fdf-8b85-837218a794f6" containerName="dnsmasq-dns" Feb 23 14:48:24.443239 master-0 kubenswrapper[28758]: I0223 14:48:24.443039 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f01ce05-6e10-4db8-b887-bd8caf29c98a" containerName="mariadb-account-create-update" Feb 23 14:48:24.443239 master-0 kubenswrapper[28758]: I0223 14:48:24.443071 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="581dcbe4-b0bd-4099-8150-902c7a87c5d0" containerName="mariadb-database-create" Feb 23 14:48:24.443239 master-0 kubenswrapper[28758]: I0223 14:48:24.443091 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="03be8bbd-b280-416a-9197-9dbdd3eaf4f7" containerName="mariadb-account-create-update" Feb 23 14:48:24.443239 master-0 kubenswrapper[28758]: I0223 14:48:24.443105 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="031f392b-0b1d-4f88-a32b-d015b41857f6" containerName="mariadb-account-create-update" Feb 23 14:48:24.443239 master-0 kubenswrapper[28758]: I0223 14:48:24.443132 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cc23db2-205f-4165-8675-1fa946a30a84" containerName="mariadb-database-create" Feb 23 14:48:24.443239 master-0 kubenswrapper[28758]: I0223 14:48:24.443152 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb37fbb-098a-4f0c-894a-918e22d2c343" containerName="mariadb-database-create" Feb 23 14:48:24.444027 master-0 kubenswrapper[28758]: I0223 14:48:24.443997 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jwf8n" Feb 23 14:48:24.447706 master-0 kubenswrapper[28758]: I0223 14:48:24.447659 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 23 14:48:24.468772 master-0 kubenswrapper[28758]: I0223 14:48:24.468663 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jwf8n"] Feb 23 14:48:24.480357 master-0 kubenswrapper[28758]: I0223 14:48:24.480300 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/808186bd-8972-4a9d-9db7-7c3456319733-operator-scripts\") pod \"root-account-create-update-jwf8n\" (UID: \"808186bd-8972-4a9d-9db7-7c3456319733\") " pod="openstack/root-account-create-update-jwf8n" Feb 23 14:48:24.480740 master-0 kubenswrapper[28758]: I0223 14:48:24.480725 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfz6j\" (UniqueName: \"kubernetes.io/projected/808186bd-8972-4a9d-9db7-7c3456319733-kube-api-access-rfz6j\") pod \"root-account-create-update-jwf8n\" (UID: \"808186bd-8972-4a9d-9db7-7c3456319733\") " pod="openstack/root-account-create-update-jwf8n" Feb 23 14:48:24.582080 master-0 kubenswrapper[28758]: I0223 14:48:24.582020 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfz6j\" (UniqueName: \"kubernetes.io/projected/808186bd-8972-4a9d-9db7-7c3456319733-kube-api-access-rfz6j\") pod \"root-account-create-update-jwf8n\" (UID: \"808186bd-8972-4a9d-9db7-7c3456319733\") " pod="openstack/root-account-create-update-jwf8n" Feb 23 14:48:24.582296 master-0 kubenswrapper[28758]: I0223 14:48:24.582165 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/808186bd-8972-4a9d-9db7-7c3456319733-operator-scripts\") pod \"root-account-create-update-jwf8n\" (UID: \"808186bd-8972-4a9d-9db7-7c3456319733\") " pod="openstack/root-account-create-update-jwf8n" Feb 23 14:48:24.583141 master-0 kubenswrapper[28758]: I0223 14:48:24.583092 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/808186bd-8972-4a9d-9db7-7c3456319733-operator-scripts\") pod \"root-account-create-update-jwf8n\" (UID: \"808186bd-8972-4a9d-9db7-7c3456319733\") " pod="openstack/root-account-create-update-jwf8n" Feb 23 14:48:24.599773 master-0 kubenswrapper[28758]: I0223 14:48:24.599707 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfz6j\" (UniqueName: \"kubernetes.io/projected/808186bd-8972-4a9d-9db7-7c3456319733-kube-api-access-rfz6j\") pod \"root-account-create-update-jwf8n\" (UID: \"808186bd-8972-4a9d-9db7-7c3456319733\") " pod="openstack/root-account-create-update-jwf8n" Feb 23 14:48:24.772018 master-0 kubenswrapper[28758]: I0223 14:48:24.771823 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jwf8n" Feb 23 14:48:25.300302 master-0 kubenswrapper[28758]: I0223 14:48:25.298994 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m897g" Feb 23 14:48:25.358000 master-0 kubenswrapper[28758]: I0223 14:48:25.357879 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jwf8n"] Feb 23 14:48:25.401507 master-0 kubenswrapper[28758]: I0223 14:48:25.401417 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/518b33ee-87c2-4090-a971-735d98c01d7f-ring-data-devices\") pod \"518b33ee-87c2-4090-a971-735d98c01d7f\" (UID: \"518b33ee-87c2-4090-a971-735d98c01d7f\") " Feb 23 14:48:25.401833 master-0 kubenswrapper[28758]: I0223 14:48:25.401794 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/518b33ee-87c2-4090-a971-735d98c01d7f-scripts\") pod \"518b33ee-87c2-4090-a971-735d98c01d7f\" (UID: \"518b33ee-87c2-4090-a971-735d98c01d7f\") " Feb 23 14:48:25.401902 master-0 kubenswrapper[28758]: I0223 14:48:25.401855 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/518b33ee-87c2-4090-a971-735d98c01d7f-dispersionconf\") pod \"518b33ee-87c2-4090-a971-735d98c01d7f\" (UID: \"518b33ee-87c2-4090-a971-735d98c01d7f\") " Feb 23 14:48:25.401957 master-0 kubenswrapper[28758]: I0223 14:48:25.401918 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/518b33ee-87c2-4090-a971-735d98c01d7f-combined-ca-bundle\") pod \"518b33ee-87c2-4090-a971-735d98c01d7f\" (UID: \"518b33ee-87c2-4090-a971-735d98c01d7f\") " Feb 23 14:48:25.401957 master-0 kubenswrapper[28758]: I0223 14:48:25.401942 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwd8n\" (UniqueName: \"kubernetes.io/projected/518b33ee-87c2-4090-a971-735d98c01d7f-kube-api-access-gwd8n\") pod \"518b33ee-87c2-4090-a971-735d98c01d7f\" (UID: \"518b33ee-87c2-4090-a971-735d98c01d7f\") " Feb 23 14:48:25.402566 master-0 kubenswrapper[28758]: I0223 14:48:25.402533 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/518b33ee-87c2-4090-a971-735d98c01d7f-etc-swift\") pod \"518b33ee-87c2-4090-a971-735d98c01d7f\" (UID: \"518b33ee-87c2-4090-a971-735d98c01d7f\") " Feb 23 14:48:25.402648 master-0 kubenswrapper[28758]: I0223 14:48:25.401984 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/518b33ee-87c2-4090-a971-735d98c01d7f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "518b33ee-87c2-4090-a971-735d98c01d7f" (UID: "518b33ee-87c2-4090-a971-735d98c01d7f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:48:25.402648 master-0 kubenswrapper[28758]: I0223 14:48:25.402590 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/518b33ee-87c2-4090-a971-735d98c01d7f-swiftconf\") pod \"518b33ee-87c2-4090-a971-735d98c01d7f\" (UID: \"518b33ee-87c2-4090-a971-735d98c01d7f\") " Feb 23 14:48:25.403059 master-0 kubenswrapper[28758]: I0223 14:48:25.403023 28758 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/518b33ee-87c2-4090-a971-735d98c01d7f-ring-data-devices\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:25.403328 master-0 kubenswrapper[28758]: I0223 14:48:25.403292 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/518b33ee-87c2-4090-a971-735d98c01d7f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "518b33ee-87c2-4090-a971-735d98c01d7f" (UID: "518b33ee-87c2-4090-a971-735d98c01d7f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:48:25.405255 master-0 kubenswrapper[28758]: I0223 14:48:25.405193 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/518b33ee-87c2-4090-a971-735d98c01d7f-kube-api-access-gwd8n" (OuterVolumeSpecName: "kube-api-access-gwd8n") pod "518b33ee-87c2-4090-a971-735d98c01d7f" (UID: "518b33ee-87c2-4090-a971-735d98c01d7f"). InnerVolumeSpecName "kube-api-access-gwd8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:48:25.408388 master-0 kubenswrapper[28758]: I0223 14:48:25.407926 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/518b33ee-87c2-4090-a971-735d98c01d7f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "518b33ee-87c2-4090-a971-735d98c01d7f" (UID: "518b33ee-87c2-4090-a971-735d98c01d7f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:48:25.434020 master-0 kubenswrapper[28758]: I0223 14:48:25.433971 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/518b33ee-87c2-4090-a971-735d98c01d7f-scripts" (OuterVolumeSpecName: "scripts") pod "518b33ee-87c2-4090-a971-735d98c01d7f" (UID: "518b33ee-87c2-4090-a971-735d98c01d7f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:48:25.441070 master-0 kubenswrapper[28758]: I0223 14:48:25.440671 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/518b33ee-87c2-4090-a971-735d98c01d7f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "518b33ee-87c2-4090-a971-735d98c01d7f" (UID: "518b33ee-87c2-4090-a971-735d98c01d7f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:48:25.441632 master-0 kubenswrapper[28758]: I0223 14:48:25.441536 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/518b33ee-87c2-4090-a971-735d98c01d7f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "518b33ee-87c2-4090-a971-735d98c01d7f" (UID: "518b33ee-87c2-4090-a971-735d98c01d7f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:48:25.507869 master-0 kubenswrapper[28758]: I0223 14:48:25.506932 28758 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/518b33ee-87c2-4090-a971-735d98c01d7f-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:25.508273 master-0 kubenswrapper[28758]: I0223 14:48:25.508225 28758 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/518b33ee-87c2-4090-a971-735d98c01d7f-dispersionconf\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:25.508273 master-0 kubenswrapper[28758]: I0223 14:48:25.508255 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/518b33ee-87c2-4090-a971-735d98c01d7f-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:25.508273 master-0 kubenswrapper[28758]: I0223 14:48:25.508271 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwd8n\" (UniqueName: \"kubernetes.io/projected/518b33ee-87c2-4090-a971-735d98c01d7f-kube-api-access-gwd8n\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:25.508437 master-0 kubenswrapper[28758]: I0223 14:48:25.508284 28758 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/518b33ee-87c2-4090-a971-735d98c01d7f-etc-swift\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:25.508437 master-0 kubenswrapper[28758]: I0223 14:48:25.508295 28758 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/518b33ee-87c2-4090-a971-735d98c01d7f-swiftconf\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:25.806811 master-0 kubenswrapper[28758]: I0223 14:48:25.806741 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-m897g" event={"ID":"518b33ee-87c2-4090-a971-735d98c01d7f","Type":"ContainerDied","Data":"8cc31f394eb8a028a9987aef34e7fb4796a685e7fa9c552a2a03856c2af485b3"} Feb 23 14:48:25.806811 master-0 kubenswrapper[28758]: I0223 14:48:25.806798 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cc31f394eb8a028a9987aef34e7fb4796a685e7fa9c552a2a03856c2af485b3" Feb 23 14:48:25.806811 master-0 kubenswrapper[28758]: I0223 14:48:25.806776 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m897g" Feb 23 14:48:25.810398 master-0 kubenswrapper[28758]: I0223 14:48:25.810331 28758 generic.go:334] "Generic (PLEG): container finished" podID="808186bd-8972-4a9d-9db7-7c3456319733" containerID="f95a64a7e7d566f16528df0084b6f2b4ce07211ef212bae147bff68ed53b269a" exitCode=0 Feb 23 14:48:25.810533 master-0 kubenswrapper[28758]: I0223 14:48:25.810398 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jwf8n" event={"ID":"808186bd-8972-4a9d-9db7-7c3456319733","Type":"ContainerDied","Data":"f95a64a7e7d566f16528df0084b6f2b4ce07211ef212bae147bff68ed53b269a"} Feb 23 14:48:25.810533 master-0 kubenswrapper[28758]: I0223 14:48:25.810429 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jwf8n" event={"ID":"808186bd-8972-4a9d-9db7-7c3456319733","Type":"ContainerStarted","Data":"e2de861fef70615819d9df8a8836fa008750cdcdd0677056947b3ef25470fe17"} Feb 23 14:48:26.064597 master-0 kubenswrapper[28758]: I0223 14:48:26.064545 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 23 14:48:26.595280 master-0 kubenswrapper[28758]: I0223 14:48:26.595229 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-m256t"] Feb 23 14:48:26.596238 master-0 kubenswrapper[28758]: E0223 14:48:26.596220 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="518b33ee-87c2-4090-a971-735d98c01d7f" containerName="swift-ring-rebalance" Feb 23 14:48:26.596323 master-0 kubenswrapper[28758]: I0223 14:48:26.596313 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="518b33ee-87c2-4090-a971-735d98c01d7f" containerName="swift-ring-rebalance" Feb 23 14:48:26.596635 master-0 kubenswrapper[28758]: I0223 14:48:26.596620 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="518b33ee-87c2-4090-a971-735d98c01d7f" containerName="swift-ring-rebalance" Feb 23 14:48:26.597400 master-0 kubenswrapper[28758]: I0223 14:48:26.597333 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-m256t" Feb 23 14:48:26.600864 master-0 kubenswrapper[28758]: I0223 14:48:26.600837 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-63e78-config-data" Feb 23 14:48:26.611119 master-0 kubenswrapper[28758]: I0223 14:48:26.611075 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-m256t"] Feb 23 14:48:26.631158 master-0 kubenswrapper[28758]: I0223 14:48:26.631098 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f87df8a-ee8a-4398-b0ce-817da20c6349-config-data\") pod \"glance-db-sync-m256t\" (UID: \"3f87df8a-ee8a-4398-b0ce-817da20c6349\") " pod="openstack/glance-db-sync-m256t" Feb 23 14:48:26.631407 master-0 kubenswrapper[28758]: I0223 14:48:26.631179 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3f87df8a-ee8a-4398-b0ce-817da20c6349-db-sync-config-data\") pod \"glance-db-sync-m256t\" (UID: \"3f87df8a-ee8a-4398-b0ce-817da20c6349\") " pod="openstack/glance-db-sync-m256t" Feb 23 14:48:26.631407 master-0 kubenswrapper[28758]: I0223 14:48:26.631233 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f87df8a-ee8a-4398-b0ce-817da20c6349-combined-ca-bundle\") pod \"glance-db-sync-m256t\" (UID: \"3f87df8a-ee8a-4398-b0ce-817da20c6349\") " pod="openstack/glance-db-sync-m256t" Feb 23 14:48:26.631407 master-0 kubenswrapper[28758]: I0223 14:48:26.631279 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5mbl\" (UniqueName: \"kubernetes.io/projected/3f87df8a-ee8a-4398-b0ce-817da20c6349-kube-api-access-x5mbl\") pod \"glance-db-sync-m256t\" (UID: \"3f87df8a-ee8a-4398-b0ce-817da20c6349\") " pod="openstack/glance-db-sync-m256t" Feb 23 14:48:26.733333 master-0 kubenswrapper[28758]: I0223 14:48:26.733265 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f87df8a-ee8a-4398-b0ce-817da20c6349-config-data\") pod \"glance-db-sync-m256t\" (UID: \"3f87df8a-ee8a-4398-b0ce-817da20c6349\") " pod="openstack/glance-db-sync-m256t" Feb 23 14:48:26.733333 master-0 kubenswrapper[28758]: I0223 14:48:26.733330 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3f87df8a-ee8a-4398-b0ce-817da20c6349-db-sync-config-data\") pod \"glance-db-sync-m256t\" (UID: \"3f87df8a-ee8a-4398-b0ce-817da20c6349\") " pod="openstack/glance-db-sync-m256t" Feb 23 14:48:26.733654 master-0 kubenswrapper[28758]: I0223 14:48:26.733621 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f87df8a-ee8a-4398-b0ce-817da20c6349-combined-ca-bundle\") pod \"glance-db-sync-m256t\" (UID: \"3f87df8a-ee8a-4398-b0ce-817da20c6349\") " pod="openstack/glance-db-sync-m256t" Feb 23 14:48:26.733772 master-0 kubenswrapper[28758]: I0223 14:48:26.733723 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5mbl\" (UniqueName: \"kubernetes.io/projected/3f87df8a-ee8a-4398-b0ce-817da20c6349-kube-api-access-x5mbl\") pod \"glance-db-sync-m256t\" (UID: \"3f87df8a-ee8a-4398-b0ce-817da20c6349\") " pod="openstack/glance-db-sync-m256t" Feb 23 14:48:26.736551 master-0 kubenswrapper[28758]: I0223 14:48:26.736510 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3f87df8a-ee8a-4398-b0ce-817da20c6349-db-sync-config-data\") pod \"glance-db-sync-m256t\" (UID: \"3f87df8a-ee8a-4398-b0ce-817da20c6349\") " pod="openstack/glance-db-sync-m256t" Feb 23 14:48:26.737050 master-0 kubenswrapper[28758]: I0223 14:48:26.737028 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f87df8a-ee8a-4398-b0ce-817da20c6349-combined-ca-bundle\") pod \"glance-db-sync-m256t\" (UID: \"3f87df8a-ee8a-4398-b0ce-817da20c6349\") " pod="openstack/glance-db-sync-m256t" Feb 23 14:48:26.737158 master-0 kubenswrapper[28758]: I0223 14:48:26.737036 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f87df8a-ee8a-4398-b0ce-817da20c6349-config-data\") pod \"glance-db-sync-m256t\" (UID: \"3f87df8a-ee8a-4398-b0ce-817da20c6349\") " pod="openstack/glance-db-sync-m256t" Feb 23 14:48:26.749762 master-0 kubenswrapper[28758]: I0223 14:48:26.749702 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5mbl\" (UniqueName: \"kubernetes.io/projected/3f87df8a-ee8a-4398-b0ce-817da20c6349-kube-api-access-x5mbl\") pod \"glance-db-sync-m256t\" (UID: \"3f87df8a-ee8a-4398-b0ce-817da20c6349\") " pod="openstack/glance-db-sync-m256t" Feb 23 14:48:26.919098 master-0 kubenswrapper[28758]: I0223 14:48:26.919008 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-m256t" Feb 23 14:48:27.168188 master-0 kubenswrapper[28758]: I0223 14:48:27.153422 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3b978484-31f3-46af-aedd-e96a997da517-etc-swift\") pod \"swift-storage-0\" (UID: \"3b978484-31f3-46af-aedd-e96a997da517\") " pod="openstack/swift-storage-0" Feb 23 14:48:27.168188 master-0 kubenswrapper[28758]: I0223 14:48:27.158400 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3b978484-31f3-46af-aedd-e96a997da517-etc-swift\") pod \"swift-storage-0\" (UID: \"3b978484-31f3-46af-aedd-e96a997da517\") " pod="openstack/swift-storage-0" Feb 23 14:48:27.276520 master-0 kubenswrapper[28758]: I0223 14:48:27.271348 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jwf8n" Feb 23 14:48:27.357332 master-0 kubenswrapper[28758]: I0223 14:48:27.357261 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/808186bd-8972-4a9d-9db7-7c3456319733-operator-scripts\") pod \"808186bd-8972-4a9d-9db7-7c3456319733\" (UID: \"808186bd-8972-4a9d-9db7-7c3456319733\") " Feb 23 14:48:27.357609 master-0 kubenswrapper[28758]: I0223 14:48:27.357393 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfz6j\" (UniqueName: \"kubernetes.io/projected/808186bd-8972-4a9d-9db7-7c3456319733-kube-api-access-rfz6j\") pod \"808186bd-8972-4a9d-9db7-7c3456319733\" (UID: \"808186bd-8972-4a9d-9db7-7c3456319733\") " Feb 23 14:48:27.357856 master-0 kubenswrapper[28758]: I0223 14:48:27.357805 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/808186bd-8972-4a9d-9db7-7c3456319733-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "808186bd-8972-4a9d-9db7-7c3456319733" (UID: "808186bd-8972-4a9d-9db7-7c3456319733"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:48:27.358195 master-0 kubenswrapper[28758]: I0223 14:48:27.358164 28758 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/808186bd-8972-4a9d-9db7-7c3456319733-operator-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:27.362694 master-0 kubenswrapper[28758]: I0223 14:48:27.361582 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/808186bd-8972-4a9d-9db7-7c3456319733-kube-api-access-rfz6j" (OuterVolumeSpecName: "kube-api-access-rfz6j") pod "808186bd-8972-4a9d-9db7-7c3456319733" (UID: "808186bd-8972-4a9d-9db7-7c3456319733"). InnerVolumeSpecName "kube-api-access-rfz6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:48:27.409605 master-0 kubenswrapper[28758]: I0223 14:48:27.409446 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 23 14:48:27.461226 master-0 kubenswrapper[28758]: I0223 14:48:27.461190 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfz6j\" (UniqueName: \"kubernetes.io/projected/808186bd-8972-4a9d-9db7-7c3456319733-kube-api-access-rfz6j\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:27.538746 master-0 kubenswrapper[28758]: I0223 14:48:27.538660 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-m256t"] Feb 23 14:48:27.539312 master-0 kubenswrapper[28758]: W0223 14:48:27.539259 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f87df8a_ee8a_4398_b0ce_817da20c6349.slice/crio-239108202cc5ec2de464b739bf1756d2bb8c6ef866f06fc126534a54618a82db WatchSource:0}: Error finding container 239108202cc5ec2de464b739bf1756d2bb8c6ef866f06fc126534a54618a82db: Status 404 returned error can't find the container with id 239108202cc5ec2de464b739bf1756d2bb8c6ef866f06fc126534a54618a82db Feb 23 14:48:27.832513 master-0 kubenswrapper[28758]: I0223 14:48:27.832433 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jwf8n" Feb 23 14:48:27.833339 master-0 kubenswrapper[28758]: I0223 14:48:27.832432 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jwf8n" event={"ID":"808186bd-8972-4a9d-9db7-7c3456319733","Type":"ContainerDied","Data":"e2de861fef70615819d9df8a8836fa008750cdcdd0677056947b3ef25470fe17"} Feb 23 14:48:27.833339 master-0 kubenswrapper[28758]: I0223 14:48:27.832599 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2de861fef70615819d9df8a8836fa008750cdcdd0677056947b3ef25470fe17" Feb 23 14:48:27.833885 master-0 kubenswrapper[28758]: I0223 14:48:27.833850 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-m256t" event={"ID":"3f87df8a-ee8a-4398-b0ce-817da20c6349","Type":"ContainerStarted","Data":"239108202cc5ec2de464b739bf1756d2bb8c6ef866f06fc126534a54618a82db"} Feb 23 14:48:27.857340 master-0 kubenswrapper[28758]: I0223 14:48:27.857277 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 23 14:48:28.844609 master-0 kubenswrapper[28758]: I0223 14:48:28.844561 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3b978484-31f3-46af-aedd-e96a997da517","Type":"ContainerStarted","Data":"8721358623cc5ac6225c9286a2ce06b2850e171411826fe617b82478472cb960"} Feb 23 14:48:29.100412 master-0 kubenswrapper[28758]: I0223 14:48:29.100274 28758 trace.go:236] Trace[980768592]: "Calculate volume metrics of mysql-db for pod openstack/openstack-cell1-galera-0" (23-Feb-2026 14:48:28.076) (total time: 1023ms): Feb 23 14:48:29.100412 master-0 kubenswrapper[28758]: Trace[980768592]: [1.023781301s] [1.023781301s] END Feb 23 14:48:29.266122 master-0 kubenswrapper[28758]: I0223 14:48:29.266072 28758 trace.go:236] Trace[2053108537]: "Calculate volume metrics of mysql-db for pod openstack/openstack-galera-0" (23-Feb-2026 14:48:28.078) (total time: 1187ms): Feb 23 14:48:29.266122 master-0 kubenswrapper[28758]: Trace[2053108537]: [1.187057199s] [1.187057199s] END Feb 23 14:48:29.859615 master-0 kubenswrapper[28758]: I0223 14:48:29.859557 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3b978484-31f3-46af-aedd-e96a997da517","Type":"ContainerStarted","Data":"310d604a817ade955e385094ef3dfa6f7611202e4d85e20ef07ebc41ed96d33a"} Feb 23 14:48:29.859615 master-0 kubenswrapper[28758]: I0223 14:48:29.859614 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3b978484-31f3-46af-aedd-e96a997da517","Type":"ContainerStarted","Data":"436f861b540e87343f346ebd584c8bc4c53749ce7f2a5ddec1d8ffb9fe05c8f7"} Feb 23 14:48:29.859615 master-0 kubenswrapper[28758]: I0223 14:48:29.859629 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3b978484-31f3-46af-aedd-e96a997da517","Type":"ContainerStarted","Data":"c6227871d0177d7bc1197f34e1e77b7975130199ca00e97478d617e1da0cc9e8"} Feb 23 14:48:29.860860 master-0 kubenswrapper[28758]: I0223 14:48:29.859638 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3b978484-31f3-46af-aedd-e96a997da517","Type":"ContainerStarted","Data":"da6be9c45008460331b42c36bc2c556c39053af79fa56d0cab210025c7b23d9b"} Feb 23 14:48:31.904323 master-0 kubenswrapper[28758]: I0223 14:48:31.904224 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3b978484-31f3-46af-aedd-e96a997da517","Type":"ContainerStarted","Data":"cba3cf764286ffbe3e3bfa4bdd42d81f67f531d89b4354675f0e3c0b8be3ffbe"} Feb 23 14:48:31.904323 master-0 kubenswrapper[28758]: I0223 14:48:31.904285 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3b978484-31f3-46af-aedd-e96a997da517","Type":"ContainerStarted","Data":"09e9b05ce73ba6cb439ee558222959828b82ba1ac241369dc77809dcafe5bae9"} Feb 23 14:48:31.904323 master-0 kubenswrapper[28758]: I0223 14:48:31.904302 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3b978484-31f3-46af-aedd-e96a997da517","Type":"ContainerStarted","Data":"33cde58a4df215a6ecc9b4b5ea8f33551847d421491aeaaec6701d8cc760ceab"} Feb 23 14:48:31.949232 master-0 kubenswrapper[28758]: I0223 14:48:31.949170 28758 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-lssgn" podUID="b9342b63-0ec2-4c10-898a-cebd5e86414a" containerName="ovn-controller" probeResult="failure" output=< Feb 23 14:48:31.949232 master-0 kubenswrapper[28758]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 23 14:48:31.949232 master-0 kubenswrapper[28758]: > Feb 23 14:48:32.046500 master-0 kubenswrapper[28758]: I0223 14:48:32.046253 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-td6ds" Feb 23 14:48:32.050609 master-0 kubenswrapper[28758]: I0223 14:48:32.050360 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-td6ds" Feb 23 14:48:32.422594 master-0 kubenswrapper[28758]: I0223 14:48:32.421600 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lssgn-config-8zs62"] Feb 23 14:48:32.434251 master-0 kubenswrapper[28758]: E0223 14:48:32.434124 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="808186bd-8972-4a9d-9db7-7c3456319733" containerName="mariadb-account-create-update" Feb 23 14:48:32.434471 master-0 kubenswrapper[28758]: I0223 14:48:32.434321 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="808186bd-8972-4a9d-9db7-7c3456319733" containerName="mariadb-account-create-update" Feb 23 14:48:32.435862 master-0 kubenswrapper[28758]: I0223 14:48:32.435825 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="808186bd-8972-4a9d-9db7-7c3456319733" containerName="mariadb-account-create-update" Feb 23 14:48:32.445606 master-0 kubenswrapper[28758]: I0223 14:48:32.444774 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lssgn-config-8zs62" Feb 23 14:48:32.453744 master-0 kubenswrapper[28758]: I0223 14:48:32.453245 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lssgn-config-8zs62"] Feb 23 14:48:32.458301 master-0 kubenswrapper[28758]: I0223 14:48:32.458212 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 23 14:48:32.481320 master-0 kubenswrapper[28758]: I0223 14:48:32.481262 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d387f59-bf80-4c79-ad36-cef7eff60db2-scripts\") pod \"ovn-controller-lssgn-config-8zs62\" (UID: \"0d387f59-bf80-4c79-ad36-cef7eff60db2\") " pod="openstack/ovn-controller-lssgn-config-8zs62" Feb 23 14:48:32.481660 master-0 kubenswrapper[28758]: I0223 14:48:32.481615 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0d387f59-bf80-4c79-ad36-cef7eff60db2-var-log-ovn\") pod \"ovn-controller-lssgn-config-8zs62\" (UID: \"0d387f59-bf80-4c79-ad36-cef7eff60db2\") " pod="openstack/ovn-controller-lssgn-config-8zs62" Feb 23 14:48:32.481850 master-0 kubenswrapper[28758]: I0223 14:48:32.481738 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0d387f59-bf80-4c79-ad36-cef7eff60db2-additional-scripts\") pod \"ovn-controller-lssgn-config-8zs62\" (UID: \"0d387f59-bf80-4c79-ad36-cef7eff60db2\") " pod="openstack/ovn-controller-lssgn-config-8zs62" Feb 23 14:48:32.481850 master-0 kubenswrapper[28758]: I0223 14:48:32.481780 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0d387f59-bf80-4c79-ad36-cef7eff60db2-var-run\") pod \"ovn-controller-lssgn-config-8zs62\" (UID: \"0d387f59-bf80-4c79-ad36-cef7eff60db2\") " pod="openstack/ovn-controller-lssgn-config-8zs62" Feb 23 14:48:32.481986 master-0 kubenswrapper[28758]: I0223 14:48:32.481853 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt7jt\" (UniqueName: \"kubernetes.io/projected/0d387f59-bf80-4c79-ad36-cef7eff60db2-kube-api-access-wt7jt\") pod \"ovn-controller-lssgn-config-8zs62\" (UID: \"0d387f59-bf80-4c79-ad36-cef7eff60db2\") " pod="openstack/ovn-controller-lssgn-config-8zs62" Feb 23 14:48:32.482225 master-0 kubenswrapper[28758]: I0223 14:48:32.482019 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0d387f59-bf80-4c79-ad36-cef7eff60db2-var-run-ovn\") pod \"ovn-controller-lssgn-config-8zs62\" (UID: \"0d387f59-bf80-4c79-ad36-cef7eff60db2\") " pod="openstack/ovn-controller-lssgn-config-8zs62" Feb 23 14:48:32.588564 master-0 kubenswrapper[28758]: I0223 14:48:32.585211 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0d387f59-bf80-4c79-ad36-cef7eff60db2-var-log-ovn\") pod \"ovn-controller-lssgn-config-8zs62\" (UID: \"0d387f59-bf80-4c79-ad36-cef7eff60db2\") " pod="openstack/ovn-controller-lssgn-config-8zs62" Feb 23 14:48:32.588564 master-0 kubenswrapper[28758]: I0223 14:48:32.585294 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0d387f59-bf80-4c79-ad36-cef7eff60db2-additional-scripts\") pod \"ovn-controller-lssgn-config-8zs62\" (UID: \"0d387f59-bf80-4c79-ad36-cef7eff60db2\") " pod="openstack/ovn-controller-lssgn-config-8zs62" Feb 23 14:48:32.588564 master-0 kubenswrapper[28758]: I0223 14:48:32.585317 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0d387f59-bf80-4c79-ad36-cef7eff60db2-var-run\") pod \"ovn-controller-lssgn-config-8zs62\" (UID: \"0d387f59-bf80-4c79-ad36-cef7eff60db2\") " pod="openstack/ovn-controller-lssgn-config-8zs62" Feb 23 14:48:32.588564 master-0 kubenswrapper[28758]: I0223 14:48:32.585351 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt7jt\" (UniqueName: \"kubernetes.io/projected/0d387f59-bf80-4c79-ad36-cef7eff60db2-kube-api-access-wt7jt\") pod \"ovn-controller-lssgn-config-8zs62\" (UID: \"0d387f59-bf80-4c79-ad36-cef7eff60db2\") " pod="openstack/ovn-controller-lssgn-config-8zs62" Feb 23 14:48:32.588564 master-0 kubenswrapper[28758]: I0223 14:48:32.585390 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0d387f59-bf80-4c79-ad36-cef7eff60db2-var-run-ovn\") pod \"ovn-controller-lssgn-config-8zs62\" (UID: \"0d387f59-bf80-4c79-ad36-cef7eff60db2\") " pod="openstack/ovn-controller-lssgn-config-8zs62" Feb 23 14:48:32.588564 master-0 kubenswrapper[28758]: I0223 14:48:32.585445 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d387f59-bf80-4c79-ad36-cef7eff60db2-scripts\") pod \"ovn-controller-lssgn-config-8zs62\" (UID: \"0d387f59-bf80-4c79-ad36-cef7eff60db2\") " pod="openstack/ovn-controller-lssgn-config-8zs62" Feb 23 14:48:32.588564 master-0 kubenswrapper[28758]: I0223 14:48:32.586563 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0d387f59-bf80-4c79-ad36-cef7eff60db2-var-run\") pod \"ovn-controller-lssgn-config-8zs62\" (UID: \"0d387f59-bf80-4c79-ad36-cef7eff60db2\") " pod="openstack/ovn-controller-lssgn-config-8zs62" Feb 23 14:48:32.588564 master-0 kubenswrapper[28758]: I0223 14:48:32.586628 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0d387f59-bf80-4c79-ad36-cef7eff60db2-var-log-ovn\") pod \"ovn-controller-lssgn-config-8zs62\" (UID: \"0d387f59-bf80-4c79-ad36-cef7eff60db2\") " pod="openstack/ovn-controller-lssgn-config-8zs62" Feb 23 14:48:32.588564 master-0 kubenswrapper[28758]: I0223 14:48:32.587102 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0d387f59-bf80-4c79-ad36-cef7eff60db2-additional-scripts\") pod \"ovn-controller-lssgn-config-8zs62\" (UID: \"0d387f59-bf80-4c79-ad36-cef7eff60db2\") " pod="openstack/ovn-controller-lssgn-config-8zs62" Feb 23 14:48:32.588564 master-0 kubenswrapper[28758]: I0223 14:48:32.587417 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0d387f59-bf80-4c79-ad36-cef7eff60db2-var-run-ovn\") pod \"ovn-controller-lssgn-config-8zs62\" (UID: \"0d387f59-bf80-4c79-ad36-cef7eff60db2\") " pod="openstack/ovn-controller-lssgn-config-8zs62" Feb 23 14:48:32.588564 master-0 kubenswrapper[28758]: I0223 14:48:32.588348 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d387f59-bf80-4c79-ad36-cef7eff60db2-scripts\") pod \"ovn-controller-lssgn-config-8zs62\" (UID: \"0d387f59-bf80-4c79-ad36-cef7eff60db2\") " pod="openstack/ovn-controller-lssgn-config-8zs62" Feb 23 14:48:32.624498 master-0 kubenswrapper[28758]: I0223 14:48:32.624169 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt7jt\" (UniqueName: \"kubernetes.io/projected/0d387f59-bf80-4c79-ad36-cef7eff60db2-kube-api-access-wt7jt\") pod \"ovn-controller-lssgn-config-8zs62\" (UID: \"0d387f59-bf80-4c79-ad36-cef7eff60db2\") " pod="openstack/ovn-controller-lssgn-config-8zs62" Feb 23 14:48:32.787175 master-0 kubenswrapper[28758]: I0223 14:48:32.787115 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lssgn-config-8zs62" Feb 23 14:48:32.924505 master-0 kubenswrapper[28758]: I0223 14:48:32.923824 28758 generic.go:334] "Generic (PLEG): container finished" podID="373bbd85-b2d4-40a4-afc1-3ecf50a666e7" containerID="28e4b899ada51892b7fdfdf96a548fe19b7e2f8172bb6e0f29396a68e2ad1446" exitCode=0 Feb 23 14:48:32.924505 master-0 kubenswrapper[28758]: I0223 14:48:32.923929 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"373bbd85-b2d4-40a4-afc1-3ecf50a666e7","Type":"ContainerDied","Data":"28e4b899ada51892b7fdfdf96a548fe19b7e2f8172bb6e0f29396a68e2ad1446"} Feb 23 14:48:32.930255 master-0 kubenswrapper[28758]: I0223 14:48:32.930210 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3b978484-31f3-46af-aedd-e96a997da517","Type":"ContainerStarted","Data":"5c53af74423cc0e3144c315552eab1d6c4957030fbdc85c7390d62f9dda0755a"} Feb 23 14:48:32.932154 master-0 kubenswrapper[28758]: I0223 14:48:32.932100 28758 generic.go:334] "Generic (PLEG): container finished" podID="e18414da-932f-4a26-ab6a-af32aa83196b" containerID="5f9cd75c8ea3610b17762a7a2a82fbd63abba29b71c7f31c59a93ce46315286e" exitCode=0 Feb 23 14:48:32.933286 master-0 kubenswrapper[28758]: I0223 14:48:32.932863 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e18414da-932f-4a26-ab6a-af32aa83196b","Type":"ContainerDied","Data":"5f9cd75c8ea3610b17762a7a2a82fbd63abba29b71c7f31c59a93ce46315286e"} Feb 23 14:48:33.221244 master-0 kubenswrapper[28758]: I0223 14:48:33.221174 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lssgn-config-8zs62"] Feb 23 14:48:36.948365 master-0 kubenswrapper[28758]: I0223 14:48:36.948283 28758 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-lssgn" podUID="b9342b63-0ec2-4c10-898a-cebd5e86414a" containerName="ovn-controller" probeResult="failure" output=< Feb 23 14:48:36.948365 master-0 kubenswrapper[28758]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 23 14:48:36.948365 master-0 kubenswrapper[28758]: > Feb 23 14:48:39.015424 master-0 kubenswrapper[28758]: W0223 14:48:39.015371 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d387f59_bf80_4c79_ad36_cef7eff60db2.slice/crio-e724eee4c70b276c40f4a6ac0b82b47796ed0e1b88e457bf09d411af10c17087 WatchSource:0}: Error finding container e724eee4c70b276c40f4a6ac0b82b47796ed0e1b88e457bf09d411af10c17087: Status 404 returned error can't find the container with id e724eee4c70b276c40f4a6ac0b82b47796ed0e1b88e457bf09d411af10c17087 Feb 23 14:48:40.015530 master-0 kubenswrapper[28758]: I0223 14:48:40.015449 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e18414da-932f-4a26-ab6a-af32aa83196b","Type":"ContainerStarted","Data":"f7163f17fd47d8c30bc43fb9b1ccd39e7fa6027eff1ad4321bd47e2e10988a87"} Feb 23 14:48:40.016075 master-0 kubenswrapper[28758]: I0223 14:48:40.015782 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:48:40.021172 master-0 kubenswrapper[28758]: I0223 14:48:40.021119 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"373bbd85-b2d4-40a4-afc1-3ecf50a666e7","Type":"ContainerStarted","Data":"d8ec818f60c0021d56dfa83e3bf893f4acc7fe70215db676ffbae6026afb86b3"} Feb 23 14:48:40.021414 master-0 kubenswrapper[28758]: I0223 14:48:40.021348 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 23 14:48:40.023822 master-0 kubenswrapper[28758]: I0223 14:48:40.023766 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lssgn-config-8zs62" event={"ID":"0d387f59-bf80-4c79-ad36-cef7eff60db2","Type":"ContainerStarted","Data":"c9fc6d29001166e45cfb2215c22cf38e0f80fb836ecd9bcc51373fcda30039cc"} Feb 23 14:48:40.023822 master-0 kubenswrapper[28758]: I0223 14:48:40.023807 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lssgn-config-8zs62" event={"ID":"0d387f59-bf80-4c79-ad36-cef7eff60db2","Type":"ContainerStarted","Data":"e724eee4c70b276c40f4a6ac0b82b47796ed0e1b88e457bf09d411af10c17087"} Feb 23 14:48:40.039319 master-0 kubenswrapper[28758]: I0223 14:48:40.039260 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3b978484-31f3-46af-aedd-e96a997da517","Type":"ContainerStarted","Data":"f25c04a1d064ed99d300f793b383fccb20acff58da1dff506b1ef4006bf204b7"} Feb 23 14:48:40.053087 master-0 kubenswrapper[28758]: I0223 14:48:40.052941 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=61.529211962 podStartE2EDuration="1m9.052921141s" podCreationTimestamp="2026-02-23 14:47:31 +0000 UTC" firstStartedPulling="2026-02-23 14:47:49.926694639 +0000 UTC m=+802.053010571" lastFinishedPulling="2026-02-23 14:47:57.450403818 +0000 UTC m=+809.576719750" observedRunningTime="2026-02-23 14:48:40.04689151 +0000 UTC m=+852.173207452" watchObservedRunningTime="2026-02-23 14:48:40.052921141 +0000 UTC m=+852.179237073" Feb 23 14:48:40.093879 master-0 kubenswrapper[28758]: I0223 14:48:40.091997 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=61.602423186 podStartE2EDuration="1m9.091970988s" podCreationTimestamp="2026-02-23 14:47:31 +0000 UTC" firstStartedPulling="2026-02-23 14:47:49.960886807 +0000 UTC m=+802.087202739" lastFinishedPulling="2026-02-23 14:47:57.450434609 +0000 UTC m=+809.576750541" observedRunningTime="2026-02-23 14:48:40.081232563 +0000 UTC m=+852.207548505" watchObservedRunningTime="2026-02-23 14:48:40.091970988 +0000 UTC m=+852.218286920" Feb 23 14:48:40.114319 master-0 kubenswrapper[28758]: I0223 14:48:40.114171 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-lssgn-config-8zs62" podStartSLOduration=8.114140807 podStartE2EDuration="8.114140807s" podCreationTimestamp="2026-02-23 14:48:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:48:40.106246238 +0000 UTC m=+852.232562170" watchObservedRunningTime="2026-02-23 14:48:40.114140807 +0000 UTC m=+852.240456739" Feb 23 14:48:41.050380 master-0 kubenswrapper[28758]: I0223 14:48:41.050282 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-m256t" event={"ID":"3f87df8a-ee8a-4398-b0ce-817da20c6349","Type":"ContainerStarted","Data":"c330f7ca02c23b53430e906257f14262ee5d999e85979aacf84635b10c49e9f1"} Feb 23 14:48:41.052191 master-0 kubenswrapper[28758]: I0223 14:48:41.052157 28758 generic.go:334] "Generic (PLEG): container finished" podID="0d387f59-bf80-4c79-ad36-cef7eff60db2" containerID="c9fc6d29001166e45cfb2215c22cf38e0f80fb836ecd9bcc51373fcda30039cc" exitCode=0 Feb 23 14:48:41.052264 master-0 kubenswrapper[28758]: I0223 14:48:41.052212 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lssgn-config-8zs62" event={"ID":"0d387f59-bf80-4c79-ad36-cef7eff60db2","Type":"ContainerDied","Data":"c9fc6d29001166e45cfb2215c22cf38e0f80fb836ecd9bcc51373fcda30039cc"} Feb 23 14:48:41.059919 master-0 kubenswrapper[28758]: I0223 14:48:41.059843 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3b978484-31f3-46af-aedd-e96a997da517","Type":"ContainerStarted","Data":"1272dc3898d0b388be4afcbc671457869647eed1c38dfd9386163ea93baa0dfd"} Feb 23 14:48:41.059919 master-0 kubenswrapper[28758]: I0223 14:48:41.059878 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3b978484-31f3-46af-aedd-e96a997da517","Type":"ContainerStarted","Data":"e6d47f18da0e2bc0b106c40238d255cba892948484a12b42eab9ef0eb350869d"} Feb 23 14:48:41.059919 master-0 kubenswrapper[28758]: I0223 14:48:41.059888 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3b978484-31f3-46af-aedd-e96a997da517","Type":"ContainerStarted","Data":"95a3c110e2ec692d9a3b24ff80dad210af35ad6d95a9bcb607de50b33945a37f"} Feb 23 14:48:41.059919 master-0 kubenswrapper[28758]: I0223 14:48:41.059896 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3b978484-31f3-46af-aedd-e96a997da517","Type":"ContainerStarted","Data":"f57eacefb63f2cf54abe296387f1e623cd3a49fbde432f242a4c1c0067883c85"} Feb 23 14:48:41.059919 master-0 kubenswrapper[28758]: I0223 14:48:41.059906 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3b978484-31f3-46af-aedd-e96a997da517","Type":"ContainerStarted","Data":"25db04c35c737259b60bc19319a8f5809b4789ef345db127f992e22db15115e5"} Feb 23 14:48:41.081722 master-0 kubenswrapper[28758]: I0223 14:48:41.081631 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-m256t" podStartSLOduration=3.210622 podStartE2EDuration="15.081609652s" podCreationTimestamp="2026-02-23 14:48:26 +0000 UTC" firstStartedPulling="2026-02-23 14:48:27.541421701 +0000 UTC m=+839.667737633" lastFinishedPulling="2026-02-23 14:48:39.412409353 +0000 UTC m=+851.538725285" observedRunningTime="2026-02-23 14:48:41.075749057 +0000 UTC m=+853.202065009" watchObservedRunningTime="2026-02-23 14:48:41.081609652 +0000 UTC m=+853.207925584" Feb 23 14:48:41.939909 master-0 kubenswrapper[28758]: I0223 14:48:41.939842 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-lssgn" Feb 23 14:48:42.073109 master-0 kubenswrapper[28758]: I0223 14:48:42.073034 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3b978484-31f3-46af-aedd-e96a997da517","Type":"ContainerStarted","Data":"013162c6d58f35747211fc8743fae60378dcb0a787915d4557ba6f66f4c6c306"} Feb 23 14:48:42.130518 master-0 kubenswrapper[28758]: I0223 14:48:42.128560 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=22.577722874 podStartE2EDuration="34.128533549s" podCreationTimestamp="2026-02-23 14:48:08 +0000 UTC" firstStartedPulling="2026-02-23 14:48:27.861563557 +0000 UTC m=+839.987879489" lastFinishedPulling="2026-02-23 14:48:39.412374202 +0000 UTC m=+851.538690164" observedRunningTime="2026-02-23 14:48:42.116279273 +0000 UTC m=+854.242595205" watchObservedRunningTime="2026-02-23 14:48:42.128533549 +0000 UTC m=+854.254849481" Feb 23 14:48:42.466503 master-0 kubenswrapper[28758]: I0223 14:48:42.463971 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fcdbc89d5-z97vk"] Feb 23 14:48:42.487095 master-0 kubenswrapper[28758]: I0223 14:48:42.476717 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fcdbc89d5-z97vk" Feb 23 14:48:42.487095 master-0 kubenswrapper[28758]: I0223 14:48:42.480442 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 23 14:48:42.505533 master-0 kubenswrapper[28758]: I0223 14:48:42.505463 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-ovsdbserver-nb\") pod \"dnsmasq-dns-7fcdbc89d5-z97vk\" (UID: \"c93cc5fc-40f5-42ab-bb10-10d64fd205f8\") " pod="openstack/dnsmasq-dns-7fcdbc89d5-z97vk" Feb 23 14:48:42.506086 master-0 kubenswrapper[28758]: I0223 14:48:42.506016 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcc5l\" (UniqueName: \"kubernetes.io/projected/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-kube-api-access-gcc5l\") pod \"dnsmasq-dns-7fcdbc89d5-z97vk\" (UID: \"c93cc5fc-40f5-42ab-bb10-10d64fd205f8\") " pod="openstack/dnsmasq-dns-7fcdbc89d5-z97vk" Feb 23 14:48:42.506142 master-0 kubenswrapper[28758]: I0223 14:48:42.506117 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-config\") pod \"dnsmasq-dns-7fcdbc89d5-z97vk\" (UID: \"c93cc5fc-40f5-42ab-bb10-10d64fd205f8\") " pod="openstack/dnsmasq-dns-7fcdbc89d5-z97vk" Feb 23 14:48:42.506177 master-0 kubenswrapper[28758]: I0223 14:48:42.506164 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-dns-svc\") pod \"dnsmasq-dns-7fcdbc89d5-z97vk\" (UID: \"c93cc5fc-40f5-42ab-bb10-10d64fd205f8\") " pod="openstack/dnsmasq-dns-7fcdbc89d5-z97vk" Feb 23 14:48:42.506532 master-0 kubenswrapper[28758]: I0223 14:48:42.506509 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-dns-swift-storage-0\") pod \"dnsmasq-dns-7fcdbc89d5-z97vk\" (UID: \"c93cc5fc-40f5-42ab-bb10-10d64fd205f8\") " pod="openstack/dnsmasq-dns-7fcdbc89d5-z97vk" Feb 23 14:48:42.506623 master-0 kubenswrapper[28758]: I0223 14:48:42.506552 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-ovsdbserver-sb\") pod \"dnsmasq-dns-7fcdbc89d5-z97vk\" (UID: \"c93cc5fc-40f5-42ab-bb10-10d64fd205f8\") " pod="openstack/dnsmasq-dns-7fcdbc89d5-z97vk" Feb 23 14:48:42.513946 master-0 kubenswrapper[28758]: I0223 14:48:42.513883 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fcdbc89d5-z97vk"] Feb 23 14:48:42.605551 master-0 kubenswrapper[28758]: I0223 14:48:42.605420 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lssgn-config-8zs62" Feb 23 14:48:42.608025 master-0 kubenswrapper[28758]: I0223 14:48:42.607969 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcc5l\" (UniqueName: \"kubernetes.io/projected/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-kube-api-access-gcc5l\") pod \"dnsmasq-dns-7fcdbc89d5-z97vk\" (UID: \"c93cc5fc-40f5-42ab-bb10-10d64fd205f8\") " pod="openstack/dnsmasq-dns-7fcdbc89d5-z97vk" Feb 23 14:48:42.608131 master-0 kubenswrapper[28758]: I0223 14:48:42.608035 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-config\") pod \"dnsmasq-dns-7fcdbc89d5-z97vk\" (UID: \"c93cc5fc-40f5-42ab-bb10-10d64fd205f8\") " pod="openstack/dnsmasq-dns-7fcdbc89d5-z97vk" Feb 23 14:48:42.609558 master-0 kubenswrapper[28758]: I0223 14:48:42.608228 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-dns-svc\") pod \"dnsmasq-dns-7fcdbc89d5-z97vk\" (UID: \"c93cc5fc-40f5-42ab-bb10-10d64fd205f8\") " pod="openstack/dnsmasq-dns-7fcdbc89d5-z97vk" Feb 23 14:48:42.609558 master-0 kubenswrapper[28758]: I0223 14:48:42.608621 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-dns-swift-storage-0\") pod \"dnsmasq-dns-7fcdbc89d5-z97vk\" (UID: \"c93cc5fc-40f5-42ab-bb10-10d64fd205f8\") " pod="openstack/dnsmasq-dns-7fcdbc89d5-z97vk" Feb 23 14:48:42.609558 master-0 kubenswrapper[28758]: I0223 14:48:42.608660 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-ovsdbserver-sb\") pod \"dnsmasq-dns-7fcdbc89d5-z97vk\" (UID: \"c93cc5fc-40f5-42ab-bb10-10d64fd205f8\") " pod="openstack/dnsmasq-dns-7fcdbc89d5-z97vk" Feb 23 14:48:42.609558 master-0 kubenswrapper[28758]: I0223 14:48:42.608868 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-ovsdbserver-nb\") pod \"dnsmasq-dns-7fcdbc89d5-z97vk\" (UID: \"c93cc5fc-40f5-42ab-bb10-10d64fd205f8\") " pod="openstack/dnsmasq-dns-7fcdbc89d5-z97vk" Feb 23 14:48:42.609558 master-0 kubenswrapper[28758]: I0223 14:48:42.609147 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-config\") pod \"dnsmasq-dns-7fcdbc89d5-z97vk\" (UID: \"c93cc5fc-40f5-42ab-bb10-10d64fd205f8\") " pod="openstack/dnsmasq-dns-7fcdbc89d5-z97vk" Feb 23 14:48:42.609558 master-0 kubenswrapper[28758]: I0223 14:48:42.609289 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-dns-svc\") pod \"dnsmasq-dns-7fcdbc89d5-z97vk\" (UID: \"c93cc5fc-40f5-42ab-bb10-10d64fd205f8\") " pod="openstack/dnsmasq-dns-7fcdbc89d5-z97vk" Feb 23 14:48:42.610011 master-0 kubenswrapper[28758]: I0223 14:48:42.609926 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-ovsdbserver-nb\") pod \"dnsmasq-dns-7fcdbc89d5-z97vk\" (UID: \"c93cc5fc-40f5-42ab-bb10-10d64fd205f8\") " pod="openstack/dnsmasq-dns-7fcdbc89d5-z97vk" Feb 23 14:48:42.610159 master-0 kubenswrapper[28758]: I0223 14:48:42.610092 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-dns-swift-storage-0\") pod \"dnsmasq-dns-7fcdbc89d5-z97vk\" (UID: \"c93cc5fc-40f5-42ab-bb10-10d64fd205f8\") " pod="openstack/dnsmasq-dns-7fcdbc89d5-z97vk" Feb 23 14:48:42.610229 master-0 kubenswrapper[28758]: I0223 14:48:42.610151 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-ovsdbserver-sb\") pod \"dnsmasq-dns-7fcdbc89d5-z97vk\" (UID: \"c93cc5fc-40f5-42ab-bb10-10d64fd205f8\") " pod="openstack/dnsmasq-dns-7fcdbc89d5-z97vk" Feb 23 14:48:42.631744 master-0 kubenswrapper[28758]: I0223 14:48:42.630979 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcc5l\" (UniqueName: \"kubernetes.io/projected/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-kube-api-access-gcc5l\") pod \"dnsmasq-dns-7fcdbc89d5-z97vk\" (UID: \"c93cc5fc-40f5-42ab-bb10-10d64fd205f8\") " pod="openstack/dnsmasq-dns-7fcdbc89d5-z97vk" Feb 23 14:48:42.710231 master-0 kubenswrapper[28758]: I0223 14:48:42.709826 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0d387f59-bf80-4c79-ad36-cef7eff60db2-additional-scripts\") pod \"0d387f59-bf80-4c79-ad36-cef7eff60db2\" (UID: \"0d387f59-bf80-4c79-ad36-cef7eff60db2\") " Feb 23 14:48:42.710231 master-0 kubenswrapper[28758]: I0223 14:48:42.709945 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d387f59-bf80-4c79-ad36-cef7eff60db2-scripts\") pod \"0d387f59-bf80-4c79-ad36-cef7eff60db2\" (UID: \"0d387f59-bf80-4c79-ad36-cef7eff60db2\") " Feb 23 14:48:42.710231 master-0 kubenswrapper[28758]: I0223 14:48:42.710049 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0d387f59-bf80-4c79-ad36-cef7eff60db2-var-run-ovn\") pod \"0d387f59-bf80-4c79-ad36-cef7eff60db2\" (UID: \"0d387f59-bf80-4c79-ad36-cef7eff60db2\") " Feb 23 14:48:42.710231 master-0 kubenswrapper[28758]: I0223 14:48:42.710153 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt7jt\" (UniqueName: \"kubernetes.io/projected/0d387f59-bf80-4c79-ad36-cef7eff60db2-kube-api-access-wt7jt\") pod \"0d387f59-bf80-4c79-ad36-cef7eff60db2\" (UID: \"0d387f59-bf80-4c79-ad36-cef7eff60db2\") " Feb 23 14:48:42.711224 master-0 kubenswrapper[28758]: I0223 14:48:42.710409 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0d387f59-bf80-4c79-ad36-cef7eff60db2-var-run\") pod \"0d387f59-bf80-4c79-ad36-cef7eff60db2\" (UID: \"0d387f59-bf80-4c79-ad36-cef7eff60db2\") " Feb 23 14:48:42.711224 master-0 kubenswrapper[28758]: I0223 14:48:42.710485 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0d387f59-bf80-4c79-ad36-cef7eff60db2-var-log-ovn\") pod \"0d387f59-bf80-4c79-ad36-cef7eff60db2\" (UID: \"0d387f59-bf80-4c79-ad36-cef7eff60db2\") " Feb 23 14:48:42.711618 master-0 kubenswrapper[28758]: I0223 14:48:42.711305 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d387f59-bf80-4c79-ad36-cef7eff60db2-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "0d387f59-bf80-4c79-ad36-cef7eff60db2" (UID: "0d387f59-bf80-4c79-ad36-cef7eff60db2"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:48:42.711618 master-0 kubenswrapper[28758]: I0223 14:48:42.711357 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d387f59-bf80-4c79-ad36-cef7eff60db2-var-run" (OuterVolumeSpecName: "var-run") pod "0d387f59-bf80-4c79-ad36-cef7eff60db2" (UID: "0d387f59-bf80-4c79-ad36-cef7eff60db2"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:48:42.711618 master-0 kubenswrapper[28758]: I0223 14:48:42.711440 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d387f59-bf80-4c79-ad36-cef7eff60db2-scripts" (OuterVolumeSpecName: "scripts") pod "0d387f59-bf80-4c79-ad36-cef7eff60db2" (UID: "0d387f59-bf80-4c79-ad36-cef7eff60db2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:48:42.711819 master-0 kubenswrapper[28758]: I0223 14:48:42.711624 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d387f59-bf80-4c79-ad36-cef7eff60db2-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "0d387f59-bf80-4c79-ad36-cef7eff60db2" (UID: "0d387f59-bf80-4c79-ad36-cef7eff60db2"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:48:42.712587 master-0 kubenswrapper[28758]: I0223 14:48:42.712551 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d387f59-bf80-4c79-ad36-cef7eff60db2-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "0d387f59-bf80-4c79-ad36-cef7eff60db2" (UID: "0d387f59-bf80-4c79-ad36-cef7eff60db2"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:48:42.714325 master-0 kubenswrapper[28758]: I0223 14:48:42.714289 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d387f59-bf80-4c79-ad36-cef7eff60db2-kube-api-access-wt7jt" (OuterVolumeSpecName: "kube-api-access-wt7jt") pod "0d387f59-bf80-4c79-ad36-cef7eff60db2" (UID: "0d387f59-bf80-4c79-ad36-cef7eff60db2"). InnerVolumeSpecName "kube-api-access-wt7jt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:48:42.812627 master-0 kubenswrapper[28758]: I0223 14:48:42.812550 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fcdbc89d5-z97vk" Feb 23 14:48:42.814044 master-0 kubenswrapper[28758]: I0223 14:48:42.813863 28758 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0d387f59-bf80-4c79-ad36-cef7eff60db2-var-log-ovn\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:42.814102 master-0 kubenswrapper[28758]: I0223 14:48:42.814051 28758 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0d387f59-bf80-4c79-ad36-cef7eff60db2-additional-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:42.814143 master-0 kubenswrapper[28758]: I0223 14:48:42.814104 28758 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d387f59-bf80-4c79-ad36-cef7eff60db2-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:42.814143 master-0 kubenswrapper[28758]: I0223 14:48:42.814119 28758 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0d387f59-bf80-4c79-ad36-cef7eff60db2-var-run-ovn\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:42.814143 master-0 kubenswrapper[28758]: I0223 14:48:42.814137 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wt7jt\" (UniqueName: \"kubernetes.io/projected/0d387f59-bf80-4c79-ad36-cef7eff60db2-kube-api-access-wt7jt\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:42.814234 master-0 kubenswrapper[28758]: I0223 14:48:42.814152 28758 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0d387f59-bf80-4c79-ad36-cef7eff60db2-var-run\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:43.092259 master-0 kubenswrapper[28758]: I0223 14:48:43.091754 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lssgn-config-8zs62" Feb 23 14:48:43.092259 master-0 kubenswrapper[28758]: I0223 14:48:43.091820 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lssgn-config-8zs62" event={"ID":"0d387f59-bf80-4c79-ad36-cef7eff60db2","Type":"ContainerDied","Data":"e724eee4c70b276c40f4a6ac0b82b47796ed0e1b88e457bf09d411af10c17087"} Feb 23 14:48:43.092259 master-0 kubenswrapper[28758]: I0223 14:48:43.091853 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e724eee4c70b276c40f4a6ac0b82b47796ed0e1b88e457bf09d411af10c17087" Feb 23 14:48:43.378758 master-0 kubenswrapper[28758]: W0223 14:48:43.378673 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc93cc5fc_40f5_42ab_bb10_10d64fd205f8.slice/crio-728ae3134d6e2469ff4e5f308301979f986db6053a9d57958b9c2e53ccfae22f WatchSource:0}: Error finding container 728ae3134d6e2469ff4e5f308301979f986db6053a9d57958b9c2e53ccfae22f: Status 404 returned error can't find the container with id 728ae3134d6e2469ff4e5f308301979f986db6053a9d57958b9c2e53ccfae22f Feb 23 14:48:43.381602 master-0 kubenswrapper[28758]: I0223 14:48:43.381558 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fcdbc89d5-z97vk"] Feb 23 14:48:43.747704 master-0 kubenswrapper[28758]: I0223 14:48:43.745450 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-lssgn-config-8zs62"] Feb 23 14:48:43.759509 master-0 kubenswrapper[28758]: I0223 14:48:43.758969 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-lssgn-config-8zs62"] Feb 23 14:48:44.104770 master-0 kubenswrapper[28758]: I0223 14:48:44.104675 28758 generic.go:334] "Generic (PLEG): container finished" podID="c93cc5fc-40f5-42ab-bb10-10d64fd205f8" containerID="2f7b6e79b4101bf5235cfbfd048c3ff0316f64ce2d723ab0bcb8e0efdd984674" exitCode=0 Feb 23 14:48:44.120082 master-0 kubenswrapper[28758]: I0223 14:48:44.118374 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d387f59-bf80-4c79-ad36-cef7eff60db2" path="/var/lib/kubelet/pods/0d387f59-bf80-4c79-ad36-cef7eff60db2/volumes" Feb 23 14:48:44.120082 master-0 kubenswrapper[28758]: I0223 14:48:44.119390 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fcdbc89d5-z97vk" event={"ID":"c93cc5fc-40f5-42ab-bb10-10d64fd205f8","Type":"ContainerDied","Data":"2f7b6e79b4101bf5235cfbfd048c3ff0316f64ce2d723ab0bcb8e0efdd984674"} Feb 23 14:48:44.120082 master-0 kubenswrapper[28758]: I0223 14:48:44.119432 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fcdbc89d5-z97vk" event={"ID":"c93cc5fc-40f5-42ab-bb10-10d64fd205f8","Type":"ContainerStarted","Data":"728ae3134d6e2469ff4e5f308301979f986db6053a9d57958b9c2e53ccfae22f"} Feb 23 14:48:45.148040 master-0 kubenswrapper[28758]: I0223 14:48:45.147970 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fcdbc89d5-z97vk" event={"ID":"c93cc5fc-40f5-42ab-bb10-10d64fd205f8","Type":"ContainerStarted","Data":"c0046d1158b93c96a8e45b65634774215919ae07e0205924e29d7b1e77f80005"} Feb 23 14:48:45.149394 master-0 kubenswrapper[28758]: I0223 14:48:45.149356 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fcdbc89d5-z97vk" Feb 23 14:48:45.174801 master-0 kubenswrapper[28758]: I0223 14:48:45.174691 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fcdbc89d5-z97vk" podStartSLOduration=3.174666702 podStartE2EDuration="3.174666702s" podCreationTimestamp="2026-02-23 14:48:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:48:45.168373684 +0000 UTC m=+857.294689616" watchObservedRunningTime="2026-02-23 14:48:45.174666702 +0000 UTC m=+857.300982644" Feb 23 14:48:48.200887 master-0 kubenswrapper[28758]: I0223 14:48:48.200770 28758 generic.go:334] "Generic (PLEG): container finished" podID="3f87df8a-ee8a-4398-b0ce-817da20c6349" containerID="c330f7ca02c23b53430e906257f14262ee5d999e85979aacf84635b10c49e9f1" exitCode=0 Feb 23 14:48:48.200887 master-0 kubenswrapper[28758]: I0223 14:48:48.200836 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-m256t" event={"ID":"3f87df8a-ee8a-4398-b0ce-817da20c6349","Type":"ContainerDied","Data":"c330f7ca02c23b53430e906257f14262ee5d999e85979aacf84635b10c49e9f1"} Feb 23 14:48:49.030781 master-0 kubenswrapper[28758]: I0223 14:48:49.030730 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 23 14:48:49.769110 master-0 kubenswrapper[28758]: I0223 14:48:49.769061 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-m256t" Feb 23 14:48:49.875198 master-0 kubenswrapper[28758]: I0223 14:48:49.875114 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f87df8a-ee8a-4398-b0ce-817da20c6349-combined-ca-bundle\") pod \"3f87df8a-ee8a-4398-b0ce-817da20c6349\" (UID: \"3f87df8a-ee8a-4398-b0ce-817da20c6349\") " Feb 23 14:48:49.875503 master-0 kubenswrapper[28758]: I0223 14:48:49.875215 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f87df8a-ee8a-4398-b0ce-817da20c6349-config-data\") pod \"3f87df8a-ee8a-4398-b0ce-817da20c6349\" (UID: \"3f87df8a-ee8a-4398-b0ce-817da20c6349\") " Feb 23 14:48:49.875503 master-0 kubenswrapper[28758]: I0223 14:48:49.875266 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5mbl\" (UniqueName: \"kubernetes.io/projected/3f87df8a-ee8a-4398-b0ce-817da20c6349-kube-api-access-x5mbl\") pod \"3f87df8a-ee8a-4398-b0ce-817da20c6349\" (UID: \"3f87df8a-ee8a-4398-b0ce-817da20c6349\") " Feb 23 14:48:49.875503 master-0 kubenswrapper[28758]: I0223 14:48:49.875329 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3f87df8a-ee8a-4398-b0ce-817da20c6349-db-sync-config-data\") pod \"3f87df8a-ee8a-4398-b0ce-817da20c6349\" (UID: \"3f87df8a-ee8a-4398-b0ce-817da20c6349\") " Feb 23 14:48:49.879250 master-0 kubenswrapper[28758]: I0223 14:48:49.879175 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f87df8a-ee8a-4398-b0ce-817da20c6349-kube-api-access-x5mbl" (OuterVolumeSpecName: "kube-api-access-x5mbl") pod "3f87df8a-ee8a-4398-b0ce-817da20c6349" (UID: "3f87df8a-ee8a-4398-b0ce-817da20c6349"). InnerVolumeSpecName "kube-api-access-x5mbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:48:49.880022 master-0 kubenswrapper[28758]: I0223 14:48:49.879965 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f87df8a-ee8a-4398-b0ce-817da20c6349-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3f87df8a-ee8a-4398-b0ce-817da20c6349" (UID: "3f87df8a-ee8a-4398-b0ce-817da20c6349"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:48:49.902423 master-0 kubenswrapper[28758]: I0223 14:48:49.902310 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f87df8a-ee8a-4398-b0ce-817da20c6349-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f87df8a-ee8a-4398-b0ce-817da20c6349" (UID: "3f87df8a-ee8a-4398-b0ce-817da20c6349"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:48:49.933729 master-0 kubenswrapper[28758]: I0223 14:48:49.933634 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f87df8a-ee8a-4398-b0ce-817da20c6349-config-data" (OuterVolumeSpecName: "config-data") pod "3f87df8a-ee8a-4398-b0ce-817da20c6349" (UID: "3f87df8a-ee8a-4398-b0ce-817da20c6349"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:48:49.977907 master-0 kubenswrapper[28758]: I0223 14:48:49.977812 28758 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3f87df8a-ee8a-4398-b0ce-817da20c6349-db-sync-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:49.977907 master-0 kubenswrapper[28758]: I0223 14:48:49.977872 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f87df8a-ee8a-4398-b0ce-817da20c6349-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:49.977907 master-0 kubenswrapper[28758]: I0223 14:48:49.977886 28758 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f87df8a-ee8a-4398-b0ce-817da20c6349-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:49.977907 master-0 kubenswrapper[28758]: I0223 14:48:49.977901 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5mbl\" (UniqueName: \"kubernetes.io/projected/3f87df8a-ee8a-4398-b0ce-817da20c6349-kube-api-access-x5mbl\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:50.231578 master-0 kubenswrapper[28758]: I0223 14:48:50.231501 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-m256t" event={"ID":"3f87df8a-ee8a-4398-b0ce-817da20c6349","Type":"ContainerDied","Data":"239108202cc5ec2de464b739bf1756d2bb8c6ef866f06fc126534a54618a82db"} Feb 23 14:48:50.231578 master-0 kubenswrapper[28758]: I0223 14:48:50.231564 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="239108202cc5ec2de464b739bf1756d2bb8c6ef866f06fc126534a54618a82db" Feb 23 14:48:50.231838 master-0 kubenswrapper[28758]: I0223 14:48:50.231627 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-m256t" Feb 23 14:48:50.661987 master-0 kubenswrapper[28758]: I0223 14:48:50.661921 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fcdbc89d5-z97vk"] Feb 23 14:48:50.662806 master-0 kubenswrapper[28758]: I0223 14:48:50.662773 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fcdbc89d5-z97vk" podUID="c93cc5fc-40f5-42ab-bb10-10d64fd205f8" containerName="dnsmasq-dns" containerID="cri-o://c0046d1158b93c96a8e45b65634774215919ae07e0205924e29d7b1e77f80005" gracePeriod=10 Feb 23 14:48:50.669735 master-0 kubenswrapper[28758]: I0223 14:48:50.668980 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fcdbc89d5-z97vk" Feb 23 14:48:50.725579 master-0 kubenswrapper[28758]: I0223 14:48:50.724057 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d9b67b7fc-m76r8"] Feb 23 14:48:50.725579 master-0 kubenswrapper[28758]: E0223 14:48:50.724658 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f87df8a-ee8a-4398-b0ce-817da20c6349" containerName="glance-db-sync" Feb 23 14:48:50.725579 master-0 kubenswrapper[28758]: I0223 14:48:50.724675 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f87df8a-ee8a-4398-b0ce-817da20c6349" containerName="glance-db-sync" Feb 23 14:48:50.725579 master-0 kubenswrapper[28758]: E0223 14:48:50.724704 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d387f59-bf80-4c79-ad36-cef7eff60db2" containerName="ovn-config" Feb 23 14:48:50.725579 master-0 kubenswrapper[28758]: I0223 14:48:50.724713 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d387f59-bf80-4c79-ad36-cef7eff60db2" containerName="ovn-config" Feb 23 14:48:50.725579 master-0 kubenswrapper[28758]: I0223 14:48:50.725003 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f87df8a-ee8a-4398-b0ce-817da20c6349" containerName="glance-db-sync" Feb 23 14:48:50.725579 master-0 kubenswrapper[28758]: I0223 14:48:50.725041 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d387f59-bf80-4c79-ad36-cef7eff60db2" containerName="ovn-config" Feb 23 14:48:50.726569 master-0 kubenswrapper[28758]: I0223 14:48:50.726404 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" Feb 23 14:48:50.743883 master-0 kubenswrapper[28758]: I0223 14:48:50.743815 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d9b67b7fc-m76r8"] Feb 23 14:48:50.808797 master-0 kubenswrapper[28758]: I0223 14:48:50.808735 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpv6s\" (UniqueName: \"kubernetes.io/projected/67d05e7e-cde1-4bc2-93b0-62274ff2002a-kube-api-access-cpv6s\") pod \"dnsmasq-dns-d9b67b7fc-m76r8\" (UID: \"67d05e7e-cde1-4bc2-93b0-62274ff2002a\") " pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" Feb 23 14:48:50.808797 master-0 kubenswrapper[28758]: I0223 14:48:50.808815 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67d05e7e-cde1-4bc2-93b0-62274ff2002a-dns-swift-storage-0\") pod \"dnsmasq-dns-d9b67b7fc-m76r8\" (UID: \"67d05e7e-cde1-4bc2-93b0-62274ff2002a\") " pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" Feb 23 14:48:50.809419 master-0 kubenswrapper[28758]: I0223 14:48:50.808849 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67d05e7e-cde1-4bc2-93b0-62274ff2002a-config\") pod \"dnsmasq-dns-d9b67b7fc-m76r8\" (UID: \"67d05e7e-cde1-4bc2-93b0-62274ff2002a\") " pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" Feb 23 14:48:50.809419 master-0 kubenswrapper[28758]: I0223 14:48:50.808903 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67d05e7e-cde1-4bc2-93b0-62274ff2002a-dns-svc\") pod \"dnsmasq-dns-d9b67b7fc-m76r8\" (UID: \"67d05e7e-cde1-4bc2-93b0-62274ff2002a\") " pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" Feb 23 14:48:50.809419 master-0 kubenswrapper[28758]: I0223 14:48:50.808925 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67d05e7e-cde1-4bc2-93b0-62274ff2002a-ovsdbserver-nb\") pod \"dnsmasq-dns-d9b67b7fc-m76r8\" (UID: \"67d05e7e-cde1-4bc2-93b0-62274ff2002a\") " pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" Feb 23 14:48:50.809419 master-0 kubenswrapper[28758]: I0223 14:48:50.808956 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67d05e7e-cde1-4bc2-93b0-62274ff2002a-ovsdbserver-sb\") pod \"dnsmasq-dns-d9b67b7fc-m76r8\" (UID: \"67d05e7e-cde1-4bc2-93b0-62274ff2002a\") " pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" Feb 23 14:48:50.910676 master-0 kubenswrapper[28758]: I0223 14:48:50.910633 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpv6s\" (UniqueName: \"kubernetes.io/projected/67d05e7e-cde1-4bc2-93b0-62274ff2002a-kube-api-access-cpv6s\") pod \"dnsmasq-dns-d9b67b7fc-m76r8\" (UID: \"67d05e7e-cde1-4bc2-93b0-62274ff2002a\") " pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" Feb 23 14:48:50.910800 master-0 kubenswrapper[28758]: I0223 14:48:50.910716 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67d05e7e-cde1-4bc2-93b0-62274ff2002a-dns-swift-storage-0\") pod \"dnsmasq-dns-d9b67b7fc-m76r8\" (UID: \"67d05e7e-cde1-4bc2-93b0-62274ff2002a\") " pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" Feb 23 14:48:50.910928 master-0 kubenswrapper[28758]: I0223 14:48:50.910850 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67d05e7e-cde1-4bc2-93b0-62274ff2002a-config\") pod \"dnsmasq-dns-d9b67b7fc-m76r8\" (UID: \"67d05e7e-cde1-4bc2-93b0-62274ff2002a\") " pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" Feb 23 14:48:50.910991 master-0 kubenswrapper[28758]: I0223 14:48:50.910958 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67d05e7e-cde1-4bc2-93b0-62274ff2002a-dns-svc\") pod \"dnsmasq-dns-d9b67b7fc-m76r8\" (UID: \"67d05e7e-cde1-4bc2-93b0-62274ff2002a\") " pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" Feb 23 14:48:50.911032 master-0 kubenswrapper[28758]: I0223 14:48:50.910993 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67d05e7e-cde1-4bc2-93b0-62274ff2002a-ovsdbserver-nb\") pod \"dnsmasq-dns-d9b67b7fc-m76r8\" (UID: \"67d05e7e-cde1-4bc2-93b0-62274ff2002a\") " pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" Feb 23 14:48:50.911078 master-0 kubenswrapper[28758]: I0223 14:48:50.911030 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67d05e7e-cde1-4bc2-93b0-62274ff2002a-ovsdbserver-sb\") pod \"dnsmasq-dns-d9b67b7fc-m76r8\" (UID: \"67d05e7e-cde1-4bc2-93b0-62274ff2002a\") " pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" Feb 23 14:48:50.911788 master-0 kubenswrapper[28758]: I0223 14:48:50.911754 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67d05e7e-cde1-4bc2-93b0-62274ff2002a-dns-svc\") pod \"dnsmasq-dns-d9b67b7fc-m76r8\" (UID: \"67d05e7e-cde1-4bc2-93b0-62274ff2002a\") " pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" Feb 23 14:48:50.911869 master-0 kubenswrapper[28758]: I0223 14:48:50.911821 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67d05e7e-cde1-4bc2-93b0-62274ff2002a-dns-swift-storage-0\") pod \"dnsmasq-dns-d9b67b7fc-m76r8\" (UID: \"67d05e7e-cde1-4bc2-93b0-62274ff2002a\") " pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" Feb 23 14:48:50.930845 master-0 kubenswrapper[28758]: I0223 14:48:50.930733 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67d05e7e-cde1-4bc2-93b0-62274ff2002a-ovsdbserver-sb\") pod \"dnsmasq-dns-d9b67b7fc-m76r8\" (UID: \"67d05e7e-cde1-4bc2-93b0-62274ff2002a\") " pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" Feb 23 14:48:50.931006 master-0 kubenswrapper[28758]: I0223 14:48:50.930949 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67d05e7e-cde1-4bc2-93b0-62274ff2002a-config\") pod \"dnsmasq-dns-d9b67b7fc-m76r8\" (UID: \"67d05e7e-cde1-4bc2-93b0-62274ff2002a\") " pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" Feb 23 14:48:50.931322 master-0 kubenswrapper[28758]: I0223 14:48:50.931271 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67d05e7e-cde1-4bc2-93b0-62274ff2002a-ovsdbserver-nb\") pod \"dnsmasq-dns-d9b67b7fc-m76r8\" (UID: \"67d05e7e-cde1-4bc2-93b0-62274ff2002a\") " pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" Feb 23 14:48:50.935776 master-0 kubenswrapper[28758]: I0223 14:48:50.935730 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpv6s\" (UniqueName: \"kubernetes.io/projected/67d05e7e-cde1-4bc2-93b0-62274ff2002a-kube-api-access-cpv6s\") pod \"dnsmasq-dns-d9b67b7fc-m76r8\" (UID: \"67d05e7e-cde1-4bc2-93b0-62274ff2002a\") " pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" Feb 23 14:48:51.095990 master-0 kubenswrapper[28758]: I0223 14:48:51.095857 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" Feb 23 14:48:51.285507 master-0 kubenswrapper[28758]: I0223 14:48:51.282831 28758 generic.go:334] "Generic (PLEG): container finished" podID="c93cc5fc-40f5-42ab-bb10-10d64fd205f8" containerID="c0046d1158b93c96a8e45b65634774215919ae07e0205924e29d7b1e77f80005" exitCode=0 Feb 23 14:48:51.285507 master-0 kubenswrapper[28758]: I0223 14:48:51.282901 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fcdbc89d5-z97vk" event={"ID":"c93cc5fc-40f5-42ab-bb10-10d64fd205f8","Type":"ContainerDied","Data":"c0046d1158b93c96a8e45b65634774215919ae07e0205924e29d7b1e77f80005"} Feb 23 14:48:51.386983 master-0 kubenswrapper[28758]: I0223 14:48:51.385889 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fcdbc89d5-z97vk" Feb 23 14:48:51.444509 master-0 kubenswrapper[28758]: I0223 14:48:51.427247 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-config\") pod \"c93cc5fc-40f5-42ab-bb10-10d64fd205f8\" (UID: \"c93cc5fc-40f5-42ab-bb10-10d64fd205f8\") " Feb 23 14:48:51.444509 master-0 kubenswrapper[28758]: I0223 14:48:51.427554 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-dns-swift-storage-0\") pod \"c93cc5fc-40f5-42ab-bb10-10d64fd205f8\" (UID: \"c93cc5fc-40f5-42ab-bb10-10d64fd205f8\") " Feb 23 14:48:51.444509 master-0 kubenswrapper[28758]: I0223 14:48:51.427591 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-ovsdbserver-nb\") pod \"c93cc5fc-40f5-42ab-bb10-10d64fd205f8\" (UID: \"c93cc5fc-40f5-42ab-bb10-10d64fd205f8\") " Feb 23 14:48:51.444509 master-0 kubenswrapper[28758]: I0223 14:48:51.427636 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-ovsdbserver-sb\") pod \"c93cc5fc-40f5-42ab-bb10-10d64fd205f8\" (UID: \"c93cc5fc-40f5-42ab-bb10-10d64fd205f8\") " Feb 23 14:48:51.444509 master-0 kubenswrapper[28758]: I0223 14:48:51.427727 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcc5l\" (UniqueName: \"kubernetes.io/projected/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-kube-api-access-gcc5l\") pod \"c93cc5fc-40f5-42ab-bb10-10d64fd205f8\" (UID: \"c93cc5fc-40f5-42ab-bb10-10d64fd205f8\") " Feb 23 14:48:51.444509 master-0 kubenswrapper[28758]: I0223 14:48:51.427845 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-dns-svc\") pod \"c93cc5fc-40f5-42ab-bb10-10d64fd205f8\" (UID: \"c93cc5fc-40f5-42ab-bb10-10d64fd205f8\") " Feb 23 14:48:51.466147 master-0 kubenswrapper[28758]: I0223 14:48:51.459954 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-kube-api-access-gcc5l" (OuterVolumeSpecName: "kube-api-access-gcc5l") pod "c93cc5fc-40f5-42ab-bb10-10d64fd205f8" (UID: "c93cc5fc-40f5-42ab-bb10-10d64fd205f8"). InnerVolumeSpecName "kube-api-access-gcc5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:48:51.505965 master-0 kubenswrapper[28758]: I0223 14:48:51.505899 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c93cc5fc-40f5-42ab-bb10-10d64fd205f8" (UID: "c93cc5fc-40f5-42ab-bb10-10d64fd205f8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:48:51.516283 master-0 kubenswrapper[28758]: I0223 14:48:51.516217 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c93cc5fc-40f5-42ab-bb10-10d64fd205f8" (UID: "c93cc5fc-40f5-42ab-bb10-10d64fd205f8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:48:51.522754 master-0 kubenswrapper[28758]: I0223 14:48:51.522709 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c93cc5fc-40f5-42ab-bb10-10d64fd205f8" (UID: "c93cc5fc-40f5-42ab-bb10-10d64fd205f8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:48:51.530657 master-0 kubenswrapper[28758]: I0223 14:48:51.529813 28758 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:51.530657 master-0 kubenswrapper[28758]: I0223 14:48:51.529849 28758 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:51.530657 master-0 kubenswrapper[28758]: I0223 14:48:51.529860 28758 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:51.530657 master-0 kubenswrapper[28758]: I0223 14:48:51.529871 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcc5l\" (UniqueName: \"kubernetes.io/projected/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-kube-api-access-gcc5l\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:51.532780 master-0 kubenswrapper[28758]: I0223 14:48:51.531125 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-config" (OuterVolumeSpecName: "config") pod "c93cc5fc-40f5-42ab-bb10-10d64fd205f8" (UID: "c93cc5fc-40f5-42ab-bb10-10d64fd205f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:48:51.532780 master-0 kubenswrapper[28758]: I0223 14:48:51.531383 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c93cc5fc-40f5-42ab-bb10-10d64fd205f8" (UID: "c93cc5fc-40f5-42ab-bb10-10d64fd205f8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:48:51.632056 master-0 kubenswrapper[28758]: I0223 14:48:51.631907 28758 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-dns-svc\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:51.632056 master-0 kubenswrapper[28758]: I0223 14:48:51.631965 28758 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c93cc5fc-40f5-42ab-bb10-10d64fd205f8-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:48:51.714820 master-0 kubenswrapper[28758]: I0223 14:48:51.714755 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d9b67b7fc-m76r8"] Feb 23 14:48:52.297219 master-0 kubenswrapper[28758]: I0223 14:48:52.297150 28758 generic.go:334] "Generic (PLEG): container finished" podID="67d05e7e-cde1-4bc2-93b0-62274ff2002a" containerID="a2cf7ff261dd77119bc5d044f8abbdd8055a635ee7cf0e8845d31b5cff026b5d" exitCode=0 Feb 23 14:48:52.297843 master-0 kubenswrapper[28758]: I0223 14:48:52.297239 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" event={"ID":"67d05e7e-cde1-4bc2-93b0-62274ff2002a","Type":"ContainerDied","Data":"a2cf7ff261dd77119bc5d044f8abbdd8055a635ee7cf0e8845d31b5cff026b5d"} Feb 23 14:48:52.297843 master-0 kubenswrapper[28758]: I0223 14:48:52.297273 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" event={"ID":"67d05e7e-cde1-4bc2-93b0-62274ff2002a","Type":"ContainerStarted","Data":"1446848d2286e951418f294d54a20e97e14e53decb5f633d1ea61354ac6ca13b"} Feb 23 14:48:52.300745 master-0 kubenswrapper[28758]: I0223 14:48:52.300688 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fcdbc89d5-z97vk" event={"ID":"c93cc5fc-40f5-42ab-bb10-10d64fd205f8","Type":"ContainerDied","Data":"728ae3134d6e2469ff4e5f308301979f986db6053a9d57958b9c2e53ccfae22f"} Feb 23 14:48:52.300745 master-0 kubenswrapper[28758]: I0223 14:48:52.300742 28758 scope.go:117] "RemoveContainer" containerID="c0046d1158b93c96a8e45b65634774215919ae07e0205924e29d7b1e77f80005" Feb 23 14:48:52.300943 master-0 kubenswrapper[28758]: I0223 14:48:52.300796 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fcdbc89d5-z97vk" Feb 23 14:48:52.462156 master-0 kubenswrapper[28758]: I0223 14:48:52.462098 28758 scope.go:117] "RemoveContainer" containerID="2f7b6e79b4101bf5235cfbfd048c3ff0316f64ce2d723ab0bcb8e0efdd984674" Feb 23 14:48:52.488323 master-0 kubenswrapper[28758]: I0223 14:48:52.488256 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fcdbc89d5-z97vk"] Feb 23 14:48:52.497583 master-0 kubenswrapper[28758]: I0223 14:48:52.497522 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fcdbc89d5-z97vk"] Feb 23 14:48:53.320724 master-0 kubenswrapper[28758]: I0223 14:48:53.320675 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" event={"ID":"67d05e7e-cde1-4bc2-93b0-62274ff2002a","Type":"ContainerStarted","Data":"b053e4c2eb401e2d1292a5655add1836c256f85d8c7cb49382994cbed906e288"} Feb 23 14:48:53.321439 master-0 kubenswrapper[28758]: I0223 14:48:53.321316 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" Feb 23 14:48:53.346880 master-0 kubenswrapper[28758]: I0223 14:48:53.346754 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" podStartSLOduration=3.3467343769999998 podStartE2EDuration="3.346734377s" podCreationTimestamp="2026-02-23 14:48:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:48:53.346678096 +0000 UTC m=+865.472994058" watchObservedRunningTime="2026-02-23 14:48:53.346734377 +0000 UTC m=+865.473050309" Feb 23 14:48:54.099937 master-0 kubenswrapper[28758]: I0223 14:48:54.099859 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c93cc5fc-40f5-42ab-bb10-10d64fd205f8" path="/var/lib/kubelet/pods/c93cc5fc-40f5-42ab-bb10-10d64fd205f8/volumes" Feb 23 14:48:57.617779 master-0 kubenswrapper[28758]: I0223 14:48:57.617695 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 23 14:48:57.939201 master-0 kubenswrapper[28758]: I0223 14:48:57.939136 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-xm7bc"] Feb 23 14:48:57.939725 master-0 kubenswrapper[28758]: E0223 14:48:57.939698 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c93cc5fc-40f5-42ab-bb10-10d64fd205f8" containerName="init" Feb 23 14:48:57.939725 master-0 kubenswrapper[28758]: I0223 14:48:57.939720 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="c93cc5fc-40f5-42ab-bb10-10d64fd205f8" containerName="init" Feb 23 14:48:57.939827 master-0 kubenswrapper[28758]: E0223 14:48:57.939739 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c93cc5fc-40f5-42ab-bb10-10d64fd205f8" containerName="dnsmasq-dns" Feb 23 14:48:57.939827 master-0 kubenswrapper[28758]: I0223 14:48:57.939746 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="c93cc5fc-40f5-42ab-bb10-10d64fd205f8" containerName="dnsmasq-dns" Feb 23 14:48:57.940001 master-0 kubenswrapper[28758]: I0223 14:48:57.939964 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="c93cc5fc-40f5-42ab-bb10-10d64fd205f8" containerName="dnsmasq-dns" Feb 23 14:48:57.940716 master-0 kubenswrapper[28758]: I0223 14:48:57.940678 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xm7bc" Feb 23 14:48:57.960031 master-0 kubenswrapper[28758]: I0223 14:48:57.959967 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-xm7bc"] Feb 23 14:48:58.039745 master-0 kubenswrapper[28758]: I0223 14:48:58.039659 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-03f5-account-create-update-ntvsz"] Feb 23 14:48:58.041233 master-0 kubenswrapper[28758]: I0223 14:48:58.041199 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-03f5-account-create-update-ntvsz" Feb 23 14:48:58.043291 master-0 kubenswrapper[28758]: I0223 14:48:58.043213 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 23 14:48:58.052808 master-0 kubenswrapper[28758]: I0223 14:48:58.052731 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-03f5-account-create-update-ntvsz"] Feb 23 14:48:58.063726 master-0 kubenswrapper[28758]: I0223 14:48:58.063669 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c3df66b-646e-4494-90d6-d17492127413-operator-scripts\") pod \"cinder-db-create-xm7bc\" (UID: \"3c3df66b-646e-4494-90d6-d17492127413\") " pod="openstack/cinder-db-create-xm7bc" Feb 23 14:48:58.064068 master-0 kubenswrapper[28758]: I0223 14:48:58.064022 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft4sj\" (UniqueName: \"kubernetes.io/projected/3c3df66b-646e-4494-90d6-d17492127413-kube-api-access-ft4sj\") pod \"cinder-db-create-xm7bc\" (UID: \"3c3df66b-646e-4494-90d6-d17492127413\") " pod="openstack/cinder-db-create-xm7bc" Feb 23 14:48:58.166455 master-0 kubenswrapper[28758]: I0223 14:48:58.166227 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft4sj\" (UniqueName: \"kubernetes.io/projected/3c3df66b-646e-4494-90d6-d17492127413-kube-api-access-ft4sj\") pod \"cinder-db-create-xm7bc\" (UID: \"3c3df66b-646e-4494-90d6-d17492127413\") " pod="openstack/cinder-db-create-xm7bc" Feb 23 14:48:58.166455 master-0 kubenswrapper[28758]: I0223 14:48:58.166355 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltng8\" (UniqueName: \"kubernetes.io/projected/eed2aa93-0ee8-4867-b988-9f8834149437-kube-api-access-ltng8\") pod \"cinder-03f5-account-create-update-ntvsz\" (UID: \"eed2aa93-0ee8-4867-b988-9f8834149437\") " pod="openstack/cinder-03f5-account-create-update-ntvsz" Feb 23 14:48:58.166455 master-0 kubenswrapper[28758]: I0223 14:48:58.166464 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c3df66b-646e-4494-90d6-d17492127413-operator-scripts\") pod \"cinder-db-create-xm7bc\" (UID: \"3c3df66b-646e-4494-90d6-d17492127413\") " pod="openstack/cinder-db-create-xm7bc" Feb 23 14:48:58.166799 master-0 kubenswrapper[28758]: I0223 14:48:58.166549 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eed2aa93-0ee8-4867-b988-9f8834149437-operator-scripts\") pod \"cinder-03f5-account-create-update-ntvsz\" (UID: \"eed2aa93-0ee8-4867-b988-9f8834149437\") " pod="openstack/cinder-03f5-account-create-update-ntvsz" Feb 23 14:48:58.167674 master-0 kubenswrapper[28758]: I0223 14:48:58.167342 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c3df66b-646e-4494-90d6-d17492127413-operator-scripts\") pod \"cinder-db-create-xm7bc\" (UID: \"3c3df66b-646e-4494-90d6-d17492127413\") " pod="openstack/cinder-db-create-xm7bc" Feb 23 14:48:58.198273 master-0 kubenswrapper[28758]: I0223 14:48:58.198138 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft4sj\" (UniqueName: \"kubernetes.io/projected/3c3df66b-646e-4494-90d6-d17492127413-kube-api-access-ft4sj\") pod \"cinder-db-create-xm7bc\" (UID: \"3c3df66b-646e-4494-90d6-d17492127413\") " pod="openstack/cinder-db-create-xm7bc" Feb 23 14:48:58.235356 master-0 kubenswrapper[28758]: I0223 14:48:58.234325 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-5pnls"] Feb 23 14:48:58.237164 master-0 kubenswrapper[28758]: I0223 14:48:58.237047 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5pnls" Feb 23 14:48:58.246603 master-0 kubenswrapper[28758]: I0223 14:48:58.246550 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-5pnls"] Feb 23 14:48:58.267952 master-0 kubenswrapper[28758]: I0223 14:48:58.267855 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eed2aa93-0ee8-4867-b988-9f8834149437-operator-scripts\") pod \"cinder-03f5-account-create-update-ntvsz\" (UID: \"eed2aa93-0ee8-4867-b988-9f8834149437\") " pod="openstack/cinder-03f5-account-create-update-ntvsz" Feb 23 14:48:58.268229 master-0 kubenswrapper[28758]: I0223 14:48:58.268111 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltng8\" (UniqueName: \"kubernetes.io/projected/eed2aa93-0ee8-4867-b988-9f8834149437-kube-api-access-ltng8\") pod \"cinder-03f5-account-create-update-ntvsz\" (UID: \"eed2aa93-0ee8-4867-b988-9f8834149437\") " pod="openstack/cinder-03f5-account-create-update-ntvsz" Feb 23 14:48:58.269170 master-0 kubenswrapper[28758]: I0223 14:48:58.268855 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eed2aa93-0ee8-4867-b988-9f8834149437-operator-scripts\") pod \"cinder-03f5-account-create-update-ntvsz\" (UID: \"eed2aa93-0ee8-4867-b988-9f8834149437\") " pod="openstack/cinder-03f5-account-create-update-ntvsz" Feb 23 14:48:58.276693 master-0 kubenswrapper[28758]: I0223 14:48:58.276638 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xm7bc" Feb 23 14:48:58.310830 master-0 kubenswrapper[28758]: I0223 14:48:58.310772 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltng8\" (UniqueName: \"kubernetes.io/projected/eed2aa93-0ee8-4867-b988-9f8834149437-kube-api-access-ltng8\") pod \"cinder-03f5-account-create-update-ntvsz\" (UID: \"eed2aa93-0ee8-4867-b988-9f8834149437\") " pod="openstack/cinder-03f5-account-create-update-ntvsz" Feb 23 14:48:58.346389 master-0 kubenswrapper[28758]: I0223 14:48:58.346285 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-89d6-account-create-update-wb2m8"] Feb 23 14:48:58.347991 master-0 kubenswrapper[28758]: I0223 14:48:58.347949 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-89d6-account-create-update-wb2m8" Feb 23 14:48:58.356252 master-0 kubenswrapper[28758]: I0223 14:48:58.356167 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-89d6-account-create-update-wb2m8"] Feb 23 14:48:58.363292 master-0 kubenswrapper[28758]: I0223 14:48:58.362943 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 23 14:48:58.367588 master-0 kubenswrapper[28758]: I0223 14:48:58.366046 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-03f5-account-create-update-ntvsz" Feb 23 14:48:58.376242 master-0 kubenswrapper[28758]: I0223 14:48:58.372966 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24013480-c173-4d1f-8d00-2cffa90b04ef-operator-scripts\") pod \"neutron-db-create-5pnls\" (UID: \"24013480-c173-4d1f-8d00-2cffa90b04ef\") " pod="openstack/neutron-db-create-5pnls" Feb 23 14:48:58.376242 master-0 kubenswrapper[28758]: I0223 14:48:58.373155 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pdcc\" (UniqueName: \"kubernetes.io/projected/24013480-c173-4d1f-8d00-2cffa90b04ef-kube-api-access-5pdcc\") pod \"neutron-db-create-5pnls\" (UID: \"24013480-c173-4d1f-8d00-2cffa90b04ef\") " pod="openstack/neutron-db-create-5pnls" Feb 23 14:48:58.412069 master-0 kubenswrapper[28758]: I0223 14:48:58.411995 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-5k6m4"] Feb 23 14:48:58.414647 master-0 kubenswrapper[28758]: I0223 14:48:58.414530 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-5k6m4" Feb 23 14:48:58.423548 master-0 kubenswrapper[28758]: I0223 14:48:58.422413 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 23 14:48:58.423548 master-0 kubenswrapper[28758]: I0223 14:48:58.422490 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 23 14:48:58.437973 master-0 kubenswrapper[28758]: I0223 14:48:58.437902 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 23 14:48:58.468953 master-0 kubenswrapper[28758]: I0223 14:48:58.468815 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-5k6m4"] Feb 23 14:48:58.483983 master-0 kubenswrapper[28758]: I0223 14:48:58.478112 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pdcc\" (UniqueName: \"kubernetes.io/projected/24013480-c173-4d1f-8d00-2cffa90b04ef-kube-api-access-5pdcc\") pod \"neutron-db-create-5pnls\" (UID: \"24013480-c173-4d1f-8d00-2cffa90b04ef\") " pod="openstack/neutron-db-create-5pnls" Feb 23 14:48:58.483983 master-0 kubenswrapper[28758]: I0223 14:48:58.478195 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cac22f15-2beb-4f7e-b34c-842e8c0fa082-operator-scripts\") pod \"neutron-89d6-account-create-update-wb2m8\" (UID: \"cac22f15-2beb-4f7e-b34c-842e8c0fa082\") " pod="openstack/neutron-89d6-account-create-update-wb2m8" Feb 23 14:48:58.483983 master-0 kubenswrapper[28758]: I0223 14:48:58.478264 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvwzj\" (UniqueName: \"kubernetes.io/projected/cac22f15-2beb-4f7e-b34c-842e8c0fa082-kube-api-access-kvwzj\") pod \"neutron-89d6-account-create-update-wb2m8\" (UID: \"cac22f15-2beb-4f7e-b34c-842e8c0fa082\") " pod="openstack/neutron-89d6-account-create-update-wb2m8" Feb 23 14:48:58.483983 master-0 kubenswrapper[28758]: I0223 14:48:58.478352 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24013480-c173-4d1f-8d00-2cffa90b04ef-operator-scripts\") pod \"neutron-db-create-5pnls\" (UID: \"24013480-c173-4d1f-8d00-2cffa90b04ef\") " pod="openstack/neutron-db-create-5pnls" Feb 23 14:48:58.483983 master-0 kubenswrapper[28758]: I0223 14:48:58.479242 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24013480-c173-4d1f-8d00-2cffa90b04ef-operator-scripts\") pod \"neutron-db-create-5pnls\" (UID: \"24013480-c173-4d1f-8d00-2cffa90b04ef\") " pod="openstack/neutron-db-create-5pnls" Feb 23 14:48:58.511730 master-0 kubenswrapper[28758]: I0223 14:48:58.499118 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pdcc\" (UniqueName: \"kubernetes.io/projected/24013480-c173-4d1f-8d00-2cffa90b04ef-kube-api-access-5pdcc\") pod \"neutron-db-create-5pnls\" (UID: \"24013480-c173-4d1f-8d00-2cffa90b04ef\") " pod="openstack/neutron-db-create-5pnls" Feb 23 14:48:58.581832 master-0 kubenswrapper[28758]: I0223 14:48:58.581758 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b4d87f9-8a0a-4e73-bb46-d5b42a03a848-combined-ca-bundle\") pod \"keystone-db-sync-5k6m4\" (UID: \"9b4d87f9-8a0a-4e73-bb46-d5b42a03a848\") " pod="openstack/keystone-db-sync-5k6m4" Feb 23 14:48:58.582057 master-0 kubenswrapper[28758]: I0223 14:48:58.581991 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b4d87f9-8a0a-4e73-bb46-d5b42a03a848-config-data\") pod \"keystone-db-sync-5k6m4\" (UID: \"9b4d87f9-8a0a-4e73-bb46-d5b42a03a848\") " pod="openstack/keystone-db-sync-5k6m4" Feb 23 14:48:58.582313 master-0 kubenswrapper[28758]: I0223 14:48:58.582248 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q62s8\" (UniqueName: \"kubernetes.io/projected/9b4d87f9-8a0a-4e73-bb46-d5b42a03a848-kube-api-access-q62s8\") pod \"keystone-db-sync-5k6m4\" (UID: \"9b4d87f9-8a0a-4e73-bb46-d5b42a03a848\") " pod="openstack/keystone-db-sync-5k6m4" Feb 23 14:48:58.582456 master-0 kubenswrapper[28758]: I0223 14:48:58.582430 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cac22f15-2beb-4f7e-b34c-842e8c0fa082-operator-scripts\") pod \"neutron-89d6-account-create-update-wb2m8\" (UID: \"cac22f15-2beb-4f7e-b34c-842e8c0fa082\") " pod="openstack/neutron-89d6-account-create-update-wb2m8" Feb 23 14:48:58.582593 master-0 kubenswrapper[28758]: I0223 14:48:58.582573 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvwzj\" (UniqueName: \"kubernetes.io/projected/cac22f15-2beb-4f7e-b34c-842e8c0fa082-kube-api-access-kvwzj\") pod \"neutron-89d6-account-create-update-wb2m8\" (UID: \"cac22f15-2beb-4f7e-b34c-842e8c0fa082\") " pod="openstack/neutron-89d6-account-create-update-wb2m8" Feb 23 14:48:58.583441 master-0 kubenswrapper[28758]: I0223 14:48:58.583302 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cac22f15-2beb-4f7e-b34c-842e8c0fa082-operator-scripts\") pod \"neutron-89d6-account-create-update-wb2m8\" (UID: \"cac22f15-2beb-4f7e-b34c-842e8c0fa082\") " pod="openstack/neutron-89d6-account-create-update-wb2m8" Feb 23 14:48:58.599369 master-0 kubenswrapper[28758]: I0223 14:48:58.599297 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5pnls" Feb 23 14:48:58.602464 master-0 kubenswrapper[28758]: I0223 14:48:58.602414 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvwzj\" (UniqueName: \"kubernetes.io/projected/cac22f15-2beb-4f7e-b34c-842e8c0fa082-kube-api-access-kvwzj\") pod \"neutron-89d6-account-create-update-wb2m8\" (UID: \"cac22f15-2beb-4f7e-b34c-842e8c0fa082\") " pod="openstack/neutron-89d6-account-create-update-wb2m8" Feb 23 14:48:58.688314 master-0 kubenswrapper[28758]: I0223 14:48:58.687875 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q62s8\" (UniqueName: \"kubernetes.io/projected/9b4d87f9-8a0a-4e73-bb46-d5b42a03a848-kube-api-access-q62s8\") pod \"keystone-db-sync-5k6m4\" (UID: \"9b4d87f9-8a0a-4e73-bb46-d5b42a03a848\") " pod="openstack/keystone-db-sync-5k6m4" Feb 23 14:48:58.688314 master-0 kubenswrapper[28758]: I0223 14:48:58.688086 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b4d87f9-8a0a-4e73-bb46-d5b42a03a848-combined-ca-bundle\") pod \"keystone-db-sync-5k6m4\" (UID: \"9b4d87f9-8a0a-4e73-bb46-d5b42a03a848\") " pod="openstack/keystone-db-sync-5k6m4" Feb 23 14:48:58.688314 master-0 kubenswrapper[28758]: I0223 14:48:58.688185 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b4d87f9-8a0a-4e73-bb46-d5b42a03a848-config-data\") pod \"keystone-db-sync-5k6m4\" (UID: \"9b4d87f9-8a0a-4e73-bb46-d5b42a03a848\") " pod="openstack/keystone-db-sync-5k6m4" Feb 23 14:48:58.691967 master-0 kubenswrapper[28758]: I0223 14:48:58.691906 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b4d87f9-8a0a-4e73-bb46-d5b42a03a848-combined-ca-bundle\") pod \"keystone-db-sync-5k6m4\" (UID: \"9b4d87f9-8a0a-4e73-bb46-d5b42a03a848\") " pod="openstack/keystone-db-sync-5k6m4" Feb 23 14:48:58.692989 master-0 kubenswrapper[28758]: I0223 14:48:58.692815 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b4d87f9-8a0a-4e73-bb46-d5b42a03a848-config-data\") pod \"keystone-db-sync-5k6m4\" (UID: \"9b4d87f9-8a0a-4e73-bb46-d5b42a03a848\") " pod="openstack/keystone-db-sync-5k6m4" Feb 23 14:48:58.706539 master-0 kubenswrapper[28758]: I0223 14:48:58.705834 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-89d6-account-create-update-wb2m8" Feb 23 14:48:58.707798 master-0 kubenswrapper[28758]: I0223 14:48:58.707748 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q62s8\" (UniqueName: \"kubernetes.io/projected/9b4d87f9-8a0a-4e73-bb46-d5b42a03a848-kube-api-access-q62s8\") pod \"keystone-db-sync-5k6m4\" (UID: \"9b4d87f9-8a0a-4e73-bb46-d5b42a03a848\") " pod="openstack/keystone-db-sync-5k6m4" Feb 23 14:48:58.825960 master-0 kubenswrapper[28758]: I0223 14:48:58.824056 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-5k6m4" Feb 23 14:48:58.835663 master-0 kubenswrapper[28758]: W0223 14:48:58.830280 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c3df66b_646e_4494_90d6_d17492127413.slice/crio-f86ecea0baa0b98a8a2c40bb3ed1131677728eecdf7a2b5bdedc21165d51d98b WatchSource:0}: Error finding container f86ecea0baa0b98a8a2c40bb3ed1131677728eecdf7a2b5bdedc21165d51d98b: Status 404 returned error can't find the container with id f86ecea0baa0b98a8a2c40bb3ed1131677728eecdf7a2b5bdedc21165d51d98b Feb 23 14:48:58.835663 master-0 kubenswrapper[28758]: I0223 14:48:58.833365 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-xm7bc"] Feb 23 14:48:58.969227 master-0 kubenswrapper[28758]: I0223 14:48:58.964602 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-03f5-account-create-update-ntvsz"] Feb 23 14:48:58.976507 master-0 kubenswrapper[28758]: W0223 14:48:58.976433 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeed2aa93_0ee8_4867_b988_9f8834149437.slice/crio-087485af20c55565fcb64c916deaacb09384e309bb50b8a5ddd385b04d55501e WatchSource:0}: Error finding container 087485af20c55565fcb64c916deaacb09384e309bb50b8a5ddd385b04d55501e: Status 404 returned error can't find the container with id 087485af20c55565fcb64c916deaacb09384e309bb50b8a5ddd385b04d55501e Feb 23 14:48:59.123141 master-0 kubenswrapper[28758]: I0223 14:48:59.114666 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-5pnls"] Feb 23 14:48:59.265608 master-0 kubenswrapper[28758]: I0223 14:48:59.263754 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-89d6-account-create-update-wb2m8"] Feb 23 14:48:59.418152 master-0 kubenswrapper[28758]: I0223 14:48:59.418064 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-89d6-account-create-update-wb2m8" event={"ID":"cac22f15-2beb-4f7e-b34c-842e8c0fa082","Type":"ContainerStarted","Data":"368081d1bf353e2148978c3d61eeac9fd6847bfe6da2010faed16f5dc75eb99e"} Feb 23 14:48:59.420818 master-0 kubenswrapper[28758]: I0223 14:48:59.420786 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-03f5-account-create-update-ntvsz" event={"ID":"eed2aa93-0ee8-4867-b988-9f8834149437","Type":"ContainerStarted","Data":"8862597eeca17852c4a830ebe434f16d7190bf543a8ffc6876d53ff2bcc8e132"} Feb 23 14:48:59.420913 master-0 kubenswrapper[28758]: I0223 14:48:59.420818 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-03f5-account-create-update-ntvsz" event={"ID":"eed2aa93-0ee8-4867-b988-9f8834149437","Type":"ContainerStarted","Data":"087485af20c55565fcb64c916deaacb09384e309bb50b8a5ddd385b04d55501e"} Feb 23 14:48:59.422324 master-0 kubenswrapper[28758]: I0223 14:48:59.422233 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5pnls" event={"ID":"24013480-c173-4d1f-8d00-2cffa90b04ef","Type":"ContainerStarted","Data":"e97683a069e1e38831efdcdb7dd7a3a03d388ecc9fcd81a386588d311625e07e"} Feb 23 14:48:59.425322 master-0 kubenswrapper[28758]: I0223 14:48:59.425253 28758 generic.go:334] "Generic (PLEG): container finished" podID="3c3df66b-646e-4494-90d6-d17492127413" containerID="8c40124fd4dfb812e34b683b70a2dda7334a023d4c9604b37125809b7545bb4b" exitCode=0 Feb 23 14:48:59.425432 master-0 kubenswrapper[28758]: I0223 14:48:59.425328 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xm7bc" event={"ID":"3c3df66b-646e-4494-90d6-d17492127413","Type":"ContainerDied","Data":"8c40124fd4dfb812e34b683b70a2dda7334a023d4c9604b37125809b7545bb4b"} Feb 23 14:48:59.425432 master-0 kubenswrapper[28758]: I0223 14:48:59.425367 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xm7bc" event={"ID":"3c3df66b-646e-4494-90d6-d17492127413","Type":"ContainerStarted","Data":"f86ecea0baa0b98a8a2c40bb3ed1131677728eecdf7a2b5bdedc21165d51d98b"} Feb 23 14:48:59.522577 master-0 kubenswrapper[28758]: I0223 14:48:59.520766 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-5k6m4"] Feb 23 14:48:59.534019 master-0 kubenswrapper[28758]: I0223 14:48:59.532961 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-03f5-account-create-update-ntvsz" podStartSLOduration=2.532923869 podStartE2EDuration="2.532923869s" podCreationTimestamp="2026-02-23 14:48:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:48:59.531488831 +0000 UTC m=+871.657804763" watchObservedRunningTime="2026-02-23 14:48:59.532923869 +0000 UTC m=+871.659239801" Feb 23 14:48:59.601282 master-0 kubenswrapper[28758]: W0223 14:48:59.601202 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b4d87f9_8a0a_4e73_bb46_d5b42a03a848.slice/crio-a2791c8699bcfb22402ed0472afc034f4c8346c29d13a2e87e90b8b912733ac8 WatchSource:0}: Error finding container a2791c8699bcfb22402ed0472afc034f4c8346c29d13a2e87e90b8b912733ac8: Status 404 returned error can't find the container with id a2791c8699bcfb22402ed0472afc034f4c8346c29d13a2e87e90b8b912733ac8 Feb 23 14:49:00.435590 master-0 kubenswrapper[28758]: I0223 14:49:00.435519 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-5k6m4" event={"ID":"9b4d87f9-8a0a-4e73-bb46-d5b42a03a848","Type":"ContainerStarted","Data":"a2791c8699bcfb22402ed0472afc034f4c8346c29d13a2e87e90b8b912733ac8"} Feb 23 14:49:00.437467 master-0 kubenswrapper[28758]: I0223 14:49:00.437432 28758 generic.go:334] "Generic (PLEG): container finished" podID="cac22f15-2beb-4f7e-b34c-842e8c0fa082" containerID="b1a5bfd0e6cd5857544f9bcef717e675bca6f33a5e3a683f45c863248a2a0d06" exitCode=0 Feb 23 14:49:00.437648 master-0 kubenswrapper[28758]: I0223 14:49:00.437597 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-89d6-account-create-update-wb2m8" event={"ID":"cac22f15-2beb-4f7e-b34c-842e8c0fa082","Type":"ContainerDied","Data":"b1a5bfd0e6cd5857544f9bcef717e675bca6f33a5e3a683f45c863248a2a0d06"} Feb 23 14:49:00.440544 master-0 kubenswrapper[28758]: I0223 14:49:00.440516 28758 generic.go:334] "Generic (PLEG): container finished" podID="24013480-c173-4d1f-8d00-2cffa90b04ef" containerID="3cfcfafb4bed9891235ccb19553f2a35a50b794ce3fc4665a9185ee65ea8d633" exitCode=0 Feb 23 14:49:00.440615 master-0 kubenswrapper[28758]: I0223 14:49:00.440545 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5pnls" event={"ID":"24013480-c173-4d1f-8d00-2cffa90b04ef","Type":"ContainerDied","Data":"3cfcfafb4bed9891235ccb19553f2a35a50b794ce3fc4665a9185ee65ea8d633"} Feb 23 14:49:00.442873 master-0 kubenswrapper[28758]: I0223 14:49:00.442844 28758 generic.go:334] "Generic (PLEG): container finished" podID="eed2aa93-0ee8-4867-b988-9f8834149437" containerID="8862597eeca17852c4a830ebe434f16d7190bf543a8ffc6876d53ff2bcc8e132" exitCode=0 Feb 23 14:49:00.442990 master-0 kubenswrapper[28758]: I0223 14:49:00.442912 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-03f5-account-create-update-ntvsz" event={"ID":"eed2aa93-0ee8-4867-b988-9f8834149437","Type":"ContainerDied","Data":"8862597eeca17852c4a830ebe434f16d7190bf543a8ffc6876d53ff2bcc8e132"} Feb 23 14:49:01.098702 master-0 kubenswrapper[28758]: I0223 14:49:01.098644 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" Feb 23 14:49:01.551658 master-0 kubenswrapper[28758]: I0223 14:49:01.550763 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b55dc5f67-8cr6h"] Feb 23 14:49:01.551658 master-0 kubenswrapper[28758]: I0223 14:49:01.551020 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b55dc5f67-8cr6h" podUID="162300cf-7eec-4b50-8667-5b6abea5e4d1" containerName="dnsmasq-dns" containerID="cri-o://a1fae130cb2207727b5ef0cadf257409cc274b961ec4da49515e1ca9f908680b" gracePeriod=10 Feb 23 14:49:01.945188 master-0 kubenswrapper[28758]: I0223 14:49:01.944532 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xm7bc" Feb 23 14:49:02.115761 master-0 kubenswrapper[28758]: I0223 14:49:02.115549 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c3df66b-646e-4494-90d6-d17492127413-operator-scripts\") pod \"3c3df66b-646e-4494-90d6-d17492127413\" (UID: \"3c3df66b-646e-4494-90d6-d17492127413\") " Feb 23 14:49:02.115886 master-0 kubenswrapper[28758]: I0223 14:49:02.115854 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ft4sj\" (UniqueName: \"kubernetes.io/projected/3c3df66b-646e-4494-90d6-d17492127413-kube-api-access-ft4sj\") pod \"3c3df66b-646e-4494-90d6-d17492127413\" (UID: \"3c3df66b-646e-4494-90d6-d17492127413\") " Feb 23 14:49:02.116370 master-0 kubenswrapper[28758]: I0223 14:49:02.116323 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c3df66b-646e-4494-90d6-d17492127413-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3c3df66b-646e-4494-90d6-d17492127413" (UID: "3c3df66b-646e-4494-90d6-d17492127413"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:49:02.116830 master-0 kubenswrapper[28758]: I0223 14:49:02.116774 28758 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c3df66b-646e-4494-90d6-d17492127413-operator-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:02.120050 master-0 kubenswrapper[28758]: I0223 14:49:02.120004 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c3df66b-646e-4494-90d6-d17492127413-kube-api-access-ft4sj" (OuterVolumeSpecName: "kube-api-access-ft4sj") pod "3c3df66b-646e-4494-90d6-d17492127413" (UID: "3c3df66b-646e-4494-90d6-d17492127413"). InnerVolumeSpecName "kube-api-access-ft4sj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:49:02.182784 master-0 kubenswrapper[28758]: I0223 14:49:02.182688 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-03f5-account-create-update-ntvsz" Feb 23 14:49:02.219711 master-0 kubenswrapper[28758]: I0223 14:49:02.219513 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ft4sj\" (UniqueName: \"kubernetes.io/projected/3c3df66b-646e-4494-90d6-d17492127413-kube-api-access-ft4sj\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:02.327633 master-0 kubenswrapper[28758]: I0223 14:49:02.327521 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eed2aa93-0ee8-4867-b988-9f8834149437-operator-scripts\") pod \"eed2aa93-0ee8-4867-b988-9f8834149437\" (UID: \"eed2aa93-0ee8-4867-b988-9f8834149437\") " Feb 23 14:49:02.327633 master-0 kubenswrapper[28758]: I0223 14:49:02.327582 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltng8\" (UniqueName: \"kubernetes.io/projected/eed2aa93-0ee8-4867-b988-9f8834149437-kube-api-access-ltng8\") pod \"eed2aa93-0ee8-4867-b988-9f8834149437\" (UID: \"eed2aa93-0ee8-4867-b988-9f8834149437\") " Feb 23 14:49:02.328934 master-0 kubenswrapper[28758]: I0223 14:49:02.328882 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eed2aa93-0ee8-4867-b988-9f8834149437-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eed2aa93-0ee8-4867-b988-9f8834149437" (UID: "eed2aa93-0ee8-4867-b988-9f8834149437"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:49:02.341460 master-0 kubenswrapper[28758]: I0223 14:49:02.341414 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eed2aa93-0ee8-4867-b988-9f8834149437-kube-api-access-ltng8" (OuterVolumeSpecName: "kube-api-access-ltng8") pod "eed2aa93-0ee8-4867-b988-9f8834149437" (UID: "eed2aa93-0ee8-4867-b988-9f8834149437"). InnerVolumeSpecName "kube-api-access-ltng8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:49:02.462016 master-0 kubenswrapper[28758]: I0223 14:49:02.460473 28758 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eed2aa93-0ee8-4867-b988-9f8834149437-operator-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:02.462016 master-0 kubenswrapper[28758]: I0223 14:49:02.460540 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltng8\" (UniqueName: \"kubernetes.io/projected/eed2aa93-0ee8-4867-b988-9f8834149437-kube-api-access-ltng8\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:02.500370 master-0 kubenswrapper[28758]: I0223 14:49:02.500344 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5pnls" Feb 23 14:49:02.525388 master-0 kubenswrapper[28758]: I0223 14:49:02.522897 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xm7bc" event={"ID":"3c3df66b-646e-4494-90d6-d17492127413","Type":"ContainerDied","Data":"f86ecea0baa0b98a8a2c40bb3ed1131677728eecdf7a2b5bdedc21165d51d98b"} Feb 23 14:49:02.525388 master-0 kubenswrapper[28758]: I0223 14:49:02.522966 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f86ecea0baa0b98a8a2c40bb3ed1131677728eecdf7a2b5bdedc21165d51d98b" Feb 23 14:49:02.525388 master-0 kubenswrapper[28758]: I0223 14:49:02.523072 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xm7bc" Feb 23 14:49:02.525388 master-0 kubenswrapper[28758]: I0223 14:49:02.523874 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b55dc5f67-8cr6h" Feb 23 14:49:02.556499 master-0 kubenswrapper[28758]: I0223 14:49:02.556301 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5pnls" event={"ID":"24013480-c173-4d1f-8d00-2cffa90b04ef","Type":"ContainerDied","Data":"e97683a069e1e38831efdcdb7dd7a3a03d388ecc9fcd81a386588d311625e07e"} Feb 23 14:49:02.556499 master-0 kubenswrapper[28758]: I0223 14:49:02.556366 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e97683a069e1e38831efdcdb7dd7a3a03d388ecc9fcd81a386588d311625e07e" Feb 23 14:49:02.556499 master-0 kubenswrapper[28758]: I0223 14:49:02.556435 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5pnls" Feb 23 14:49:02.572541 master-0 kubenswrapper[28758]: I0223 14:49:02.559188 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-03f5-account-create-update-ntvsz" event={"ID":"eed2aa93-0ee8-4867-b988-9f8834149437","Type":"ContainerDied","Data":"087485af20c55565fcb64c916deaacb09384e309bb50b8a5ddd385b04d55501e"} Feb 23 14:49:02.572541 master-0 kubenswrapper[28758]: I0223 14:49:02.559217 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="087485af20c55565fcb64c916deaacb09384e309bb50b8a5ddd385b04d55501e" Feb 23 14:49:02.572541 master-0 kubenswrapper[28758]: I0223 14:49:02.559268 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-03f5-account-create-update-ntvsz" Feb 23 14:49:02.609310 master-0 kubenswrapper[28758]: I0223 14:49:02.609259 28758 generic.go:334] "Generic (PLEG): container finished" podID="162300cf-7eec-4b50-8667-5b6abea5e4d1" containerID="a1fae130cb2207727b5ef0cadf257409cc274b961ec4da49515e1ca9f908680b" exitCode=0 Feb 23 14:49:02.609310 master-0 kubenswrapper[28758]: I0223 14:49:02.609298 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b55dc5f67-8cr6h" Feb 23 14:49:02.609467 master-0 kubenswrapper[28758]: I0223 14:49:02.609329 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b55dc5f67-8cr6h" event={"ID":"162300cf-7eec-4b50-8667-5b6abea5e4d1","Type":"ContainerDied","Data":"a1fae130cb2207727b5ef0cadf257409cc274b961ec4da49515e1ca9f908680b"} Feb 23 14:49:02.609467 master-0 kubenswrapper[28758]: I0223 14:49:02.609424 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b55dc5f67-8cr6h" event={"ID":"162300cf-7eec-4b50-8667-5b6abea5e4d1","Type":"ContainerDied","Data":"2757e8f14dd508d75782381be5749d4f93480f9312cb88d5b478319ab1997851"} Feb 23 14:49:02.609467 master-0 kubenswrapper[28758]: I0223 14:49:02.609449 28758 scope.go:117] "RemoveContainer" containerID="a1fae130cb2207727b5ef0cadf257409cc274b961ec4da49515e1ca9f908680b" Feb 23 14:49:02.625553 master-0 kubenswrapper[28758]: I0223 14:49:02.625500 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-89d6-account-create-update-wb2m8" Feb 23 14:49:02.648543 master-0 kubenswrapper[28758]: I0223 14:49:02.648511 28758 scope.go:117] "RemoveContainer" containerID="e1aed1dc5b64635fff97630e05593e15292c2d09f5c97e5ef47dd7f57144fc25" Feb 23 14:49:02.671300 master-0 kubenswrapper[28758]: I0223 14:49:02.671234 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scmqn\" (UniqueName: \"kubernetes.io/projected/162300cf-7eec-4b50-8667-5b6abea5e4d1-kube-api-access-scmqn\") pod \"162300cf-7eec-4b50-8667-5b6abea5e4d1\" (UID: \"162300cf-7eec-4b50-8667-5b6abea5e4d1\") " Feb 23 14:49:02.671549 master-0 kubenswrapper[28758]: I0223 14:49:02.671359 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24013480-c173-4d1f-8d00-2cffa90b04ef-operator-scripts\") pod \"24013480-c173-4d1f-8d00-2cffa90b04ef\" (UID: \"24013480-c173-4d1f-8d00-2cffa90b04ef\") " Feb 23 14:49:02.671549 master-0 kubenswrapper[28758]: I0223 14:49:02.671391 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/162300cf-7eec-4b50-8667-5b6abea5e4d1-ovsdbserver-sb\") pod \"162300cf-7eec-4b50-8667-5b6abea5e4d1\" (UID: \"162300cf-7eec-4b50-8667-5b6abea5e4d1\") " Feb 23 14:49:02.671549 master-0 kubenswrapper[28758]: I0223 14:49:02.671424 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/162300cf-7eec-4b50-8667-5b6abea5e4d1-config\") pod \"162300cf-7eec-4b50-8667-5b6abea5e4d1\" (UID: \"162300cf-7eec-4b50-8667-5b6abea5e4d1\") " Feb 23 14:49:02.671549 master-0 kubenswrapper[28758]: I0223 14:49:02.671511 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pdcc\" (UniqueName: \"kubernetes.io/projected/24013480-c173-4d1f-8d00-2cffa90b04ef-kube-api-access-5pdcc\") pod \"24013480-c173-4d1f-8d00-2cffa90b04ef\" (UID: \"24013480-c173-4d1f-8d00-2cffa90b04ef\") " Feb 23 14:49:02.671712 master-0 kubenswrapper[28758]: I0223 14:49:02.671612 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/162300cf-7eec-4b50-8667-5b6abea5e4d1-dns-svc\") pod \"162300cf-7eec-4b50-8667-5b6abea5e4d1\" (UID: \"162300cf-7eec-4b50-8667-5b6abea5e4d1\") " Feb 23 14:49:02.671712 master-0 kubenswrapper[28758]: I0223 14:49:02.671697 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/162300cf-7eec-4b50-8667-5b6abea5e4d1-ovsdbserver-nb\") pod \"162300cf-7eec-4b50-8667-5b6abea5e4d1\" (UID: \"162300cf-7eec-4b50-8667-5b6abea5e4d1\") " Feb 23 14:49:02.674874 master-0 kubenswrapper[28758]: I0223 14:49:02.674809 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/162300cf-7eec-4b50-8667-5b6abea5e4d1-kube-api-access-scmqn" (OuterVolumeSpecName: "kube-api-access-scmqn") pod "162300cf-7eec-4b50-8667-5b6abea5e4d1" (UID: "162300cf-7eec-4b50-8667-5b6abea5e4d1"). InnerVolumeSpecName "kube-api-access-scmqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:49:02.679354 master-0 kubenswrapper[28758]: I0223 14:49:02.678169 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24013480-c173-4d1f-8d00-2cffa90b04ef-kube-api-access-5pdcc" (OuterVolumeSpecName: "kube-api-access-5pdcc") pod "24013480-c173-4d1f-8d00-2cffa90b04ef" (UID: "24013480-c173-4d1f-8d00-2cffa90b04ef"). InnerVolumeSpecName "kube-api-access-5pdcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:49:02.679354 master-0 kubenswrapper[28758]: I0223 14:49:02.678617 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24013480-c173-4d1f-8d00-2cffa90b04ef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "24013480-c173-4d1f-8d00-2cffa90b04ef" (UID: "24013480-c173-4d1f-8d00-2cffa90b04ef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:49:02.725340 master-0 kubenswrapper[28758]: I0223 14:49:02.725154 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/162300cf-7eec-4b50-8667-5b6abea5e4d1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "162300cf-7eec-4b50-8667-5b6abea5e4d1" (UID: "162300cf-7eec-4b50-8667-5b6abea5e4d1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:49:02.734060 master-0 kubenswrapper[28758]: I0223 14:49:02.734009 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/162300cf-7eec-4b50-8667-5b6abea5e4d1-config" (OuterVolumeSpecName: "config") pod "162300cf-7eec-4b50-8667-5b6abea5e4d1" (UID: "162300cf-7eec-4b50-8667-5b6abea5e4d1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:49:02.735544 master-0 kubenswrapper[28758]: I0223 14:49:02.735444 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/162300cf-7eec-4b50-8667-5b6abea5e4d1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "162300cf-7eec-4b50-8667-5b6abea5e4d1" (UID: "162300cf-7eec-4b50-8667-5b6abea5e4d1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:49:02.764663 master-0 kubenswrapper[28758]: I0223 14:49:02.764599 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/162300cf-7eec-4b50-8667-5b6abea5e4d1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "162300cf-7eec-4b50-8667-5b6abea5e4d1" (UID: "162300cf-7eec-4b50-8667-5b6abea5e4d1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:49:02.773985 master-0 kubenswrapper[28758]: I0223 14:49:02.773933 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvwzj\" (UniqueName: \"kubernetes.io/projected/cac22f15-2beb-4f7e-b34c-842e8c0fa082-kube-api-access-kvwzj\") pod \"cac22f15-2beb-4f7e-b34c-842e8c0fa082\" (UID: \"cac22f15-2beb-4f7e-b34c-842e8c0fa082\") " Feb 23 14:49:02.774150 master-0 kubenswrapper[28758]: I0223 14:49:02.773997 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cac22f15-2beb-4f7e-b34c-842e8c0fa082-operator-scripts\") pod \"cac22f15-2beb-4f7e-b34c-842e8c0fa082\" (UID: \"cac22f15-2beb-4f7e-b34c-842e8c0fa082\") " Feb 23 14:49:02.774534 master-0 kubenswrapper[28758]: I0223 14:49:02.774504 28758 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/162300cf-7eec-4b50-8667-5b6abea5e4d1-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:02.774534 master-0 kubenswrapper[28758]: I0223 14:49:02.774525 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scmqn\" (UniqueName: \"kubernetes.io/projected/162300cf-7eec-4b50-8667-5b6abea5e4d1-kube-api-access-scmqn\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:02.774534 master-0 kubenswrapper[28758]: I0223 14:49:02.774536 28758 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24013480-c173-4d1f-8d00-2cffa90b04ef-operator-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:02.774710 master-0 kubenswrapper[28758]: I0223 14:49:02.774545 28758 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/162300cf-7eec-4b50-8667-5b6abea5e4d1-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:02.774710 master-0 kubenswrapper[28758]: I0223 14:49:02.774555 28758 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/162300cf-7eec-4b50-8667-5b6abea5e4d1-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:02.774710 master-0 kubenswrapper[28758]: I0223 14:49:02.774564 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pdcc\" (UniqueName: \"kubernetes.io/projected/24013480-c173-4d1f-8d00-2cffa90b04ef-kube-api-access-5pdcc\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:02.774710 master-0 kubenswrapper[28758]: I0223 14:49:02.774573 28758 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/162300cf-7eec-4b50-8667-5b6abea5e4d1-dns-svc\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:02.774936 master-0 kubenswrapper[28758]: I0223 14:49:02.774914 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cac22f15-2beb-4f7e-b34c-842e8c0fa082-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cac22f15-2beb-4f7e-b34c-842e8c0fa082" (UID: "cac22f15-2beb-4f7e-b34c-842e8c0fa082"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:49:02.778086 master-0 kubenswrapper[28758]: I0223 14:49:02.778002 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cac22f15-2beb-4f7e-b34c-842e8c0fa082-kube-api-access-kvwzj" (OuterVolumeSpecName: "kube-api-access-kvwzj") pod "cac22f15-2beb-4f7e-b34c-842e8c0fa082" (UID: "cac22f15-2beb-4f7e-b34c-842e8c0fa082"). InnerVolumeSpecName "kube-api-access-kvwzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:49:02.848163 master-0 kubenswrapper[28758]: I0223 14:49:02.848113 28758 scope.go:117] "RemoveContainer" containerID="a1fae130cb2207727b5ef0cadf257409cc274b961ec4da49515e1ca9f908680b" Feb 23 14:49:02.849768 master-0 kubenswrapper[28758]: E0223 14:49:02.849714 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1fae130cb2207727b5ef0cadf257409cc274b961ec4da49515e1ca9f908680b\": container with ID starting with a1fae130cb2207727b5ef0cadf257409cc274b961ec4da49515e1ca9f908680b not found: ID does not exist" containerID="a1fae130cb2207727b5ef0cadf257409cc274b961ec4da49515e1ca9f908680b" Feb 23 14:49:02.849870 master-0 kubenswrapper[28758]: I0223 14:49:02.849780 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1fae130cb2207727b5ef0cadf257409cc274b961ec4da49515e1ca9f908680b"} err="failed to get container status \"a1fae130cb2207727b5ef0cadf257409cc274b961ec4da49515e1ca9f908680b\": rpc error: code = NotFound desc = could not find container \"a1fae130cb2207727b5ef0cadf257409cc274b961ec4da49515e1ca9f908680b\": container with ID starting with a1fae130cb2207727b5ef0cadf257409cc274b961ec4da49515e1ca9f908680b not found: ID does not exist" Feb 23 14:49:02.849870 master-0 kubenswrapper[28758]: I0223 14:49:02.849815 28758 scope.go:117] "RemoveContainer" containerID="e1aed1dc5b64635fff97630e05593e15292c2d09f5c97e5ef47dd7f57144fc25" Feb 23 14:49:02.850417 master-0 kubenswrapper[28758]: E0223 14:49:02.850383 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1aed1dc5b64635fff97630e05593e15292c2d09f5c97e5ef47dd7f57144fc25\": container with ID starting with e1aed1dc5b64635fff97630e05593e15292c2d09f5c97e5ef47dd7f57144fc25 not found: ID does not exist" containerID="e1aed1dc5b64635fff97630e05593e15292c2d09f5c97e5ef47dd7f57144fc25" Feb 23 14:49:02.850568 master-0 kubenswrapper[28758]: I0223 14:49:02.850417 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1aed1dc5b64635fff97630e05593e15292c2d09f5c97e5ef47dd7f57144fc25"} err="failed to get container status \"e1aed1dc5b64635fff97630e05593e15292c2d09f5c97e5ef47dd7f57144fc25\": rpc error: code = NotFound desc = could not find container \"e1aed1dc5b64635fff97630e05593e15292c2d09f5c97e5ef47dd7f57144fc25\": container with ID starting with e1aed1dc5b64635fff97630e05593e15292c2d09f5c97e5ef47dd7f57144fc25 not found: ID does not exist" Feb 23 14:49:02.876699 master-0 kubenswrapper[28758]: I0223 14:49:02.876650 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvwzj\" (UniqueName: \"kubernetes.io/projected/cac22f15-2beb-4f7e-b34c-842e8c0fa082-kube-api-access-kvwzj\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:02.876699 master-0 kubenswrapper[28758]: I0223 14:49:02.876699 28758 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cac22f15-2beb-4f7e-b34c-842e8c0fa082-operator-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:02.999269 master-0 kubenswrapper[28758]: I0223 14:49:02.995666 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b55dc5f67-8cr6h"] Feb 23 14:49:03.037288 master-0 kubenswrapper[28758]: I0223 14:49:03.037208 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b55dc5f67-8cr6h"] Feb 23 14:49:03.621007 master-0 kubenswrapper[28758]: I0223 14:49:03.620905 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-89d6-account-create-update-wb2m8" event={"ID":"cac22f15-2beb-4f7e-b34c-842e8c0fa082","Type":"ContainerDied","Data":"368081d1bf353e2148978c3d61eeac9fd6847bfe6da2010faed16f5dc75eb99e"} Feb 23 14:49:03.621007 master-0 kubenswrapper[28758]: I0223 14:49:03.620992 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="368081d1bf353e2148978c3d61eeac9fd6847bfe6da2010faed16f5dc75eb99e" Feb 23 14:49:03.621865 master-0 kubenswrapper[28758]: I0223 14:49:03.621063 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-89d6-account-create-update-wb2m8" Feb 23 14:49:04.101020 master-0 kubenswrapper[28758]: I0223 14:49:04.100947 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="162300cf-7eec-4b50-8667-5b6abea5e4d1" path="/var/lib/kubelet/pods/162300cf-7eec-4b50-8667-5b6abea5e4d1/volumes" Feb 23 14:49:07.668185 master-0 kubenswrapper[28758]: I0223 14:49:07.667892 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-5k6m4" event={"ID":"9b4d87f9-8a0a-4e73-bb46-d5b42a03a848","Type":"ContainerStarted","Data":"b5cfdf4d7d4d0b962b36c937abac8c0ea91f4fe8099a02445b571528e1284019"} Feb 23 14:49:07.695455 master-0 kubenswrapper[28758]: I0223 14:49:07.695315 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-5k6m4" podStartSLOduration=2.817330545 podStartE2EDuration="9.695290747s" podCreationTimestamp="2026-02-23 14:48:58 +0000 UTC" firstStartedPulling="2026-02-23 14:48:59.603958066 +0000 UTC m=+871.730273998" lastFinishedPulling="2026-02-23 14:49:06.481918258 +0000 UTC m=+878.608234200" observedRunningTime="2026-02-23 14:49:07.686813542 +0000 UTC m=+879.813129474" watchObservedRunningTime="2026-02-23 14:49:07.695290747 +0000 UTC m=+879.821606679" Feb 23 14:49:11.712992 master-0 kubenswrapper[28758]: I0223 14:49:11.712915 28758 generic.go:334] "Generic (PLEG): container finished" podID="9b4d87f9-8a0a-4e73-bb46-d5b42a03a848" containerID="b5cfdf4d7d4d0b962b36c937abac8c0ea91f4fe8099a02445b571528e1284019" exitCode=0 Feb 23 14:49:11.712992 master-0 kubenswrapper[28758]: I0223 14:49:11.712975 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-5k6m4" event={"ID":"9b4d87f9-8a0a-4e73-bb46-d5b42a03a848","Type":"ContainerDied","Data":"b5cfdf4d7d4d0b962b36c937abac8c0ea91f4fe8099a02445b571528e1284019"} Feb 23 14:49:13.144059 master-0 kubenswrapper[28758]: I0223 14:49:13.143987 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-5k6m4" Feb 23 14:49:13.231581 master-0 kubenswrapper[28758]: I0223 14:49:13.231486 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q62s8\" (UniqueName: \"kubernetes.io/projected/9b4d87f9-8a0a-4e73-bb46-d5b42a03a848-kube-api-access-q62s8\") pod \"9b4d87f9-8a0a-4e73-bb46-d5b42a03a848\" (UID: \"9b4d87f9-8a0a-4e73-bb46-d5b42a03a848\") " Feb 23 14:49:13.231941 master-0 kubenswrapper[28758]: I0223 14:49:13.231709 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b4d87f9-8a0a-4e73-bb46-d5b42a03a848-config-data\") pod \"9b4d87f9-8a0a-4e73-bb46-d5b42a03a848\" (UID: \"9b4d87f9-8a0a-4e73-bb46-d5b42a03a848\") " Feb 23 14:49:13.231941 master-0 kubenswrapper[28758]: I0223 14:49:13.231753 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b4d87f9-8a0a-4e73-bb46-d5b42a03a848-combined-ca-bundle\") pod \"9b4d87f9-8a0a-4e73-bb46-d5b42a03a848\" (UID: \"9b4d87f9-8a0a-4e73-bb46-d5b42a03a848\") " Feb 23 14:49:13.235054 master-0 kubenswrapper[28758]: I0223 14:49:13.235008 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b4d87f9-8a0a-4e73-bb46-d5b42a03a848-kube-api-access-q62s8" (OuterVolumeSpecName: "kube-api-access-q62s8") pod "9b4d87f9-8a0a-4e73-bb46-d5b42a03a848" (UID: "9b4d87f9-8a0a-4e73-bb46-d5b42a03a848"). InnerVolumeSpecName "kube-api-access-q62s8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:49:13.257188 master-0 kubenswrapper[28758]: I0223 14:49:13.257136 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b4d87f9-8a0a-4e73-bb46-d5b42a03a848-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b4d87f9-8a0a-4e73-bb46-d5b42a03a848" (UID: "9b4d87f9-8a0a-4e73-bb46-d5b42a03a848"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:13.280923 master-0 kubenswrapper[28758]: I0223 14:49:13.280864 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b4d87f9-8a0a-4e73-bb46-d5b42a03a848-config-data" (OuterVolumeSpecName: "config-data") pod "9b4d87f9-8a0a-4e73-bb46-d5b42a03a848" (UID: "9b4d87f9-8a0a-4e73-bb46-d5b42a03a848"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:13.334890 master-0 kubenswrapper[28758]: I0223 14:49:13.334838 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q62s8\" (UniqueName: \"kubernetes.io/projected/9b4d87f9-8a0a-4e73-bb46-d5b42a03a848-kube-api-access-q62s8\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:13.334890 master-0 kubenswrapper[28758]: I0223 14:49:13.334885 28758 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b4d87f9-8a0a-4e73-bb46-d5b42a03a848-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:13.335203 master-0 kubenswrapper[28758]: I0223 14:49:13.334901 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b4d87f9-8a0a-4e73-bb46-d5b42a03a848-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:13.733577 master-0 kubenswrapper[28758]: I0223 14:49:13.733471 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-5k6m4" event={"ID":"9b4d87f9-8a0a-4e73-bb46-d5b42a03a848","Type":"ContainerDied","Data":"a2791c8699bcfb22402ed0472afc034f4c8346c29d13a2e87e90b8b912733ac8"} Feb 23 14:49:13.733577 master-0 kubenswrapper[28758]: I0223 14:49:13.733553 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2791c8699bcfb22402ed0472afc034f4c8346c29d13a2e87e90b8b912733ac8" Feb 23 14:49:13.733577 master-0 kubenswrapper[28758]: I0223 14:49:13.733527 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-5k6m4" Feb 23 14:49:14.052673 master-0 kubenswrapper[28758]: I0223 14:49:14.052613 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57bb5cf47f-hcx8l"] Feb 23 14:49:14.053108 master-0 kubenswrapper[28758]: E0223 14:49:14.053092 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c3df66b-646e-4494-90d6-d17492127413" containerName="mariadb-database-create" Feb 23 14:49:14.053161 master-0 kubenswrapper[28758]: I0223 14:49:14.053113 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c3df66b-646e-4494-90d6-d17492127413" containerName="mariadb-database-create" Feb 23 14:49:14.053161 master-0 kubenswrapper[28758]: E0223 14:49:14.053130 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="162300cf-7eec-4b50-8667-5b6abea5e4d1" containerName="dnsmasq-dns" Feb 23 14:49:14.053161 master-0 kubenswrapper[28758]: I0223 14:49:14.053136 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="162300cf-7eec-4b50-8667-5b6abea5e4d1" containerName="dnsmasq-dns" Feb 23 14:49:14.053161 master-0 kubenswrapper[28758]: E0223 14:49:14.053149 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="162300cf-7eec-4b50-8667-5b6abea5e4d1" containerName="init" Feb 23 14:49:14.053161 master-0 kubenswrapper[28758]: I0223 14:49:14.053155 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="162300cf-7eec-4b50-8667-5b6abea5e4d1" containerName="init" Feb 23 14:49:14.053335 master-0 kubenswrapper[28758]: E0223 14:49:14.053167 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cac22f15-2beb-4f7e-b34c-842e8c0fa082" containerName="mariadb-account-create-update" Feb 23 14:49:14.053335 master-0 kubenswrapper[28758]: I0223 14:49:14.053173 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="cac22f15-2beb-4f7e-b34c-842e8c0fa082" containerName="mariadb-account-create-update" Feb 23 14:49:14.053335 master-0 kubenswrapper[28758]: E0223 14:49:14.053217 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24013480-c173-4d1f-8d00-2cffa90b04ef" containerName="mariadb-database-create" Feb 23 14:49:14.053335 master-0 kubenswrapper[28758]: I0223 14:49:14.053226 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="24013480-c173-4d1f-8d00-2cffa90b04ef" containerName="mariadb-database-create" Feb 23 14:49:14.053335 master-0 kubenswrapper[28758]: E0223 14:49:14.053248 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b4d87f9-8a0a-4e73-bb46-d5b42a03a848" containerName="keystone-db-sync" Feb 23 14:49:14.053335 master-0 kubenswrapper[28758]: I0223 14:49:14.053254 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b4d87f9-8a0a-4e73-bb46-d5b42a03a848" containerName="keystone-db-sync" Feb 23 14:49:14.053335 master-0 kubenswrapper[28758]: E0223 14:49:14.053268 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eed2aa93-0ee8-4867-b988-9f8834149437" containerName="mariadb-account-create-update" Feb 23 14:49:14.053335 master-0 kubenswrapper[28758]: I0223 14:49:14.053276 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="eed2aa93-0ee8-4867-b988-9f8834149437" containerName="mariadb-account-create-update" Feb 23 14:49:14.053628 master-0 kubenswrapper[28758]: I0223 14:49:14.053538 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c3df66b-646e-4494-90d6-d17492127413" containerName="mariadb-database-create" Feb 23 14:49:14.053628 master-0 kubenswrapper[28758]: I0223 14:49:14.053568 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b4d87f9-8a0a-4e73-bb46-d5b42a03a848" containerName="keystone-db-sync" Feb 23 14:49:14.063508 master-0 kubenswrapper[28758]: I0223 14:49:14.062638 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="cac22f15-2beb-4f7e-b34c-842e8c0fa082" containerName="mariadb-account-create-update" Feb 23 14:49:14.063508 master-0 kubenswrapper[28758]: I0223 14:49:14.062725 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="162300cf-7eec-4b50-8667-5b6abea5e4d1" containerName="dnsmasq-dns" Feb 23 14:49:14.063508 master-0 kubenswrapper[28758]: I0223 14:49:14.062766 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="eed2aa93-0ee8-4867-b988-9f8834149437" containerName="mariadb-account-create-update" Feb 23 14:49:14.063508 master-0 kubenswrapper[28758]: I0223 14:49:14.062781 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="24013480-c173-4d1f-8d00-2cffa90b04ef" containerName="mariadb-database-create" Feb 23 14:49:14.067783 master-0 kubenswrapper[28758]: I0223 14:49:14.067649 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bb5cf47f-hcx8l" Feb 23 14:49:14.153036 master-0 kubenswrapper[28758]: I0223 14:49:14.152973 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9248a4c-32fb-402d-8277-a8ea33fe97fe-dns-swift-storage-0\") pod \"dnsmasq-dns-57bb5cf47f-hcx8l\" (UID: \"f9248a4c-32fb-402d-8277-a8ea33fe97fe\") " pod="openstack/dnsmasq-dns-57bb5cf47f-hcx8l" Feb 23 14:49:14.153918 master-0 kubenswrapper[28758]: I0223 14:49:14.153071 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9248a4c-32fb-402d-8277-a8ea33fe97fe-dns-svc\") pod \"dnsmasq-dns-57bb5cf47f-hcx8l\" (UID: \"f9248a4c-32fb-402d-8277-a8ea33fe97fe\") " pod="openstack/dnsmasq-dns-57bb5cf47f-hcx8l" Feb 23 14:49:14.153918 master-0 kubenswrapper[28758]: I0223 14:49:14.153110 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmg64\" (UniqueName: \"kubernetes.io/projected/f9248a4c-32fb-402d-8277-a8ea33fe97fe-kube-api-access-hmg64\") pod \"dnsmasq-dns-57bb5cf47f-hcx8l\" (UID: \"f9248a4c-32fb-402d-8277-a8ea33fe97fe\") " pod="openstack/dnsmasq-dns-57bb5cf47f-hcx8l" Feb 23 14:49:14.153918 master-0 kubenswrapper[28758]: I0223 14:49:14.153164 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9248a4c-32fb-402d-8277-a8ea33fe97fe-ovsdbserver-nb\") pod \"dnsmasq-dns-57bb5cf47f-hcx8l\" (UID: \"f9248a4c-32fb-402d-8277-a8ea33fe97fe\") " pod="openstack/dnsmasq-dns-57bb5cf47f-hcx8l" Feb 23 14:49:14.153918 master-0 kubenswrapper[28758]: I0223 14:49:14.153190 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9248a4c-32fb-402d-8277-a8ea33fe97fe-config\") pod \"dnsmasq-dns-57bb5cf47f-hcx8l\" (UID: \"f9248a4c-32fb-402d-8277-a8ea33fe97fe\") " pod="openstack/dnsmasq-dns-57bb5cf47f-hcx8l" Feb 23 14:49:14.153918 master-0 kubenswrapper[28758]: I0223 14:49:14.153400 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9248a4c-32fb-402d-8277-a8ea33fe97fe-ovsdbserver-sb\") pod \"dnsmasq-dns-57bb5cf47f-hcx8l\" (UID: \"f9248a4c-32fb-402d-8277-a8ea33fe97fe\") " pod="openstack/dnsmasq-dns-57bb5cf47f-hcx8l" Feb 23 14:49:14.168148 master-0 kubenswrapper[28758]: I0223 14:49:14.166597 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57bb5cf47f-hcx8l"] Feb 23 14:49:14.255780 master-0 kubenswrapper[28758]: I0223 14:49:14.255707 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9248a4c-32fb-402d-8277-a8ea33fe97fe-ovsdbserver-sb\") pod \"dnsmasq-dns-57bb5cf47f-hcx8l\" (UID: \"f9248a4c-32fb-402d-8277-a8ea33fe97fe\") " pod="openstack/dnsmasq-dns-57bb5cf47f-hcx8l" Feb 23 14:49:14.255992 master-0 kubenswrapper[28758]: I0223 14:49:14.255839 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9248a4c-32fb-402d-8277-a8ea33fe97fe-dns-swift-storage-0\") pod \"dnsmasq-dns-57bb5cf47f-hcx8l\" (UID: \"f9248a4c-32fb-402d-8277-a8ea33fe97fe\") " pod="openstack/dnsmasq-dns-57bb5cf47f-hcx8l" Feb 23 14:49:14.255992 master-0 kubenswrapper[28758]: I0223 14:49:14.255876 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9248a4c-32fb-402d-8277-a8ea33fe97fe-dns-svc\") pod \"dnsmasq-dns-57bb5cf47f-hcx8l\" (UID: \"f9248a4c-32fb-402d-8277-a8ea33fe97fe\") " pod="openstack/dnsmasq-dns-57bb5cf47f-hcx8l" Feb 23 14:49:14.255992 master-0 kubenswrapper[28758]: I0223 14:49:14.255902 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmg64\" (UniqueName: \"kubernetes.io/projected/f9248a4c-32fb-402d-8277-a8ea33fe97fe-kube-api-access-hmg64\") pod \"dnsmasq-dns-57bb5cf47f-hcx8l\" (UID: \"f9248a4c-32fb-402d-8277-a8ea33fe97fe\") " pod="openstack/dnsmasq-dns-57bb5cf47f-hcx8l" Feb 23 14:49:14.255992 master-0 kubenswrapper[28758]: I0223 14:49:14.255949 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9248a4c-32fb-402d-8277-a8ea33fe97fe-ovsdbserver-nb\") pod \"dnsmasq-dns-57bb5cf47f-hcx8l\" (UID: \"f9248a4c-32fb-402d-8277-a8ea33fe97fe\") " pod="openstack/dnsmasq-dns-57bb5cf47f-hcx8l" Feb 23 14:49:14.255992 master-0 kubenswrapper[28758]: I0223 14:49:14.255988 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9248a4c-32fb-402d-8277-a8ea33fe97fe-config\") pod \"dnsmasq-dns-57bb5cf47f-hcx8l\" (UID: \"f9248a4c-32fb-402d-8277-a8ea33fe97fe\") " pod="openstack/dnsmasq-dns-57bb5cf47f-hcx8l" Feb 23 14:49:14.258442 master-0 kubenswrapper[28758]: I0223 14:49:14.256969 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9248a4c-32fb-402d-8277-a8ea33fe97fe-config\") pod \"dnsmasq-dns-57bb5cf47f-hcx8l\" (UID: \"f9248a4c-32fb-402d-8277-a8ea33fe97fe\") " pod="openstack/dnsmasq-dns-57bb5cf47f-hcx8l" Feb 23 14:49:14.260504 master-0 kubenswrapper[28758]: I0223 14:49:14.260422 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9248a4c-32fb-402d-8277-a8ea33fe97fe-ovsdbserver-sb\") pod \"dnsmasq-dns-57bb5cf47f-hcx8l\" (UID: \"f9248a4c-32fb-402d-8277-a8ea33fe97fe\") " pod="openstack/dnsmasq-dns-57bb5cf47f-hcx8l" Feb 23 14:49:14.265533 master-0 kubenswrapper[28758]: I0223 14:49:14.261454 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9248a4c-32fb-402d-8277-a8ea33fe97fe-dns-svc\") pod \"dnsmasq-dns-57bb5cf47f-hcx8l\" (UID: \"f9248a4c-32fb-402d-8277-a8ea33fe97fe\") " pod="openstack/dnsmasq-dns-57bb5cf47f-hcx8l" Feb 23 14:49:14.265533 master-0 kubenswrapper[28758]: I0223 14:49:14.262419 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9248a4c-32fb-402d-8277-a8ea33fe97fe-ovsdbserver-nb\") pod \"dnsmasq-dns-57bb5cf47f-hcx8l\" (UID: \"f9248a4c-32fb-402d-8277-a8ea33fe97fe\") " pod="openstack/dnsmasq-dns-57bb5cf47f-hcx8l" Feb 23 14:49:14.265533 master-0 kubenswrapper[28758]: I0223 14:49:14.263441 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9248a4c-32fb-402d-8277-a8ea33fe97fe-dns-swift-storage-0\") pod \"dnsmasq-dns-57bb5cf47f-hcx8l\" (UID: \"f9248a4c-32fb-402d-8277-a8ea33fe97fe\") " pod="openstack/dnsmasq-dns-57bb5cf47f-hcx8l" Feb 23 14:49:14.286497 master-0 kubenswrapper[28758]: I0223 14:49:14.284784 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-b85ps"] Feb 23 14:49:14.286497 master-0 kubenswrapper[28758]: I0223 14:49:14.286194 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b85ps" Feb 23 14:49:14.300551 master-0 kubenswrapper[28758]: I0223 14:49:14.300091 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 23 14:49:14.300551 master-0 kubenswrapper[28758]: I0223 14:49:14.300403 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 23 14:49:14.300806 master-0 kubenswrapper[28758]: I0223 14:49:14.300660 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 23 14:49:14.300846 master-0 kubenswrapper[28758]: I0223 14:49:14.300804 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 23 14:49:14.334770 master-0 kubenswrapper[28758]: I0223 14:49:14.330982 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmg64\" (UniqueName: \"kubernetes.io/projected/f9248a4c-32fb-402d-8277-a8ea33fe97fe-kube-api-access-hmg64\") pod \"dnsmasq-dns-57bb5cf47f-hcx8l\" (UID: \"f9248a4c-32fb-402d-8277-a8ea33fe97fe\") " pod="openstack/dnsmasq-dns-57bb5cf47f-hcx8l" Feb 23 14:49:14.361509 master-0 kubenswrapper[28758]: I0223 14:49:14.357116 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-config-data\") pod \"keystone-bootstrap-b85ps\" (UID: \"48a556fc-7e30-4fd5-b266-6315dfdcb0e8\") " pod="openstack/keystone-bootstrap-b85ps" Feb 23 14:49:14.361509 master-0 kubenswrapper[28758]: I0223 14:49:14.357324 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-credential-keys\") pod \"keystone-bootstrap-b85ps\" (UID: \"48a556fc-7e30-4fd5-b266-6315dfdcb0e8\") " pod="openstack/keystone-bootstrap-b85ps" Feb 23 14:49:14.361509 master-0 kubenswrapper[28758]: I0223 14:49:14.357381 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w8gb\" (UniqueName: \"kubernetes.io/projected/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-kube-api-access-6w8gb\") pod \"keystone-bootstrap-b85ps\" (UID: \"48a556fc-7e30-4fd5-b266-6315dfdcb0e8\") " pod="openstack/keystone-bootstrap-b85ps" Feb 23 14:49:14.361509 master-0 kubenswrapper[28758]: I0223 14:49:14.357408 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-fernet-keys\") pod \"keystone-bootstrap-b85ps\" (UID: \"48a556fc-7e30-4fd5-b266-6315dfdcb0e8\") " pod="openstack/keystone-bootstrap-b85ps" Feb 23 14:49:14.361509 master-0 kubenswrapper[28758]: I0223 14:49:14.357451 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-scripts\") pod \"keystone-bootstrap-b85ps\" (UID: \"48a556fc-7e30-4fd5-b266-6315dfdcb0e8\") " pod="openstack/keystone-bootstrap-b85ps" Feb 23 14:49:14.361509 master-0 kubenswrapper[28758]: I0223 14:49:14.357521 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-combined-ca-bundle\") pod \"keystone-bootstrap-b85ps\" (UID: \"48a556fc-7e30-4fd5-b266-6315dfdcb0e8\") " pod="openstack/keystone-bootstrap-b85ps" Feb 23 14:49:14.420510 master-0 kubenswrapper[28758]: I0223 14:49:14.419736 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-b85ps"] Feb 23 14:49:14.498395 master-0 kubenswrapper[28758]: I0223 14:49:14.498260 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-config-data\") pod \"keystone-bootstrap-b85ps\" (UID: \"48a556fc-7e30-4fd5-b266-6315dfdcb0e8\") " pod="openstack/keystone-bootstrap-b85ps" Feb 23 14:49:14.498630 master-0 kubenswrapper[28758]: I0223 14:49:14.498419 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-credential-keys\") pod \"keystone-bootstrap-b85ps\" (UID: \"48a556fc-7e30-4fd5-b266-6315dfdcb0e8\") " pod="openstack/keystone-bootstrap-b85ps" Feb 23 14:49:14.498630 master-0 kubenswrapper[28758]: I0223 14:49:14.498458 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w8gb\" (UniqueName: \"kubernetes.io/projected/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-kube-api-access-6w8gb\") pod \"keystone-bootstrap-b85ps\" (UID: \"48a556fc-7e30-4fd5-b266-6315dfdcb0e8\") " pod="openstack/keystone-bootstrap-b85ps" Feb 23 14:49:14.498630 master-0 kubenswrapper[28758]: I0223 14:49:14.498499 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-fernet-keys\") pod \"keystone-bootstrap-b85ps\" (UID: \"48a556fc-7e30-4fd5-b266-6315dfdcb0e8\") " pod="openstack/keystone-bootstrap-b85ps" Feb 23 14:49:14.498630 master-0 kubenswrapper[28758]: I0223 14:49:14.498579 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-scripts\") pod \"keystone-bootstrap-b85ps\" (UID: \"48a556fc-7e30-4fd5-b266-6315dfdcb0e8\") " pod="openstack/keystone-bootstrap-b85ps" Feb 23 14:49:14.498766 master-0 kubenswrapper[28758]: I0223 14:49:14.498683 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-combined-ca-bundle\") pod \"keystone-bootstrap-b85ps\" (UID: \"48a556fc-7e30-4fd5-b266-6315dfdcb0e8\") " pod="openstack/keystone-bootstrap-b85ps" Feb 23 14:49:14.499467 master-0 kubenswrapper[28758]: I0223 14:49:14.499214 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bb5cf47f-hcx8l" Feb 23 14:49:14.505212 master-0 kubenswrapper[28758]: I0223 14:49:14.504287 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-fernet-keys\") pod \"keystone-bootstrap-b85ps\" (UID: \"48a556fc-7e30-4fd5-b266-6315dfdcb0e8\") " pod="openstack/keystone-bootstrap-b85ps" Feb 23 14:49:14.505212 master-0 kubenswrapper[28758]: I0223 14:49:14.504862 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-config-data\") pod \"keystone-bootstrap-b85ps\" (UID: \"48a556fc-7e30-4fd5-b266-6315dfdcb0e8\") " pod="openstack/keystone-bootstrap-b85ps" Feb 23 14:49:14.506716 master-0 kubenswrapper[28758]: I0223 14:49:14.506665 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-scripts\") pod \"keystone-bootstrap-b85ps\" (UID: \"48a556fc-7e30-4fd5-b266-6315dfdcb0e8\") " pod="openstack/keystone-bootstrap-b85ps" Feb 23 14:49:14.507073 master-0 kubenswrapper[28758]: I0223 14:49:14.507036 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-combined-ca-bundle\") pod \"keystone-bootstrap-b85ps\" (UID: \"48a556fc-7e30-4fd5-b266-6315dfdcb0e8\") " pod="openstack/keystone-bootstrap-b85ps" Feb 23 14:49:14.536277 master-0 kubenswrapper[28758]: I0223 14:49:14.522177 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w8gb\" (UniqueName: \"kubernetes.io/projected/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-kube-api-access-6w8gb\") pod \"keystone-bootstrap-b85ps\" (UID: \"48a556fc-7e30-4fd5-b266-6315dfdcb0e8\") " pod="openstack/keystone-bootstrap-b85ps" Feb 23 14:49:14.536277 master-0 kubenswrapper[28758]: I0223 14:49:14.522339 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-credential-keys\") pod \"keystone-bootstrap-b85ps\" (UID: \"48a556fc-7e30-4fd5-b266-6315dfdcb0e8\") " pod="openstack/keystone-bootstrap-b85ps" Feb 23 14:49:14.536277 master-0 kubenswrapper[28758]: I0223 14:49:14.528675 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-db-create-b22m4"] Feb 23 14:49:14.536277 master-0 kubenswrapper[28758]: I0223 14:49:14.530130 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-b22m4" Feb 23 14:49:14.543375 master-0 kubenswrapper[28758]: I0223 14:49:14.543320 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-2d990-db-sync-f2ddt"] Feb 23 14:49:14.549809 master-0 kubenswrapper[28758]: I0223 14:49:14.549747 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d990-db-sync-f2ddt" Feb 23 14:49:14.554128 master-0 kubenswrapper[28758]: I0223 14:49:14.554073 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-2d990-scripts" Feb 23 14:49:14.554285 master-0 kubenswrapper[28758]: I0223 14:49:14.554238 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-2d990-config-data" Feb 23 14:49:14.562037 master-0 kubenswrapper[28758]: I0223 14:49:14.561938 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2d990-db-sync-f2ddt"] Feb 23 14:49:14.578437 master-0 kubenswrapper[28758]: I0223 14:49:14.578342 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-create-b22m4"] Feb 23 14:49:14.600900 master-0 kubenswrapper[28758]: I0223 14:49:14.598998 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-tnlrn"] Feb 23 14:49:14.600900 master-0 kubenswrapper[28758]: I0223 14:49:14.600413 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0378b9ac-d258-466f-8e8d-c1d27932f3b2-scripts\") pod \"cinder-2d990-db-sync-f2ddt\" (UID: \"0378b9ac-d258-466f-8e8d-c1d27932f3b2\") " pod="openstack/cinder-2d990-db-sync-f2ddt" Feb 23 14:49:14.600900 master-0 kubenswrapper[28758]: I0223 14:49:14.600541 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0378b9ac-d258-466f-8e8d-c1d27932f3b2-etc-machine-id\") pod \"cinder-2d990-db-sync-f2ddt\" (UID: \"0378b9ac-d258-466f-8e8d-c1d27932f3b2\") " pod="openstack/cinder-2d990-db-sync-f2ddt" Feb 23 14:49:14.601283 master-0 kubenswrapper[28758]: I0223 14:49:14.601258 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0378b9ac-d258-466f-8e8d-c1d27932f3b2-combined-ca-bundle\") pod \"cinder-2d990-db-sync-f2ddt\" (UID: \"0378b9ac-d258-466f-8e8d-c1d27932f3b2\") " pod="openstack/cinder-2d990-db-sync-f2ddt" Feb 23 14:49:14.601424 master-0 kubenswrapper[28758]: I0223 14:49:14.601348 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0378b9ac-d258-466f-8e8d-c1d27932f3b2-db-sync-config-data\") pod \"cinder-2d990-db-sync-f2ddt\" (UID: \"0378b9ac-d258-466f-8e8d-c1d27932f3b2\") " pod="openstack/cinder-2d990-db-sync-f2ddt" Feb 23 14:49:14.601424 master-0 kubenswrapper[28758]: I0223 14:49:14.601376 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr5xv\" (UniqueName: \"kubernetes.io/projected/0378b9ac-d258-466f-8e8d-c1d27932f3b2-kube-api-access-tr5xv\") pod \"cinder-2d990-db-sync-f2ddt\" (UID: \"0378b9ac-d258-466f-8e8d-c1d27932f3b2\") " pod="openstack/cinder-2d990-db-sync-f2ddt" Feb 23 14:49:14.601584 master-0 kubenswrapper[28758]: I0223 14:49:14.601417 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0378b9ac-d258-466f-8e8d-c1d27932f3b2-config-data\") pod \"cinder-2d990-db-sync-f2ddt\" (UID: \"0378b9ac-d258-466f-8e8d-c1d27932f3b2\") " pod="openstack/cinder-2d990-db-sync-f2ddt" Feb 23 14:49:14.605593 master-0 kubenswrapper[28758]: I0223 14:49:14.605074 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-tnlrn" Feb 23 14:49:14.606844 master-0 kubenswrapper[28758]: I0223 14:49:14.606805 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 23 14:49:14.607285 master-0 kubenswrapper[28758]: I0223 14:49:14.607258 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 23 14:49:14.613557 master-0 kubenswrapper[28758]: I0223 14:49:14.611375 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-279b-account-create-update-vbsbw"] Feb 23 14:49:14.613557 master-0 kubenswrapper[28758]: I0223 14:49:14.612701 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-279b-account-create-update-vbsbw" Feb 23 14:49:14.619160 master-0 kubenswrapper[28758]: I0223 14:49:14.619109 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-db-secret" Feb 23 14:49:14.632366 master-0 kubenswrapper[28758]: I0223 14:49:14.632322 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-tnlrn"] Feb 23 14:49:14.644929 master-0 kubenswrapper[28758]: I0223 14:49:14.643012 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-279b-account-create-update-vbsbw"] Feb 23 14:49:14.654984 master-0 kubenswrapper[28758]: I0223 14:49:14.654920 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57bb5cf47f-hcx8l"] Feb 23 14:49:14.670056 master-0 kubenswrapper[28758]: I0223 14:49:14.669213 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-w9z2m"] Feb 23 14:49:14.673536 master-0 kubenswrapper[28758]: I0223 14:49:14.672406 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-w9z2m" Feb 23 14:49:14.675761 master-0 kubenswrapper[28758]: I0223 14:49:14.675706 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 23 14:49:14.679011 master-0 kubenswrapper[28758]: I0223 14:49:14.678969 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 23 14:49:14.680038 master-0 kubenswrapper[28758]: I0223 14:49:14.680001 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-ddcf4757-slsxg"] Feb 23 14:49:14.682467 master-0 kubenswrapper[28758]: I0223 14:49:14.682432 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ddcf4757-slsxg" Feb 23 14:49:14.697584 master-0 kubenswrapper[28758]: I0223 14:49:14.697441 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-w9z2m"] Feb 23 14:49:14.712449 master-0 kubenswrapper[28758]: I0223 14:49:14.712169 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ddcf4757-slsxg"] Feb 23 14:49:14.718677 master-0 kubenswrapper[28758]: I0223 14:49:14.718551 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6p69\" (UniqueName: \"kubernetes.io/projected/dcc685c7-da7b-4080-972d-1b4bd11f68ea-kube-api-access-h6p69\") pod \"ironic-db-create-b22m4\" (UID: \"dcc685c7-da7b-4080-972d-1b4bd11f68ea\") " pod="openstack/ironic-db-create-b22m4" Feb 23 14:49:14.720909 master-0 kubenswrapper[28758]: I0223 14:49:14.719356 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b85ps" Feb 23 14:49:14.720909 master-0 kubenswrapper[28758]: I0223 14:49:14.720368 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0378b9ac-d258-466f-8e8d-c1d27932f3b2-scripts\") pod \"cinder-2d990-db-sync-f2ddt\" (UID: \"0378b9ac-d258-466f-8e8d-c1d27932f3b2\") " pod="openstack/cinder-2d990-db-sync-f2ddt" Feb 23 14:49:14.720909 master-0 kubenswrapper[28758]: I0223 14:49:14.720511 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcc685c7-da7b-4080-972d-1b4bd11f68ea-operator-scripts\") pod \"ironic-db-create-b22m4\" (UID: \"dcc685c7-da7b-4080-972d-1b4bd11f68ea\") " pod="openstack/ironic-db-create-b22m4" Feb 23 14:49:14.720909 master-0 kubenswrapper[28758]: I0223 14:49:14.720555 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0378b9ac-d258-466f-8e8d-c1d27932f3b2-etc-machine-id\") pod \"cinder-2d990-db-sync-f2ddt\" (UID: \"0378b9ac-d258-466f-8e8d-c1d27932f3b2\") " pod="openstack/cinder-2d990-db-sync-f2ddt" Feb 23 14:49:14.720909 master-0 kubenswrapper[28758]: I0223 14:49:14.720581 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0378b9ac-d258-466f-8e8d-c1d27932f3b2-combined-ca-bundle\") pod \"cinder-2d990-db-sync-f2ddt\" (UID: \"0378b9ac-d258-466f-8e8d-c1d27932f3b2\") " pod="openstack/cinder-2d990-db-sync-f2ddt" Feb 23 14:49:14.720909 master-0 kubenswrapper[28758]: I0223 14:49:14.720630 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0378b9ac-d258-466f-8e8d-c1d27932f3b2-db-sync-config-data\") pod \"cinder-2d990-db-sync-f2ddt\" (UID: \"0378b9ac-d258-466f-8e8d-c1d27932f3b2\") " pod="openstack/cinder-2d990-db-sync-f2ddt" Feb 23 14:49:14.720909 master-0 kubenswrapper[28758]: I0223 14:49:14.720659 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr5xv\" (UniqueName: \"kubernetes.io/projected/0378b9ac-d258-466f-8e8d-c1d27932f3b2-kube-api-access-tr5xv\") pod \"cinder-2d990-db-sync-f2ddt\" (UID: \"0378b9ac-d258-466f-8e8d-c1d27932f3b2\") " pod="openstack/cinder-2d990-db-sync-f2ddt" Feb 23 14:49:14.720909 master-0 kubenswrapper[28758]: I0223 14:49:14.720716 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0378b9ac-d258-466f-8e8d-c1d27932f3b2-config-data\") pod \"cinder-2d990-db-sync-f2ddt\" (UID: \"0378b9ac-d258-466f-8e8d-c1d27932f3b2\") " pod="openstack/cinder-2d990-db-sync-f2ddt" Feb 23 14:49:14.721322 master-0 kubenswrapper[28758]: I0223 14:49:14.721013 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0378b9ac-d258-466f-8e8d-c1d27932f3b2-etc-machine-id\") pod \"cinder-2d990-db-sync-f2ddt\" (UID: \"0378b9ac-d258-466f-8e8d-c1d27932f3b2\") " pod="openstack/cinder-2d990-db-sync-f2ddt" Feb 23 14:49:14.724675 master-0 kubenswrapper[28758]: I0223 14:49:14.724523 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0378b9ac-d258-466f-8e8d-c1d27932f3b2-combined-ca-bundle\") pod \"cinder-2d990-db-sync-f2ddt\" (UID: \"0378b9ac-d258-466f-8e8d-c1d27932f3b2\") " pod="openstack/cinder-2d990-db-sync-f2ddt" Feb 23 14:49:14.739149 master-0 kubenswrapper[28758]: I0223 14:49:14.739087 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0378b9ac-d258-466f-8e8d-c1d27932f3b2-scripts\") pod \"cinder-2d990-db-sync-f2ddt\" (UID: \"0378b9ac-d258-466f-8e8d-c1d27932f3b2\") " pod="openstack/cinder-2d990-db-sync-f2ddt" Feb 23 14:49:14.754191 master-0 kubenswrapper[28758]: I0223 14:49:14.754103 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0378b9ac-d258-466f-8e8d-c1d27932f3b2-db-sync-config-data\") pod \"cinder-2d990-db-sync-f2ddt\" (UID: \"0378b9ac-d258-466f-8e8d-c1d27932f3b2\") " pod="openstack/cinder-2d990-db-sync-f2ddt" Feb 23 14:49:14.758291 master-0 kubenswrapper[28758]: I0223 14:49:14.756244 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0378b9ac-d258-466f-8e8d-c1d27932f3b2-config-data\") pod \"cinder-2d990-db-sync-f2ddt\" (UID: \"0378b9ac-d258-466f-8e8d-c1d27932f3b2\") " pod="openstack/cinder-2d990-db-sync-f2ddt" Feb 23 14:49:14.758291 master-0 kubenswrapper[28758]: I0223 14:49:14.758102 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr5xv\" (UniqueName: \"kubernetes.io/projected/0378b9ac-d258-466f-8e8d-c1d27932f3b2-kube-api-access-tr5xv\") pod \"cinder-2d990-db-sync-f2ddt\" (UID: \"0378b9ac-d258-466f-8e8d-c1d27932f3b2\") " pod="openstack/cinder-2d990-db-sync-f2ddt" Feb 23 14:49:14.829559 master-0 kubenswrapper[28758]: I0223 14:49:14.828824 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rt88\" (UniqueName: \"kubernetes.io/projected/6d0c7920-4214-40e6-83d6-46e306919ec4-kube-api-access-5rt88\") pod \"neutron-db-sync-tnlrn\" (UID: \"6d0c7920-4214-40e6-83d6-46e306919ec4\") " pod="openstack/neutron-db-sync-tnlrn" Feb 23 14:49:14.829559 master-0 kubenswrapper[28758]: I0223 14:49:14.828918 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k49m5\" (UniqueName: \"kubernetes.io/projected/0344160e-18f1-422b-90f8-663a15320959-kube-api-access-k49m5\") pod \"placement-db-sync-w9z2m\" (UID: \"0344160e-18f1-422b-90f8-663a15320959\") " pod="openstack/placement-db-sync-w9z2m" Feb 23 14:49:14.829559 master-0 kubenswrapper[28758]: I0223 14:49:14.828955 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-config\") pod \"dnsmasq-dns-ddcf4757-slsxg\" (UID: \"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef\") " pod="openstack/dnsmasq-dns-ddcf4757-slsxg" Feb 23 14:49:14.829559 master-0 kubenswrapper[28758]: I0223 14:49:14.828991 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-dns-swift-storage-0\") pod \"dnsmasq-dns-ddcf4757-slsxg\" (UID: \"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef\") " pod="openstack/dnsmasq-dns-ddcf4757-slsxg" Feb 23 14:49:14.829559 master-0 kubenswrapper[28758]: I0223 14:49:14.829052 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6p69\" (UniqueName: \"kubernetes.io/projected/dcc685c7-da7b-4080-972d-1b4bd11f68ea-kube-api-access-h6p69\") pod \"ironic-db-create-b22m4\" (UID: \"dcc685c7-da7b-4080-972d-1b4bd11f68ea\") " pod="openstack/ironic-db-create-b22m4" Feb 23 14:49:14.829559 master-0 kubenswrapper[28758]: I0223 14:49:14.829094 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpqhd\" (UniqueName: \"kubernetes.io/projected/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-kube-api-access-tpqhd\") pod \"dnsmasq-dns-ddcf4757-slsxg\" (UID: \"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef\") " pod="openstack/dnsmasq-dns-ddcf4757-slsxg" Feb 23 14:49:14.829559 master-0 kubenswrapper[28758]: I0223 14:49:14.829115 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-ovsdbserver-sb\") pod \"dnsmasq-dns-ddcf4757-slsxg\" (UID: \"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef\") " pod="openstack/dnsmasq-dns-ddcf4757-slsxg" Feb 23 14:49:14.829559 master-0 kubenswrapper[28758]: I0223 14:49:14.829133 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0344160e-18f1-422b-90f8-663a15320959-config-data\") pod \"placement-db-sync-w9z2m\" (UID: \"0344160e-18f1-422b-90f8-663a15320959\") " pod="openstack/placement-db-sync-w9z2m" Feb 23 14:49:14.829559 master-0 kubenswrapper[28758]: I0223 14:49:14.829150 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24xqs\" (UniqueName: \"kubernetes.io/projected/651d3a04-116a-4337-8f42-3865d8a0b9be-kube-api-access-24xqs\") pod \"ironic-279b-account-create-update-vbsbw\" (UID: \"651d3a04-116a-4337-8f42-3865d8a0b9be\") " pod="openstack/ironic-279b-account-create-update-vbsbw" Feb 23 14:49:14.829559 master-0 kubenswrapper[28758]: I0223 14:49:14.829175 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6d0c7920-4214-40e6-83d6-46e306919ec4-config\") pod \"neutron-db-sync-tnlrn\" (UID: \"6d0c7920-4214-40e6-83d6-46e306919ec4\") " pod="openstack/neutron-db-sync-tnlrn" Feb 23 14:49:14.829559 master-0 kubenswrapper[28758]: I0223 14:49:14.829209 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/651d3a04-116a-4337-8f42-3865d8a0b9be-operator-scripts\") pod \"ironic-279b-account-create-update-vbsbw\" (UID: \"651d3a04-116a-4337-8f42-3865d8a0b9be\") " pod="openstack/ironic-279b-account-create-update-vbsbw" Feb 23 14:49:14.829559 master-0 kubenswrapper[28758]: I0223 14:49:14.829256 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-dns-svc\") pod \"dnsmasq-dns-ddcf4757-slsxg\" (UID: \"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef\") " pod="openstack/dnsmasq-dns-ddcf4757-slsxg" Feb 23 14:49:14.829559 master-0 kubenswrapper[28758]: I0223 14:49:14.829301 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-ovsdbserver-nb\") pod \"dnsmasq-dns-ddcf4757-slsxg\" (UID: \"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef\") " pod="openstack/dnsmasq-dns-ddcf4757-slsxg" Feb 23 14:49:14.829559 master-0 kubenswrapper[28758]: I0223 14:49:14.829363 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0344160e-18f1-422b-90f8-663a15320959-combined-ca-bundle\") pod \"placement-db-sync-w9z2m\" (UID: \"0344160e-18f1-422b-90f8-663a15320959\") " pod="openstack/placement-db-sync-w9z2m" Feb 23 14:49:14.829559 master-0 kubenswrapper[28758]: I0223 14:49:14.829382 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0344160e-18f1-422b-90f8-663a15320959-scripts\") pod \"placement-db-sync-w9z2m\" (UID: \"0344160e-18f1-422b-90f8-663a15320959\") " pod="openstack/placement-db-sync-w9z2m" Feb 23 14:49:14.829559 master-0 kubenswrapper[28758]: I0223 14:49:14.829420 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcc685c7-da7b-4080-972d-1b4bd11f68ea-operator-scripts\") pod \"ironic-db-create-b22m4\" (UID: \"dcc685c7-da7b-4080-972d-1b4bd11f68ea\") " pod="openstack/ironic-db-create-b22m4" Feb 23 14:49:14.829559 master-0 kubenswrapper[28758]: I0223 14:49:14.829494 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0344160e-18f1-422b-90f8-663a15320959-logs\") pod \"placement-db-sync-w9z2m\" (UID: \"0344160e-18f1-422b-90f8-663a15320959\") " pod="openstack/placement-db-sync-w9z2m" Feb 23 14:49:14.829559 master-0 kubenswrapper[28758]: I0223 14:49:14.829514 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d0c7920-4214-40e6-83d6-46e306919ec4-combined-ca-bundle\") pod \"neutron-db-sync-tnlrn\" (UID: \"6d0c7920-4214-40e6-83d6-46e306919ec4\") " pod="openstack/neutron-db-sync-tnlrn" Feb 23 14:49:14.831530 master-0 kubenswrapper[28758]: I0223 14:49:14.830682 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcc685c7-da7b-4080-972d-1b4bd11f68ea-operator-scripts\") pod \"ironic-db-create-b22m4\" (UID: \"dcc685c7-da7b-4080-972d-1b4bd11f68ea\") " pod="openstack/ironic-db-create-b22m4" Feb 23 14:49:14.858077 master-0 kubenswrapper[28758]: I0223 14:49:14.857929 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6p69\" (UniqueName: \"kubernetes.io/projected/dcc685c7-da7b-4080-972d-1b4bd11f68ea-kube-api-access-h6p69\") pod \"ironic-db-create-b22m4\" (UID: \"dcc685c7-da7b-4080-972d-1b4bd11f68ea\") " pod="openstack/ironic-db-create-b22m4" Feb 23 14:49:14.921849 master-0 kubenswrapper[28758]: I0223 14:49:14.920349 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-b22m4" Feb 23 14:49:14.937112 master-0 kubenswrapper[28758]: I0223 14:49:14.935726 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpqhd\" (UniqueName: \"kubernetes.io/projected/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-kube-api-access-tpqhd\") pod \"dnsmasq-dns-ddcf4757-slsxg\" (UID: \"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef\") " pod="openstack/dnsmasq-dns-ddcf4757-slsxg" Feb 23 14:49:14.937112 master-0 kubenswrapper[28758]: I0223 14:49:14.935796 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-ovsdbserver-sb\") pod \"dnsmasq-dns-ddcf4757-slsxg\" (UID: \"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef\") " pod="openstack/dnsmasq-dns-ddcf4757-slsxg" Feb 23 14:49:14.937112 master-0 kubenswrapper[28758]: I0223 14:49:14.935821 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0344160e-18f1-422b-90f8-663a15320959-config-data\") pod \"placement-db-sync-w9z2m\" (UID: \"0344160e-18f1-422b-90f8-663a15320959\") " pod="openstack/placement-db-sync-w9z2m" Feb 23 14:49:14.937112 master-0 kubenswrapper[28758]: I0223 14:49:14.935846 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24xqs\" (UniqueName: \"kubernetes.io/projected/651d3a04-116a-4337-8f42-3865d8a0b9be-kube-api-access-24xqs\") pod \"ironic-279b-account-create-update-vbsbw\" (UID: \"651d3a04-116a-4337-8f42-3865d8a0b9be\") " pod="openstack/ironic-279b-account-create-update-vbsbw" Feb 23 14:49:14.937112 master-0 kubenswrapper[28758]: I0223 14:49:14.935878 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6d0c7920-4214-40e6-83d6-46e306919ec4-config\") pod \"neutron-db-sync-tnlrn\" (UID: \"6d0c7920-4214-40e6-83d6-46e306919ec4\") " pod="openstack/neutron-db-sync-tnlrn" Feb 23 14:49:14.937112 master-0 kubenswrapper[28758]: I0223 14:49:14.935917 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/651d3a04-116a-4337-8f42-3865d8a0b9be-operator-scripts\") pod \"ironic-279b-account-create-update-vbsbw\" (UID: \"651d3a04-116a-4337-8f42-3865d8a0b9be\") " pod="openstack/ironic-279b-account-create-update-vbsbw" Feb 23 14:49:14.937112 master-0 kubenswrapper[28758]: I0223 14:49:14.935965 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-dns-svc\") pod \"dnsmasq-dns-ddcf4757-slsxg\" (UID: \"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef\") " pod="openstack/dnsmasq-dns-ddcf4757-slsxg" Feb 23 14:49:14.937112 master-0 kubenswrapper[28758]: I0223 14:49:14.936015 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-ovsdbserver-nb\") pod \"dnsmasq-dns-ddcf4757-slsxg\" (UID: \"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef\") " pod="openstack/dnsmasq-dns-ddcf4757-slsxg" Feb 23 14:49:14.937112 master-0 kubenswrapper[28758]: I0223 14:49:14.936076 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0344160e-18f1-422b-90f8-663a15320959-combined-ca-bundle\") pod \"placement-db-sync-w9z2m\" (UID: \"0344160e-18f1-422b-90f8-663a15320959\") " pod="openstack/placement-db-sync-w9z2m" Feb 23 14:49:14.937112 master-0 kubenswrapper[28758]: I0223 14:49:14.936100 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0344160e-18f1-422b-90f8-663a15320959-scripts\") pod \"placement-db-sync-w9z2m\" (UID: \"0344160e-18f1-422b-90f8-663a15320959\") " pod="openstack/placement-db-sync-w9z2m" Feb 23 14:49:14.937112 master-0 kubenswrapper[28758]: I0223 14:49:14.936168 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0344160e-18f1-422b-90f8-663a15320959-logs\") pod \"placement-db-sync-w9z2m\" (UID: \"0344160e-18f1-422b-90f8-663a15320959\") " pod="openstack/placement-db-sync-w9z2m" Feb 23 14:49:14.937112 master-0 kubenswrapper[28758]: I0223 14:49:14.936195 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d0c7920-4214-40e6-83d6-46e306919ec4-combined-ca-bundle\") pod \"neutron-db-sync-tnlrn\" (UID: \"6d0c7920-4214-40e6-83d6-46e306919ec4\") " pod="openstack/neutron-db-sync-tnlrn" Feb 23 14:49:14.937112 master-0 kubenswrapper[28758]: I0223 14:49:14.936234 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rt88\" (UniqueName: \"kubernetes.io/projected/6d0c7920-4214-40e6-83d6-46e306919ec4-kube-api-access-5rt88\") pod \"neutron-db-sync-tnlrn\" (UID: \"6d0c7920-4214-40e6-83d6-46e306919ec4\") " pod="openstack/neutron-db-sync-tnlrn" Feb 23 14:49:14.937112 master-0 kubenswrapper[28758]: I0223 14:49:14.936264 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k49m5\" (UniqueName: \"kubernetes.io/projected/0344160e-18f1-422b-90f8-663a15320959-kube-api-access-k49m5\") pod \"placement-db-sync-w9z2m\" (UID: \"0344160e-18f1-422b-90f8-663a15320959\") " pod="openstack/placement-db-sync-w9z2m" Feb 23 14:49:14.937112 master-0 kubenswrapper[28758]: I0223 14:49:14.936287 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-config\") pod \"dnsmasq-dns-ddcf4757-slsxg\" (UID: \"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef\") " pod="openstack/dnsmasq-dns-ddcf4757-slsxg" Feb 23 14:49:14.937112 master-0 kubenswrapper[28758]: I0223 14:49:14.936316 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-dns-swift-storage-0\") pod \"dnsmasq-dns-ddcf4757-slsxg\" (UID: \"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef\") " pod="openstack/dnsmasq-dns-ddcf4757-slsxg" Feb 23 14:49:14.943129 master-0 kubenswrapper[28758]: I0223 14:49:14.943068 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d990-db-sync-f2ddt" Feb 23 14:49:14.944631 master-0 kubenswrapper[28758]: I0223 14:49:14.944589 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-ovsdbserver-nb\") pod \"dnsmasq-dns-ddcf4757-slsxg\" (UID: \"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef\") " pod="openstack/dnsmasq-dns-ddcf4757-slsxg" Feb 23 14:49:14.945625 master-0 kubenswrapper[28758]: I0223 14:49:14.945589 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-ovsdbserver-sb\") pod \"dnsmasq-dns-ddcf4757-slsxg\" (UID: \"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef\") " pod="openstack/dnsmasq-dns-ddcf4757-slsxg" Feb 23 14:49:14.949978 master-0 kubenswrapper[28758]: I0223 14:49:14.949934 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0344160e-18f1-422b-90f8-663a15320959-config-data\") pod \"placement-db-sync-w9z2m\" (UID: \"0344160e-18f1-422b-90f8-663a15320959\") " pod="openstack/placement-db-sync-w9z2m" Feb 23 14:49:14.951323 master-0 kubenswrapper[28758]: I0223 14:49:14.951272 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-dns-swift-storage-0\") pod \"dnsmasq-dns-ddcf4757-slsxg\" (UID: \"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef\") " pod="openstack/dnsmasq-dns-ddcf4757-slsxg" Feb 23 14:49:14.967935 master-0 kubenswrapper[28758]: I0223 14:49:14.967885 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-config\") pod \"dnsmasq-dns-ddcf4757-slsxg\" (UID: \"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef\") " pod="openstack/dnsmasq-dns-ddcf4757-slsxg" Feb 23 14:49:14.973449 master-0 kubenswrapper[28758]: I0223 14:49:14.973400 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/651d3a04-116a-4337-8f42-3865d8a0b9be-operator-scripts\") pod \"ironic-279b-account-create-update-vbsbw\" (UID: \"651d3a04-116a-4337-8f42-3865d8a0b9be\") " pod="openstack/ironic-279b-account-create-update-vbsbw" Feb 23 14:49:14.987423 master-0 kubenswrapper[28758]: I0223 14:49:14.986852 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0344160e-18f1-422b-90f8-663a15320959-logs\") pod \"placement-db-sync-w9z2m\" (UID: \"0344160e-18f1-422b-90f8-663a15320959\") " pod="openstack/placement-db-sync-w9z2m" Feb 23 14:49:14.993114 master-0 kubenswrapper[28758]: I0223 14:49:14.992394 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-dns-svc\") pod \"dnsmasq-dns-ddcf4757-slsxg\" (UID: \"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef\") " pod="openstack/dnsmasq-dns-ddcf4757-slsxg" Feb 23 14:49:15.018500 master-0 kubenswrapper[28758]: I0223 14:49:15.010870 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpqhd\" (UniqueName: \"kubernetes.io/projected/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-kube-api-access-tpqhd\") pod \"dnsmasq-dns-ddcf4757-slsxg\" (UID: \"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef\") " pod="openstack/dnsmasq-dns-ddcf4757-slsxg" Feb 23 14:49:15.018500 master-0 kubenswrapper[28758]: I0223 14:49:15.016582 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ddcf4757-slsxg" Feb 23 14:49:15.027066 master-0 kubenswrapper[28758]: I0223 14:49:15.026381 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0344160e-18f1-422b-90f8-663a15320959-combined-ca-bundle\") pod \"placement-db-sync-w9z2m\" (UID: \"0344160e-18f1-422b-90f8-663a15320959\") " pod="openstack/placement-db-sync-w9z2m" Feb 23 14:49:15.036502 master-0 kubenswrapper[28758]: I0223 14:49:15.034979 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rt88\" (UniqueName: \"kubernetes.io/projected/6d0c7920-4214-40e6-83d6-46e306919ec4-kube-api-access-5rt88\") pod \"neutron-db-sync-tnlrn\" (UID: \"6d0c7920-4214-40e6-83d6-46e306919ec4\") " pod="openstack/neutron-db-sync-tnlrn" Feb 23 14:49:15.037778 master-0 kubenswrapper[28758]: I0223 14:49:15.037190 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k49m5\" (UniqueName: \"kubernetes.io/projected/0344160e-18f1-422b-90f8-663a15320959-kube-api-access-k49m5\") pod \"placement-db-sync-w9z2m\" (UID: \"0344160e-18f1-422b-90f8-663a15320959\") " pod="openstack/placement-db-sync-w9z2m" Feb 23 14:49:15.051500 master-0 kubenswrapper[28758]: I0223 14:49:15.043339 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24xqs\" (UniqueName: \"kubernetes.io/projected/651d3a04-116a-4337-8f42-3865d8a0b9be-kube-api-access-24xqs\") pod \"ironic-279b-account-create-update-vbsbw\" (UID: \"651d3a04-116a-4337-8f42-3865d8a0b9be\") " pod="openstack/ironic-279b-account-create-update-vbsbw" Feb 23 14:49:15.078748 master-0 kubenswrapper[28758]: I0223 14:49:15.074922 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0344160e-18f1-422b-90f8-663a15320959-scripts\") pod \"placement-db-sync-w9z2m\" (UID: \"0344160e-18f1-422b-90f8-663a15320959\") " pod="openstack/placement-db-sync-w9z2m" Feb 23 14:49:15.104552 master-0 kubenswrapper[28758]: I0223 14:49:15.104416 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d0c7920-4214-40e6-83d6-46e306919ec4-combined-ca-bundle\") pod \"neutron-db-sync-tnlrn\" (UID: \"6d0c7920-4214-40e6-83d6-46e306919ec4\") " pod="openstack/neutron-db-sync-tnlrn" Feb 23 14:49:15.118506 master-0 kubenswrapper[28758]: I0223 14:49:15.110279 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6d0c7920-4214-40e6-83d6-46e306919ec4-config\") pod \"neutron-db-sync-tnlrn\" (UID: \"6d0c7920-4214-40e6-83d6-46e306919ec4\") " pod="openstack/neutron-db-sync-tnlrn" Feb 23 14:49:15.180638 master-0 kubenswrapper[28758]: I0223 14:49:15.180574 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57bb5cf47f-hcx8l"] Feb 23 14:49:15.275272 master-0 kubenswrapper[28758]: I0223 14:49:15.275032 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-tnlrn" Feb 23 14:49:15.286335 master-0 kubenswrapper[28758]: I0223 14:49:15.284071 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-279b-account-create-update-vbsbw" Feb 23 14:49:15.298091 master-0 kubenswrapper[28758]: I0223 14:49:15.297416 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-w9z2m" Feb 23 14:49:15.493957 master-0 kubenswrapper[28758]: W0223 14:49:15.493886 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48a556fc_7e30_4fd5_b266_6315dfdcb0e8.slice/crio-d992c9530995798c0c16c16975b8ff09dfaa1b158be2eda87b3ee31b953a8fb0 WatchSource:0}: Error finding container d992c9530995798c0c16c16975b8ff09dfaa1b158be2eda87b3ee31b953a8fb0: Status 404 returned error can't find the container with id d992c9530995798c0c16c16975b8ff09dfaa1b158be2eda87b3ee31b953a8fb0 Feb 23 14:49:15.512241 master-0 kubenswrapper[28758]: I0223 14:49:15.512207 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-b85ps"] Feb 23 14:49:15.775735 master-0 kubenswrapper[28758]: I0223 14:49:15.775605 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bb5cf47f-hcx8l" event={"ID":"f9248a4c-32fb-402d-8277-a8ea33fe97fe","Type":"ContainerStarted","Data":"9192da4b87201f502339afcd9c1a97c43c62c36ebedf352bfe0f67951baf8f67"} Feb 23 14:49:15.775735 master-0 kubenswrapper[28758]: I0223 14:49:15.775668 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bb5cf47f-hcx8l" event={"ID":"f9248a4c-32fb-402d-8277-a8ea33fe97fe","Type":"ContainerStarted","Data":"24a27a9bfd57f92c326eb11251741a9985f74153b518201edf627a9539de3f4d"} Feb 23 14:49:15.777547 master-0 kubenswrapper[28758]: I0223 14:49:15.777499 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b85ps" event={"ID":"48a556fc-7e30-4fd5-b266-6315dfdcb0e8","Type":"ContainerStarted","Data":"3a7a3ce5df768907856540049b9397f5cff2fd6bc39ac7546a2c808ab9445216"} Feb 23 14:49:15.777547 master-0 kubenswrapper[28758]: I0223 14:49:15.777538 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b85ps" event={"ID":"48a556fc-7e30-4fd5-b266-6315dfdcb0e8","Type":"ContainerStarted","Data":"d992c9530995798c0c16c16975b8ff09dfaa1b158be2eda87b3ee31b953a8fb0"} Feb 23 14:49:15.820750 master-0 kubenswrapper[28758]: W0223 14:49:15.820673 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddcc685c7_da7b_4080_972d_1b4bd11f68ea.slice/crio-2036a8e00b5b47a4b527e01cebb846d497aa315490e198430a6ac0b3a78b100e WatchSource:0}: Error finding container 2036a8e00b5b47a4b527e01cebb846d497aa315490e198430a6ac0b3a78b100e: Status 404 returned error can't find the container with id 2036a8e00b5b47a4b527e01cebb846d497aa315490e198430a6ac0b3a78b100e Feb 23 14:49:15.823900 master-0 kubenswrapper[28758]: I0223 14:49:15.823474 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-create-b22m4"] Feb 23 14:49:16.152164 master-0 kubenswrapper[28758]: W0223 14:49:16.152033 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0344160e_18f1_422b_90f8_663a15320959.slice/crio-c5025665d0d8448131710b08518c038259de72f4a3e35815a1fef70febd6dade WatchSource:0}: Error finding container c5025665d0d8448131710b08518c038259de72f4a3e35815a1fef70febd6dade: Status 404 returned error can't find the container with id c5025665d0d8448131710b08518c038259de72f4a3e35815a1fef70febd6dade Feb 23 14:49:16.160030 master-0 kubenswrapper[28758]: W0223 14:49:16.158358 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf30a16e8_1f43_43c6_bc0c_e471a0cda8ef.slice/crio-001346ba47fe071524b194d59c260c1ff7699837870361fc77044032a22fa2e4 WatchSource:0}: Error finding container 001346ba47fe071524b194d59c260c1ff7699837870361fc77044032a22fa2e4: Status 404 returned error can't find the container with id 001346ba47fe071524b194d59c260c1ff7699837870361fc77044032a22fa2e4 Feb 23 14:49:16.177310 master-0 kubenswrapper[28758]: I0223 14:49:16.176887 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2d990-db-sync-f2ddt"] Feb 23 14:49:16.192128 master-0 kubenswrapper[28758]: I0223 14:49:16.192077 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-w9z2m"] Feb 23 14:49:16.235588 master-0 kubenswrapper[28758]: I0223 14:49:16.235537 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ddcf4757-slsxg"] Feb 23 14:49:16.244399 master-0 kubenswrapper[28758]: I0223 14:49:16.244341 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-63e78-default-external-api-0"] Feb 23 14:49:16.257108 master-0 kubenswrapper[28758]: I0223 14:49:16.246810 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:16.257108 master-0 kubenswrapper[28758]: I0223 14:49:16.251427 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 23 14:49:16.257108 master-0 kubenswrapper[28758]: I0223 14:49:16.251586 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 23 14:49:16.257108 master-0 kubenswrapper[28758]: I0223 14:49:16.251982 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-63e78-default-external-config-data" Feb 23 14:49:16.257108 master-0 kubenswrapper[28758]: I0223 14:49:16.253851 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-63e78-default-external-api-0"] Feb 23 14:49:16.279208 master-0 kubenswrapper[28758]: I0223 14:49:16.279139 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-tnlrn"] Feb 23 14:49:16.347812 master-0 kubenswrapper[28758]: I0223 14:49:16.347737 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-279b-account-create-update-vbsbw"] Feb 23 14:49:16.356308 master-0 kubenswrapper[28758]: W0223 14:49:16.356074 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod651d3a04_116a_4337_8f42_3865d8a0b9be.slice/crio-302908afd78ed9546d99d84ae19db1fc147ed9739a995ae1f1bb4ada14a330f8 WatchSource:0}: Error finding container 302908afd78ed9546d99d84ae19db1fc147ed9739a995ae1f1bb4ada14a330f8: Status 404 returned error can't find the container with id 302908afd78ed9546d99d84ae19db1fc147ed9739a995ae1f1bb4ada14a330f8 Feb 23 14:49:16.426294 master-0 kubenswrapper[28758]: I0223 14:49:16.426236 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fee995dc-6f05-4147-9442-57dcc3df496b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4676deba-719d-4f1a-a679-315252c62322\") pod \"glance-63e78-default-external-api-0\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:16.427074 master-0 kubenswrapper[28758]: I0223 14:49:16.426596 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/462dd76d-fc73-4153-a592-d8d2fa38aaf1-combined-ca-bundle\") pod \"glance-63e78-default-external-api-0\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:16.427074 master-0 kubenswrapper[28758]: I0223 14:49:16.426685 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/462dd76d-fc73-4153-a592-d8d2fa38aaf1-logs\") pod \"glance-63e78-default-external-api-0\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:16.427074 master-0 kubenswrapper[28758]: I0223 14:49:16.426725 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/462dd76d-fc73-4153-a592-d8d2fa38aaf1-scripts\") pod \"glance-63e78-default-external-api-0\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:16.427074 master-0 kubenswrapper[28758]: I0223 14:49:16.427008 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/462dd76d-fc73-4153-a592-d8d2fa38aaf1-public-tls-certs\") pod \"glance-63e78-default-external-api-0\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:16.427318 master-0 kubenswrapper[28758]: I0223 14:49:16.427094 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/462dd76d-fc73-4153-a592-d8d2fa38aaf1-httpd-run\") pod \"glance-63e78-default-external-api-0\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:16.427318 master-0 kubenswrapper[28758]: I0223 14:49:16.427172 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/462dd76d-fc73-4153-a592-d8d2fa38aaf1-config-data\") pod \"glance-63e78-default-external-api-0\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:16.427318 master-0 kubenswrapper[28758]: I0223 14:49:16.427214 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szjtd\" (UniqueName: \"kubernetes.io/projected/462dd76d-fc73-4153-a592-d8d2fa38aaf1-kube-api-access-szjtd\") pod \"glance-63e78-default-external-api-0\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:16.530045 master-0 kubenswrapper[28758]: I0223 14:49:16.529978 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/462dd76d-fc73-4153-a592-d8d2fa38aaf1-public-tls-certs\") pod \"glance-63e78-default-external-api-0\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:16.530882 master-0 kubenswrapper[28758]: I0223 14:49:16.530843 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/462dd76d-fc73-4153-a592-d8d2fa38aaf1-httpd-run\") pod \"glance-63e78-default-external-api-0\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:16.530959 master-0 kubenswrapper[28758]: I0223 14:49:16.530911 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/462dd76d-fc73-4153-a592-d8d2fa38aaf1-config-data\") pod \"glance-63e78-default-external-api-0\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:16.530959 master-0 kubenswrapper[28758]: I0223 14:49:16.530949 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szjtd\" (UniqueName: \"kubernetes.io/projected/462dd76d-fc73-4153-a592-d8d2fa38aaf1-kube-api-access-szjtd\") pod \"glance-63e78-default-external-api-0\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:16.531438 master-0 kubenswrapper[28758]: I0223 14:49:16.531072 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fee995dc-6f05-4147-9442-57dcc3df496b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4676deba-719d-4f1a-a679-315252c62322\") pod \"glance-63e78-default-external-api-0\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:16.531438 master-0 kubenswrapper[28758]: I0223 14:49:16.531338 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/462dd76d-fc73-4153-a592-d8d2fa38aaf1-combined-ca-bundle\") pod \"glance-63e78-default-external-api-0\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:16.531438 master-0 kubenswrapper[28758]: I0223 14:49:16.531375 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/462dd76d-fc73-4153-a592-d8d2fa38aaf1-logs\") pod \"glance-63e78-default-external-api-0\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:16.531438 master-0 kubenswrapper[28758]: I0223 14:49:16.531412 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/462dd76d-fc73-4153-a592-d8d2fa38aaf1-scripts\") pod \"glance-63e78-default-external-api-0\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:16.532517 master-0 kubenswrapper[28758]: I0223 14:49:16.532427 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/462dd76d-fc73-4153-a592-d8d2fa38aaf1-httpd-run\") pod \"glance-63e78-default-external-api-0\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:16.534381 master-0 kubenswrapper[28758]: I0223 14:49:16.533808 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/462dd76d-fc73-4153-a592-d8d2fa38aaf1-logs\") pod \"glance-63e78-default-external-api-0\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:16.535967 master-0 kubenswrapper[28758]: I0223 14:49:16.535820 28758 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 14:49:16.535967 master-0 kubenswrapper[28758]: I0223 14:49:16.535877 28758 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fee995dc-6f05-4147-9442-57dcc3df496b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4676deba-719d-4f1a-a679-315252c62322\") pod \"glance-63e78-default-external-api-0\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/e3fa539436bb43727cdf11c2c03cf22a4b969059756e5f07ca659a4a6862fdb6/globalmount\"" pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:16.536404 master-0 kubenswrapper[28758]: I0223 14:49:16.536359 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/462dd76d-fc73-4153-a592-d8d2fa38aaf1-scripts\") pod \"glance-63e78-default-external-api-0\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:16.537559 master-0 kubenswrapper[28758]: I0223 14:49:16.537524 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/462dd76d-fc73-4153-a592-d8d2fa38aaf1-config-data\") pod \"glance-63e78-default-external-api-0\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:16.539147 master-0 kubenswrapper[28758]: I0223 14:49:16.539068 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/462dd76d-fc73-4153-a592-d8d2fa38aaf1-combined-ca-bundle\") pod \"glance-63e78-default-external-api-0\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:16.544576 master-0 kubenswrapper[28758]: I0223 14:49:16.540210 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/462dd76d-fc73-4153-a592-d8d2fa38aaf1-public-tls-certs\") pod \"glance-63e78-default-external-api-0\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:16.550924 master-0 kubenswrapper[28758]: I0223 14:49:16.550878 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szjtd\" (UniqueName: \"kubernetes.io/projected/462dd76d-fc73-4153-a592-d8d2fa38aaf1-kube-api-access-szjtd\") pod \"glance-63e78-default-external-api-0\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:16.812815 master-0 kubenswrapper[28758]: I0223 14:49:16.806108 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ddcf4757-slsxg" event={"ID":"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef","Type":"ContainerStarted","Data":"001346ba47fe071524b194d59c260c1ff7699837870361fc77044032a22fa2e4"} Feb 23 14:49:16.820529 master-0 kubenswrapper[28758]: I0223 14:49:16.820210 28758 generic.go:334] "Generic (PLEG): container finished" podID="f9248a4c-32fb-402d-8277-a8ea33fe97fe" containerID="9192da4b87201f502339afcd9c1a97c43c62c36ebedf352bfe0f67951baf8f67" exitCode=0 Feb 23 14:49:16.820529 master-0 kubenswrapper[28758]: I0223 14:49:16.820279 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bb5cf47f-hcx8l" event={"ID":"f9248a4c-32fb-402d-8277-a8ea33fe97fe","Type":"ContainerDied","Data":"9192da4b87201f502339afcd9c1a97c43c62c36ebedf352bfe0f67951baf8f67"} Feb 23 14:49:16.825495 master-0 kubenswrapper[28758]: I0223 14:49:16.825402 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-w9z2m" event={"ID":"0344160e-18f1-422b-90f8-663a15320959","Type":"ContainerStarted","Data":"c5025665d0d8448131710b08518c038259de72f4a3e35815a1fef70febd6dade"} Feb 23 14:49:16.827025 master-0 kubenswrapper[28758]: I0223 14:49:16.826989 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-db-sync-f2ddt" event={"ID":"0378b9ac-d258-466f-8e8d-c1d27932f3b2","Type":"ContainerStarted","Data":"4e74d3742b197801ade72053810c65d00838300f1e021eb6a1f6d093d8cf0a11"} Feb 23 14:49:16.834833 master-0 kubenswrapper[28758]: I0223 14:49:16.834428 28758 generic.go:334] "Generic (PLEG): container finished" podID="dcc685c7-da7b-4080-972d-1b4bd11f68ea" containerID="9c9a24046ef728b3310dcfa90c50b1866d4b91e8dd9faa870a330c0dbe7bc618" exitCode=0 Feb 23 14:49:16.834833 master-0 kubenswrapper[28758]: I0223 14:49:16.834711 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-b22m4" event={"ID":"dcc685c7-da7b-4080-972d-1b4bd11f68ea","Type":"ContainerDied","Data":"9c9a24046ef728b3310dcfa90c50b1866d4b91e8dd9faa870a330c0dbe7bc618"} Feb 23 14:49:16.834833 master-0 kubenswrapper[28758]: I0223 14:49:16.834754 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-b22m4" event={"ID":"dcc685c7-da7b-4080-972d-1b4bd11f68ea","Type":"ContainerStarted","Data":"2036a8e00b5b47a4b527e01cebb846d497aa315490e198430a6ac0b3a78b100e"} Feb 23 14:49:16.840389 master-0 kubenswrapper[28758]: I0223 14:49:16.840314 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-279b-account-create-update-vbsbw" event={"ID":"651d3a04-116a-4337-8f42-3865d8a0b9be","Type":"ContainerStarted","Data":"302908afd78ed9546d99d84ae19db1fc147ed9739a995ae1f1bb4ada14a330f8"} Feb 23 14:49:16.845449 master-0 kubenswrapper[28758]: I0223 14:49:16.845343 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-tnlrn" event={"ID":"6d0c7920-4214-40e6-83d6-46e306919ec4","Type":"ContainerStarted","Data":"07f8bb525d47f0913ba3f4e74c1e76a4eb4a1c1bed843ddb4abcdc854b95ce0a"} Feb 23 14:49:16.845613 master-0 kubenswrapper[28758]: I0223 14:49:16.845453 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-tnlrn" event={"ID":"6d0c7920-4214-40e6-83d6-46e306919ec4","Type":"ContainerStarted","Data":"6cf6a2e7247c6aeaf35c96acf10bdb278d896ed014749941e4872c96a2ef299f"} Feb 23 14:49:16.894193 master-0 kubenswrapper[28758]: I0223 14:49:16.894046 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-279b-account-create-update-vbsbw" podStartSLOduration=2.894027489 podStartE2EDuration="2.894027489s" podCreationTimestamp="2026-02-23 14:49:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:49:16.885666186 +0000 UTC m=+889.011982138" watchObservedRunningTime="2026-02-23 14:49:16.894027489 +0000 UTC m=+889.020343421" Feb 23 14:49:16.928413 master-0 kubenswrapper[28758]: I0223 14:49:16.928256 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-tnlrn" podStartSLOduration=2.928239548 podStartE2EDuration="2.928239548s" podCreationTimestamp="2026-02-23 14:49:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:49:16.927293042 +0000 UTC m=+889.053608974" watchObservedRunningTime="2026-02-23 14:49:16.928239548 +0000 UTC m=+889.054555480" Feb 23 14:49:16.991875 master-0 kubenswrapper[28758]: I0223 14:49:16.991805 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-b85ps" podStartSLOduration=2.991784116 podStartE2EDuration="2.991784116s" podCreationTimestamp="2026-02-23 14:49:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:49:16.988087068 +0000 UTC m=+889.114403000" watchObservedRunningTime="2026-02-23 14:49:16.991784116 +0000 UTC m=+889.118100048" Feb 23 14:49:17.280237 master-0 kubenswrapper[28758]: I0223 14:49:17.280184 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-63e78-default-external-api-0"] Feb 23 14:49:17.281170 master-0 kubenswrapper[28758]: E0223 14:49:17.281132 28758 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[glance], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-63e78-default-external-api-0" podUID="462dd76d-fc73-4153-a592-d8d2fa38aaf1" Feb 23 14:49:17.332725 master-0 kubenswrapper[28758]: I0223 14:49:17.332657 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-63e78-default-internal-api-0"] Feb 23 14:49:17.336093 master-0 kubenswrapper[28758]: I0223 14:49:17.334976 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:17.340305 master-0 kubenswrapper[28758]: I0223 14:49:17.337316 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-63e78-default-internal-config-data" Feb 23 14:49:17.340305 master-0 kubenswrapper[28758]: I0223 14:49:17.337754 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 23 14:49:17.354266 master-0 kubenswrapper[28758]: I0223 14:49:17.354218 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-63e78-default-internal-api-0"] Feb 23 14:49:17.454306 master-0 kubenswrapper[28758]: I0223 14:49:17.454252 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdvpz\" (UniqueName: \"kubernetes.io/projected/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-kube-api-access-sdvpz\") pod \"glance-63e78-default-internal-api-0\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:17.454306 master-0 kubenswrapper[28758]: I0223 14:49:17.454311 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-config-data\") pod \"glance-63e78-default-internal-api-0\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:17.454541 master-0 kubenswrapper[28758]: I0223 14:49:17.454522 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-httpd-run\") pod \"glance-63e78-default-internal-api-0\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:17.454585 master-0 kubenswrapper[28758]: I0223 14:49:17.454552 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-internal-tls-certs\") pod \"glance-63e78-default-internal-api-0\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:17.454619 master-0 kubenswrapper[28758]: I0223 14:49:17.454582 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-scripts\") pod \"glance-63e78-default-internal-api-0\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:17.454619 master-0 kubenswrapper[28758]: I0223 14:49:17.454610 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-94be7a4f-8e83-4b52-8f7d-b530d749c57c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^1a730645-16f8-4853-9800-95bc968aad28\") pod \"glance-63e78-default-internal-api-0\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:17.454676 master-0 kubenswrapper[28758]: I0223 14:49:17.454644 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-logs\") pod \"glance-63e78-default-internal-api-0\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:17.454676 master-0 kubenswrapper[28758]: I0223 14:49:17.454664 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-combined-ca-bundle\") pod \"glance-63e78-default-internal-api-0\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:17.461351 master-0 kubenswrapper[28758]: I0223 14:49:17.461273 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-63e78-default-internal-api-0"] Feb 23 14:49:17.476080 master-0 kubenswrapper[28758]: I0223 14:49:17.476026 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bb5cf47f-hcx8l" Feb 23 14:49:17.494517 master-0 kubenswrapper[28758]: E0223 14:49:17.494444 28758 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data glance httpd-run internal-tls-certs kube-api-access-sdvpz logs scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-63e78-default-internal-api-0" podUID="e5a697af-fabe-4d68-b98f-eb9d3ba688f1" Feb 23 14:49:17.556207 master-0 kubenswrapper[28758]: I0223 14:49:17.556100 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9248a4c-32fb-402d-8277-a8ea33fe97fe-config\") pod \"f9248a4c-32fb-402d-8277-a8ea33fe97fe\" (UID: \"f9248a4c-32fb-402d-8277-a8ea33fe97fe\") " Feb 23 14:49:17.556207 master-0 kubenswrapper[28758]: I0223 14:49:17.556197 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9248a4c-32fb-402d-8277-a8ea33fe97fe-dns-svc\") pod \"f9248a4c-32fb-402d-8277-a8ea33fe97fe\" (UID: \"f9248a4c-32fb-402d-8277-a8ea33fe97fe\") " Feb 23 14:49:17.556539 master-0 kubenswrapper[28758]: I0223 14:49:17.556296 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9248a4c-32fb-402d-8277-a8ea33fe97fe-dns-swift-storage-0\") pod \"f9248a4c-32fb-402d-8277-a8ea33fe97fe\" (UID: \"f9248a4c-32fb-402d-8277-a8ea33fe97fe\") " Feb 23 14:49:17.556539 master-0 kubenswrapper[28758]: I0223 14:49:17.556351 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmg64\" (UniqueName: \"kubernetes.io/projected/f9248a4c-32fb-402d-8277-a8ea33fe97fe-kube-api-access-hmg64\") pod \"f9248a4c-32fb-402d-8277-a8ea33fe97fe\" (UID: \"f9248a4c-32fb-402d-8277-a8ea33fe97fe\") " Feb 23 14:49:17.556539 master-0 kubenswrapper[28758]: I0223 14:49:17.556436 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9248a4c-32fb-402d-8277-a8ea33fe97fe-ovsdbserver-nb\") pod \"f9248a4c-32fb-402d-8277-a8ea33fe97fe\" (UID: \"f9248a4c-32fb-402d-8277-a8ea33fe97fe\") " Feb 23 14:49:17.556539 master-0 kubenswrapper[28758]: I0223 14:49:17.556465 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9248a4c-32fb-402d-8277-a8ea33fe97fe-ovsdbserver-sb\") pod \"f9248a4c-32fb-402d-8277-a8ea33fe97fe\" (UID: \"f9248a4c-32fb-402d-8277-a8ea33fe97fe\") " Feb 23 14:49:17.557122 master-0 kubenswrapper[28758]: I0223 14:49:17.556874 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-httpd-run\") pod \"glance-63e78-default-internal-api-0\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:17.557122 master-0 kubenswrapper[28758]: I0223 14:49:17.556923 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-internal-tls-certs\") pod \"glance-63e78-default-internal-api-0\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:17.557122 master-0 kubenswrapper[28758]: I0223 14:49:17.556965 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-scripts\") pod \"glance-63e78-default-internal-api-0\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:17.557122 master-0 kubenswrapper[28758]: I0223 14:49:17.556995 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-94be7a4f-8e83-4b52-8f7d-b530d749c57c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^1a730645-16f8-4853-9800-95bc968aad28\") pod \"glance-63e78-default-internal-api-0\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:17.557122 master-0 kubenswrapper[28758]: I0223 14:49:17.557035 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-logs\") pod \"glance-63e78-default-internal-api-0\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:17.557122 master-0 kubenswrapper[28758]: I0223 14:49:17.557071 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-combined-ca-bundle\") pod \"glance-63e78-default-internal-api-0\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:17.557122 master-0 kubenswrapper[28758]: I0223 14:49:17.557103 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdvpz\" (UniqueName: \"kubernetes.io/projected/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-kube-api-access-sdvpz\") pod \"glance-63e78-default-internal-api-0\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:17.557376 master-0 kubenswrapper[28758]: I0223 14:49:17.557127 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-config-data\") pod \"glance-63e78-default-internal-api-0\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:17.558575 master-0 kubenswrapper[28758]: I0223 14:49:17.558527 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-httpd-run\") pod \"glance-63e78-default-internal-api-0\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:17.559198 master-0 kubenswrapper[28758]: I0223 14:49:17.559165 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-logs\") pod \"glance-63e78-default-internal-api-0\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:17.560401 master-0 kubenswrapper[28758]: I0223 14:49:17.560193 28758 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 14:49:17.560401 master-0 kubenswrapper[28758]: I0223 14:49:17.560251 28758 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-94be7a4f-8e83-4b52-8f7d-b530d749c57c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^1a730645-16f8-4853-9800-95bc968aad28\") pod \"glance-63e78-default-internal-api-0\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/152be6b8b3317aae1186e616b7fc755c4c4a3c8c7584c6b0c3c221e4abb1d1f4/globalmount\"" pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:17.564312 master-0 kubenswrapper[28758]: I0223 14:49:17.564212 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-combined-ca-bundle\") pod \"glance-63e78-default-internal-api-0\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:17.569693 master-0 kubenswrapper[28758]: I0223 14:49:17.569653 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-config-data\") pod \"glance-63e78-default-internal-api-0\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:17.576157 master-0 kubenswrapper[28758]: I0223 14:49:17.576091 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9248a4c-32fb-402d-8277-a8ea33fe97fe-kube-api-access-hmg64" (OuterVolumeSpecName: "kube-api-access-hmg64") pod "f9248a4c-32fb-402d-8277-a8ea33fe97fe" (UID: "f9248a4c-32fb-402d-8277-a8ea33fe97fe"). InnerVolumeSpecName "kube-api-access-hmg64". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:49:17.576745 master-0 kubenswrapper[28758]: I0223 14:49:17.576699 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-scripts\") pod \"glance-63e78-default-internal-api-0\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:17.584738 master-0 kubenswrapper[28758]: I0223 14:49:17.584676 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-internal-tls-certs\") pod \"glance-63e78-default-internal-api-0\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:17.594565 master-0 kubenswrapper[28758]: I0223 14:49:17.594434 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdvpz\" (UniqueName: \"kubernetes.io/projected/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-kube-api-access-sdvpz\") pod \"glance-63e78-default-internal-api-0\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:17.604126 master-0 kubenswrapper[28758]: I0223 14:49:17.604056 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9248a4c-32fb-402d-8277-a8ea33fe97fe-config" (OuterVolumeSpecName: "config") pod "f9248a4c-32fb-402d-8277-a8ea33fe97fe" (UID: "f9248a4c-32fb-402d-8277-a8ea33fe97fe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:49:17.604874 master-0 kubenswrapper[28758]: I0223 14:49:17.604828 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9248a4c-32fb-402d-8277-a8ea33fe97fe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f9248a4c-32fb-402d-8277-a8ea33fe97fe" (UID: "f9248a4c-32fb-402d-8277-a8ea33fe97fe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:49:17.611098 master-0 kubenswrapper[28758]: I0223 14:49:17.611033 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9248a4c-32fb-402d-8277-a8ea33fe97fe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f9248a4c-32fb-402d-8277-a8ea33fe97fe" (UID: "f9248a4c-32fb-402d-8277-a8ea33fe97fe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:49:17.632077 master-0 kubenswrapper[28758]: I0223 14:49:17.632026 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9248a4c-32fb-402d-8277-a8ea33fe97fe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f9248a4c-32fb-402d-8277-a8ea33fe97fe" (UID: "f9248a4c-32fb-402d-8277-a8ea33fe97fe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:49:17.640681 master-0 kubenswrapper[28758]: I0223 14:49:17.640603 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9248a4c-32fb-402d-8277-a8ea33fe97fe-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f9248a4c-32fb-402d-8277-a8ea33fe97fe" (UID: "f9248a4c-32fb-402d-8277-a8ea33fe97fe"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:49:17.664615 master-0 kubenswrapper[28758]: I0223 14:49:17.663904 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmg64\" (UniqueName: \"kubernetes.io/projected/f9248a4c-32fb-402d-8277-a8ea33fe97fe-kube-api-access-hmg64\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:17.664615 master-0 kubenswrapper[28758]: I0223 14:49:17.663968 28758 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9248a4c-32fb-402d-8277-a8ea33fe97fe-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:17.664615 master-0 kubenswrapper[28758]: I0223 14:49:17.663979 28758 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9248a4c-32fb-402d-8277-a8ea33fe97fe-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:17.664615 master-0 kubenswrapper[28758]: I0223 14:49:17.663991 28758 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9248a4c-32fb-402d-8277-a8ea33fe97fe-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:17.664615 master-0 kubenswrapper[28758]: I0223 14:49:17.664002 28758 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9248a4c-32fb-402d-8277-a8ea33fe97fe-dns-svc\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:17.664615 master-0 kubenswrapper[28758]: I0223 14:49:17.664014 28758 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9248a4c-32fb-402d-8277-a8ea33fe97fe-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:17.862955 master-0 kubenswrapper[28758]: I0223 14:49:17.862802 28758 generic.go:334] "Generic (PLEG): container finished" podID="f30a16e8-1f43-43c6-bc0c-e471a0cda8ef" containerID="559c41debc259ac95fb37f6f92e19f67d910b7c8815d98b28491eb492e7aebd6" exitCode=0 Feb 23 14:49:17.863159 master-0 kubenswrapper[28758]: I0223 14:49:17.862963 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ddcf4757-slsxg" event={"ID":"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef","Type":"ContainerDied","Data":"559c41debc259ac95fb37f6f92e19f67d910b7c8815d98b28491eb492e7aebd6"} Feb 23 14:49:17.879086 master-0 kubenswrapper[28758]: I0223 14:49:17.879015 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bb5cf47f-hcx8l" event={"ID":"f9248a4c-32fb-402d-8277-a8ea33fe97fe","Type":"ContainerDied","Data":"24a27a9bfd57f92c326eb11251741a9985f74153b518201edf627a9539de3f4d"} Feb 23 14:49:17.879086 master-0 kubenswrapper[28758]: I0223 14:49:17.879071 28758 scope.go:117] "RemoveContainer" containerID="9192da4b87201f502339afcd9c1a97c43c62c36ebedf352bfe0f67951baf8f67" Feb 23 14:49:17.879409 master-0 kubenswrapper[28758]: I0223 14:49:17.879288 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bb5cf47f-hcx8l" Feb 23 14:49:17.903670 master-0 kubenswrapper[28758]: I0223 14:49:17.902051 28758 generic.go:334] "Generic (PLEG): container finished" podID="651d3a04-116a-4337-8f42-3865d8a0b9be" containerID="4784d3b4e6d630344a9c8a7cf42d0847dbcac07cd4d210d35398b5a75e06a2a7" exitCode=0 Feb 23 14:49:17.903670 master-0 kubenswrapper[28758]: I0223 14:49:17.902661 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-279b-account-create-update-vbsbw" event={"ID":"651d3a04-116a-4337-8f42-3865d8a0b9be","Type":"ContainerDied","Data":"4784d3b4e6d630344a9c8a7cf42d0847dbcac07cd4d210d35398b5a75e06a2a7"} Feb 23 14:49:17.905836 master-0 kubenswrapper[28758]: I0223 14:49:17.904988 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:17.905836 master-0 kubenswrapper[28758]: I0223 14:49:17.905605 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:17.972570 master-0 kubenswrapper[28758]: I0223 14:49:17.972163 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:17.991045 master-0 kubenswrapper[28758]: I0223 14:49:17.990632 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:18.026958 master-0 kubenswrapper[28758]: I0223 14:49:18.025123 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57bb5cf47f-hcx8l"] Feb 23 14:49:18.037445 master-0 kubenswrapper[28758]: I0223 14:49:18.036009 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57bb5cf47f-hcx8l"] Feb 23 14:49:18.067343 master-0 kubenswrapper[28758]: I0223 14:49:18.065737 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fee995dc-6f05-4147-9442-57dcc3df496b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4676deba-719d-4f1a-a679-315252c62322\") pod \"glance-63e78-default-external-api-0\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:18.078709 master-0 kubenswrapper[28758]: I0223 14:49:18.076253 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/462dd76d-fc73-4153-a592-d8d2fa38aaf1-public-tls-certs\") pod \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") " Feb 23 14:49:18.078709 master-0 kubenswrapper[28758]: I0223 14:49:18.076358 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-httpd-run\") pod \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") " Feb 23 14:49:18.078709 master-0 kubenswrapper[28758]: I0223 14:49:18.076491 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szjtd\" (UniqueName: \"kubernetes.io/projected/462dd76d-fc73-4153-a592-d8d2fa38aaf1-kube-api-access-szjtd\") pod \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") " Feb 23 14:49:18.078709 master-0 kubenswrapper[28758]: I0223 14:49:18.076527 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-combined-ca-bundle\") pod \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") " Feb 23 14:49:18.078709 master-0 kubenswrapper[28758]: I0223 14:49:18.076784 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/462dd76d-fc73-4153-a592-d8d2fa38aaf1-config-data\") pod \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") " Feb 23 14:49:18.078709 master-0 kubenswrapper[28758]: I0223 14:49:18.076828 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/462dd76d-fc73-4153-a592-d8d2fa38aaf1-scripts\") pod \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") " Feb 23 14:49:18.078709 master-0 kubenswrapper[28758]: I0223 14:49:18.076842 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-config-data\") pod \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") " Feb 23 14:49:18.078709 master-0 kubenswrapper[28758]: I0223 14:49:18.076862 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/462dd76d-fc73-4153-a592-d8d2fa38aaf1-logs\") pod \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") " Feb 23 14:49:18.078709 master-0 kubenswrapper[28758]: I0223 14:49:18.076885 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-scripts\") pod \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") " Feb 23 14:49:18.078709 master-0 kubenswrapper[28758]: I0223 14:49:18.076905 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-logs\") pod \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") " Feb 23 14:49:18.078709 master-0 kubenswrapper[28758]: I0223 14:49:18.076944 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/462dd76d-fc73-4153-a592-d8d2fa38aaf1-combined-ca-bundle\") pod \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") " Feb 23 14:49:18.078709 master-0 kubenswrapper[28758]: I0223 14:49:18.076992 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdvpz\" (UniqueName: \"kubernetes.io/projected/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-kube-api-access-sdvpz\") pod \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") " Feb 23 14:49:18.078709 master-0 kubenswrapper[28758]: I0223 14:49:18.077038 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/462dd76d-fc73-4153-a592-d8d2fa38aaf1-httpd-run\") pod \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") " Feb 23 14:49:18.078709 master-0 kubenswrapper[28758]: I0223 14:49:18.077086 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-internal-tls-certs\") pod \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") " Feb 23 14:49:18.086277 master-0 kubenswrapper[28758]: I0223 14:49:18.086232 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e5a697af-fabe-4d68-b98f-eb9d3ba688f1" (UID: "e5a697af-fabe-4d68-b98f-eb9d3ba688f1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:18.086559 master-0 kubenswrapper[28758]: I0223 14:49:18.086452 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/462dd76d-fc73-4153-a592-d8d2fa38aaf1-logs" (OuterVolumeSpecName: "logs") pod "462dd76d-fc73-4153-a592-d8d2fa38aaf1" (UID: "462dd76d-fc73-4153-a592-d8d2fa38aaf1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:49:18.089019 master-0 kubenswrapper[28758]: I0223 14:49:18.088988 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/462dd76d-fc73-4153-a592-d8d2fa38aaf1-kube-api-access-szjtd" (OuterVolumeSpecName: "kube-api-access-szjtd") pod "462dd76d-fc73-4153-a592-d8d2fa38aaf1" (UID: "462dd76d-fc73-4153-a592-d8d2fa38aaf1"). InnerVolumeSpecName "kube-api-access-szjtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:49:18.089356 master-0 kubenswrapper[28758]: I0223 14:49:18.089306 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e5a697af-fabe-4d68-b98f-eb9d3ba688f1" (UID: "e5a697af-fabe-4d68-b98f-eb9d3ba688f1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:49:18.089719 master-0 kubenswrapper[28758]: I0223 14:49:18.089686 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-scripts" (OuterVolumeSpecName: "scripts") pod "e5a697af-fabe-4d68-b98f-eb9d3ba688f1" (UID: "e5a697af-fabe-4d68-b98f-eb9d3ba688f1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:18.089988 master-0 kubenswrapper[28758]: I0223 14:49:18.089965 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-logs" (OuterVolumeSpecName: "logs") pod "e5a697af-fabe-4d68-b98f-eb9d3ba688f1" (UID: "e5a697af-fabe-4d68-b98f-eb9d3ba688f1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:49:18.092737 master-0 kubenswrapper[28758]: I0223 14:49:18.092699 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/462dd76d-fc73-4153-a592-d8d2fa38aaf1-config-data" (OuterVolumeSpecName: "config-data") pod "462dd76d-fc73-4153-a592-d8d2fa38aaf1" (UID: "462dd76d-fc73-4153-a592-d8d2fa38aaf1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:18.094559 master-0 kubenswrapper[28758]: I0223 14:49:18.094522 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/462dd76d-fc73-4153-a592-d8d2fa38aaf1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "462dd76d-fc73-4153-a592-d8d2fa38aaf1" (UID: "462dd76d-fc73-4153-a592-d8d2fa38aaf1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:49:18.094806 master-0 kubenswrapper[28758]: I0223 14:49:18.094732 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/462dd76d-fc73-4153-a592-d8d2fa38aaf1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "462dd76d-fc73-4153-a592-d8d2fa38aaf1" (UID: "462dd76d-fc73-4153-a592-d8d2fa38aaf1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:18.095903 master-0 kubenswrapper[28758]: I0223 14:49:18.095872 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5a697af-fabe-4d68-b98f-eb9d3ba688f1" (UID: "e5a697af-fabe-4d68-b98f-eb9d3ba688f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:18.099917 master-0 kubenswrapper[28758]: I0223 14:49:18.099856 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/462dd76d-fc73-4153-a592-d8d2fa38aaf1-scripts" (OuterVolumeSpecName: "scripts") pod "462dd76d-fc73-4153-a592-d8d2fa38aaf1" (UID: "462dd76d-fc73-4153-a592-d8d2fa38aaf1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:18.103074 master-0 kubenswrapper[28758]: I0223 14:49:18.102956 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-kube-api-access-sdvpz" (OuterVolumeSpecName: "kube-api-access-sdvpz") pod "e5a697af-fabe-4d68-b98f-eb9d3ba688f1" (UID: "e5a697af-fabe-4d68-b98f-eb9d3ba688f1"). InnerVolumeSpecName "kube-api-access-sdvpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:49:18.108614 master-0 kubenswrapper[28758]: I0223 14:49:18.108568 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-config-data" (OuterVolumeSpecName: "config-data") pod "e5a697af-fabe-4d68-b98f-eb9d3ba688f1" (UID: "e5a697af-fabe-4d68-b98f-eb9d3ba688f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:18.109556 master-0 kubenswrapper[28758]: I0223 14:49:18.109492 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/462dd76d-fc73-4153-a592-d8d2fa38aaf1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "462dd76d-fc73-4153-a592-d8d2fa38aaf1" (UID: "462dd76d-fc73-4153-a592-d8d2fa38aaf1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:18.113909 master-0 kubenswrapper[28758]: I0223 14:49:18.113552 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9248a4c-32fb-402d-8277-a8ea33fe97fe" path="/var/lib/kubelet/pods/f9248a4c-32fb-402d-8277-a8ea33fe97fe/volumes" Feb 23 14:49:18.191345 master-0 kubenswrapper[28758]: I0223 14:49:18.189244 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4676deba-719d-4f1a-a679-315252c62322\") pod \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\" (UID: \"462dd76d-fc73-4153-a592-d8d2fa38aaf1\") " Feb 23 14:49:18.220387 master-0 kubenswrapper[28758]: I0223 14:49:18.220294 28758 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/462dd76d-fc73-4153-a592-d8d2fa38aaf1-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:18.220387 master-0 kubenswrapper[28758]: I0223 14:49:18.220371 28758 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:18.220387 master-0 kubenswrapper[28758]: I0223 14:49:18.220391 28758 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/462dd76d-fc73-4153-a592-d8d2fa38aaf1-logs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:18.220387 master-0 kubenswrapper[28758]: I0223 14:49:18.220403 28758 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:18.220820 master-0 kubenswrapper[28758]: I0223 14:49:18.220416 28758 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-logs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:18.220820 master-0 kubenswrapper[28758]: I0223 14:49:18.220428 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/462dd76d-fc73-4153-a592-d8d2fa38aaf1-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:18.220820 master-0 kubenswrapper[28758]: I0223 14:49:18.220441 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdvpz\" (UniqueName: \"kubernetes.io/projected/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-kube-api-access-sdvpz\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:18.220820 master-0 kubenswrapper[28758]: I0223 14:49:18.220453 28758 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/462dd76d-fc73-4153-a592-d8d2fa38aaf1-httpd-run\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:18.220820 master-0 kubenswrapper[28758]: I0223 14:49:18.220465 28758 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:18.220820 master-0 kubenswrapper[28758]: I0223 14:49:18.220491 28758 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/462dd76d-fc73-4153-a592-d8d2fa38aaf1-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:18.220820 master-0 kubenswrapper[28758]: I0223 14:49:18.220503 28758 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-httpd-run\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:18.220820 master-0 kubenswrapper[28758]: I0223 14:49:18.220515 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szjtd\" (UniqueName: \"kubernetes.io/projected/462dd76d-fc73-4153-a592-d8d2fa38aaf1-kube-api-access-szjtd\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:18.220820 master-0 kubenswrapper[28758]: I0223 14:49:18.220526 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a697af-fabe-4d68-b98f-eb9d3ba688f1-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:18.220820 master-0 kubenswrapper[28758]: I0223 14:49:18.220537 28758 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/462dd76d-fc73-4153-a592-d8d2fa38aaf1-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:18.543143 master-0 kubenswrapper[28758]: I0223 14:49:18.543065 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-b22m4" Feb 23 14:49:18.634025 master-0 kubenswrapper[28758]: I0223 14:49:18.632644 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcc685c7-da7b-4080-972d-1b4bd11f68ea-operator-scripts\") pod \"dcc685c7-da7b-4080-972d-1b4bd11f68ea\" (UID: \"dcc685c7-da7b-4080-972d-1b4bd11f68ea\") " Feb 23 14:49:18.634025 master-0 kubenswrapper[28758]: I0223 14:49:18.632728 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6p69\" (UniqueName: \"kubernetes.io/projected/dcc685c7-da7b-4080-972d-1b4bd11f68ea-kube-api-access-h6p69\") pod \"dcc685c7-da7b-4080-972d-1b4bd11f68ea\" (UID: \"dcc685c7-da7b-4080-972d-1b4bd11f68ea\") " Feb 23 14:49:18.634025 master-0 kubenswrapper[28758]: I0223 14:49:18.632961 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcc685c7-da7b-4080-972d-1b4bd11f68ea-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dcc685c7-da7b-4080-972d-1b4bd11f68ea" (UID: "dcc685c7-da7b-4080-972d-1b4bd11f68ea"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:49:18.634025 master-0 kubenswrapper[28758]: I0223 14:49:18.633307 28758 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcc685c7-da7b-4080-972d-1b4bd11f68ea-operator-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:18.638562 master-0 kubenswrapper[28758]: I0223 14:49:18.638438 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcc685c7-da7b-4080-972d-1b4bd11f68ea-kube-api-access-h6p69" (OuterVolumeSpecName: "kube-api-access-h6p69") pod "dcc685c7-da7b-4080-972d-1b4bd11f68ea" (UID: "dcc685c7-da7b-4080-972d-1b4bd11f68ea"). InnerVolumeSpecName "kube-api-access-h6p69". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:49:18.736904 master-0 kubenswrapper[28758]: I0223 14:49:18.736339 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6p69\" (UniqueName: \"kubernetes.io/projected/dcc685c7-da7b-4080-972d-1b4bd11f68ea-kube-api-access-h6p69\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:18.917029 master-0 kubenswrapper[28758]: I0223 14:49:18.916905 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-b22m4" Feb 23 14:49:18.917029 master-0 kubenswrapper[28758]: I0223 14:49:18.916908 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-b22m4" event={"ID":"dcc685c7-da7b-4080-972d-1b4bd11f68ea","Type":"ContainerDied","Data":"2036a8e00b5b47a4b527e01cebb846d497aa315490e198430a6ac0b3a78b100e"} Feb 23 14:49:18.917029 master-0 kubenswrapper[28758]: I0223 14:49:18.916979 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2036a8e00b5b47a4b527e01cebb846d497aa315490e198430a6ac0b3a78b100e" Feb 23 14:49:18.920669 master-0 kubenswrapper[28758]: I0223 14:49:18.919353 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ddcf4757-slsxg" event={"ID":"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef","Type":"ContainerStarted","Data":"7cb67f8d72e3e54bce465a80b556566400c5d647ff8f81e442763ed5e78f5ff4"} Feb 23 14:49:18.920669 master-0 kubenswrapper[28758]: I0223 14:49:18.919445 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-ddcf4757-slsxg" Feb 23 14:49:18.920669 master-0 kubenswrapper[28758]: I0223 14:49:18.920501 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:18.921988 master-0 kubenswrapper[28758]: I0223 14:49:18.920867 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:18.969187 master-0 kubenswrapper[28758]: I0223 14:49:18.963510 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-ddcf4757-slsxg" podStartSLOduration=4.963488492 podStartE2EDuration="4.963488492s" podCreationTimestamp="2026-02-23 14:49:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:49:18.954297328 +0000 UTC m=+891.080613260" watchObservedRunningTime="2026-02-23 14:49:18.963488492 +0000 UTC m=+891.089804414" Feb 23 14:49:19.024514 master-0 kubenswrapper[28758]: I0223 14:49:19.021074 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-63e78-default-internal-api-0"] Feb 23 14:49:19.042442 master-0 kubenswrapper[28758]: I0223 14:49:19.039789 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-63e78-default-internal-api-0"] Feb 23 14:49:19.066522 master-0 kubenswrapper[28758]: I0223 14:49:19.065924 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-63e78-default-internal-api-0"] Feb 23 14:49:19.066763 master-0 kubenswrapper[28758]: E0223 14:49:19.066666 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcc685c7-da7b-4080-972d-1b4bd11f68ea" containerName="mariadb-database-create" Feb 23 14:49:19.066763 master-0 kubenswrapper[28758]: I0223 14:49:19.066688 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcc685c7-da7b-4080-972d-1b4bd11f68ea" containerName="mariadb-database-create" Feb 23 14:49:19.066763 master-0 kubenswrapper[28758]: E0223 14:49:19.066715 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9248a4c-32fb-402d-8277-a8ea33fe97fe" containerName="init" Feb 23 14:49:19.066763 master-0 kubenswrapper[28758]: I0223 14:49:19.066724 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9248a4c-32fb-402d-8277-a8ea33fe97fe" containerName="init" Feb 23 14:49:19.069764 master-0 kubenswrapper[28758]: I0223 14:49:19.066986 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcc685c7-da7b-4080-972d-1b4bd11f68ea" containerName="mariadb-database-create" Feb 23 14:49:19.069764 master-0 kubenswrapper[28758]: I0223 14:49:19.067055 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9248a4c-32fb-402d-8277-a8ea33fe97fe" containerName="init" Feb 23 14:49:19.069764 master-0 kubenswrapper[28758]: I0223 14:49:19.068532 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:19.084513 master-0 kubenswrapper[28758]: I0223 14:49:19.079649 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 23 14:49:19.084513 master-0 kubenswrapper[28758]: I0223 14:49:19.080227 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-63e78-default-internal-config-data" Feb 23 14:49:19.100025 master-0 kubenswrapper[28758]: I0223 14:49:19.098258 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-63e78-default-internal-api-0"] Feb 23 14:49:19.269509 master-0 kubenswrapper[28758]: I0223 14:49:19.266830 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76a97aa3-19a1-44d3-9019-da7b27957297-combined-ca-bundle\") pod \"glance-63e78-default-internal-api-0\" (UID: \"76a97aa3-19a1-44d3-9019-da7b27957297\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:19.269509 master-0 kubenswrapper[28758]: I0223 14:49:19.267606 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/76a97aa3-19a1-44d3-9019-da7b27957297-internal-tls-certs\") pod \"glance-63e78-default-internal-api-0\" (UID: \"76a97aa3-19a1-44d3-9019-da7b27957297\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:19.269509 master-0 kubenswrapper[28758]: I0223 14:49:19.267784 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdx92\" (UniqueName: \"kubernetes.io/projected/76a97aa3-19a1-44d3-9019-da7b27957297-kube-api-access-vdx92\") pod \"glance-63e78-default-internal-api-0\" (UID: \"76a97aa3-19a1-44d3-9019-da7b27957297\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:19.269509 master-0 kubenswrapper[28758]: I0223 14:49:19.267895 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76a97aa3-19a1-44d3-9019-da7b27957297-config-data\") pod \"glance-63e78-default-internal-api-0\" (UID: \"76a97aa3-19a1-44d3-9019-da7b27957297\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:19.269509 master-0 kubenswrapper[28758]: I0223 14:49:19.267918 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76a97aa3-19a1-44d3-9019-da7b27957297-scripts\") pod \"glance-63e78-default-internal-api-0\" (UID: \"76a97aa3-19a1-44d3-9019-da7b27957297\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:19.269509 master-0 kubenswrapper[28758]: I0223 14:49:19.267977 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76a97aa3-19a1-44d3-9019-da7b27957297-logs\") pod \"glance-63e78-default-internal-api-0\" (UID: \"76a97aa3-19a1-44d3-9019-da7b27957297\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:19.269509 master-0 kubenswrapper[28758]: I0223 14:49:19.268165 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/76a97aa3-19a1-44d3-9019-da7b27957297-httpd-run\") pod \"glance-63e78-default-internal-api-0\" (UID: \"76a97aa3-19a1-44d3-9019-da7b27957297\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:19.371162 master-0 kubenswrapper[28758]: I0223 14:49:19.371095 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/76a97aa3-19a1-44d3-9019-da7b27957297-httpd-run\") pod \"glance-63e78-default-internal-api-0\" (UID: \"76a97aa3-19a1-44d3-9019-da7b27957297\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:19.371359 master-0 kubenswrapper[28758]: I0223 14:49:19.371338 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76a97aa3-19a1-44d3-9019-da7b27957297-combined-ca-bundle\") pod \"glance-63e78-default-internal-api-0\" (UID: \"76a97aa3-19a1-44d3-9019-da7b27957297\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:19.371406 master-0 kubenswrapper[28758]: I0223 14:49:19.371370 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/76a97aa3-19a1-44d3-9019-da7b27957297-internal-tls-certs\") pod \"glance-63e78-default-internal-api-0\" (UID: \"76a97aa3-19a1-44d3-9019-da7b27957297\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:19.371528 master-0 kubenswrapper[28758]: I0223 14:49:19.371494 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdx92\" (UniqueName: \"kubernetes.io/projected/76a97aa3-19a1-44d3-9019-da7b27957297-kube-api-access-vdx92\") pod \"glance-63e78-default-internal-api-0\" (UID: \"76a97aa3-19a1-44d3-9019-da7b27957297\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:19.371656 master-0 kubenswrapper[28758]: I0223 14:49:19.371589 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76a97aa3-19a1-44d3-9019-da7b27957297-config-data\") pod \"glance-63e78-default-internal-api-0\" (UID: \"76a97aa3-19a1-44d3-9019-da7b27957297\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:19.371656 master-0 kubenswrapper[28758]: I0223 14:49:19.371613 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76a97aa3-19a1-44d3-9019-da7b27957297-scripts\") pod \"glance-63e78-default-internal-api-0\" (UID: \"76a97aa3-19a1-44d3-9019-da7b27957297\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:19.372316 master-0 kubenswrapper[28758]: I0223 14:49:19.371715 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/76a97aa3-19a1-44d3-9019-da7b27957297-httpd-run\") pod \"glance-63e78-default-internal-api-0\" (UID: \"76a97aa3-19a1-44d3-9019-da7b27957297\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:19.372316 master-0 kubenswrapper[28758]: I0223 14:49:19.371926 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76a97aa3-19a1-44d3-9019-da7b27957297-logs\") pod \"glance-63e78-default-internal-api-0\" (UID: \"76a97aa3-19a1-44d3-9019-da7b27957297\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:19.372444 master-0 kubenswrapper[28758]: I0223 14:49:19.372362 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76a97aa3-19a1-44d3-9019-da7b27957297-logs\") pod \"glance-63e78-default-internal-api-0\" (UID: \"76a97aa3-19a1-44d3-9019-da7b27957297\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:19.376898 master-0 kubenswrapper[28758]: I0223 14:49:19.376854 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76a97aa3-19a1-44d3-9019-da7b27957297-scripts\") pod \"glance-63e78-default-internal-api-0\" (UID: \"76a97aa3-19a1-44d3-9019-da7b27957297\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:19.378023 master-0 kubenswrapper[28758]: I0223 14:49:19.377935 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76a97aa3-19a1-44d3-9019-da7b27957297-config-data\") pod \"glance-63e78-default-internal-api-0\" (UID: \"76a97aa3-19a1-44d3-9019-da7b27957297\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:19.378023 master-0 kubenswrapper[28758]: I0223 14:49:19.377943 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76a97aa3-19a1-44d3-9019-da7b27957297-combined-ca-bundle\") pod \"glance-63e78-default-internal-api-0\" (UID: \"76a97aa3-19a1-44d3-9019-da7b27957297\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:19.378831 master-0 kubenswrapper[28758]: I0223 14:49:19.378809 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/76a97aa3-19a1-44d3-9019-da7b27957297-internal-tls-certs\") pod \"glance-63e78-default-internal-api-0\" (UID: \"76a97aa3-19a1-44d3-9019-da7b27957297\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:19.391950 master-0 kubenswrapper[28758]: I0223 14:49:19.391902 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdx92\" (UniqueName: \"kubernetes.io/projected/76a97aa3-19a1-44d3-9019-da7b27957297-kube-api-access-vdx92\") pod \"glance-63e78-default-internal-api-0\" (UID: \"76a97aa3-19a1-44d3-9019-da7b27957297\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:19.569577 master-0 kubenswrapper[28758]: I0223 14:49:19.569506 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^4676deba-719d-4f1a-a679-315252c62322" (OuterVolumeSpecName: "glance") pod "462dd76d-fc73-4153-a592-d8d2fa38aaf1" (UID: "462dd76d-fc73-4153-a592-d8d2fa38aaf1"). InnerVolumeSpecName "pvc-fee995dc-6f05-4147-9442-57dcc3df496b". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 23 14:49:19.580606 master-0 kubenswrapper[28758]: I0223 14:49:19.580560 28758 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-fee995dc-6f05-4147-9442-57dcc3df496b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4676deba-719d-4f1a-a679-315252c62322\") on node \"master-0\" " Feb 23 14:49:19.600158 master-0 kubenswrapper[28758]: I0223 14:49:19.599653 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-94be7a4f-8e83-4b52-8f7d-b530d749c57c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^1a730645-16f8-4853-9800-95bc968aad28\") pod \"glance-63e78-default-internal-api-0\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:19.628520 master-0 kubenswrapper[28758]: I0223 14:49:19.627763 28758 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 23 14:49:19.628520 master-0 kubenswrapper[28758]: I0223 14:49:19.628028 28758 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-fee995dc-6f05-4147-9442-57dcc3df496b" (UniqueName: "kubernetes.io/csi/topolvm.io^4676deba-719d-4f1a-a679-315252c62322") on node "master-0" Feb 23 14:49:19.681956 master-0 kubenswrapper[28758]: I0223 14:49:19.681905 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^1a730645-16f8-4853-9800-95bc968aad28\") pod \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\" (UID: \"e5a697af-fabe-4d68-b98f-eb9d3ba688f1\") " Feb 23 14:49:19.682464 master-0 kubenswrapper[28758]: I0223 14:49:19.682442 28758 reconciler_common.go:293] "Volume detached for volume \"pvc-fee995dc-6f05-4147-9442-57dcc3df496b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4676deba-719d-4f1a-a679-315252c62322\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:19.710773 master-0 kubenswrapper[28758]: I0223 14:49:19.707287 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^1a730645-16f8-4853-9800-95bc968aad28" (OuterVolumeSpecName: "glance") pod "e5a697af-fabe-4d68-b98f-eb9d3ba688f1" (UID: "e5a697af-fabe-4d68-b98f-eb9d3ba688f1"). InnerVolumeSpecName "pvc-94be7a4f-8e83-4b52-8f7d-b530d749c57c". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 23 14:49:19.788358 master-0 kubenswrapper[28758]: I0223 14:49:19.788040 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-94be7a4f-8e83-4b52-8f7d-b530d749c57c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^1a730645-16f8-4853-9800-95bc968aad28\") pod \"glance-63e78-default-internal-api-0\" (UID: \"76a97aa3-19a1-44d3-9019-da7b27957297\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:19.923769 master-0 kubenswrapper[28758]: I0223 14:49:19.923713 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-63e78-default-external-api-0"] Feb 23 14:49:19.943056 master-0 kubenswrapper[28758]: I0223 14:49:19.942998 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-63e78-default-external-api-0"] Feb 23 14:49:19.947947 master-0 kubenswrapper[28758]: I0223 14:49:19.947879 28758 generic.go:334] "Generic (PLEG): container finished" podID="48a556fc-7e30-4fd5-b266-6315dfdcb0e8" containerID="3a7a3ce5df768907856540049b9397f5cff2fd6bc39ac7546a2c808ab9445216" exitCode=0 Feb 23 14:49:19.948225 master-0 kubenswrapper[28758]: I0223 14:49:19.948162 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b85ps" event={"ID":"48a556fc-7e30-4fd5-b266-6315dfdcb0e8","Type":"ContainerDied","Data":"3a7a3ce5df768907856540049b9397f5cff2fd6bc39ac7546a2c808ab9445216"} Feb 23 14:49:19.960516 master-0 kubenswrapper[28758]: I0223 14:49:19.959321 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-63e78-default-external-api-0"] Feb 23 14:49:19.961166 master-0 kubenswrapper[28758]: I0223 14:49:19.961132 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:19.965780 master-0 kubenswrapper[28758]: I0223 14:49:19.964556 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 23 14:49:19.966277 master-0 kubenswrapper[28758]: I0223 14:49:19.966029 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-63e78-default-external-config-data" Feb 23 14:49:19.985608 master-0 kubenswrapper[28758]: I0223 14:49:19.985546 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-63e78-default-external-api-0"] Feb 23 14:49:20.104885 master-0 kubenswrapper[28758]: I0223 14:49:20.101940 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a09b843c-6e74-4532-8ce4-26147e97d8c5-logs\") pod \"glance-63e78-default-external-api-0\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:20.104885 master-0 kubenswrapper[28758]: I0223 14:49:20.104630 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jngbf\" (UniqueName: \"kubernetes.io/projected/a09b843c-6e74-4532-8ce4-26147e97d8c5-kube-api-access-jngbf\") pod \"glance-63e78-default-external-api-0\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:20.104885 master-0 kubenswrapper[28758]: I0223 14:49:20.104855 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a09b843c-6e74-4532-8ce4-26147e97d8c5-scripts\") pod \"glance-63e78-default-external-api-0\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:20.105177 master-0 kubenswrapper[28758]: I0223 14:49:20.104955 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fee995dc-6f05-4147-9442-57dcc3df496b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4676deba-719d-4f1a-a679-315252c62322\") pod \"glance-63e78-default-external-api-0\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:20.105177 master-0 kubenswrapper[28758]: I0223 14:49:20.105019 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a09b843c-6e74-4532-8ce4-26147e97d8c5-httpd-run\") pod \"glance-63e78-default-external-api-0\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:20.105468 master-0 kubenswrapper[28758]: I0223 14:49:20.105442 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a09b843c-6e74-4532-8ce4-26147e97d8c5-config-data\") pod \"glance-63e78-default-external-api-0\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:20.105563 master-0 kubenswrapper[28758]: I0223 14:49:20.105504 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a09b843c-6e74-4532-8ce4-26147e97d8c5-combined-ca-bundle\") pod \"glance-63e78-default-external-api-0\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:20.106003 master-0 kubenswrapper[28758]: I0223 14:49:20.105960 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a09b843c-6e74-4532-8ce4-26147e97d8c5-public-tls-certs\") pod \"glance-63e78-default-external-api-0\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:20.108722 master-0 kubenswrapper[28758]: I0223 14:49:20.108692 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="462dd76d-fc73-4153-a592-d8d2fa38aaf1" path="/var/lib/kubelet/pods/462dd76d-fc73-4153-a592-d8d2fa38aaf1/volumes" Feb 23 14:49:20.109416 master-0 kubenswrapper[28758]: I0223 14:49:20.109396 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5a697af-fabe-4d68-b98f-eb9d3ba688f1" path="/var/lib/kubelet/pods/e5a697af-fabe-4d68-b98f-eb9d3ba688f1/volumes" Feb 23 14:49:20.208072 master-0 kubenswrapper[28758]: I0223 14:49:20.208007 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a09b843c-6e74-4532-8ce4-26147e97d8c5-httpd-run\") pod \"glance-63e78-default-external-api-0\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:20.208295 master-0 kubenswrapper[28758]: I0223 14:49:20.208278 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a09b843c-6e74-4532-8ce4-26147e97d8c5-config-data\") pod \"glance-63e78-default-external-api-0\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:20.208336 master-0 kubenswrapper[28758]: I0223 14:49:20.208307 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a09b843c-6e74-4532-8ce4-26147e97d8c5-combined-ca-bundle\") pod \"glance-63e78-default-external-api-0\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:20.208815 master-0 kubenswrapper[28758]: I0223 14:49:20.208669 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a09b843c-6e74-4532-8ce4-26147e97d8c5-httpd-run\") pod \"glance-63e78-default-external-api-0\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:20.208971 master-0 kubenswrapper[28758]: I0223 14:49:20.208946 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a09b843c-6e74-4532-8ce4-26147e97d8c5-public-tls-certs\") pod \"glance-63e78-default-external-api-0\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:20.209060 master-0 kubenswrapper[28758]: I0223 14:49:20.209040 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a09b843c-6e74-4532-8ce4-26147e97d8c5-logs\") pod \"glance-63e78-default-external-api-0\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:20.209112 master-0 kubenswrapper[28758]: I0223 14:49:20.209093 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jngbf\" (UniqueName: \"kubernetes.io/projected/a09b843c-6e74-4532-8ce4-26147e97d8c5-kube-api-access-jngbf\") pod \"glance-63e78-default-external-api-0\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:20.209208 master-0 kubenswrapper[28758]: I0223 14:49:20.209189 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a09b843c-6e74-4532-8ce4-26147e97d8c5-scripts\") pod \"glance-63e78-default-external-api-0\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:20.209269 master-0 kubenswrapper[28758]: I0223 14:49:20.209250 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fee995dc-6f05-4147-9442-57dcc3df496b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4676deba-719d-4f1a-a679-315252c62322\") pod \"glance-63e78-default-external-api-0\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:20.209694 master-0 kubenswrapper[28758]: I0223 14:49:20.209657 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a09b843c-6e74-4532-8ce4-26147e97d8c5-logs\") pod \"glance-63e78-default-external-api-0\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:20.211267 master-0 kubenswrapper[28758]: I0223 14:49:20.211238 28758 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 14:49:20.211381 master-0 kubenswrapper[28758]: I0223 14:49:20.211272 28758 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fee995dc-6f05-4147-9442-57dcc3df496b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4676deba-719d-4f1a-a679-315252c62322\") pod \"glance-63e78-default-external-api-0\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/e3fa539436bb43727cdf11c2c03cf22a4b969059756e5f07ca659a4a6862fdb6/globalmount\"" pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:20.212355 master-0 kubenswrapper[28758]: I0223 14:49:20.212311 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a09b843c-6e74-4532-8ce4-26147e97d8c5-combined-ca-bundle\") pod \"glance-63e78-default-external-api-0\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:20.214048 master-0 kubenswrapper[28758]: I0223 14:49:20.214009 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a09b843c-6e74-4532-8ce4-26147e97d8c5-config-data\") pod \"glance-63e78-default-external-api-0\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:20.215511 master-0 kubenswrapper[28758]: I0223 14:49:20.215393 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a09b843c-6e74-4532-8ce4-26147e97d8c5-scripts\") pod \"glance-63e78-default-external-api-0\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:20.216534 master-0 kubenswrapper[28758]: I0223 14:49:20.216384 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a09b843c-6e74-4532-8ce4-26147e97d8c5-public-tls-certs\") pod \"glance-63e78-default-external-api-0\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:20.247498 master-0 kubenswrapper[28758]: I0223 14:49:20.247424 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jngbf\" (UniqueName: \"kubernetes.io/projected/a09b843c-6e74-4532-8ce4-26147e97d8c5-kube-api-access-jngbf\") pod \"glance-63e78-default-external-api-0\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:21.115669 master-0 kubenswrapper[28758]: I0223 14:49:21.115601 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-279b-account-create-update-vbsbw" Feb 23 14:49:21.139073 master-0 kubenswrapper[28758]: I0223 14:49:21.139014 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24xqs\" (UniqueName: \"kubernetes.io/projected/651d3a04-116a-4337-8f42-3865d8a0b9be-kube-api-access-24xqs\") pod \"651d3a04-116a-4337-8f42-3865d8a0b9be\" (UID: \"651d3a04-116a-4337-8f42-3865d8a0b9be\") " Feb 23 14:49:21.139474 master-0 kubenswrapper[28758]: I0223 14:49:21.139438 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/651d3a04-116a-4337-8f42-3865d8a0b9be-operator-scripts\") pod \"651d3a04-116a-4337-8f42-3865d8a0b9be\" (UID: \"651d3a04-116a-4337-8f42-3865d8a0b9be\") " Feb 23 14:49:21.140168 master-0 kubenswrapper[28758]: I0223 14:49:21.140085 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/651d3a04-116a-4337-8f42-3865d8a0b9be-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "651d3a04-116a-4337-8f42-3865d8a0b9be" (UID: "651d3a04-116a-4337-8f42-3865d8a0b9be"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:49:21.141325 master-0 kubenswrapper[28758]: I0223 14:49:21.141279 28758 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/651d3a04-116a-4337-8f42-3865d8a0b9be-operator-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:21.142590 master-0 kubenswrapper[28758]: I0223 14:49:21.142532 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/651d3a04-116a-4337-8f42-3865d8a0b9be-kube-api-access-24xqs" (OuterVolumeSpecName: "kube-api-access-24xqs") pod "651d3a04-116a-4337-8f42-3865d8a0b9be" (UID: "651d3a04-116a-4337-8f42-3865d8a0b9be"). InnerVolumeSpecName "kube-api-access-24xqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:49:21.168423 master-0 kubenswrapper[28758]: I0223 14:49:21.168358 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-94be7a4f-8e83-4b52-8f7d-b530d749c57c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^1a730645-16f8-4853-9800-95bc968aad28\") pod \"glance-63e78-default-internal-api-0\" (UID: \"76a97aa3-19a1-44d3-9019-da7b27957297\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:21.243690 master-0 kubenswrapper[28758]: I0223 14:49:21.243538 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24xqs\" (UniqueName: \"kubernetes.io/projected/651d3a04-116a-4337-8f42-3865d8a0b9be-kube-api-access-24xqs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:21.501581 master-0 kubenswrapper[28758]: I0223 14:49:21.501439 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:21.987999 master-0 kubenswrapper[28758]: I0223 14:49:21.987918 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b85ps" event={"ID":"48a556fc-7e30-4fd5-b266-6315dfdcb0e8","Type":"ContainerDied","Data":"d992c9530995798c0c16c16975b8ff09dfaa1b158be2eda87b3ee31b953a8fb0"} Feb 23 14:49:21.987999 master-0 kubenswrapper[28758]: I0223 14:49:21.987980 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d992c9530995798c0c16c16975b8ff09dfaa1b158be2eda87b3ee31b953a8fb0" Feb 23 14:49:21.995567 master-0 kubenswrapper[28758]: I0223 14:49:21.993402 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-279b-account-create-update-vbsbw" event={"ID":"651d3a04-116a-4337-8f42-3865d8a0b9be","Type":"ContainerDied","Data":"302908afd78ed9546d99d84ae19db1fc147ed9739a995ae1f1bb4ada14a330f8"} Feb 23 14:49:21.995567 master-0 kubenswrapper[28758]: I0223 14:49:21.993448 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="302908afd78ed9546d99d84ae19db1fc147ed9739a995ae1f1bb4ada14a330f8" Feb 23 14:49:21.995567 master-0 kubenswrapper[28758]: I0223 14:49:21.993561 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-279b-account-create-update-vbsbw" Feb 23 14:49:22.261895 master-0 kubenswrapper[28758]: I0223 14:49:22.261787 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b85ps" Feb 23 14:49:22.386977 master-0 kubenswrapper[28758]: I0223 14:49:22.386917 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6w8gb\" (UniqueName: \"kubernetes.io/projected/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-kube-api-access-6w8gb\") pod \"48a556fc-7e30-4fd5-b266-6315dfdcb0e8\" (UID: \"48a556fc-7e30-4fd5-b266-6315dfdcb0e8\") " Feb 23 14:49:22.387113 master-0 kubenswrapper[28758]: I0223 14:49:22.386984 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-credential-keys\") pod \"48a556fc-7e30-4fd5-b266-6315dfdcb0e8\" (UID: \"48a556fc-7e30-4fd5-b266-6315dfdcb0e8\") " Feb 23 14:49:22.387565 master-0 kubenswrapper[28758]: I0223 14:49:22.387531 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-config-data\") pod \"48a556fc-7e30-4fd5-b266-6315dfdcb0e8\" (UID: \"48a556fc-7e30-4fd5-b266-6315dfdcb0e8\") " Feb 23 14:49:22.387624 master-0 kubenswrapper[28758]: I0223 14:49:22.387595 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-fernet-keys\") pod \"48a556fc-7e30-4fd5-b266-6315dfdcb0e8\" (UID: \"48a556fc-7e30-4fd5-b266-6315dfdcb0e8\") " Feb 23 14:49:22.387624 master-0 kubenswrapper[28758]: I0223 14:49:22.387617 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-combined-ca-bundle\") pod \"48a556fc-7e30-4fd5-b266-6315dfdcb0e8\" (UID: \"48a556fc-7e30-4fd5-b266-6315dfdcb0e8\") " Feb 23 14:49:22.387686 master-0 kubenswrapper[28758]: I0223 14:49:22.387644 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-scripts\") pod \"48a556fc-7e30-4fd5-b266-6315dfdcb0e8\" (UID: \"48a556fc-7e30-4fd5-b266-6315dfdcb0e8\") " Feb 23 14:49:22.390738 master-0 kubenswrapper[28758]: I0223 14:49:22.390690 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-kube-api-access-6w8gb" (OuterVolumeSpecName: "kube-api-access-6w8gb") pod "48a556fc-7e30-4fd5-b266-6315dfdcb0e8" (UID: "48a556fc-7e30-4fd5-b266-6315dfdcb0e8"). InnerVolumeSpecName "kube-api-access-6w8gb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:49:22.392734 master-0 kubenswrapper[28758]: I0223 14:49:22.391907 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "48a556fc-7e30-4fd5-b266-6315dfdcb0e8" (UID: "48a556fc-7e30-4fd5-b266-6315dfdcb0e8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:22.392734 master-0 kubenswrapper[28758]: I0223 14:49:22.391943 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "48a556fc-7e30-4fd5-b266-6315dfdcb0e8" (UID: "48a556fc-7e30-4fd5-b266-6315dfdcb0e8"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:22.395697 master-0 kubenswrapper[28758]: I0223 14:49:22.393737 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-scripts" (OuterVolumeSpecName: "scripts") pod "48a556fc-7e30-4fd5-b266-6315dfdcb0e8" (UID: "48a556fc-7e30-4fd5-b266-6315dfdcb0e8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:22.420148 master-0 kubenswrapper[28758]: I0223 14:49:22.419573 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-config-data" (OuterVolumeSpecName: "config-data") pod "48a556fc-7e30-4fd5-b266-6315dfdcb0e8" (UID: "48a556fc-7e30-4fd5-b266-6315dfdcb0e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:22.420148 master-0 kubenswrapper[28758]: I0223 14:49:22.419781 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48a556fc-7e30-4fd5-b266-6315dfdcb0e8" (UID: "48a556fc-7e30-4fd5-b266-6315dfdcb0e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:22.483896 master-0 kubenswrapper[28758]: I0223 14:49:22.483853 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-63e78-default-internal-api-0"] Feb 23 14:49:22.491112 master-0 kubenswrapper[28758]: I0223 14:49:22.491074 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6w8gb\" (UniqueName: \"kubernetes.io/projected/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-kube-api-access-6w8gb\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:22.491261 master-0 kubenswrapper[28758]: I0223 14:49:22.491246 28758 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-credential-keys\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:22.491341 master-0 kubenswrapper[28758]: I0223 14:49:22.491328 28758 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:22.491425 master-0 kubenswrapper[28758]: I0223 14:49:22.491411 28758 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-fernet-keys\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:22.491524 master-0 kubenswrapper[28758]: I0223 14:49:22.491511 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:22.491615 master-0 kubenswrapper[28758]: I0223 14:49:22.491601 28758 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48a556fc-7e30-4fd5-b266-6315dfdcb0e8-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:22.621665 master-0 kubenswrapper[28758]: I0223 14:49:22.621608 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fee995dc-6f05-4147-9442-57dcc3df496b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4676deba-719d-4f1a-a679-315252c62322\") pod \"glance-63e78-default-external-api-0\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:22.677651 master-0 kubenswrapper[28758]: I0223 14:49:22.677525 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:23.007384 master-0 kubenswrapper[28758]: I0223 14:49:23.007311 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-63e78-default-internal-api-0" event={"ID":"76a97aa3-19a1-44d3-9019-da7b27957297","Type":"ContainerStarted","Data":"8fc8c6cfd6b7d27127599caf0fa236d17dda5219c208ccbc690fc2e0a894b165"} Feb 23 14:49:23.009792 master-0 kubenswrapper[28758]: I0223 14:49:23.009755 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b85ps" Feb 23 14:49:23.009792 master-0 kubenswrapper[28758]: I0223 14:49:23.009788 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-w9z2m" event={"ID":"0344160e-18f1-422b-90f8-663a15320959","Type":"ContainerStarted","Data":"14390245c2be873fca1fe1fb4e69d4bc37faf2c1b5c8442ab2e12b381d91c0b7"} Feb 23 14:49:23.132071 master-0 kubenswrapper[28758]: I0223 14:49:23.131993 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-w9z2m" podStartSLOduration=3.326329544 podStartE2EDuration="9.131976715s" podCreationTimestamp="2026-02-23 14:49:14 +0000 UTC" firstStartedPulling="2026-02-23 14:49:16.162810741 +0000 UTC m=+888.289126673" lastFinishedPulling="2026-02-23 14:49:21.968457912 +0000 UTC m=+894.094773844" observedRunningTime="2026-02-23 14:49:23.108849861 +0000 UTC m=+895.235165793" watchObservedRunningTime="2026-02-23 14:49:23.131976715 +0000 UTC m=+895.258292647" Feb 23 14:49:23.258905 master-0 kubenswrapper[28758]: I0223 14:49:23.258257 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-63e78-default-external-api-0"] Feb 23 14:49:23.261608 master-0 kubenswrapper[28758]: W0223 14:49:23.261559 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda09b843c_6e74_4532_8ce4_26147e97d8c5.slice/crio-56f7b6afc53ef30e889efc2e9c46ee89e3a7677a006d35de42f8733392b7a62a WatchSource:0}: Error finding container 56f7b6afc53ef30e889efc2e9c46ee89e3a7677a006d35de42f8733392b7a62a: Status 404 returned error can't find the container with id 56f7b6afc53ef30e889efc2e9c46ee89e3a7677a006d35de42f8733392b7a62a Feb 23 14:49:23.379817 master-0 kubenswrapper[28758]: I0223 14:49:23.379666 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-b85ps"] Feb 23 14:49:23.392734 master-0 kubenswrapper[28758]: I0223 14:49:23.392671 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-b85ps"] Feb 23 14:49:23.479071 master-0 kubenswrapper[28758]: I0223 14:49:23.479008 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-pn6b2"] Feb 23 14:49:23.479643 master-0 kubenswrapper[28758]: E0223 14:49:23.479615 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="651d3a04-116a-4337-8f42-3865d8a0b9be" containerName="mariadb-account-create-update" Feb 23 14:49:23.479643 master-0 kubenswrapper[28758]: I0223 14:49:23.479637 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="651d3a04-116a-4337-8f42-3865d8a0b9be" containerName="mariadb-account-create-update" Feb 23 14:49:23.479736 master-0 kubenswrapper[28758]: E0223 14:49:23.479648 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48a556fc-7e30-4fd5-b266-6315dfdcb0e8" containerName="keystone-bootstrap" Feb 23 14:49:23.479736 master-0 kubenswrapper[28758]: I0223 14:49:23.479655 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="48a556fc-7e30-4fd5-b266-6315dfdcb0e8" containerName="keystone-bootstrap" Feb 23 14:49:23.479920 master-0 kubenswrapper[28758]: I0223 14:49:23.479891 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="651d3a04-116a-4337-8f42-3865d8a0b9be" containerName="mariadb-account-create-update" Feb 23 14:49:23.479920 master-0 kubenswrapper[28758]: I0223 14:49:23.479911 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="48a556fc-7e30-4fd5-b266-6315dfdcb0e8" containerName="keystone-bootstrap" Feb 23 14:49:23.480703 master-0 kubenswrapper[28758]: I0223 14:49:23.480674 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pn6b2" Feb 23 14:49:23.483342 master-0 kubenswrapper[28758]: I0223 14:49:23.483313 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 23 14:49:23.483418 master-0 kubenswrapper[28758]: I0223 14:49:23.483391 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 23 14:49:23.483546 master-0 kubenswrapper[28758]: I0223 14:49:23.483506 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 23 14:49:23.483629 master-0 kubenswrapper[28758]: I0223 14:49:23.483558 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 23 14:49:23.492589 master-0 kubenswrapper[28758]: I0223 14:49:23.489511 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-pn6b2"] Feb 23 14:49:23.522509 master-0 kubenswrapper[28758]: I0223 14:49:23.522422 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-fernet-keys\") pod \"keystone-bootstrap-pn6b2\" (UID: \"2a94e87c-4854-4511-86a5-bb59bd265598\") " pod="openstack/keystone-bootstrap-pn6b2" Feb 23 14:49:23.522740 master-0 kubenswrapper[28758]: I0223 14:49:23.522559 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-credential-keys\") pod \"keystone-bootstrap-pn6b2\" (UID: \"2a94e87c-4854-4511-86a5-bb59bd265598\") " pod="openstack/keystone-bootstrap-pn6b2" Feb 23 14:49:23.522740 master-0 kubenswrapper[28758]: I0223 14:49:23.522644 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-scripts\") pod \"keystone-bootstrap-pn6b2\" (UID: \"2a94e87c-4854-4511-86a5-bb59bd265598\") " pod="openstack/keystone-bootstrap-pn6b2" Feb 23 14:49:23.522740 master-0 kubenswrapper[28758]: I0223 14:49:23.522684 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-config-data\") pod \"keystone-bootstrap-pn6b2\" (UID: \"2a94e87c-4854-4511-86a5-bb59bd265598\") " pod="openstack/keystone-bootstrap-pn6b2" Feb 23 14:49:23.522740 master-0 kubenswrapper[28758]: I0223 14:49:23.522704 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-combined-ca-bundle\") pod \"keystone-bootstrap-pn6b2\" (UID: \"2a94e87c-4854-4511-86a5-bb59bd265598\") " pod="openstack/keystone-bootstrap-pn6b2" Feb 23 14:49:23.523097 master-0 kubenswrapper[28758]: I0223 14:49:23.523060 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbcz2\" (UniqueName: \"kubernetes.io/projected/2a94e87c-4854-4511-86a5-bb59bd265598-kube-api-access-lbcz2\") pod \"keystone-bootstrap-pn6b2\" (UID: \"2a94e87c-4854-4511-86a5-bb59bd265598\") " pod="openstack/keystone-bootstrap-pn6b2" Feb 23 14:49:23.630446 master-0 kubenswrapper[28758]: I0223 14:49:23.630315 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbcz2\" (UniqueName: \"kubernetes.io/projected/2a94e87c-4854-4511-86a5-bb59bd265598-kube-api-access-lbcz2\") pod \"keystone-bootstrap-pn6b2\" (UID: \"2a94e87c-4854-4511-86a5-bb59bd265598\") " pod="openstack/keystone-bootstrap-pn6b2" Feb 23 14:49:23.630446 master-0 kubenswrapper[28758]: I0223 14:49:23.630444 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-fernet-keys\") pod \"keystone-bootstrap-pn6b2\" (UID: \"2a94e87c-4854-4511-86a5-bb59bd265598\") " pod="openstack/keystone-bootstrap-pn6b2" Feb 23 14:49:23.630655 master-0 kubenswrapper[28758]: I0223 14:49:23.630526 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-credential-keys\") pod \"keystone-bootstrap-pn6b2\" (UID: \"2a94e87c-4854-4511-86a5-bb59bd265598\") " pod="openstack/keystone-bootstrap-pn6b2" Feb 23 14:49:23.630655 master-0 kubenswrapper[28758]: I0223 14:49:23.630620 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-config-data\") pod \"keystone-bootstrap-pn6b2\" (UID: \"2a94e87c-4854-4511-86a5-bb59bd265598\") " pod="openstack/keystone-bootstrap-pn6b2" Feb 23 14:49:23.630734 master-0 kubenswrapper[28758]: I0223 14:49:23.630658 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-scripts\") pod \"keystone-bootstrap-pn6b2\" (UID: \"2a94e87c-4854-4511-86a5-bb59bd265598\") " pod="openstack/keystone-bootstrap-pn6b2" Feb 23 14:49:23.630734 master-0 kubenswrapper[28758]: I0223 14:49:23.630683 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-combined-ca-bundle\") pod \"keystone-bootstrap-pn6b2\" (UID: \"2a94e87c-4854-4511-86a5-bb59bd265598\") " pod="openstack/keystone-bootstrap-pn6b2" Feb 23 14:49:23.634839 master-0 kubenswrapper[28758]: I0223 14:49:23.634798 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-scripts\") pod \"keystone-bootstrap-pn6b2\" (UID: \"2a94e87c-4854-4511-86a5-bb59bd265598\") " pod="openstack/keystone-bootstrap-pn6b2" Feb 23 14:49:23.634839 master-0 kubenswrapper[28758]: I0223 14:49:23.634828 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-fernet-keys\") pod \"keystone-bootstrap-pn6b2\" (UID: \"2a94e87c-4854-4511-86a5-bb59bd265598\") " pod="openstack/keystone-bootstrap-pn6b2" Feb 23 14:49:23.635054 master-0 kubenswrapper[28758]: I0223 14:49:23.634924 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-config-data\") pod \"keystone-bootstrap-pn6b2\" (UID: \"2a94e87c-4854-4511-86a5-bb59bd265598\") " pod="openstack/keystone-bootstrap-pn6b2" Feb 23 14:49:23.636089 master-0 kubenswrapper[28758]: I0223 14:49:23.636056 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-credential-keys\") pod \"keystone-bootstrap-pn6b2\" (UID: \"2a94e87c-4854-4511-86a5-bb59bd265598\") " pod="openstack/keystone-bootstrap-pn6b2" Feb 23 14:49:23.638082 master-0 kubenswrapper[28758]: I0223 14:49:23.638013 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-combined-ca-bundle\") pod \"keystone-bootstrap-pn6b2\" (UID: \"2a94e87c-4854-4511-86a5-bb59bd265598\") " pod="openstack/keystone-bootstrap-pn6b2" Feb 23 14:49:23.653451 master-0 kubenswrapper[28758]: I0223 14:49:23.652606 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbcz2\" (UniqueName: \"kubernetes.io/projected/2a94e87c-4854-4511-86a5-bb59bd265598-kube-api-access-lbcz2\") pod \"keystone-bootstrap-pn6b2\" (UID: \"2a94e87c-4854-4511-86a5-bb59bd265598\") " pod="openstack/keystone-bootstrap-pn6b2" Feb 23 14:49:23.812319 master-0 kubenswrapper[28758]: I0223 14:49:23.812267 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pn6b2" Feb 23 14:49:24.020984 master-0 kubenswrapper[28758]: I0223 14:49:24.020891 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-63e78-default-external-api-0" event={"ID":"a09b843c-6e74-4532-8ce4-26147e97d8c5","Type":"ContainerStarted","Data":"cca996d036b0d938c26b7289bdc7d311eca327d7be81f6a3839239c667eb9ed6"} Feb 23 14:49:24.020984 master-0 kubenswrapper[28758]: I0223 14:49:24.020984 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-63e78-default-external-api-0" event={"ID":"a09b843c-6e74-4532-8ce4-26147e97d8c5","Type":"ContainerStarted","Data":"56f7b6afc53ef30e889efc2e9c46ee89e3a7677a006d35de42f8733392b7a62a"} Feb 23 14:49:24.022730 master-0 kubenswrapper[28758]: I0223 14:49:24.022693 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-63e78-default-internal-api-0" event={"ID":"76a97aa3-19a1-44d3-9019-da7b27957297","Type":"ContainerStarted","Data":"1b34d62d94b3cfc0f7ea720c20214c859b4f1d29c479e024171cc5193e14f80b"} Feb 23 14:49:24.022730 master-0 kubenswrapper[28758]: I0223 14:49:24.022728 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-63e78-default-internal-api-0" event={"ID":"76a97aa3-19a1-44d3-9019-da7b27957297","Type":"ContainerStarted","Data":"eabfb46572be76a89d392cc4f8273ef2d73faca6dfeac81f8567fbb5e1856924"} Feb 23 14:49:24.059553 master-0 kubenswrapper[28758]: I0223 14:49:24.055505 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-63e78-default-internal-api-0" podStartSLOduration=5.055455102 podStartE2EDuration="5.055455102s" podCreationTimestamp="2026-02-23 14:49:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:49:24.050299675 +0000 UTC m=+896.176615617" watchObservedRunningTime="2026-02-23 14:49:24.055455102 +0000 UTC m=+896.181771034" Feb 23 14:49:24.118508 master-0 kubenswrapper[28758]: I0223 14:49:24.118391 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48a556fc-7e30-4fd5-b266-6315dfdcb0e8" path="/var/lib/kubelet/pods/48a556fc-7e30-4fd5-b266-6315dfdcb0e8/volumes" Feb 23 14:49:24.673724 master-0 kubenswrapper[28758]: I0223 14:49:24.673639 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-db-sync-szljh"] Feb 23 14:49:24.676619 master-0 kubenswrapper[28758]: I0223 14:49:24.676570 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-szljh" Feb 23 14:49:24.679467 master-0 kubenswrapper[28758]: I0223 14:49:24.679274 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-scripts" Feb 23 14:49:24.679725 master-0 kubenswrapper[28758]: I0223 14:49:24.679680 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-config-data" Feb 23 14:49:24.688898 master-0 kubenswrapper[28758]: I0223 14:49:24.688841 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-sync-szljh"] Feb 23 14:49:24.868680 master-0 kubenswrapper[28758]: I0223 14:49:24.868619 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc21cc06-410c-4afe-85d4-a72d8cebf881-config-data\") pod \"ironic-db-sync-szljh\" (UID: \"cc21cc06-410c-4afe-85d4-a72d8cebf881\") " pod="openstack/ironic-db-sync-szljh" Feb 23 14:49:24.869026 master-0 kubenswrapper[28758]: I0223 14:49:24.869002 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc21cc06-410c-4afe-85d4-a72d8cebf881-combined-ca-bundle\") pod \"ironic-db-sync-szljh\" (UID: \"cc21cc06-410c-4afe-85d4-a72d8cebf881\") " pod="openstack/ironic-db-sync-szljh" Feb 23 14:49:24.869180 master-0 kubenswrapper[28758]: I0223 14:49:24.869162 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cc21cc06-410c-4afe-85d4-a72d8cebf881-etc-podinfo\") pod \"ironic-db-sync-szljh\" (UID: \"cc21cc06-410c-4afe-85d4-a72d8cebf881\") " pod="openstack/ironic-db-sync-szljh" Feb 23 14:49:24.869329 master-0 kubenswrapper[28758]: I0223 14:49:24.869308 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc21cc06-410c-4afe-85d4-a72d8cebf881-scripts\") pod \"ironic-db-sync-szljh\" (UID: \"cc21cc06-410c-4afe-85d4-a72d8cebf881\") " pod="openstack/ironic-db-sync-szljh" Feb 23 14:49:24.869468 master-0 kubenswrapper[28758]: I0223 14:49:24.869448 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cc21cc06-410c-4afe-85d4-a72d8cebf881-config-data-merged\") pod \"ironic-db-sync-szljh\" (UID: \"cc21cc06-410c-4afe-85d4-a72d8cebf881\") " pod="openstack/ironic-db-sync-szljh" Feb 23 14:49:24.869609 master-0 kubenswrapper[28758]: I0223 14:49:24.869588 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbxp6\" (UniqueName: \"kubernetes.io/projected/cc21cc06-410c-4afe-85d4-a72d8cebf881-kube-api-access-pbxp6\") pod \"ironic-db-sync-szljh\" (UID: \"cc21cc06-410c-4afe-85d4-a72d8cebf881\") " pod="openstack/ironic-db-sync-szljh" Feb 23 14:49:24.971728 master-0 kubenswrapper[28758]: I0223 14:49:24.971625 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc21cc06-410c-4afe-85d4-a72d8cebf881-config-data\") pod \"ironic-db-sync-szljh\" (UID: \"cc21cc06-410c-4afe-85d4-a72d8cebf881\") " pod="openstack/ironic-db-sync-szljh" Feb 23 14:49:24.972884 master-0 kubenswrapper[28758]: I0223 14:49:24.972853 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc21cc06-410c-4afe-85d4-a72d8cebf881-combined-ca-bundle\") pod \"ironic-db-sync-szljh\" (UID: \"cc21cc06-410c-4afe-85d4-a72d8cebf881\") " pod="openstack/ironic-db-sync-szljh" Feb 23 14:49:24.973441 master-0 kubenswrapper[28758]: I0223 14:49:24.973422 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cc21cc06-410c-4afe-85d4-a72d8cebf881-etc-podinfo\") pod \"ironic-db-sync-szljh\" (UID: \"cc21cc06-410c-4afe-85d4-a72d8cebf881\") " pod="openstack/ironic-db-sync-szljh" Feb 23 14:49:24.973621 master-0 kubenswrapper[28758]: I0223 14:49:24.973598 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc21cc06-410c-4afe-85d4-a72d8cebf881-scripts\") pod \"ironic-db-sync-szljh\" (UID: \"cc21cc06-410c-4afe-85d4-a72d8cebf881\") " pod="openstack/ironic-db-sync-szljh" Feb 23 14:49:24.973756 master-0 kubenswrapper[28758]: I0223 14:49:24.973737 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cc21cc06-410c-4afe-85d4-a72d8cebf881-config-data-merged\") pod \"ironic-db-sync-szljh\" (UID: \"cc21cc06-410c-4afe-85d4-a72d8cebf881\") " pod="openstack/ironic-db-sync-szljh" Feb 23 14:49:24.973865 master-0 kubenswrapper[28758]: I0223 14:49:24.973847 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbxp6\" (UniqueName: \"kubernetes.io/projected/cc21cc06-410c-4afe-85d4-a72d8cebf881-kube-api-access-pbxp6\") pod \"ironic-db-sync-szljh\" (UID: \"cc21cc06-410c-4afe-85d4-a72d8cebf881\") " pod="openstack/ironic-db-sync-szljh" Feb 23 14:49:24.979700 master-0 kubenswrapper[28758]: I0223 14:49:24.976894 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cc21cc06-410c-4afe-85d4-a72d8cebf881-config-data-merged\") pod \"ironic-db-sync-szljh\" (UID: \"cc21cc06-410c-4afe-85d4-a72d8cebf881\") " pod="openstack/ironic-db-sync-szljh" Feb 23 14:49:24.995022 master-0 kubenswrapper[28758]: I0223 14:49:24.994170 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc21cc06-410c-4afe-85d4-a72d8cebf881-combined-ca-bundle\") pod \"ironic-db-sync-szljh\" (UID: \"cc21cc06-410c-4afe-85d4-a72d8cebf881\") " pod="openstack/ironic-db-sync-szljh" Feb 23 14:49:24.995022 master-0 kubenswrapper[28758]: I0223 14:49:24.994283 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc21cc06-410c-4afe-85d4-a72d8cebf881-config-data\") pod \"ironic-db-sync-szljh\" (UID: \"cc21cc06-410c-4afe-85d4-a72d8cebf881\") " pod="openstack/ironic-db-sync-szljh" Feb 23 14:49:24.995022 master-0 kubenswrapper[28758]: I0223 14:49:24.994562 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc21cc06-410c-4afe-85d4-a72d8cebf881-scripts\") pod \"ironic-db-sync-szljh\" (UID: \"cc21cc06-410c-4afe-85d4-a72d8cebf881\") " pod="openstack/ironic-db-sync-szljh" Feb 23 14:49:25.008297 master-0 kubenswrapper[28758]: I0223 14:49:25.008252 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbxp6\" (UniqueName: \"kubernetes.io/projected/cc21cc06-410c-4afe-85d4-a72d8cebf881-kube-api-access-pbxp6\") pod \"ironic-db-sync-szljh\" (UID: \"cc21cc06-410c-4afe-85d4-a72d8cebf881\") " pod="openstack/ironic-db-sync-szljh" Feb 23 14:49:25.016253 master-0 kubenswrapper[28758]: I0223 14:49:25.016206 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cc21cc06-410c-4afe-85d4-a72d8cebf881-etc-podinfo\") pod \"ironic-db-sync-szljh\" (UID: \"cc21cc06-410c-4afe-85d4-a72d8cebf881\") " pod="openstack/ironic-db-sync-szljh" Feb 23 14:49:25.020971 master-0 kubenswrapper[28758]: I0223 14:49:25.020921 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-ddcf4757-slsxg" Feb 23 14:49:25.090419 master-0 kubenswrapper[28758]: I0223 14:49:25.089247 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d9b67b7fc-m76r8"] Feb 23 14:49:25.090419 master-0 kubenswrapper[28758]: I0223 14:49:25.089623 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" podUID="67d05e7e-cde1-4bc2-93b0-62274ff2002a" containerName="dnsmasq-dns" containerID="cri-o://b053e4c2eb401e2d1292a5655add1836c256f85d8c7cb49382994cbed906e288" gracePeriod=10 Feb 23 14:49:25.312859 master-0 kubenswrapper[28758]: I0223 14:49:25.312748 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-szljh" Feb 23 14:49:26.052342 master-0 kubenswrapper[28758]: I0223 14:49:26.052148 28758 generic.go:334] "Generic (PLEG): container finished" podID="67d05e7e-cde1-4bc2-93b0-62274ff2002a" containerID="b053e4c2eb401e2d1292a5655add1836c256f85d8c7cb49382994cbed906e288" exitCode=0 Feb 23 14:49:26.052342 master-0 kubenswrapper[28758]: I0223 14:49:26.052225 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" event={"ID":"67d05e7e-cde1-4bc2-93b0-62274ff2002a","Type":"ContainerDied","Data":"b053e4c2eb401e2d1292a5655add1836c256f85d8c7cb49382994cbed906e288"} Feb 23 14:49:26.053547 master-0 kubenswrapper[28758]: I0223 14:49:26.053527 28758 generic.go:334] "Generic (PLEG): container finished" podID="0344160e-18f1-422b-90f8-663a15320959" containerID="14390245c2be873fca1fe1fb4e69d4bc37faf2c1b5c8442ab2e12b381d91c0b7" exitCode=0 Feb 23 14:49:26.053547 master-0 kubenswrapper[28758]: I0223 14:49:26.053546 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-w9z2m" event={"ID":"0344160e-18f1-422b-90f8-663a15320959","Type":"ContainerDied","Data":"14390245c2be873fca1fe1fb4e69d4bc37faf2c1b5c8442ab2e12b381d91c0b7"} Feb 23 14:49:26.096876 master-0 kubenswrapper[28758]: I0223 14:49:26.096813 28758 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" podUID="67d05e7e-cde1-4bc2-93b0-62274ff2002a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.198:5353: connect: connection refused" Feb 23 14:49:31.096867 master-0 kubenswrapper[28758]: I0223 14:49:31.096778 28758 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" podUID="67d05e7e-cde1-4bc2-93b0-62274ff2002a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.198:5353: connect: connection refused" Feb 23 14:49:31.505003 master-0 kubenswrapper[28758]: I0223 14:49:31.504943 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:31.505154 master-0 kubenswrapper[28758]: I0223 14:49:31.505012 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:31.531743 master-0 kubenswrapper[28758]: I0223 14:49:31.531690 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-w9z2m" Feb 23 14:49:31.549599 master-0 kubenswrapper[28758]: I0223 14:49:31.549558 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:31.561540 master-0 kubenswrapper[28758]: I0223 14:49:31.561507 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:31.987446 master-0 kubenswrapper[28758]: I0223 14:49:31.987361 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0344160e-18f1-422b-90f8-663a15320959-combined-ca-bundle\") pod \"0344160e-18f1-422b-90f8-663a15320959\" (UID: \"0344160e-18f1-422b-90f8-663a15320959\") " Feb 23 14:49:31.989504 master-0 kubenswrapper[28758]: I0223 14:49:31.988079 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0344160e-18f1-422b-90f8-663a15320959-config-data\") pod \"0344160e-18f1-422b-90f8-663a15320959\" (UID: \"0344160e-18f1-422b-90f8-663a15320959\") " Feb 23 14:49:31.989824 master-0 kubenswrapper[28758]: I0223 14:49:31.989800 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0344160e-18f1-422b-90f8-663a15320959-logs\") pod \"0344160e-18f1-422b-90f8-663a15320959\" (UID: \"0344160e-18f1-422b-90f8-663a15320959\") " Feb 23 14:49:31.989964 master-0 kubenswrapper[28758]: I0223 14:49:31.989952 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0344160e-18f1-422b-90f8-663a15320959-scripts\") pod \"0344160e-18f1-422b-90f8-663a15320959\" (UID: \"0344160e-18f1-422b-90f8-663a15320959\") " Feb 23 14:49:31.990153 master-0 kubenswrapper[28758]: I0223 14:49:31.990138 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k49m5\" (UniqueName: \"kubernetes.io/projected/0344160e-18f1-422b-90f8-663a15320959-kube-api-access-k49m5\") pod \"0344160e-18f1-422b-90f8-663a15320959\" (UID: \"0344160e-18f1-422b-90f8-663a15320959\") " Feb 23 14:49:31.990983 master-0 kubenswrapper[28758]: I0223 14:49:31.990130 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0344160e-18f1-422b-90f8-663a15320959-logs" (OuterVolumeSpecName: "logs") pod "0344160e-18f1-422b-90f8-663a15320959" (UID: "0344160e-18f1-422b-90f8-663a15320959"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:49:31.992044 master-0 kubenswrapper[28758]: I0223 14:49:31.992013 28758 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0344160e-18f1-422b-90f8-663a15320959-logs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:31.995969 master-0 kubenswrapper[28758]: I0223 14:49:31.995890 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0344160e-18f1-422b-90f8-663a15320959-kube-api-access-k49m5" (OuterVolumeSpecName: "kube-api-access-k49m5") pod "0344160e-18f1-422b-90f8-663a15320959" (UID: "0344160e-18f1-422b-90f8-663a15320959"). InnerVolumeSpecName "kube-api-access-k49m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:49:32.004174 master-0 kubenswrapper[28758]: I0223 14:49:32.004089 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0344160e-18f1-422b-90f8-663a15320959-scripts" (OuterVolumeSpecName: "scripts") pod "0344160e-18f1-422b-90f8-663a15320959" (UID: "0344160e-18f1-422b-90f8-663a15320959"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:32.018840 master-0 kubenswrapper[28758]: I0223 14:49:32.018535 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0344160e-18f1-422b-90f8-663a15320959-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0344160e-18f1-422b-90f8-663a15320959" (UID: "0344160e-18f1-422b-90f8-663a15320959"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:32.060818 master-0 kubenswrapper[28758]: I0223 14:49:32.060724 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0344160e-18f1-422b-90f8-663a15320959-config-data" (OuterVolumeSpecName: "config-data") pod "0344160e-18f1-422b-90f8-663a15320959" (UID: "0344160e-18f1-422b-90f8-663a15320959"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:32.096365 master-0 kubenswrapper[28758]: I0223 14:49:32.096297 28758 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0344160e-18f1-422b-90f8-663a15320959-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:32.096591 master-0 kubenswrapper[28758]: I0223 14:49:32.096379 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k49m5\" (UniqueName: \"kubernetes.io/projected/0344160e-18f1-422b-90f8-663a15320959-kube-api-access-k49m5\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:32.096591 master-0 kubenswrapper[28758]: I0223 14:49:32.096450 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0344160e-18f1-422b-90f8-663a15320959-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:32.096875 master-0 kubenswrapper[28758]: I0223 14:49:32.096852 28758 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0344160e-18f1-422b-90f8-663a15320959-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:32.141972 master-0 kubenswrapper[28758]: I0223 14:49:32.139311 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-w9z2m" event={"ID":"0344160e-18f1-422b-90f8-663a15320959","Type":"ContainerDied","Data":"c5025665d0d8448131710b08518c038259de72f4a3e35815a1fef70febd6dade"} Feb 23 14:49:32.141972 master-0 kubenswrapper[28758]: I0223 14:49:32.139419 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5025665d0d8448131710b08518c038259de72f4a3e35815a1fef70febd6dade" Feb 23 14:49:32.141972 master-0 kubenswrapper[28758]: I0223 14:49:32.139456 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:32.141972 master-0 kubenswrapper[28758]: I0223 14:49:32.139590 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-w9z2m" Feb 23 14:49:32.141972 master-0 kubenswrapper[28758]: I0223 14:49:32.140935 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:33.103505 master-0 kubenswrapper[28758]: I0223 14:49:33.098844 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" Feb 23 14:49:33.226197 master-0 kubenswrapper[28758]: I0223 14:49:33.226095 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67d05e7e-cde1-4bc2-93b0-62274ff2002a-config\") pod \"67d05e7e-cde1-4bc2-93b0-62274ff2002a\" (UID: \"67d05e7e-cde1-4bc2-93b0-62274ff2002a\") " Feb 23 14:49:33.226664 master-0 kubenswrapper[28758]: I0223 14:49:33.226246 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67d05e7e-cde1-4bc2-93b0-62274ff2002a-dns-svc\") pod \"67d05e7e-cde1-4bc2-93b0-62274ff2002a\" (UID: \"67d05e7e-cde1-4bc2-93b0-62274ff2002a\") " Feb 23 14:49:33.226664 master-0 kubenswrapper[28758]: I0223 14:49:33.226293 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67d05e7e-cde1-4bc2-93b0-62274ff2002a-ovsdbserver-sb\") pod \"67d05e7e-cde1-4bc2-93b0-62274ff2002a\" (UID: \"67d05e7e-cde1-4bc2-93b0-62274ff2002a\") " Feb 23 14:49:33.226664 master-0 kubenswrapper[28758]: I0223 14:49:33.226316 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67d05e7e-cde1-4bc2-93b0-62274ff2002a-dns-swift-storage-0\") pod \"67d05e7e-cde1-4bc2-93b0-62274ff2002a\" (UID: \"67d05e7e-cde1-4bc2-93b0-62274ff2002a\") " Feb 23 14:49:33.226664 master-0 kubenswrapper[28758]: I0223 14:49:33.226407 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67d05e7e-cde1-4bc2-93b0-62274ff2002a-ovsdbserver-nb\") pod \"67d05e7e-cde1-4bc2-93b0-62274ff2002a\" (UID: \"67d05e7e-cde1-4bc2-93b0-62274ff2002a\") " Feb 23 14:49:33.226664 master-0 kubenswrapper[28758]: I0223 14:49:33.226427 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpv6s\" (UniqueName: \"kubernetes.io/projected/67d05e7e-cde1-4bc2-93b0-62274ff2002a-kube-api-access-cpv6s\") pod \"67d05e7e-cde1-4bc2-93b0-62274ff2002a\" (UID: \"67d05e7e-cde1-4bc2-93b0-62274ff2002a\") " Feb 23 14:49:33.238362 master-0 kubenswrapper[28758]: I0223 14:49:33.238295 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67d05e7e-cde1-4bc2-93b0-62274ff2002a-kube-api-access-cpv6s" (OuterVolumeSpecName: "kube-api-access-cpv6s") pod "67d05e7e-cde1-4bc2-93b0-62274ff2002a" (UID: "67d05e7e-cde1-4bc2-93b0-62274ff2002a"). InnerVolumeSpecName "kube-api-access-cpv6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:49:33.257603 master-0 kubenswrapper[28758]: I0223 14:49:33.247746 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6bdcbb4f68-p649t"] Feb 23 14:49:33.257603 master-0 kubenswrapper[28758]: E0223 14:49:33.248219 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67d05e7e-cde1-4bc2-93b0-62274ff2002a" containerName="dnsmasq-dns" Feb 23 14:49:33.257603 master-0 kubenswrapper[28758]: I0223 14:49:33.248232 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="67d05e7e-cde1-4bc2-93b0-62274ff2002a" containerName="dnsmasq-dns" Feb 23 14:49:33.257603 master-0 kubenswrapper[28758]: E0223 14:49:33.248263 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67d05e7e-cde1-4bc2-93b0-62274ff2002a" containerName="init" Feb 23 14:49:33.257603 master-0 kubenswrapper[28758]: I0223 14:49:33.248271 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="67d05e7e-cde1-4bc2-93b0-62274ff2002a" containerName="init" Feb 23 14:49:33.257603 master-0 kubenswrapper[28758]: E0223 14:49:33.248282 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0344160e-18f1-422b-90f8-663a15320959" containerName="placement-db-sync" Feb 23 14:49:33.257603 master-0 kubenswrapper[28758]: I0223 14:49:33.248289 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="0344160e-18f1-422b-90f8-663a15320959" containerName="placement-db-sync" Feb 23 14:49:33.257603 master-0 kubenswrapper[28758]: I0223 14:49:33.248528 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="67d05e7e-cde1-4bc2-93b0-62274ff2002a" containerName="dnsmasq-dns" Feb 23 14:49:33.257603 master-0 kubenswrapper[28758]: I0223 14:49:33.248568 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="0344160e-18f1-422b-90f8-663a15320959" containerName="placement-db-sync" Feb 23 14:49:33.257603 master-0 kubenswrapper[28758]: I0223 14:49:33.249706 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6bdcbb4f68-p649t" Feb 23 14:49:33.257603 master-0 kubenswrapper[28758]: I0223 14:49:33.252768 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 23 14:49:33.257603 master-0 kubenswrapper[28758]: I0223 14:49:33.252925 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 23 14:49:33.257603 master-0 kubenswrapper[28758]: I0223 14:49:33.252948 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 23 14:49:33.257603 master-0 kubenswrapper[28758]: I0223 14:49:33.253063 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 23 14:49:33.260758 master-0 kubenswrapper[28758]: I0223 14:49:33.260700 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6bdcbb4f68-p649t"] Feb 23 14:49:33.273843 master-0 kubenswrapper[28758]: I0223 14:49:33.273635 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" event={"ID":"67d05e7e-cde1-4bc2-93b0-62274ff2002a","Type":"ContainerDied","Data":"1446848d2286e951418f294d54a20e97e14e53decb5f633d1ea61354ac6ca13b"} Feb 23 14:49:33.273843 master-0 kubenswrapper[28758]: I0223 14:49:33.273728 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d9b67b7fc-m76r8" Feb 23 14:49:33.273843 master-0 kubenswrapper[28758]: I0223 14:49:33.273784 28758 scope.go:117] "RemoveContainer" containerID="b053e4c2eb401e2d1292a5655add1836c256f85d8c7cb49382994cbed906e288" Feb 23 14:49:33.330860 master-0 kubenswrapper[28758]: I0223 14:49:33.329372 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2de7b549-c3f5-4105-8d7b-b0de62f9784e-logs\") pod \"placement-6bdcbb4f68-p649t\" (UID: \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\") " pod="openstack/placement-6bdcbb4f68-p649t" Feb 23 14:49:33.330860 master-0 kubenswrapper[28758]: I0223 14:49:33.329457 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de7b549-c3f5-4105-8d7b-b0de62f9784e-internal-tls-certs\") pod \"placement-6bdcbb4f68-p649t\" (UID: \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\") " pod="openstack/placement-6bdcbb4f68-p649t" Feb 23 14:49:33.330860 master-0 kubenswrapper[28758]: I0223 14:49:33.329560 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2b6d\" (UniqueName: \"kubernetes.io/projected/2de7b549-c3f5-4105-8d7b-b0de62f9784e-kube-api-access-r2b6d\") pod \"placement-6bdcbb4f68-p649t\" (UID: \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\") " pod="openstack/placement-6bdcbb4f68-p649t" Feb 23 14:49:33.330860 master-0 kubenswrapper[28758]: I0223 14:49:33.329594 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2de7b549-c3f5-4105-8d7b-b0de62f9784e-config-data\") pod \"placement-6bdcbb4f68-p649t\" (UID: \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\") " pod="openstack/placement-6bdcbb4f68-p649t" Feb 23 14:49:33.330860 master-0 kubenswrapper[28758]: I0223 14:49:33.329664 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de7b549-c3f5-4105-8d7b-b0de62f9784e-public-tls-certs\") pod \"placement-6bdcbb4f68-p649t\" (UID: \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\") " pod="openstack/placement-6bdcbb4f68-p649t" Feb 23 14:49:33.330860 master-0 kubenswrapper[28758]: I0223 14:49:33.329757 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de7b549-c3f5-4105-8d7b-b0de62f9784e-combined-ca-bundle\") pod \"placement-6bdcbb4f68-p649t\" (UID: \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\") " pod="openstack/placement-6bdcbb4f68-p649t" Feb 23 14:49:33.330860 master-0 kubenswrapper[28758]: I0223 14:49:33.329815 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2de7b549-c3f5-4105-8d7b-b0de62f9784e-scripts\") pod \"placement-6bdcbb4f68-p649t\" (UID: \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\") " pod="openstack/placement-6bdcbb4f68-p649t" Feb 23 14:49:33.330860 master-0 kubenswrapper[28758]: I0223 14:49:33.329943 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpv6s\" (UniqueName: \"kubernetes.io/projected/67d05e7e-cde1-4bc2-93b0-62274ff2002a-kube-api-access-cpv6s\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:33.345405 master-0 kubenswrapper[28758]: I0223 14:49:33.345227 28758 scope.go:117] "RemoveContainer" containerID="a2cf7ff261dd77119bc5d044f8abbdd8055a635ee7cf0e8845d31b5cff026b5d" Feb 23 14:49:33.347856 master-0 kubenswrapper[28758]: I0223 14:49:33.347807 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67d05e7e-cde1-4bc2-93b0-62274ff2002a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "67d05e7e-cde1-4bc2-93b0-62274ff2002a" (UID: "67d05e7e-cde1-4bc2-93b0-62274ff2002a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:49:33.365145 master-0 kubenswrapper[28758]: I0223 14:49:33.365029 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67d05e7e-cde1-4bc2-93b0-62274ff2002a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "67d05e7e-cde1-4bc2-93b0-62274ff2002a" (UID: "67d05e7e-cde1-4bc2-93b0-62274ff2002a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:49:33.380002 master-0 kubenswrapper[28758]: I0223 14:49:33.379947 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67d05e7e-cde1-4bc2-93b0-62274ff2002a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "67d05e7e-cde1-4bc2-93b0-62274ff2002a" (UID: "67d05e7e-cde1-4bc2-93b0-62274ff2002a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:49:33.419690 master-0 kubenswrapper[28758]: I0223 14:49:33.419639 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67d05e7e-cde1-4bc2-93b0-62274ff2002a-config" (OuterVolumeSpecName: "config") pod "67d05e7e-cde1-4bc2-93b0-62274ff2002a" (UID: "67d05e7e-cde1-4bc2-93b0-62274ff2002a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:49:33.431635 master-0 kubenswrapper[28758]: I0223 14:49:33.431578 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2de7b549-c3f5-4105-8d7b-b0de62f9784e-scripts\") pod \"placement-6bdcbb4f68-p649t\" (UID: \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\") " pod="openstack/placement-6bdcbb4f68-p649t" Feb 23 14:49:33.432081 master-0 kubenswrapper[28758]: I0223 14:49:33.432051 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2de7b549-c3f5-4105-8d7b-b0de62f9784e-logs\") pod \"placement-6bdcbb4f68-p649t\" (UID: \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\") " pod="openstack/placement-6bdcbb4f68-p649t" Feb 23 14:49:33.432953 master-0 kubenswrapper[28758]: I0223 14:49:33.432776 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2de7b549-c3f5-4105-8d7b-b0de62f9784e-logs\") pod \"placement-6bdcbb4f68-p649t\" (UID: \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\") " pod="openstack/placement-6bdcbb4f68-p649t" Feb 23 14:49:33.433152 master-0 kubenswrapper[28758]: I0223 14:49:33.433126 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de7b549-c3f5-4105-8d7b-b0de62f9784e-internal-tls-certs\") pod \"placement-6bdcbb4f68-p649t\" (UID: \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\") " pod="openstack/placement-6bdcbb4f68-p649t" Feb 23 14:49:33.433354 master-0 kubenswrapper[28758]: I0223 14:49:33.433337 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2b6d\" (UniqueName: \"kubernetes.io/projected/2de7b549-c3f5-4105-8d7b-b0de62f9784e-kube-api-access-r2b6d\") pod \"placement-6bdcbb4f68-p649t\" (UID: \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\") " pod="openstack/placement-6bdcbb4f68-p649t" Feb 23 14:49:33.433523 master-0 kubenswrapper[28758]: I0223 14:49:33.433471 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2de7b549-c3f5-4105-8d7b-b0de62f9784e-config-data\") pod \"placement-6bdcbb4f68-p649t\" (UID: \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\") " pod="openstack/placement-6bdcbb4f68-p649t" Feb 23 14:49:33.433753 master-0 kubenswrapper[28758]: I0223 14:49:33.433737 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de7b549-c3f5-4105-8d7b-b0de62f9784e-public-tls-certs\") pod \"placement-6bdcbb4f68-p649t\" (UID: \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\") " pod="openstack/placement-6bdcbb4f68-p649t" Feb 23 14:49:33.434004 master-0 kubenswrapper[28758]: I0223 14:49:33.433989 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de7b549-c3f5-4105-8d7b-b0de62f9784e-combined-ca-bundle\") pod \"placement-6bdcbb4f68-p649t\" (UID: \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\") " pod="openstack/placement-6bdcbb4f68-p649t" Feb 23 14:49:33.434254 master-0 kubenswrapper[28758]: I0223 14:49:33.434239 28758 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67d05e7e-cde1-4bc2-93b0-62274ff2002a-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:33.434617 master-0 kubenswrapper[28758]: I0223 14:49:33.434596 28758 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67d05e7e-cde1-4bc2-93b0-62274ff2002a-dns-svc\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:33.434764 master-0 kubenswrapper[28758]: I0223 14:49:33.434750 28758 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67d05e7e-cde1-4bc2-93b0-62274ff2002a-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:33.434858 master-0 kubenswrapper[28758]: I0223 14:49:33.434848 28758 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67d05e7e-cde1-4bc2-93b0-62274ff2002a-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:33.435327 master-0 kubenswrapper[28758]: I0223 14:49:33.435033 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2de7b549-c3f5-4105-8d7b-b0de62f9784e-scripts\") pod \"placement-6bdcbb4f68-p649t\" (UID: \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\") " pod="openstack/placement-6bdcbb4f68-p649t" Feb 23 14:49:33.437412 master-0 kubenswrapper[28758]: I0223 14:49:33.436701 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de7b549-c3f5-4105-8d7b-b0de62f9784e-internal-tls-certs\") pod \"placement-6bdcbb4f68-p649t\" (UID: \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\") " pod="openstack/placement-6bdcbb4f68-p649t" Feb 23 14:49:33.437412 master-0 kubenswrapper[28758]: I0223 14:49:33.437085 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de7b549-c3f5-4105-8d7b-b0de62f9784e-public-tls-certs\") pod \"placement-6bdcbb4f68-p649t\" (UID: \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\") " pod="openstack/placement-6bdcbb4f68-p649t" Feb 23 14:49:33.437412 master-0 kubenswrapper[28758]: I0223 14:49:33.437368 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2de7b549-c3f5-4105-8d7b-b0de62f9784e-config-data\") pod \"placement-6bdcbb4f68-p649t\" (UID: \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\") " pod="openstack/placement-6bdcbb4f68-p649t" Feb 23 14:49:33.438583 master-0 kubenswrapper[28758]: I0223 14:49:33.438542 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de7b549-c3f5-4105-8d7b-b0de62f9784e-combined-ca-bundle\") pod \"placement-6bdcbb4f68-p649t\" (UID: \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\") " pod="openstack/placement-6bdcbb4f68-p649t" Feb 23 14:49:33.448453 master-0 kubenswrapper[28758]: I0223 14:49:33.448408 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2b6d\" (UniqueName: \"kubernetes.io/projected/2de7b549-c3f5-4105-8d7b-b0de62f9784e-kube-api-access-r2b6d\") pod \"placement-6bdcbb4f68-p649t\" (UID: \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\") " pod="openstack/placement-6bdcbb4f68-p649t" Feb 23 14:49:33.468150 master-0 kubenswrapper[28758]: I0223 14:49:33.468094 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67d05e7e-cde1-4bc2-93b0-62274ff2002a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "67d05e7e-cde1-4bc2-93b0-62274ff2002a" (UID: "67d05e7e-cde1-4bc2-93b0-62274ff2002a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:49:33.539506 master-0 kubenswrapper[28758]: I0223 14:49:33.539006 28758 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67d05e7e-cde1-4bc2-93b0-62274ff2002a-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:33.583946 master-0 kubenswrapper[28758]: I0223 14:49:33.583699 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6bdcbb4f68-p649t" Feb 23 14:49:33.624589 master-0 kubenswrapper[28758]: I0223 14:49:33.623747 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d9b67b7fc-m76r8"] Feb 23 14:49:33.673181 master-0 kubenswrapper[28758]: I0223 14:49:33.673118 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d9b67b7fc-m76r8"] Feb 23 14:49:33.734591 master-0 kubenswrapper[28758]: I0223 14:49:33.731259 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-pn6b2"] Feb 23 14:49:33.765882 master-0 kubenswrapper[28758]: W0223 14:49:33.765828 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a94e87c_4854_4511_86a5_bb59bd265598.slice/crio-7c3a88e2c8ae556a3415bb6d0f9ef794ec8b5f3e4a646b35c2b827049020b7fd WatchSource:0}: Error finding container 7c3a88e2c8ae556a3415bb6d0f9ef794ec8b5f3e4a646b35c2b827049020b7fd: Status 404 returned error can't find the container with id 7c3a88e2c8ae556a3415bb6d0f9ef794ec8b5f3e4a646b35c2b827049020b7fd Feb 23 14:49:33.779617 master-0 kubenswrapper[28758]: I0223 14:49:33.778327 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 23 14:49:33.847956 master-0 kubenswrapper[28758]: W0223 14:49:33.847895 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc21cc06_410c_4afe_85d4_a72d8cebf881.slice/crio-0ef7982cd5cc6497dcf8db96b7e01a6d38aa2c80836d3b65a3fa8ce65d99959e WatchSource:0}: Error finding container 0ef7982cd5cc6497dcf8db96b7e01a6d38aa2c80836d3b65a3fa8ce65d99959e: Status 404 returned error can't find the container with id 0ef7982cd5cc6497dcf8db96b7e01a6d38aa2c80836d3b65a3fa8ce65d99959e Feb 23 14:49:33.850642 master-0 kubenswrapper[28758]: I0223 14:49:33.850607 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-sync-szljh"] Feb 23 14:49:34.112138 master-0 kubenswrapper[28758]: I0223 14:49:34.112078 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67d05e7e-cde1-4bc2-93b0-62274ff2002a" path="/var/lib/kubelet/pods/67d05e7e-cde1-4bc2-93b0-62274ff2002a/volumes" Feb 23 14:49:34.112863 master-0 kubenswrapper[28758]: I0223 14:49:34.112829 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6bdcbb4f68-p649t"] Feb 23 14:49:34.289324 master-0 kubenswrapper[28758]: I0223 14:49:34.289082 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-szljh" event={"ID":"cc21cc06-410c-4afe-85d4-a72d8cebf881","Type":"ContainerStarted","Data":"0ef7982cd5cc6497dcf8db96b7e01a6d38aa2c80836d3b65a3fa8ce65d99959e"} Feb 23 14:49:34.295517 master-0 kubenswrapper[28758]: I0223 14:49:34.295275 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-db-sync-f2ddt" event={"ID":"0378b9ac-d258-466f-8e8d-c1d27932f3b2","Type":"ContainerStarted","Data":"bc3660a666ef358c4127c7c478004d9485ab4800c068cb7d437da4c2d89f18b3"} Feb 23 14:49:34.301865 master-0 kubenswrapper[28758]: I0223 14:49:34.301705 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-63e78-default-external-api-0" event={"ID":"a09b843c-6e74-4532-8ce4-26147e97d8c5","Type":"ContainerStarted","Data":"0de1652540e3d677e7cc14a5d0ddea1d004b11fbddcf5d72dd3721158d988478"} Feb 23 14:49:34.305546 master-0 kubenswrapper[28758]: I0223 14:49:34.305467 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pn6b2" event={"ID":"2a94e87c-4854-4511-86a5-bb59bd265598","Type":"ContainerStarted","Data":"db726734716cb9c758f04cf1225d4bf86e0a58d90cc3c2898c49f77f4d8f0dba"} Feb 23 14:49:34.305659 master-0 kubenswrapper[28758]: I0223 14:49:34.305555 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pn6b2" event={"ID":"2a94e87c-4854-4511-86a5-bb59bd265598","Type":"ContainerStarted","Data":"7c3a88e2c8ae556a3415bb6d0f9ef794ec8b5f3e4a646b35c2b827049020b7fd"} Feb 23 14:49:34.311173 master-0 kubenswrapper[28758]: I0223 14:49:34.311064 28758 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 14:49:34.311173 master-0 kubenswrapper[28758]: I0223 14:49:34.311110 28758 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 14:49:34.315897 master-0 kubenswrapper[28758]: I0223 14:49:34.312729 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6bdcbb4f68-p649t" event={"ID":"2de7b549-c3f5-4105-8d7b-b0de62f9784e","Type":"ContainerStarted","Data":"ccf69e2cba2f92aa359cf82dc448702d3e77ae74d7a89b757206a96b1a70f7e3"} Feb 23 14:49:34.319443 master-0 kubenswrapper[28758]: I0223 14:49:34.319359 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-2d990-db-sync-f2ddt" podStartSLOduration=3.273184863 podStartE2EDuration="20.319338245s" podCreationTimestamp="2026-02-23 14:49:14 +0000 UTC" firstStartedPulling="2026-02-23 14:49:16.159094482 +0000 UTC m=+888.285410414" lastFinishedPulling="2026-02-23 14:49:33.205247864 +0000 UTC m=+905.331563796" observedRunningTime="2026-02-23 14:49:34.312330408 +0000 UTC m=+906.438646340" watchObservedRunningTime="2026-02-23 14:49:34.319338245 +0000 UTC m=+906.445654177" Feb 23 14:49:34.348713 master-0 kubenswrapper[28758]: I0223 14:49:34.348622 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-63e78-default-external-api-0" podStartSLOduration=15.348601512 podStartE2EDuration="15.348601512s" podCreationTimestamp="2026-02-23 14:49:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:49:34.339149481 +0000 UTC m=+906.465465423" watchObservedRunningTime="2026-02-23 14:49:34.348601512 +0000 UTC m=+906.474917444" Feb 23 14:49:34.374677 master-0 kubenswrapper[28758]: I0223 14:49:34.374595 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-pn6b2" podStartSLOduration=11.374574912 podStartE2EDuration="11.374574912s" podCreationTimestamp="2026-02-23 14:49:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:49:34.371127301 +0000 UTC m=+906.497443233" watchObservedRunningTime="2026-02-23 14:49:34.374574912 +0000 UTC m=+906.500890844" Feb 23 14:49:34.503308 master-0 kubenswrapper[28758]: I0223 14:49:34.503200 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:34.517321 master-0 kubenswrapper[28758]: I0223 14:49:34.515758 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:49:35.339506 master-0 kubenswrapper[28758]: I0223 14:49:35.339116 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6bdcbb4f68-p649t" event={"ID":"2de7b549-c3f5-4105-8d7b-b0de62f9784e","Type":"ContainerStarted","Data":"4f1c16bcc5617661be7a37f2eafd96e60f43859b2c7e4c4a5ff2bb5a0ce24f36"} Feb 23 14:49:35.339506 master-0 kubenswrapper[28758]: I0223 14:49:35.339181 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6bdcbb4f68-p649t" event={"ID":"2de7b549-c3f5-4105-8d7b-b0de62f9784e","Type":"ContainerStarted","Data":"8df68d5ca8c4ef5606a78ffd1f8761494e1613cefdb191fc895a8b358a8b3d9c"} Feb 23 14:49:35.677567 master-0 kubenswrapper[28758]: I0223 14:49:35.676771 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6bdcbb4f68-p649t" podStartSLOduration=2.67674464 podStartE2EDuration="2.67674464s" podCreationTimestamp="2026-02-23 14:49:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:49:35.664048063 +0000 UTC m=+907.790363995" watchObservedRunningTime="2026-02-23 14:49:35.67674464 +0000 UTC m=+907.803060572" Feb 23 14:49:36.352186 master-0 kubenswrapper[28758]: I0223 14:49:36.352021 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6bdcbb4f68-p649t" Feb 23 14:49:36.352186 master-0 kubenswrapper[28758]: I0223 14:49:36.352098 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6bdcbb4f68-p649t" Feb 23 14:49:37.364355 master-0 kubenswrapper[28758]: I0223 14:49:37.364296 28758 generic.go:334] "Generic (PLEG): container finished" podID="2a94e87c-4854-4511-86a5-bb59bd265598" containerID="db726734716cb9c758f04cf1225d4bf86e0a58d90cc3c2898c49f77f4d8f0dba" exitCode=0 Feb 23 14:49:37.364355 master-0 kubenswrapper[28758]: I0223 14:49:37.364341 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pn6b2" event={"ID":"2a94e87c-4854-4511-86a5-bb59bd265598","Type":"ContainerDied","Data":"db726734716cb9c758f04cf1225d4bf86e0a58d90cc3c2898c49f77f4d8f0dba"} Feb 23 14:49:39.355715 master-0 kubenswrapper[28758]: I0223 14:49:39.355605 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pn6b2" Feb 23 14:49:39.405178 master-0 kubenswrapper[28758]: I0223 14:49:39.405112 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pn6b2" event={"ID":"2a94e87c-4854-4511-86a5-bb59bd265598","Type":"ContainerDied","Data":"7c3a88e2c8ae556a3415bb6d0f9ef794ec8b5f3e4a646b35c2b827049020b7fd"} Feb 23 14:49:39.405178 master-0 kubenswrapper[28758]: I0223 14:49:39.405157 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pn6b2" Feb 23 14:49:39.405178 master-0 kubenswrapper[28758]: I0223 14:49:39.405176 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c3a88e2c8ae556a3415bb6d0f9ef794ec8b5f3e4a646b35c2b827049020b7fd" Feb 23 14:49:39.416125 master-0 kubenswrapper[28758]: I0223 14:49:39.416069 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-credential-keys\") pod \"2a94e87c-4854-4511-86a5-bb59bd265598\" (UID: \"2a94e87c-4854-4511-86a5-bb59bd265598\") " Feb 23 14:49:39.416453 master-0 kubenswrapper[28758]: I0223 14:49:39.416439 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-fernet-keys\") pod \"2a94e87c-4854-4511-86a5-bb59bd265598\" (UID: \"2a94e87c-4854-4511-86a5-bb59bd265598\") " Feb 23 14:49:39.416587 master-0 kubenswrapper[28758]: I0223 14:49:39.416571 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbcz2\" (UniqueName: \"kubernetes.io/projected/2a94e87c-4854-4511-86a5-bb59bd265598-kube-api-access-lbcz2\") pod \"2a94e87c-4854-4511-86a5-bb59bd265598\" (UID: \"2a94e87c-4854-4511-86a5-bb59bd265598\") " Feb 23 14:49:39.416713 master-0 kubenswrapper[28758]: I0223 14:49:39.416700 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-combined-ca-bundle\") pod \"2a94e87c-4854-4511-86a5-bb59bd265598\" (UID: \"2a94e87c-4854-4511-86a5-bb59bd265598\") " Feb 23 14:49:39.416801 master-0 kubenswrapper[28758]: I0223 14:49:39.416790 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-config-data\") pod \"2a94e87c-4854-4511-86a5-bb59bd265598\" (UID: \"2a94e87c-4854-4511-86a5-bb59bd265598\") " Feb 23 14:49:39.416911 master-0 kubenswrapper[28758]: I0223 14:49:39.416900 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-scripts\") pod \"2a94e87c-4854-4511-86a5-bb59bd265598\" (UID: \"2a94e87c-4854-4511-86a5-bb59bd265598\") " Feb 23 14:49:39.420650 master-0 kubenswrapper[28758]: I0223 14:49:39.420592 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2a94e87c-4854-4511-86a5-bb59bd265598" (UID: "2a94e87c-4854-4511-86a5-bb59bd265598"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:39.421052 master-0 kubenswrapper[28758]: I0223 14:49:39.420684 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a94e87c-4854-4511-86a5-bb59bd265598-kube-api-access-lbcz2" (OuterVolumeSpecName: "kube-api-access-lbcz2") pod "2a94e87c-4854-4511-86a5-bb59bd265598" (UID: "2a94e87c-4854-4511-86a5-bb59bd265598"). InnerVolumeSpecName "kube-api-access-lbcz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:49:39.421052 master-0 kubenswrapper[28758]: I0223 14:49:39.420708 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2a94e87c-4854-4511-86a5-bb59bd265598" (UID: "2a94e87c-4854-4511-86a5-bb59bd265598"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:39.423690 master-0 kubenswrapper[28758]: I0223 14:49:39.423653 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-scripts" (OuterVolumeSpecName: "scripts") pod "2a94e87c-4854-4511-86a5-bb59bd265598" (UID: "2a94e87c-4854-4511-86a5-bb59bd265598"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:39.454512 master-0 kubenswrapper[28758]: E0223 14:49:39.454438 28758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-config-data podName:2a94e87c-4854-4511-86a5-bb59bd265598 nodeName:}" failed. No retries permitted until 2026-02-23 14:49:39.954382989 +0000 UTC m=+912.080698941 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-config-data") pod "2a94e87c-4854-4511-86a5-bb59bd265598" (UID: "2a94e87c-4854-4511-86a5-bb59bd265598") : error deleting /var/lib/kubelet/pods/2a94e87c-4854-4511-86a5-bb59bd265598/volume-subpaths: remove /var/lib/kubelet/pods/2a94e87c-4854-4511-86a5-bb59bd265598/volume-subpaths: no such file or directory Feb 23 14:49:39.458883 master-0 kubenswrapper[28758]: I0223 14:49:39.458823 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a94e87c-4854-4511-86a5-bb59bd265598" (UID: "2a94e87c-4854-4511-86a5-bb59bd265598"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:39.519174 master-0 kubenswrapper[28758]: I0223 14:49:39.519102 28758 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-credential-keys\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:39.519174 master-0 kubenswrapper[28758]: I0223 14:49:39.519155 28758 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-fernet-keys\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:39.519174 master-0 kubenswrapper[28758]: I0223 14:49:39.519169 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbcz2\" (UniqueName: \"kubernetes.io/projected/2a94e87c-4854-4511-86a5-bb59bd265598-kube-api-access-lbcz2\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:39.519174 master-0 kubenswrapper[28758]: I0223 14:49:39.519183 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:39.519174 master-0 kubenswrapper[28758]: I0223 14:49:39.519194 28758 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:40.029877 master-0 kubenswrapper[28758]: I0223 14:49:40.029812 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-config-data\") pod \"2a94e87c-4854-4511-86a5-bb59bd265598\" (UID: \"2a94e87c-4854-4511-86a5-bb59bd265598\") " Feb 23 14:49:40.032707 master-0 kubenswrapper[28758]: I0223 14:49:40.032673 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-config-data" (OuterVolumeSpecName: "config-data") pod "2a94e87c-4854-4511-86a5-bb59bd265598" (UID: "2a94e87c-4854-4511-86a5-bb59bd265598"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:40.133269 master-0 kubenswrapper[28758]: I0223 14:49:40.133196 28758 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a94e87c-4854-4511-86a5-bb59bd265598-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:40.424762 master-0 kubenswrapper[28758]: I0223 14:49:40.424358 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-szljh" event={"ID":"cc21cc06-410c-4afe-85d4-a72d8cebf881","Type":"ContainerStarted","Data":"f3fe2183d4c3c96c027d08265e909347e2bfe51dfe60c7bde4cf7f8db1acc464"} Feb 23 14:49:40.572452 master-0 kubenswrapper[28758]: I0223 14:49:40.572368 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-56fc45f8f5-fsvgg"] Feb 23 14:49:40.573009 master-0 kubenswrapper[28758]: E0223 14:49:40.572965 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a94e87c-4854-4511-86a5-bb59bd265598" containerName="keystone-bootstrap" Feb 23 14:49:40.573009 master-0 kubenswrapper[28758]: I0223 14:49:40.572992 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a94e87c-4854-4511-86a5-bb59bd265598" containerName="keystone-bootstrap" Feb 23 14:49:40.573359 master-0 kubenswrapper[28758]: I0223 14:49:40.573326 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a94e87c-4854-4511-86a5-bb59bd265598" containerName="keystone-bootstrap" Feb 23 14:49:40.574543 master-0 kubenswrapper[28758]: I0223 14:49:40.574507 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56fc45f8f5-fsvgg" Feb 23 14:49:40.576668 master-0 kubenswrapper[28758]: I0223 14:49:40.576621 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 23 14:49:40.577396 master-0 kubenswrapper[28758]: I0223 14:49:40.577335 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 23 14:49:40.577519 master-0 kubenswrapper[28758]: I0223 14:49:40.577375 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 23 14:49:40.577519 master-0 kubenswrapper[28758]: I0223 14:49:40.577426 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 23 14:49:40.577721 master-0 kubenswrapper[28758]: I0223 14:49:40.577661 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 23 14:49:40.652775 master-0 kubenswrapper[28758]: I0223 14:49:40.652690 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33a3834a-cb45-4a38-ab13-67b38425de47-scripts\") pod \"keystone-56fc45f8f5-fsvgg\" (UID: \"33a3834a-cb45-4a38-ab13-67b38425de47\") " pod="openstack/keystone-56fc45f8f5-fsvgg" Feb 23 14:49:40.652775 master-0 kubenswrapper[28758]: I0223 14:49:40.652768 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33a3834a-cb45-4a38-ab13-67b38425de47-public-tls-certs\") pod \"keystone-56fc45f8f5-fsvgg\" (UID: \"33a3834a-cb45-4a38-ab13-67b38425de47\") " pod="openstack/keystone-56fc45f8f5-fsvgg" Feb 23 14:49:40.653067 master-0 kubenswrapper[28758]: I0223 14:49:40.652803 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/33a3834a-cb45-4a38-ab13-67b38425de47-credential-keys\") pod \"keystone-56fc45f8f5-fsvgg\" (UID: \"33a3834a-cb45-4a38-ab13-67b38425de47\") " pod="openstack/keystone-56fc45f8f5-fsvgg" Feb 23 14:49:40.653067 master-0 kubenswrapper[28758]: I0223 14:49:40.652850 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33a3834a-cb45-4a38-ab13-67b38425de47-combined-ca-bundle\") pod \"keystone-56fc45f8f5-fsvgg\" (UID: \"33a3834a-cb45-4a38-ab13-67b38425de47\") " pod="openstack/keystone-56fc45f8f5-fsvgg" Feb 23 14:49:40.653067 master-0 kubenswrapper[28758]: I0223 14:49:40.653029 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33a3834a-cb45-4a38-ab13-67b38425de47-internal-tls-certs\") pod \"keystone-56fc45f8f5-fsvgg\" (UID: \"33a3834a-cb45-4a38-ab13-67b38425de47\") " pod="openstack/keystone-56fc45f8f5-fsvgg" Feb 23 14:49:40.653333 master-0 kubenswrapper[28758]: I0223 14:49:40.653266 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/33a3834a-cb45-4a38-ab13-67b38425de47-fernet-keys\") pod \"keystone-56fc45f8f5-fsvgg\" (UID: \"33a3834a-cb45-4a38-ab13-67b38425de47\") " pod="openstack/keystone-56fc45f8f5-fsvgg" Feb 23 14:49:40.653405 master-0 kubenswrapper[28758]: I0223 14:49:40.653383 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrmwf\" (UniqueName: \"kubernetes.io/projected/33a3834a-cb45-4a38-ab13-67b38425de47-kube-api-access-rrmwf\") pod \"keystone-56fc45f8f5-fsvgg\" (UID: \"33a3834a-cb45-4a38-ab13-67b38425de47\") " pod="openstack/keystone-56fc45f8f5-fsvgg" Feb 23 14:49:40.653601 master-0 kubenswrapper[28758]: I0223 14:49:40.653570 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33a3834a-cb45-4a38-ab13-67b38425de47-config-data\") pod \"keystone-56fc45f8f5-fsvgg\" (UID: \"33a3834a-cb45-4a38-ab13-67b38425de47\") " pod="openstack/keystone-56fc45f8f5-fsvgg" Feb 23 14:49:40.663852 master-0 kubenswrapper[28758]: I0223 14:49:40.663764 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-56fc45f8f5-fsvgg"] Feb 23 14:49:40.755641 master-0 kubenswrapper[28758]: I0223 14:49:40.755563 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33a3834a-cb45-4a38-ab13-67b38425de47-scripts\") pod \"keystone-56fc45f8f5-fsvgg\" (UID: \"33a3834a-cb45-4a38-ab13-67b38425de47\") " pod="openstack/keystone-56fc45f8f5-fsvgg" Feb 23 14:49:40.755875 master-0 kubenswrapper[28758]: I0223 14:49:40.755648 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33a3834a-cb45-4a38-ab13-67b38425de47-public-tls-certs\") pod \"keystone-56fc45f8f5-fsvgg\" (UID: \"33a3834a-cb45-4a38-ab13-67b38425de47\") " pod="openstack/keystone-56fc45f8f5-fsvgg" Feb 23 14:49:40.755875 master-0 kubenswrapper[28758]: I0223 14:49:40.755709 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/33a3834a-cb45-4a38-ab13-67b38425de47-credential-keys\") pod \"keystone-56fc45f8f5-fsvgg\" (UID: \"33a3834a-cb45-4a38-ab13-67b38425de47\") " pod="openstack/keystone-56fc45f8f5-fsvgg" Feb 23 14:49:40.755875 master-0 kubenswrapper[28758]: I0223 14:49:40.755835 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33a3834a-cb45-4a38-ab13-67b38425de47-combined-ca-bundle\") pod \"keystone-56fc45f8f5-fsvgg\" (UID: \"33a3834a-cb45-4a38-ab13-67b38425de47\") " pod="openstack/keystone-56fc45f8f5-fsvgg" Feb 23 14:49:40.755970 master-0 kubenswrapper[28758]: I0223 14:49:40.755886 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33a3834a-cb45-4a38-ab13-67b38425de47-internal-tls-certs\") pod \"keystone-56fc45f8f5-fsvgg\" (UID: \"33a3834a-cb45-4a38-ab13-67b38425de47\") " pod="openstack/keystone-56fc45f8f5-fsvgg" Feb 23 14:49:40.755970 master-0 kubenswrapper[28758]: I0223 14:49:40.755940 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/33a3834a-cb45-4a38-ab13-67b38425de47-fernet-keys\") pod \"keystone-56fc45f8f5-fsvgg\" (UID: \"33a3834a-cb45-4a38-ab13-67b38425de47\") " pod="openstack/keystone-56fc45f8f5-fsvgg" Feb 23 14:49:40.756036 master-0 kubenswrapper[28758]: I0223 14:49:40.755978 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrmwf\" (UniqueName: \"kubernetes.io/projected/33a3834a-cb45-4a38-ab13-67b38425de47-kube-api-access-rrmwf\") pod \"keystone-56fc45f8f5-fsvgg\" (UID: \"33a3834a-cb45-4a38-ab13-67b38425de47\") " pod="openstack/keystone-56fc45f8f5-fsvgg" Feb 23 14:49:40.756068 master-0 kubenswrapper[28758]: I0223 14:49:40.756043 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33a3834a-cb45-4a38-ab13-67b38425de47-config-data\") pod \"keystone-56fc45f8f5-fsvgg\" (UID: \"33a3834a-cb45-4a38-ab13-67b38425de47\") " pod="openstack/keystone-56fc45f8f5-fsvgg" Feb 23 14:49:40.758962 master-0 kubenswrapper[28758]: I0223 14:49:40.758920 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33a3834a-cb45-4a38-ab13-67b38425de47-public-tls-certs\") pod \"keystone-56fc45f8f5-fsvgg\" (UID: \"33a3834a-cb45-4a38-ab13-67b38425de47\") " pod="openstack/keystone-56fc45f8f5-fsvgg" Feb 23 14:49:40.759402 master-0 kubenswrapper[28758]: I0223 14:49:40.759357 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33a3834a-cb45-4a38-ab13-67b38425de47-scripts\") pod \"keystone-56fc45f8f5-fsvgg\" (UID: \"33a3834a-cb45-4a38-ab13-67b38425de47\") " pod="openstack/keystone-56fc45f8f5-fsvgg" Feb 23 14:49:40.759689 master-0 kubenswrapper[28758]: I0223 14:49:40.759649 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/33a3834a-cb45-4a38-ab13-67b38425de47-fernet-keys\") pod \"keystone-56fc45f8f5-fsvgg\" (UID: \"33a3834a-cb45-4a38-ab13-67b38425de47\") " pod="openstack/keystone-56fc45f8f5-fsvgg" Feb 23 14:49:40.759741 master-0 kubenswrapper[28758]: I0223 14:49:40.759652 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/33a3834a-cb45-4a38-ab13-67b38425de47-credential-keys\") pod \"keystone-56fc45f8f5-fsvgg\" (UID: \"33a3834a-cb45-4a38-ab13-67b38425de47\") " pod="openstack/keystone-56fc45f8f5-fsvgg" Feb 23 14:49:40.760227 master-0 kubenswrapper[28758]: I0223 14:49:40.760208 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33a3834a-cb45-4a38-ab13-67b38425de47-internal-tls-certs\") pod \"keystone-56fc45f8f5-fsvgg\" (UID: \"33a3834a-cb45-4a38-ab13-67b38425de47\") " pod="openstack/keystone-56fc45f8f5-fsvgg" Feb 23 14:49:40.760398 master-0 kubenswrapper[28758]: I0223 14:49:40.760349 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33a3834a-cb45-4a38-ab13-67b38425de47-combined-ca-bundle\") pod \"keystone-56fc45f8f5-fsvgg\" (UID: \"33a3834a-cb45-4a38-ab13-67b38425de47\") " pod="openstack/keystone-56fc45f8f5-fsvgg" Feb 23 14:49:40.761684 master-0 kubenswrapper[28758]: I0223 14:49:40.761639 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33a3834a-cb45-4a38-ab13-67b38425de47-config-data\") pod \"keystone-56fc45f8f5-fsvgg\" (UID: \"33a3834a-cb45-4a38-ab13-67b38425de47\") " pod="openstack/keystone-56fc45f8f5-fsvgg" Feb 23 14:49:40.887199 master-0 kubenswrapper[28758]: I0223 14:49:40.887153 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrmwf\" (UniqueName: \"kubernetes.io/projected/33a3834a-cb45-4a38-ab13-67b38425de47-kube-api-access-rrmwf\") pod \"keystone-56fc45f8f5-fsvgg\" (UID: \"33a3834a-cb45-4a38-ab13-67b38425de47\") " pod="openstack/keystone-56fc45f8f5-fsvgg" Feb 23 14:49:40.905189 master-0 kubenswrapper[28758]: I0223 14:49:40.905129 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56fc45f8f5-fsvgg" Feb 23 14:49:41.405183 master-0 kubenswrapper[28758]: W0223 14:49:41.405112 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33a3834a_cb45_4a38_ab13_67b38425de47.slice/crio-3e109278616d05f331e81f720e50f156289a60fefaed8448d3c83cfbb6bdb155 WatchSource:0}: Error finding container 3e109278616d05f331e81f720e50f156289a60fefaed8448d3c83cfbb6bdb155: Status 404 returned error can't find the container with id 3e109278616d05f331e81f720e50f156289a60fefaed8448d3c83cfbb6bdb155 Feb 23 14:49:41.410791 master-0 kubenswrapper[28758]: I0223 14:49:41.410685 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-56fc45f8f5-fsvgg"] Feb 23 14:49:41.438672 master-0 kubenswrapper[28758]: I0223 14:49:41.438037 28758 generic.go:334] "Generic (PLEG): container finished" podID="cc21cc06-410c-4afe-85d4-a72d8cebf881" containerID="f3fe2183d4c3c96c027d08265e909347e2bfe51dfe60c7bde4cf7f8db1acc464" exitCode=0 Feb 23 14:49:41.438672 master-0 kubenswrapper[28758]: I0223 14:49:41.438143 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-szljh" event={"ID":"cc21cc06-410c-4afe-85d4-a72d8cebf881","Type":"ContainerDied","Data":"f3fe2183d4c3c96c027d08265e909347e2bfe51dfe60c7bde4cf7f8db1acc464"} Feb 23 14:49:41.449969 master-0 kubenswrapper[28758]: I0223 14:49:41.449888 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56fc45f8f5-fsvgg" event={"ID":"33a3834a-cb45-4a38-ab13-67b38425de47","Type":"ContainerStarted","Data":"3e109278616d05f331e81f720e50f156289a60fefaed8448d3c83cfbb6bdb155"} Feb 23 14:49:42.466000 master-0 kubenswrapper[28758]: I0223 14:49:42.465892 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56fc45f8f5-fsvgg" event={"ID":"33a3834a-cb45-4a38-ab13-67b38425de47","Type":"ContainerStarted","Data":"ef7b393de8e664b70280cbda614463cd593d8eaa03cd23576d4d3ae0c981d5d6"} Feb 23 14:49:42.466878 master-0 kubenswrapper[28758]: I0223 14:49:42.466394 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-56fc45f8f5-fsvgg" Feb 23 14:49:42.473867 master-0 kubenswrapper[28758]: I0223 14:49:42.473805 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-szljh" event={"ID":"cc21cc06-410c-4afe-85d4-a72d8cebf881","Type":"ContainerStarted","Data":"f1899198a88f763740bd890a3b93d1aa145dbe8e2027b332ad777ba096752dec"} Feb 23 14:49:42.518843 master-0 kubenswrapper[28758]: I0223 14:49:42.511670 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-56fc45f8f5-fsvgg" podStartSLOduration=3.511574326 podStartE2EDuration="3.511574326s" podCreationTimestamp="2026-02-23 14:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:49:42.50232039 +0000 UTC m=+914.628636322" watchObservedRunningTime="2026-02-23 14:49:42.511574326 +0000 UTC m=+914.637890288" Feb 23 14:49:42.591460 master-0 kubenswrapper[28758]: I0223 14:49:42.591353 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-db-sync-szljh" podStartSLOduration=13.271878612 podStartE2EDuration="18.591325865s" podCreationTimestamp="2026-02-23 14:49:24 +0000 UTC" firstStartedPulling="2026-02-23 14:49:33.85169281 +0000 UTC m=+905.978008742" lastFinishedPulling="2026-02-23 14:49:39.171140063 +0000 UTC m=+911.297455995" observedRunningTime="2026-02-23 14:49:42.576667935 +0000 UTC m=+914.702983867" watchObservedRunningTime="2026-02-23 14:49:42.591325865 +0000 UTC m=+914.717641797" Feb 23 14:49:42.678577 master-0 kubenswrapper[28758]: I0223 14:49:42.678436 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:42.679160 master-0 kubenswrapper[28758]: I0223 14:49:42.679140 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:42.708428 master-0 kubenswrapper[28758]: I0223 14:49:42.708375 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:42.731278 master-0 kubenswrapper[28758]: I0223 14:49:42.731157 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:43.483604 master-0 kubenswrapper[28758]: I0223 14:49:43.483526 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:43.483604 master-0 kubenswrapper[28758]: I0223 14:49:43.483595 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:44.495259 master-0 kubenswrapper[28758]: I0223 14:49:44.495192 28758 generic.go:334] "Generic (PLEG): container finished" podID="0378b9ac-d258-466f-8e8d-c1d27932f3b2" containerID="bc3660a666ef358c4127c7c478004d9485ab4800c068cb7d437da4c2d89f18b3" exitCode=0 Feb 23 14:49:44.495796 master-0 kubenswrapper[28758]: I0223 14:49:44.495244 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-db-sync-f2ddt" event={"ID":"0378b9ac-d258-466f-8e8d-c1d27932f3b2","Type":"ContainerDied","Data":"bc3660a666ef358c4127c7c478004d9485ab4800c068cb7d437da4c2d89f18b3"} Feb 23 14:49:45.571222 master-0 kubenswrapper[28758]: I0223 14:49:45.571178 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:45.571883 master-0 kubenswrapper[28758]: I0223 14:49:45.571866 28758 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 14:49:45.661041 master-0 kubenswrapper[28758]: I0223 14:49:45.660974 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:49:45.993236 master-0 kubenswrapper[28758]: I0223 14:49:45.993183 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d990-db-sync-f2ddt" Feb 23 14:49:46.103306 master-0 kubenswrapper[28758]: I0223 14:49:46.103175 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0378b9ac-d258-466f-8e8d-c1d27932f3b2-db-sync-config-data\") pod \"0378b9ac-d258-466f-8e8d-c1d27932f3b2\" (UID: \"0378b9ac-d258-466f-8e8d-c1d27932f3b2\") " Feb 23 14:49:46.103306 master-0 kubenswrapper[28758]: I0223 14:49:46.103250 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0378b9ac-d258-466f-8e8d-c1d27932f3b2-scripts\") pod \"0378b9ac-d258-466f-8e8d-c1d27932f3b2\" (UID: \"0378b9ac-d258-466f-8e8d-c1d27932f3b2\") " Feb 23 14:49:46.103655 master-0 kubenswrapper[28758]: I0223 14:49:46.103314 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tr5xv\" (UniqueName: \"kubernetes.io/projected/0378b9ac-d258-466f-8e8d-c1d27932f3b2-kube-api-access-tr5xv\") pod \"0378b9ac-d258-466f-8e8d-c1d27932f3b2\" (UID: \"0378b9ac-d258-466f-8e8d-c1d27932f3b2\") " Feb 23 14:49:46.103655 master-0 kubenswrapper[28758]: I0223 14:49:46.103428 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0378b9ac-d258-466f-8e8d-c1d27932f3b2-combined-ca-bundle\") pod \"0378b9ac-d258-466f-8e8d-c1d27932f3b2\" (UID: \"0378b9ac-d258-466f-8e8d-c1d27932f3b2\") " Feb 23 14:49:46.103655 master-0 kubenswrapper[28758]: I0223 14:49:46.103567 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0378b9ac-d258-466f-8e8d-c1d27932f3b2-etc-machine-id\") pod \"0378b9ac-d258-466f-8e8d-c1d27932f3b2\" (UID: \"0378b9ac-d258-466f-8e8d-c1d27932f3b2\") " Feb 23 14:49:46.103655 master-0 kubenswrapper[28758]: I0223 14:49:46.103615 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0378b9ac-d258-466f-8e8d-c1d27932f3b2-config-data\") pod \"0378b9ac-d258-466f-8e8d-c1d27932f3b2\" (UID: \"0378b9ac-d258-466f-8e8d-c1d27932f3b2\") " Feb 23 14:49:46.104776 master-0 kubenswrapper[28758]: I0223 14:49:46.104737 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0378b9ac-d258-466f-8e8d-c1d27932f3b2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0378b9ac-d258-466f-8e8d-c1d27932f3b2" (UID: "0378b9ac-d258-466f-8e8d-c1d27932f3b2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:49:46.108043 master-0 kubenswrapper[28758]: I0223 14:49:46.107994 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0378b9ac-d258-466f-8e8d-c1d27932f3b2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0378b9ac-d258-466f-8e8d-c1d27932f3b2" (UID: "0378b9ac-d258-466f-8e8d-c1d27932f3b2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:46.109738 master-0 kubenswrapper[28758]: I0223 14:49:46.109611 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0378b9ac-d258-466f-8e8d-c1d27932f3b2-kube-api-access-tr5xv" (OuterVolumeSpecName: "kube-api-access-tr5xv") pod "0378b9ac-d258-466f-8e8d-c1d27932f3b2" (UID: "0378b9ac-d258-466f-8e8d-c1d27932f3b2"). InnerVolumeSpecName "kube-api-access-tr5xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:49:46.121046 master-0 kubenswrapper[28758]: I0223 14:49:46.120987 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0378b9ac-d258-466f-8e8d-c1d27932f3b2-scripts" (OuterVolumeSpecName: "scripts") pod "0378b9ac-d258-466f-8e8d-c1d27932f3b2" (UID: "0378b9ac-d258-466f-8e8d-c1d27932f3b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:46.147190 master-0 kubenswrapper[28758]: I0223 14:49:46.147086 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0378b9ac-d258-466f-8e8d-c1d27932f3b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0378b9ac-d258-466f-8e8d-c1d27932f3b2" (UID: "0378b9ac-d258-466f-8e8d-c1d27932f3b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:46.178136 master-0 kubenswrapper[28758]: I0223 14:49:46.178064 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0378b9ac-d258-466f-8e8d-c1d27932f3b2-config-data" (OuterVolumeSpecName: "config-data") pod "0378b9ac-d258-466f-8e8d-c1d27932f3b2" (UID: "0378b9ac-d258-466f-8e8d-c1d27932f3b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:46.206383 master-0 kubenswrapper[28758]: I0223 14:49:46.206023 28758 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0378b9ac-d258-466f-8e8d-c1d27932f3b2-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:46.206383 master-0 kubenswrapper[28758]: I0223 14:49:46.206072 28758 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0378b9ac-d258-466f-8e8d-c1d27932f3b2-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:46.206383 master-0 kubenswrapper[28758]: I0223 14:49:46.206081 28758 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0378b9ac-d258-466f-8e8d-c1d27932f3b2-db-sync-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:46.206383 master-0 kubenswrapper[28758]: I0223 14:49:46.206092 28758 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0378b9ac-d258-466f-8e8d-c1d27932f3b2-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:46.206383 master-0 kubenswrapper[28758]: I0223 14:49:46.206103 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tr5xv\" (UniqueName: \"kubernetes.io/projected/0378b9ac-d258-466f-8e8d-c1d27932f3b2-kube-api-access-tr5xv\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:46.206383 master-0 kubenswrapper[28758]: I0223 14:49:46.206111 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0378b9ac-d258-466f-8e8d-c1d27932f3b2-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:46.522125 master-0 kubenswrapper[28758]: I0223 14:49:46.522043 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-db-sync-f2ddt" event={"ID":"0378b9ac-d258-466f-8e8d-c1d27932f3b2","Type":"ContainerDied","Data":"4e74d3742b197801ade72053810c65d00838300f1e021eb6a1f6d093d8cf0a11"} Feb 23 14:49:46.522125 master-0 kubenswrapper[28758]: I0223 14:49:46.522115 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e74d3742b197801ade72053810c65d00838300f1e021eb6a1f6d093d8cf0a11" Feb 23 14:49:46.522125 master-0 kubenswrapper[28758]: I0223 14:49:46.522083 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d990-db-sync-f2ddt" Feb 23 14:49:47.747105 master-0 kubenswrapper[28758]: I0223 14:49:47.747018 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-2d990-scheduler-0"] Feb 23 14:49:47.747896 master-0 kubenswrapper[28758]: E0223 14:49:47.747649 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0378b9ac-d258-466f-8e8d-c1d27932f3b2" containerName="cinder-2d990-db-sync" Feb 23 14:49:47.747896 master-0 kubenswrapper[28758]: I0223 14:49:47.747666 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="0378b9ac-d258-466f-8e8d-c1d27932f3b2" containerName="cinder-2d990-db-sync" Feb 23 14:49:47.747896 master-0 kubenswrapper[28758]: I0223 14:49:47.747893 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="0378b9ac-d258-466f-8e8d-c1d27932f3b2" containerName="cinder-2d990-db-sync" Feb 23 14:49:47.749794 master-0 kubenswrapper[28758]: I0223 14:49:47.749751 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:49:47.755084 master-0 kubenswrapper[28758]: I0223 14:49:47.755006 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-2d990-scripts" Feb 23 14:49:47.755541 master-0 kubenswrapper[28758]: I0223 14:49:47.755493 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-2d990-config-data" Feb 23 14:49:47.757241 master-0 kubenswrapper[28758]: I0223 14:49:47.755722 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-2d990-scheduler-config-data" Feb 23 14:49:47.795789 master-0 kubenswrapper[28758]: I0223 14:49:47.795736 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2d990-scheduler-0"] Feb 23 14:49:47.846440 master-0 kubenswrapper[28758]: I0223 14:49:47.843920 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl5fr\" (UniqueName: \"kubernetes.io/projected/4021d59b-dfa8-49cc-b55c-48469a02b971-kube-api-access-fl5fr\") pod \"cinder-2d990-scheduler-0\" (UID: \"4021d59b-dfa8-49cc-b55c-48469a02b971\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:49:47.846440 master-0 kubenswrapper[28758]: I0223 14:49:47.844003 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4021d59b-dfa8-49cc-b55c-48469a02b971-scripts\") pod \"cinder-2d990-scheduler-0\" (UID: \"4021d59b-dfa8-49cc-b55c-48469a02b971\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:49:47.846440 master-0 kubenswrapper[28758]: I0223 14:49:47.844061 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4021d59b-dfa8-49cc-b55c-48469a02b971-config-data-custom\") pod \"cinder-2d990-scheduler-0\" (UID: \"4021d59b-dfa8-49cc-b55c-48469a02b971\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:49:47.846440 master-0 kubenswrapper[28758]: I0223 14:49:47.844104 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4021d59b-dfa8-49cc-b55c-48469a02b971-combined-ca-bundle\") pod \"cinder-2d990-scheduler-0\" (UID: \"4021d59b-dfa8-49cc-b55c-48469a02b971\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:49:47.846440 master-0 kubenswrapper[28758]: I0223 14:49:47.844163 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4021d59b-dfa8-49cc-b55c-48469a02b971-etc-machine-id\") pod \"cinder-2d990-scheduler-0\" (UID: \"4021d59b-dfa8-49cc-b55c-48469a02b971\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:49:47.846440 master-0 kubenswrapper[28758]: I0223 14:49:47.844197 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4021d59b-dfa8-49cc-b55c-48469a02b971-config-data\") pod \"cinder-2d990-scheduler-0\" (UID: \"4021d59b-dfa8-49cc-b55c-48469a02b971\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:49:47.888885 master-0 kubenswrapper[28758]: I0223 14:49:47.875725 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56fd8c5c-tkwd6"] Feb 23 14:49:47.888885 master-0 kubenswrapper[28758]: I0223 14:49:47.878710 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56fd8c5c-tkwd6" Feb 23 14:49:47.923179 master-0 kubenswrapper[28758]: I0223 14:49:47.923097 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-2d990-volume-lvm-iscsi-0"] Feb 23 14:49:47.927981 master-0 kubenswrapper[28758]: I0223 14:49:47.926111 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:47.935997 master-0 kubenswrapper[28758]: I0223 14:49:47.933325 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-2d990-volume-lvm-iscsi-config-data" Feb 23 14:49:47.954845 master-0 kubenswrapper[28758]: I0223 14:49:47.940730 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56fd8c5c-tkwd6"] Feb 23 14:49:47.954845 master-0 kubenswrapper[28758]: I0223 14:49:47.950125 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4021d59b-dfa8-49cc-b55c-48469a02b971-etc-machine-id\") pod \"cinder-2d990-scheduler-0\" (UID: \"4021d59b-dfa8-49cc-b55c-48469a02b971\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:49:47.954845 master-0 kubenswrapper[28758]: I0223 14:49:47.950262 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4021d59b-dfa8-49cc-b55c-48469a02b971-config-data\") pod \"cinder-2d990-scheduler-0\" (UID: \"4021d59b-dfa8-49cc-b55c-48469a02b971\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:49:47.954845 master-0 kubenswrapper[28758]: I0223 14:49:47.950411 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7cce387-f350-41ac-9293-ca3c7e93681a-ovsdbserver-sb\") pod \"dnsmasq-dns-56fd8c5c-tkwd6\" (UID: \"b7cce387-f350-41ac-9293-ca3c7e93681a\") " pod="openstack/dnsmasq-dns-56fd8c5c-tkwd6" Feb 23 14:49:47.954845 master-0 kubenswrapper[28758]: I0223 14:49:47.950422 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4021d59b-dfa8-49cc-b55c-48469a02b971-etc-machine-id\") pod \"cinder-2d990-scheduler-0\" (UID: \"4021d59b-dfa8-49cc-b55c-48469a02b971\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:49:47.954845 master-0 kubenswrapper[28758]: I0223 14:49:47.950440 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7cce387-f350-41ac-9293-ca3c7e93681a-config\") pod \"dnsmasq-dns-56fd8c5c-tkwd6\" (UID: \"b7cce387-f350-41ac-9293-ca3c7e93681a\") " pod="openstack/dnsmasq-dns-56fd8c5c-tkwd6" Feb 23 14:49:47.954845 master-0 kubenswrapper[28758]: I0223 14:49:47.950644 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7cce387-f350-41ac-9293-ca3c7e93681a-dns-svc\") pod \"dnsmasq-dns-56fd8c5c-tkwd6\" (UID: \"b7cce387-f350-41ac-9293-ca3c7e93681a\") " pod="openstack/dnsmasq-dns-56fd8c5c-tkwd6" Feb 23 14:49:47.954845 master-0 kubenswrapper[28758]: I0223 14:49:47.950717 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl5fr\" (UniqueName: \"kubernetes.io/projected/4021d59b-dfa8-49cc-b55c-48469a02b971-kube-api-access-fl5fr\") pod \"cinder-2d990-scheduler-0\" (UID: \"4021d59b-dfa8-49cc-b55c-48469a02b971\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:49:47.954845 master-0 kubenswrapper[28758]: I0223 14:49:47.950794 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7cce387-f350-41ac-9293-ca3c7e93681a-ovsdbserver-nb\") pod \"dnsmasq-dns-56fd8c5c-tkwd6\" (UID: \"b7cce387-f350-41ac-9293-ca3c7e93681a\") " pod="openstack/dnsmasq-dns-56fd8c5c-tkwd6" Feb 23 14:49:47.954845 master-0 kubenswrapper[28758]: I0223 14:49:47.950848 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4021d59b-dfa8-49cc-b55c-48469a02b971-scripts\") pod \"cinder-2d990-scheduler-0\" (UID: \"4021d59b-dfa8-49cc-b55c-48469a02b971\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:49:47.954845 master-0 kubenswrapper[28758]: I0223 14:49:47.950916 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-944nj\" (UniqueName: \"kubernetes.io/projected/b7cce387-f350-41ac-9293-ca3c7e93681a-kube-api-access-944nj\") pod \"dnsmasq-dns-56fd8c5c-tkwd6\" (UID: \"b7cce387-f350-41ac-9293-ca3c7e93681a\") " pod="openstack/dnsmasq-dns-56fd8c5c-tkwd6" Feb 23 14:49:47.954845 master-0 kubenswrapper[28758]: I0223 14:49:47.951025 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4021d59b-dfa8-49cc-b55c-48469a02b971-config-data-custom\") pod \"cinder-2d990-scheduler-0\" (UID: \"4021d59b-dfa8-49cc-b55c-48469a02b971\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:49:47.954845 master-0 kubenswrapper[28758]: I0223 14:49:47.951118 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b7cce387-f350-41ac-9293-ca3c7e93681a-dns-swift-storage-0\") pod \"dnsmasq-dns-56fd8c5c-tkwd6\" (UID: \"b7cce387-f350-41ac-9293-ca3c7e93681a\") " pod="openstack/dnsmasq-dns-56fd8c5c-tkwd6" Feb 23 14:49:47.954845 master-0 kubenswrapper[28758]: I0223 14:49:47.951213 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4021d59b-dfa8-49cc-b55c-48469a02b971-combined-ca-bundle\") pod \"cinder-2d990-scheduler-0\" (UID: \"4021d59b-dfa8-49cc-b55c-48469a02b971\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:49:47.962631 master-0 kubenswrapper[28758]: I0223 14:49:47.962557 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4021d59b-dfa8-49cc-b55c-48469a02b971-config-data\") pod \"cinder-2d990-scheduler-0\" (UID: \"4021d59b-dfa8-49cc-b55c-48469a02b971\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:49:47.979987 master-0 kubenswrapper[28758]: I0223 14:49:47.979895 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4021d59b-dfa8-49cc-b55c-48469a02b971-combined-ca-bundle\") pod \"cinder-2d990-scheduler-0\" (UID: \"4021d59b-dfa8-49cc-b55c-48469a02b971\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:49:47.983286 master-0 kubenswrapper[28758]: I0223 14:49:47.983236 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4021d59b-dfa8-49cc-b55c-48469a02b971-config-data-custom\") pod \"cinder-2d990-scheduler-0\" (UID: \"4021d59b-dfa8-49cc-b55c-48469a02b971\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:49:48.010741 master-0 kubenswrapper[28758]: I0223 14:49:48.008637 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4021d59b-dfa8-49cc-b55c-48469a02b971-scripts\") pod \"cinder-2d990-scheduler-0\" (UID: \"4021d59b-dfa8-49cc-b55c-48469a02b971\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:49:48.011654 master-0 kubenswrapper[28758]: I0223 14:49:48.011356 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2d990-volume-lvm-iscsi-0"] Feb 23 14:49:48.057832 master-0 kubenswrapper[28758]: I0223 14:49:48.054421 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl5fr\" (UniqueName: \"kubernetes.io/projected/4021d59b-dfa8-49cc-b55c-48469a02b971-kube-api-access-fl5fr\") pod \"cinder-2d990-scheduler-0\" (UID: \"4021d59b-dfa8-49cc-b55c-48469a02b971\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:49:48.063384 master-0 kubenswrapper[28758]: I0223 14:49:48.063332 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-run\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.064567 master-0 kubenswrapper[28758]: I0223 14:49:48.064331 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b7cce387-f350-41ac-9293-ca3c7e93681a-dns-swift-storage-0\") pod \"dnsmasq-dns-56fd8c5c-tkwd6\" (UID: \"b7cce387-f350-41ac-9293-ca3c7e93681a\") " pod="openstack/dnsmasq-dns-56fd8c5c-tkwd6" Feb 23 14:49:48.067581 master-0 kubenswrapper[28758]: I0223 14:49:48.064913 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42d8d5ca-eecd-4a43-9a68-1099bab111aa-scripts\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.067581 master-0 kubenswrapper[28758]: I0223 14:49:48.064956 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d8d5ca-eecd-4a43-9a68-1099bab111aa-combined-ca-bundle\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.067581 master-0 kubenswrapper[28758]: I0223 14:49:48.065288 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42d8d5ca-eecd-4a43-9a68-1099bab111aa-config-data-custom\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.067581 master-0 kubenswrapper[28758]: I0223 14:49:48.065415 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42d8d5ca-eecd-4a43-9a68-1099bab111aa-config-data\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.067581 master-0 kubenswrapper[28758]: I0223 14:49:48.065448 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b7cce387-f350-41ac-9293-ca3c7e93681a-dns-swift-storage-0\") pod \"dnsmasq-dns-56fd8c5c-tkwd6\" (UID: \"b7cce387-f350-41ac-9293-ca3c7e93681a\") " pod="openstack/dnsmasq-dns-56fd8c5c-tkwd6" Feb 23 14:49:48.067581 master-0 kubenswrapper[28758]: I0223 14:49:48.065533 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-lib-modules\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.067581 master-0 kubenswrapper[28758]: I0223 14:49:48.066700 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-dev\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.067581 master-0 kubenswrapper[28758]: I0223 14:49:48.066739 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-etc-nvme\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.067581 master-0 kubenswrapper[28758]: I0223 14:49:48.066773 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-sys\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.067581 master-0 kubenswrapper[28758]: I0223 14:49:48.066839 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-var-locks-brick\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.067581 master-0 kubenswrapper[28758]: I0223 14:49:48.067004 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7cce387-f350-41ac-9293-ca3c7e93681a-ovsdbserver-sb\") pod \"dnsmasq-dns-56fd8c5c-tkwd6\" (UID: \"b7cce387-f350-41ac-9293-ca3c7e93681a\") " pod="openstack/dnsmasq-dns-56fd8c5c-tkwd6" Feb 23 14:49:48.067581 master-0 kubenswrapper[28758]: I0223 14:49:48.067175 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7cce387-f350-41ac-9293-ca3c7e93681a-config\") pod \"dnsmasq-dns-56fd8c5c-tkwd6\" (UID: \"b7cce387-f350-41ac-9293-ca3c7e93681a\") " pod="openstack/dnsmasq-dns-56fd8c5c-tkwd6" Feb 23 14:49:48.067581 master-0 kubenswrapper[28758]: I0223 14:49:48.067208 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7cce387-f350-41ac-9293-ca3c7e93681a-dns-svc\") pod \"dnsmasq-dns-56fd8c5c-tkwd6\" (UID: \"b7cce387-f350-41ac-9293-ca3c7e93681a\") " pod="openstack/dnsmasq-dns-56fd8c5c-tkwd6" Feb 23 14:49:48.067581 master-0 kubenswrapper[28758]: I0223 14:49:48.067452 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-etc-machine-id\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.067581 master-0 kubenswrapper[28758]: I0223 14:49:48.067526 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szxf8\" (UniqueName: \"kubernetes.io/projected/42d8d5ca-eecd-4a43-9a68-1099bab111aa-kube-api-access-szxf8\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.067581 master-0 kubenswrapper[28758]: I0223 14:49:48.067580 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7cce387-f350-41ac-9293-ca3c7e93681a-ovsdbserver-nb\") pod \"dnsmasq-dns-56fd8c5c-tkwd6\" (UID: \"b7cce387-f350-41ac-9293-ca3c7e93681a\") " pod="openstack/dnsmasq-dns-56fd8c5c-tkwd6" Feb 23 14:49:48.069536 master-0 kubenswrapper[28758]: I0223 14:49:48.067670 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-etc-iscsi\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.069536 master-0 kubenswrapper[28758]: I0223 14:49:48.067847 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-944nj\" (UniqueName: \"kubernetes.io/projected/b7cce387-f350-41ac-9293-ca3c7e93681a-kube-api-access-944nj\") pod \"dnsmasq-dns-56fd8c5c-tkwd6\" (UID: \"b7cce387-f350-41ac-9293-ca3c7e93681a\") " pod="openstack/dnsmasq-dns-56fd8c5c-tkwd6" Feb 23 14:49:48.069536 master-0 kubenswrapper[28758]: I0223 14:49:48.067914 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7cce387-f350-41ac-9293-ca3c7e93681a-dns-svc\") pod \"dnsmasq-dns-56fd8c5c-tkwd6\" (UID: \"b7cce387-f350-41ac-9293-ca3c7e93681a\") " pod="openstack/dnsmasq-dns-56fd8c5c-tkwd6" Feb 23 14:49:48.069536 master-0 kubenswrapper[28758]: I0223 14:49:48.067936 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-var-locks-cinder\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.069536 master-0 kubenswrapper[28758]: I0223 14:49:48.067982 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-var-lib-cinder\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.069536 master-0 kubenswrapper[28758]: I0223 14:49:48.068412 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7cce387-f350-41ac-9293-ca3c7e93681a-config\") pod \"dnsmasq-dns-56fd8c5c-tkwd6\" (UID: \"b7cce387-f350-41ac-9293-ca3c7e93681a\") " pod="openstack/dnsmasq-dns-56fd8c5c-tkwd6" Feb 23 14:49:48.069536 master-0 kubenswrapper[28758]: I0223 14:49:48.068455 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7cce387-f350-41ac-9293-ca3c7e93681a-ovsdbserver-sb\") pod \"dnsmasq-dns-56fd8c5c-tkwd6\" (UID: \"b7cce387-f350-41ac-9293-ca3c7e93681a\") " pod="openstack/dnsmasq-dns-56fd8c5c-tkwd6" Feb 23 14:49:48.069536 master-0 kubenswrapper[28758]: I0223 14:49:48.069060 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7cce387-f350-41ac-9293-ca3c7e93681a-ovsdbserver-nb\") pod \"dnsmasq-dns-56fd8c5c-tkwd6\" (UID: \"b7cce387-f350-41ac-9293-ca3c7e93681a\") " pod="openstack/dnsmasq-dns-56fd8c5c-tkwd6" Feb 23 14:49:48.106818 master-0 kubenswrapper[28758]: I0223 14:49:48.106750 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:49:48.113098 master-0 kubenswrapper[28758]: I0223 14:49:48.113041 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-944nj\" (UniqueName: \"kubernetes.io/projected/b7cce387-f350-41ac-9293-ca3c7e93681a-kube-api-access-944nj\") pod \"dnsmasq-dns-56fd8c5c-tkwd6\" (UID: \"b7cce387-f350-41ac-9293-ca3c7e93681a\") " pod="openstack/dnsmasq-dns-56fd8c5c-tkwd6" Feb 23 14:49:48.135127 master-0 kubenswrapper[28758]: I0223 14:49:48.132207 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-2d990-backup-0"] Feb 23 14:49:48.135127 master-0 kubenswrapper[28758]: I0223 14:49:48.133812 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2d990-backup-0"] Feb 23 14:49:48.135127 master-0 kubenswrapper[28758]: I0223 14:49:48.133892 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.140345 master-0 kubenswrapper[28758]: I0223 14:49:48.140281 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-2d990-backup-config-data" Feb 23 14:49:48.163527 master-0 kubenswrapper[28758]: I0223 14:49:48.163099 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-2d990-api-0"] Feb 23 14:49:48.167210 master-0 kubenswrapper[28758]: I0223 14:49:48.165577 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d990-api-0" Feb 23 14:49:48.169432 master-0 kubenswrapper[28758]: I0223 14:49:48.169389 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-etc-machine-id\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.169610 master-0 kubenswrapper[28758]: I0223 14:49:48.169443 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-var-locks-cinder\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.171151 master-0 kubenswrapper[28758]: I0223 14:49:48.169879 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-run\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.171151 master-0 kubenswrapper[28758]: I0223 14:49:48.169916 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42d8d5ca-eecd-4a43-9a68-1099bab111aa-scripts\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.171151 master-0 kubenswrapper[28758]: I0223 14:49:48.169934 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d8d5ca-eecd-4a43-9a68-1099bab111aa-combined-ca-bundle\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.171151 master-0 kubenswrapper[28758]: I0223 14:49:48.169978 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-etc-nvme\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.171151 master-0 kubenswrapper[28758]: I0223 14:49:48.170008 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42d8d5ca-eecd-4a43-9a68-1099bab111aa-config-data-custom\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.171151 master-0 kubenswrapper[28758]: I0223 14:49:48.170088 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42d8d5ca-eecd-4a43-9a68-1099bab111aa-config-data\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.171151 master-0 kubenswrapper[28758]: I0223 14:49:48.170129 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a793659-5013-4d7e-a986-ea95782880c2-scripts\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.171151 master-0 kubenswrapper[28758]: I0223 14:49:48.170172 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-var-locks-brick\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.171151 master-0 kubenswrapper[28758]: I0223 14:49:48.170200 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-var-lib-cinder\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.171151 master-0 kubenswrapper[28758]: I0223 14:49:48.170236 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-lib-modules\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.171151 master-0 kubenswrapper[28758]: I0223 14:49:48.170262 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-dev\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.171151 master-0 kubenswrapper[28758]: I0223 14:49:48.170335 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-lib-modules\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.171151 master-0 kubenswrapper[28758]: I0223 14:49:48.170373 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-dev\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.171151 master-0 kubenswrapper[28758]: I0223 14:49:48.170960 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-2d990-api-config-data" Feb 23 14:49:48.172584 master-0 kubenswrapper[28758]: I0223 14:49:48.172546 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-etc-iscsi\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.172671 master-0 kubenswrapper[28758]: I0223 14:49:48.172608 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-etc-nvme\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.172671 master-0 kubenswrapper[28758]: I0223 14:49:48.172659 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-sys\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.172768 master-0 kubenswrapper[28758]: I0223 14:49:48.172697 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a793659-5013-4d7e-a986-ea95782880c2-config-data\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.172768 master-0 kubenswrapper[28758]: I0223 14:49:48.172753 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-dev\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.173107 master-0 kubenswrapper[28758]: I0223 14:49:48.172786 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-var-locks-brick\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.173107 master-0 kubenswrapper[28758]: I0223 14:49:48.172883 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a793659-5013-4d7e-a986-ea95782880c2-combined-ca-bundle\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.173107 master-0 kubenswrapper[28758]: I0223 14:49:48.172974 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-etc-machine-id\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.173107 master-0 kubenswrapper[28758]: I0223 14:49:48.173005 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szxf8\" (UniqueName: \"kubernetes.io/projected/42d8d5ca-eecd-4a43-9a68-1099bab111aa-kube-api-access-szxf8\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.173107 master-0 kubenswrapper[28758]: I0223 14:49:48.173086 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a793659-5013-4d7e-a986-ea95782880c2-config-data-custom\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.173577 master-0 kubenswrapper[28758]: I0223 14:49:48.173114 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-sys\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.173577 master-0 kubenswrapper[28758]: I0223 14:49:48.173143 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-etc-iscsi\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.173577 master-0 kubenswrapper[28758]: I0223 14:49:48.173207 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-var-locks-cinder\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.173577 master-0 kubenswrapper[28758]: I0223 14:49:48.173245 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-var-lib-cinder\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.173577 master-0 kubenswrapper[28758]: I0223 14:49:48.173298 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-lib-modules\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.173577 master-0 kubenswrapper[28758]: I0223 14:49:48.173328 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6zq4\" (UniqueName: \"kubernetes.io/projected/7a793659-5013-4d7e-a986-ea95782880c2-kube-api-access-w6zq4\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.173577 master-0 kubenswrapper[28758]: I0223 14:49:48.173357 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-run\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.173898 master-0 kubenswrapper[28758]: I0223 14:49:48.173598 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-run\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.173898 master-0 kubenswrapper[28758]: I0223 14:49:48.173716 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-etc-nvme\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.174128 master-0 kubenswrapper[28758]: I0223 14:49:48.174075 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-etc-iscsi\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.174380 master-0 kubenswrapper[28758]: I0223 14:49:48.174246 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-sys\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.174380 master-0 kubenswrapper[28758]: I0223 14:49:48.174306 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-var-locks-cinder\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.177817 master-0 kubenswrapper[28758]: I0223 14:49:48.174415 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-var-lib-cinder\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.177817 master-0 kubenswrapper[28758]: I0223 14:49:48.174630 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-var-locks-brick\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.177817 master-0 kubenswrapper[28758]: I0223 14:49:48.174994 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42d8d5ca-eecd-4a43-9a68-1099bab111aa-config-data-custom\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.177817 master-0 kubenswrapper[28758]: I0223 14:49:48.175041 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2d990-api-0"] Feb 23 14:49:48.177817 master-0 kubenswrapper[28758]: I0223 14:49:48.175098 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-etc-machine-id\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.177817 master-0 kubenswrapper[28758]: I0223 14:49:48.177793 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42d8d5ca-eecd-4a43-9a68-1099bab111aa-config-data\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.178260 master-0 kubenswrapper[28758]: I0223 14:49:48.178216 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42d8d5ca-eecd-4a43-9a68-1099bab111aa-scripts\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.182912 master-0 kubenswrapper[28758]: I0223 14:49:48.182854 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d8d5ca-eecd-4a43-9a68-1099bab111aa-combined-ca-bundle\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.194917 master-0 kubenswrapper[28758]: I0223 14:49:48.194800 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szxf8\" (UniqueName: \"kubernetes.io/projected/42d8d5ca-eecd-4a43-9a68-1099bab111aa-kube-api-access-szxf8\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.236998 master-0 kubenswrapper[28758]: I0223 14:49:48.236951 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56fd8c5c-tkwd6" Feb 23 14:49:48.275865 master-0 kubenswrapper[28758]: I0223 14:49:48.275806 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-etc-machine-id\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.276062 master-0 kubenswrapper[28758]: I0223 14:49:48.275873 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d289190e-0bfa-414d-8482-421efb521c2a-combined-ca-bundle\") pod \"cinder-2d990-api-0\" (UID: \"d289190e-0bfa-414d-8482-421efb521c2a\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:48.276062 master-0 kubenswrapper[28758]: I0223 14:49:48.275905 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-var-locks-cinder\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.276062 master-0 kubenswrapper[28758]: I0223 14:49:48.275996 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-var-locks-cinder\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.276226 master-0 kubenswrapper[28758]: I0223 14:49:48.276061 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-etc-machine-id\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.276226 master-0 kubenswrapper[28758]: I0223 14:49:48.276161 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d289190e-0bfa-414d-8482-421efb521c2a-config-data\") pod \"cinder-2d990-api-0\" (UID: \"d289190e-0bfa-414d-8482-421efb521c2a\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:48.276359 master-0 kubenswrapper[28758]: I0223 14:49:48.276321 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-run\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.276430 master-0 kubenswrapper[28758]: I0223 14:49:48.276352 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-run\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.276808 master-0 kubenswrapper[28758]: I0223 14:49:48.276560 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d289190e-0bfa-414d-8482-421efb521c2a-config-data-custom\") pod \"cinder-2d990-api-0\" (UID: \"d289190e-0bfa-414d-8482-421efb521c2a\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:48.276808 master-0 kubenswrapper[28758]: I0223 14:49:48.276617 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-etc-nvme\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.276808 master-0 kubenswrapper[28758]: I0223 14:49:48.276736 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-etc-nvme\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.277728 master-0 kubenswrapper[28758]: I0223 14:49:48.277686 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a793659-5013-4d7e-a986-ea95782880c2-scripts\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.277812 master-0 kubenswrapper[28758]: I0223 14:49:48.277733 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-var-locks-brick\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.278001 master-0 kubenswrapper[28758]: I0223 14:49:48.277959 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-var-lib-cinder\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.278067 master-0 kubenswrapper[28758]: I0223 14:49:48.277998 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-var-locks-brick\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.278067 master-0 kubenswrapper[28758]: I0223 14:49:48.278037 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-var-lib-cinder\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.278158 master-0 kubenswrapper[28758]: I0223 14:49:48.278080 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d289190e-0bfa-414d-8482-421efb521c2a-logs\") pod \"cinder-2d990-api-0\" (UID: \"d289190e-0bfa-414d-8482-421efb521c2a\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:48.278158 master-0 kubenswrapper[28758]: I0223 14:49:48.278129 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-etc-iscsi\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.278251 master-0 kubenswrapper[28758]: I0223 14:49:48.278175 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a793659-5013-4d7e-a986-ea95782880c2-config-data\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.278251 master-0 kubenswrapper[28758]: I0223 14:49:48.278206 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-etc-iscsi\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.278251 master-0 kubenswrapper[28758]: I0223 14:49:48.278217 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-dev\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.278394 master-0 kubenswrapper[28758]: I0223 14:49:48.278281 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz4jx\" (UniqueName: \"kubernetes.io/projected/d289190e-0bfa-414d-8482-421efb521c2a-kube-api-access-pz4jx\") pod \"cinder-2d990-api-0\" (UID: \"d289190e-0bfa-414d-8482-421efb521c2a\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:48.278394 master-0 kubenswrapper[28758]: I0223 14:49:48.278317 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a793659-5013-4d7e-a986-ea95782880c2-combined-ca-bundle\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.278504 master-0 kubenswrapper[28758]: I0223 14:49:48.278408 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d289190e-0bfa-414d-8482-421efb521c2a-scripts\") pod \"cinder-2d990-api-0\" (UID: \"d289190e-0bfa-414d-8482-421efb521c2a\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:48.278553 master-0 kubenswrapper[28758]: I0223 14:49:48.278529 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-sys\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.278612 master-0 kubenswrapper[28758]: I0223 14:49:48.278565 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a793659-5013-4d7e-a986-ea95782880c2-config-data-custom\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.278663 master-0 kubenswrapper[28758]: I0223 14:49:48.278616 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-sys\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.278708 master-0 kubenswrapper[28758]: I0223 14:49:48.278690 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d289190e-0bfa-414d-8482-421efb521c2a-etc-machine-id\") pod \"cinder-2d990-api-0\" (UID: \"d289190e-0bfa-414d-8482-421efb521c2a\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:48.278754 master-0 kubenswrapper[28758]: I0223 14:49:48.278723 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-lib-modules\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.278801 master-0 kubenswrapper[28758]: I0223 14:49:48.278758 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6zq4\" (UniqueName: \"kubernetes.io/projected/7a793659-5013-4d7e-a986-ea95782880c2-kube-api-access-w6zq4\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.279223 master-0 kubenswrapper[28758]: I0223 14:49:48.279192 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-lib-modules\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.280098 master-0 kubenswrapper[28758]: I0223 14:49:48.279267 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-dev\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.282226 master-0 kubenswrapper[28758]: I0223 14:49:48.281030 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a793659-5013-4d7e-a986-ea95782880c2-scripts\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.283571 master-0 kubenswrapper[28758]: I0223 14:49:48.283521 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a793659-5013-4d7e-a986-ea95782880c2-combined-ca-bundle\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.289568 master-0 kubenswrapper[28758]: I0223 14:49:48.283794 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a793659-5013-4d7e-a986-ea95782880c2-config-data-custom\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.289568 master-0 kubenswrapper[28758]: I0223 14:49:48.287336 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a793659-5013-4d7e-a986-ea95782880c2-config-data\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.345130 master-0 kubenswrapper[28758]: I0223 14:49:48.345080 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6zq4\" (UniqueName: \"kubernetes.io/projected/7a793659-5013-4d7e-a986-ea95782880c2-kube-api-access-w6zq4\") pod \"cinder-2d990-backup-0\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.385503 master-0 kubenswrapper[28758]: I0223 14:49:48.381116 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d289190e-0bfa-414d-8482-421efb521c2a-config-data\") pod \"cinder-2d990-api-0\" (UID: \"d289190e-0bfa-414d-8482-421efb521c2a\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:48.385503 master-0 kubenswrapper[28758]: I0223 14:49:48.381279 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d289190e-0bfa-414d-8482-421efb521c2a-config-data-custom\") pod \"cinder-2d990-api-0\" (UID: \"d289190e-0bfa-414d-8482-421efb521c2a\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:48.385503 master-0 kubenswrapper[28758]: I0223 14:49:48.381365 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d289190e-0bfa-414d-8482-421efb521c2a-logs\") pod \"cinder-2d990-api-0\" (UID: \"d289190e-0bfa-414d-8482-421efb521c2a\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:48.385503 master-0 kubenswrapper[28758]: I0223 14:49:48.381443 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz4jx\" (UniqueName: \"kubernetes.io/projected/d289190e-0bfa-414d-8482-421efb521c2a-kube-api-access-pz4jx\") pod \"cinder-2d990-api-0\" (UID: \"d289190e-0bfa-414d-8482-421efb521c2a\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:48.385503 master-0 kubenswrapper[28758]: I0223 14:49:48.381663 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d289190e-0bfa-414d-8482-421efb521c2a-scripts\") pod \"cinder-2d990-api-0\" (UID: \"d289190e-0bfa-414d-8482-421efb521c2a\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:48.385503 master-0 kubenswrapper[28758]: I0223 14:49:48.381725 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d289190e-0bfa-414d-8482-421efb521c2a-etc-machine-id\") pod \"cinder-2d990-api-0\" (UID: \"d289190e-0bfa-414d-8482-421efb521c2a\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:48.385503 master-0 kubenswrapper[28758]: I0223 14:49:48.381757 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d289190e-0bfa-414d-8482-421efb521c2a-combined-ca-bundle\") pod \"cinder-2d990-api-0\" (UID: \"d289190e-0bfa-414d-8482-421efb521c2a\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:48.385503 master-0 kubenswrapper[28758]: I0223 14:49:48.382579 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d289190e-0bfa-414d-8482-421efb521c2a-logs\") pod \"cinder-2d990-api-0\" (UID: \"d289190e-0bfa-414d-8482-421efb521c2a\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:48.393662 master-0 kubenswrapper[28758]: I0223 14:49:48.387816 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d289190e-0bfa-414d-8482-421efb521c2a-combined-ca-bundle\") pod \"cinder-2d990-api-0\" (UID: \"d289190e-0bfa-414d-8482-421efb521c2a\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:48.393662 master-0 kubenswrapper[28758]: I0223 14:49:48.389376 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d289190e-0bfa-414d-8482-421efb521c2a-etc-machine-id\") pod \"cinder-2d990-api-0\" (UID: \"d289190e-0bfa-414d-8482-421efb521c2a\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:48.405969 master-0 kubenswrapper[28758]: I0223 14:49:48.399650 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d289190e-0bfa-414d-8482-421efb521c2a-config-data-custom\") pod \"cinder-2d990-api-0\" (UID: \"d289190e-0bfa-414d-8482-421efb521c2a\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:48.423733 master-0 kubenswrapper[28758]: I0223 14:49:48.410440 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d289190e-0bfa-414d-8482-421efb521c2a-config-data\") pod \"cinder-2d990-api-0\" (UID: \"d289190e-0bfa-414d-8482-421efb521c2a\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:48.438716 master-0 kubenswrapper[28758]: I0223 14:49:48.427063 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz4jx\" (UniqueName: \"kubernetes.io/projected/d289190e-0bfa-414d-8482-421efb521c2a-kube-api-access-pz4jx\") pod \"cinder-2d990-api-0\" (UID: \"d289190e-0bfa-414d-8482-421efb521c2a\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:48.457581 master-0 kubenswrapper[28758]: I0223 14:49:48.456037 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d289190e-0bfa-414d-8482-421efb521c2a-scripts\") pod \"cinder-2d990-api-0\" (UID: \"d289190e-0bfa-414d-8482-421efb521c2a\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:48.477602 master-0 kubenswrapper[28758]: I0223 14:49:48.477544 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:48.557163 master-0 kubenswrapper[28758]: I0223 14:49:48.557089 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:48.570685 master-0 kubenswrapper[28758]: I0223 14:49:48.570285 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d990-api-0" Feb 23 14:49:48.812639 master-0 kubenswrapper[28758]: I0223 14:49:48.812580 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2d990-scheduler-0"] Feb 23 14:49:48.942357 master-0 kubenswrapper[28758]: I0223 14:49:48.942295 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56fd8c5c-tkwd6"] Feb 23 14:49:48.948291 master-0 kubenswrapper[28758]: W0223 14:49:48.947052 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7cce387_f350_41ac_9293_ca3c7e93681a.slice/crio-613bcf8892538b1cc18e766c27dcd7339f6d323819869c272d3a608d169a9595 WatchSource:0}: Error finding container 613bcf8892538b1cc18e766c27dcd7339f6d323819869c272d3a608d169a9595: Status 404 returned error can't find the container with id 613bcf8892538b1cc18e766c27dcd7339f6d323819869c272d3a608d169a9595 Feb 23 14:49:49.188844 master-0 kubenswrapper[28758]: I0223 14:49:49.183042 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2d990-volume-lvm-iscsi-0"] Feb 23 14:49:49.416541 master-0 kubenswrapper[28758]: I0223 14:49:49.416140 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2d990-api-0"] Feb 23 14:49:49.537207 master-0 kubenswrapper[28758]: W0223 14:49:49.537143 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd289190e_0bfa_414d_8482_421efb521c2a.slice/crio-ad711b32806d21fdd299e8127650de9e235f34d35cd66955ad3a2dbe7ed4b627 WatchSource:0}: Error finding container ad711b32806d21fdd299e8127650de9e235f34d35cd66955ad3a2dbe7ed4b627: Status 404 returned error can't find the container with id ad711b32806d21fdd299e8127650de9e235f34d35cd66955ad3a2dbe7ed4b627 Feb 23 14:49:49.547505 master-0 kubenswrapper[28758]: W0223 14:49:49.547298 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a793659_5013_4d7e_a986_ea95782880c2.slice/crio-603a95c4ec2210a0fcdee0b0f3e894ce0c00cf99b448ab802058f2b047e1fa69 WatchSource:0}: Error finding container 603a95c4ec2210a0fcdee0b0f3e894ce0c00cf99b448ab802058f2b047e1fa69: Status 404 returned error can't find the container with id 603a95c4ec2210a0fcdee0b0f3e894ce0c00cf99b448ab802058f2b047e1fa69 Feb 23 14:49:49.555427 master-0 kubenswrapper[28758]: I0223 14:49:49.555377 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2d990-backup-0"] Feb 23 14:49:49.561126 master-0 kubenswrapper[28758]: I0223 14:49:49.561051 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-api-0" event={"ID":"d289190e-0bfa-414d-8482-421efb521c2a","Type":"ContainerStarted","Data":"ad711b32806d21fdd299e8127650de9e235f34d35cd66955ad3a2dbe7ed4b627"} Feb 23 14:49:49.571599 master-0 kubenswrapper[28758]: I0223 14:49:49.567011 28758 generic.go:334] "Generic (PLEG): container finished" podID="b7cce387-f350-41ac-9293-ca3c7e93681a" containerID="ce4678450a0a5b191545732e9365298462af447b51d03c19b04b940607a70ff4" exitCode=0 Feb 23 14:49:49.571599 master-0 kubenswrapper[28758]: I0223 14:49:49.567090 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56fd8c5c-tkwd6" event={"ID":"b7cce387-f350-41ac-9293-ca3c7e93681a","Type":"ContainerDied","Data":"ce4678450a0a5b191545732e9365298462af447b51d03c19b04b940607a70ff4"} Feb 23 14:49:49.571599 master-0 kubenswrapper[28758]: I0223 14:49:49.567121 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56fd8c5c-tkwd6" event={"ID":"b7cce387-f350-41ac-9293-ca3c7e93681a","Type":"ContainerStarted","Data":"613bcf8892538b1cc18e766c27dcd7339f6d323819869c272d3a608d169a9595"} Feb 23 14:49:49.572689 master-0 kubenswrapper[28758]: I0223 14:49:49.572413 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-volume-lvm-iscsi-0" event={"ID":"42d8d5ca-eecd-4a43-9a68-1099bab111aa","Type":"ContainerStarted","Data":"6fa8b148b095cdbd794cb63df596cd695896caadbdfe78ebcd369f966d6f37eb"} Feb 23 14:49:49.574099 master-0 kubenswrapper[28758]: I0223 14:49:49.574067 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-scheduler-0" event={"ID":"4021d59b-dfa8-49cc-b55c-48469a02b971","Type":"ContainerStarted","Data":"af68541bc58e379745c979863813dd1df390c09cb5e65da31042416700954961"} Feb 23 14:49:49.575326 master-0 kubenswrapper[28758]: E0223 14:49:49.575289 28758 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7cce387_f350_41ac_9293_ca3c7e93681a.slice/crio-conmon-ce4678450a0a5b191545732e9365298462af447b51d03c19b04b940607a70ff4.scope\": RecentStats: unable to find data in memory cache]" Feb 23 14:49:50.589611 master-0 kubenswrapper[28758]: I0223 14:49:50.589545 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56fd8c5c-tkwd6" event={"ID":"b7cce387-f350-41ac-9293-ca3c7e93681a","Type":"ContainerStarted","Data":"ab83798ecd0a01f893c05c912022a7b2f2a13fa225bcbd0f1c1a5b5f9983730f"} Feb 23 14:49:50.590913 master-0 kubenswrapper[28758]: I0223 14:49:50.589977 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56fd8c5c-tkwd6" Feb 23 14:49:50.595705 master-0 kubenswrapper[28758]: I0223 14:49:50.593842 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-scheduler-0" event={"ID":"4021d59b-dfa8-49cc-b55c-48469a02b971","Type":"ContainerStarted","Data":"e7b0433721951dfb37923e3f3397313fe35bb598c89aa4a7ffc288127028a2ad"} Feb 23 14:49:50.608164 master-0 kubenswrapper[28758]: I0223 14:49:50.608091 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-api-0" event={"ID":"d289190e-0bfa-414d-8482-421efb521c2a","Type":"ContainerStarted","Data":"77446772071b5cb8ee89e1e32162dc9af28d270f9637fa3a835884c842dc471b"} Feb 23 14:49:50.615769 master-0 kubenswrapper[28758]: I0223 14:49:50.615549 28758 generic.go:334] "Generic (PLEG): container finished" podID="6d0c7920-4214-40e6-83d6-46e306919ec4" containerID="07f8bb525d47f0913ba3f4e74c1e76a4eb4a1c1bed843ddb4abcdc854b95ce0a" exitCode=0 Feb 23 14:49:50.615769 master-0 kubenswrapper[28758]: I0223 14:49:50.615626 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-tnlrn" event={"ID":"6d0c7920-4214-40e6-83d6-46e306919ec4","Type":"ContainerDied","Data":"07f8bb525d47f0913ba3f4e74c1e76a4eb4a1c1bed843ddb4abcdc854b95ce0a"} Feb 23 14:49:50.621608 master-0 kubenswrapper[28758]: I0223 14:49:50.621146 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-backup-0" event={"ID":"7a793659-5013-4d7e-a986-ea95782880c2","Type":"ContainerStarted","Data":"603a95c4ec2210a0fcdee0b0f3e894ce0c00cf99b448ab802058f2b047e1fa69"} Feb 23 14:49:50.622124 master-0 kubenswrapper[28758]: I0223 14:49:50.622047 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56fd8c5c-tkwd6" podStartSLOduration=3.621998134 podStartE2EDuration="3.621998134s" podCreationTimestamp="2026-02-23 14:49:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:49:50.61359649 +0000 UTC m=+922.739912432" watchObservedRunningTime="2026-02-23 14:49:50.621998134 +0000 UTC m=+922.748314066" Feb 23 14:49:50.955048 master-0 kubenswrapper[28758]: I0223 14:49:50.954519 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-2d990-api-0"] Feb 23 14:49:51.646124 master-0 kubenswrapper[28758]: I0223 14:49:51.643382 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-volume-lvm-iscsi-0" event={"ID":"42d8d5ca-eecd-4a43-9a68-1099bab111aa","Type":"ContainerStarted","Data":"9d782ea0bcfed09e73794591745ae7dd51f3de5161f1e96d6cb094ade09f86aa"} Feb 23 14:49:51.646124 master-0 kubenswrapper[28758]: I0223 14:49:51.643461 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-volume-lvm-iscsi-0" event={"ID":"42d8d5ca-eecd-4a43-9a68-1099bab111aa","Type":"ContainerStarted","Data":"3bdcc4b7933f394730d3f24b022b4af86aac0763903d06917b66ab3c00fdc562"} Feb 23 14:49:51.651901 master-0 kubenswrapper[28758]: I0223 14:49:51.651435 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-scheduler-0" event={"ID":"4021d59b-dfa8-49cc-b55c-48469a02b971","Type":"ContainerStarted","Data":"d10703ef54d01e9d2bd5964168e28f6bb29d723aece463c73eb4d8418eb7106d"} Feb 23 14:49:51.654322 master-0 kubenswrapper[28758]: I0223 14:49:51.654289 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-api-0" event={"ID":"d289190e-0bfa-414d-8482-421efb521c2a","Type":"ContainerStarted","Data":"546090b41e80668dadf15c66f794bc7a03bfda8d0cb8ef46f8937fd3d97ebb21"} Feb 23 14:49:51.655310 master-0 kubenswrapper[28758]: I0223 14:49:51.654575 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-2d990-api-0" podUID="d289190e-0bfa-414d-8482-421efb521c2a" containerName="cinder-2d990-api-log" containerID="cri-o://77446772071b5cb8ee89e1e32162dc9af28d270f9637fa3a835884c842dc471b" gracePeriod=30 Feb 23 14:49:51.655310 master-0 kubenswrapper[28758]: I0223 14:49:51.654662 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-2d990-api-0" podUID="d289190e-0bfa-414d-8482-421efb521c2a" containerName="cinder-api" containerID="cri-o://546090b41e80668dadf15c66f794bc7a03bfda8d0cb8ef46f8937fd3d97ebb21" gracePeriod=30 Feb 23 14:49:51.655541 master-0 kubenswrapper[28758]: I0223 14:49:51.655523 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-2d990-api-0" Feb 23 14:49:51.660721 master-0 kubenswrapper[28758]: I0223 14:49:51.660679 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-backup-0" event={"ID":"7a793659-5013-4d7e-a986-ea95782880c2","Type":"ContainerStarted","Data":"08cf06726322e72e4cb4550309df3da0cc5dbd24384fbcb6e109a66128208f17"} Feb 23 14:49:51.660915 master-0 kubenswrapper[28758]: I0223 14:49:51.660901 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-backup-0" event={"ID":"7a793659-5013-4d7e-a986-ea95782880c2","Type":"ContainerStarted","Data":"86b06d12ffc94bcbdc6b4b284ec6fa03e058c14f82ea14b05587f96271ce4a86"} Feb 23 14:49:51.736799 master-0 kubenswrapper[28758]: I0223 14:49:51.736660 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-2d990-volume-lvm-iscsi-0" podStartSLOduration=2.85708881 podStartE2EDuration="4.736638388s" podCreationTimestamp="2026-02-23 14:49:47 +0000 UTC" firstStartedPulling="2026-02-23 14:49:49.19165193 +0000 UTC m=+921.317967862" lastFinishedPulling="2026-02-23 14:49:51.071201508 +0000 UTC m=+923.197517440" observedRunningTime="2026-02-23 14:49:51.686147226 +0000 UTC m=+923.812463178" watchObservedRunningTime="2026-02-23 14:49:51.736638388 +0000 UTC m=+923.862954320" Feb 23 14:49:51.741876 master-0 kubenswrapper[28758]: I0223 14:49:51.741791 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-2d990-api-0" podStartSLOduration=3.741768714 podStartE2EDuration="3.741768714s" podCreationTimestamp="2026-02-23 14:49:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:49:51.712041924 +0000 UTC m=+923.838357856" watchObservedRunningTime="2026-02-23 14:49:51.741768714 +0000 UTC m=+923.868084636" Feb 23 14:49:51.755932 master-0 kubenswrapper[28758]: I0223 14:49:51.755666 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-2d990-backup-0" podStartSLOduration=4.05812329 podStartE2EDuration="4.755641443s" podCreationTimestamp="2026-02-23 14:49:47 +0000 UTC" firstStartedPulling="2026-02-23 14:49:49.5891154 +0000 UTC m=+921.715431332" lastFinishedPulling="2026-02-23 14:49:50.286633553 +0000 UTC m=+922.412949485" observedRunningTime="2026-02-23 14:49:51.740026288 +0000 UTC m=+923.866342220" watchObservedRunningTime="2026-02-23 14:49:51.755641443 +0000 UTC m=+923.881957385" Feb 23 14:49:51.834503 master-0 kubenswrapper[28758]: I0223 14:49:51.833411 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-2d990-scheduler-0" podStartSLOduration=4.06601322 podStartE2EDuration="4.833399209s" podCreationTimestamp="2026-02-23 14:49:47 +0000 UTC" firstStartedPulling="2026-02-23 14:49:48.82545409 +0000 UTC m=+920.951770022" lastFinishedPulling="2026-02-23 14:49:49.592840069 +0000 UTC m=+921.719156011" observedRunningTime="2026-02-23 14:49:51.832896905 +0000 UTC m=+923.959212848" watchObservedRunningTime="2026-02-23 14:49:51.833399209 +0000 UTC m=+923.959715141" Feb 23 14:49:52.145553 master-0 kubenswrapper[28758]: I0223 14:49:52.145470 28758 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod651d3a04-116a-4337-8f42-3865d8a0b9be"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod651d3a04-116a-4337-8f42-3865d8a0b9be] : Timed out while waiting for systemd to remove kubepods-besteffort-pod651d3a04_116a_4337_8f42_3865d8a0b9be.slice" Feb 23 14:49:52.145553 master-0 kubenswrapper[28758]: E0223 14:49:52.145535 28758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod651d3a04-116a-4337-8f42-3865d8a0b9be] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod651d3a04-116a-4337-8f42-3865d8a0b9be] : Timed out while waiting for systemd to remove kubepods-besteffort-pod651d3a04_116a_4337_8f42_3865d8a0b9be.slice" pod="openstack/ironic-279b-account-create-update-vbsbw" podUID="651d3a04-116a-4337-8f42-3865d8a0b9be" Feb 23 14:49:52.210925 master-0 kubenswrapper[28758]: I0223 14:49:52.210848 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-tnlrn" Feb 23 14:49:52.352842 master-0 kubenswrapper[28758]: I0223 14:49:52.352774 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d0c7920-4214-40e6-83d6-46e306919ec4-combined-ca-bundle\") pod \"6d0c7920-4214-40e6-83d6-46e306919ec4\" (UID: \"6d0c7920-4214-40e6-83d6-46e306919ec4\") " Feb 23 14:49:52.352842 master-0 kubenswrapper[28758]: I0223 14:49:52.352833 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6d0c7920-4214-40e6-83d6-46e306919ec4-config\") pod \"6d0c7920-4214-40e6-83d6-46e306919ec4\" (UID: \"6d0c7920-4214-40e6-83d6-46e306919ec4\") " Feb 23 14:49:52.353135 master-0 kubenswrapper[28758]: I0223 14:49:52.353040 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rt88\" (UniqueName: \"kubernetes.io/projected/6d0c7920-4214-40e6-83d6-46e306919ec4-kube-api-access-5rt88\") pod \"6d0c7920-4214-40e6-83d6-46e306919ec4\" (UID: \"6d0c7920-4214-40e6-83d6-46e306919ec4\") " Feb 23 14:49:52.356524 master-0 kubenswrapper[28758]: I0223 14:49:52.356422 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d0c7920-4214-40e6-83d6-46e306919ec4-kube-api-access-5rt88" (OuterVolumeSpecName: "kube-api-access-5rt88") pod "6d0c7920-4214-40e6-83d6-46e306919ec4" (UID: "6d0c7920-4214-40e6-83d6-46e306919ec4"). InnerVolumeSpecName "kube-api-access-5rt88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:49:52.392572 master-0 kubenswrapper[28758]: I0223 14:49:52.392515 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d0c7920-4214-40e6-83d6-46e306919ec4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d0c7920-4214-40e6-83d6-46e306919ec4" (UID: "6d0c7920-4214-40e6-83d6-46e306919ec4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:52.402620 master-0 kubenswrapper[28758]: I0223 14:49:52.402555 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d0c7920-4214-40e6-83d6-46e306919ec4-config" (OuterVolumeSpecName: "config") pod "6d0c7920-4214-40e6-83d6-46e306919ec4" (UID: "6d0c7920-4214-40e6-83d6-46e306919ec4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:52.434701 master-0 kubenswrapper[28758]: I0223 14:49:52.434271 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d990-api-0" Feb 23 14:49:52.456375 master-0 kubenswrapper[28758]: I0223 14:49:52.456301 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rt88\" (UniqueName: \"kubernetes.io/projected/6d0c7920-4214-40e6-83d6-46e306919ec4-kube-api-access-5rt88\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:52.456375 master-0 kubenswrapper[28758]: I0223 14:49:52.456354 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d0c7920-4214-40e6-83d6-46e306919ec4-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:52.456375 master-0 kubenswrapper[28758]: I0223 14:49:52.456368 28758 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6d0c7920-4214-40e6-83d6-46e306919ec4-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:52.557505 master-0 kubenswrapper[28758]: I0223 14:49:52.557427 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d289190e-0bfa-414d-8482-421efb521c2a-logs\") pod \"d289190e-0bfa-414d-8482-421efb521c2a\" (UID: \"d289190e-0bfa-414d-8482-421efb521c2a\") " Feb 23 14:49:52.557851 master-0 kubenswrapper[28758]: I0223 14:49:52.557538 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz4jx\" (UniqueName: \"kubernetes.io/projected/d289190e-0bfa-414d-8482-421efb521c2a-kube-api-access-pz4jx\") pod \"d289190e-0bfa-414d-8482-421efb521c2a\" (UID: \"d289190e-0bfa-414d-8482-421efb521c2a\") " Feb 23 14:49:52.558114 master-0 kubenswrapper[28758]: I0223 14:49:52.558062 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d289190e-0bfa-414d-8482-421efb521c2a-logs" (OuterVolumeSpecName: "logs") pod "d289190e-0bfa-414d-8482-421efb521c2a" (UID: "d289190e-0bfa-414d-8482-421efb521c2a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:49:52.558329 master-0 kubenswrapper[28758]: I0223 14:49:52.558295 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d289190e-0bfa-414d-8482-421efb521c2a-config-data-custom\") pod \"d289190e-0bfa-414d-8482-421efb521c2a\" (UID: \"d289190e-0bfa-414d-8482-421efb521c2a\") " Feb 23 14:49:52.558388 master-0 kubenswrapper[28758]: I0223 14:49:52.558352 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d289190e-0bfa-414d-8482-421efb521c2a-etc-machine-id\") pod \"d289190e-0bfa-414d-8482-421efb521c2a\" (UID: \"d289190e-0bfa-414d-8482-421efb521c2a\") " Feb 23 14:49:52.558507 master-0 kubenswrapper[28758]: I0223 14:49:52.558451 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d289190e-0bfa-414d-8482-421efb521c2a-scripts\") pod \"d289190e-0bfa-414d-8482-421efb521c2a\" (UID: \"d289190e-0bfa-414d-8482-421efb521c2a\") " Feb 23 14:49:52.558569 master-0 kubenswrapper[28758]: I0223 14:49:52.558529 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d289190e-0bfa-414d-8482-421efb521c2a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d289190e-0bfa-414d-8482-421efb521c2a" (UID: "d289190e-0bfa-414d-8482-421efb521c2a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:49:52.558761 master-0 kubenswrapper[28758]: I0223 14:49:52.558716 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d289190e-0bfa-414d-8482-421efb521c2a-combined-ca-bundle\") pod \"d289190e-0bfa-414d-8482-421efb521c2a\" (UID: \"d289190e-0bfa-414d-8482-421efb521c2a\") " Feb 23 14:49:52.558837 master-0 kubenswrapper[28758]: I0223 14:49:52.558826 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d289190e-0bfa-414d-8482-421efb521c2a-config-data\") pod \"d289190e-0bfa-414d-8482-421efb521c2a\" (UID: \"d289190e-0bfa-414d-8482-421efb521c2a\") " Feb 23 14:49:52.559979 master-0 kubenswrapper[28758]: I0223 14:49:52.559924 28758 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d289190e-0bfa-414d-8482-421efb521c2a-logs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:52.559979 master-0 kubenswrapper[28758]: I0223 14:49:52.559959 28758 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d289190e-0bfa-414d-8482-421efb521c2a-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:52.561403 master-0 kubenswrapper[28758]: I0223 14:49:52.561237 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d289190e-0bfa-414d-8482-421efb521c2a-kube-api-access-pz4jx" (OuterVolumeSpecName: "kube-api-access-pz4jx") pod "d289190e-0bfa-414d-8482-421efb521c2a" (UID: "d289190e-0bfa-414d-8482-421efb521c2a"). InnerVolumeSpecName "kube-api-access-pz4jx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:49:52.561947 master-0 kubenswrapper[28758]: I0223 14:49:52.561916 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d289190e-0bfa-414d-8482-421efb521c2a-scripts" (OuterVolumeSpecName: "scripts") pod "d289190e-0bfa-414d-8482-421efb521c2a" (UID: "d289190e-0bfa-414d-8482-421efb521c2a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:52.562985 master-0 kubenswrapper[28758]: I0223 14:49:52.562852 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d289190e-0bfa-414d-8482-421efb521c2a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d289190e-0bfa-414d-8482-421efb521c2a" (UID: "d289190e-0bfa-414d-8482-421efb521c2a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:52.616016 master-0 kubenswrapper[28758]: I0223 14:49:52.615886 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d289190e-0bfa-414d-8482-421efb521c2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d289190e-0bfa-414d-8482-421efb521c2a" (UID: "d289190e-0bfa-414d-8482-421efb521c2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:52.648949 master-0 kubenswrapper[28758]: I0223 14:49:52.645695 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d289190e-0bfa-414d-8482-421efb521c2a-config-data" (OuterVolumeSpecName: "config-data") pod "d289190e-0bfa-414d-8482-421efb521c2a" (UID: "d289190e-0bfa-414d-8482-421efb521c2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:52.662230 master-0 kubenswrapper[28758]: I0223 14:49:52.661891 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pz4jx\" (UniqueName: \"kubernetes.io/projected/d289190e-0bfa-414d-8482-421efb521c2a-kube-api-access-pz4jx\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:52.662230 master-0 kubenswrapper[28758]: I0223 14:49:52.661938 28758 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d289190e-0bfa-414d-8482-421efb521c2a-config-data-custom\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:52.662230 master-0 kubenswrapper[28758]: I0223 14:49:52.661950 28758 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d289190e-0bfa-414d-8482-421efb521c2a-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:52.662230 master-0 kubenswrapper[28758]: I0223 14:49:52.661961 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d289190e-0bfa-414d-8482-421efb521c2a-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:52.662230 master-0 kubenswrapper[28758]: I0223 14:49:52.661972 28758 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d289190e-0bfa-414d-8482-421efb521c2a-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:52.705399 master-0 kubenswrapper[28758]: I0223 14:49:52.705321 28758 generic.go:334] "Generic (PLEG): container finished" podID="d289190e-0bfa-414d-8482-421efb521c2a" containerID="546090b41e80668dadf15c66f794bc7a03bfda8d0cb8ef46f8937fd3d97ebb21" exitCode=0 Feb 23 14:49:52.705399 master-0 kubenswrapper[28758]: I0223 14:49:52.705377 28758 generic.go:334] "Generic (PLEG): container finished" podID="d289190e-0bfa-414d-8482-421efb521c2a" containerID="77446772071b5cb8ee89e1e32162dc9af28d270f9637fa3a835884c842dc471b" exitCode=143 Feb 23 14:49:52.706330 master-0 kubenswrapper[28758]: I0223 14:49:52.705428 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-api-0" event={"ID":"d289190e-0bfa-414d-8482-421efb521c2a","Type":"ContainerDied","Data":"546090b41e80668dadf15c66f794bc7a03bfda8d0cb8ef46f8937fd3d97ebb21"} Feb 23 14:49:52.706330 master-0 kubenswrapper[28758]: I0223 14:49:52.705465 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-api-0" event={"ID":"d289190e-0bfa-414d-8482-421efb521c2a","Type":"ContainerDied","Data":"77446772071b5cb8ee89e1e32162dc9af28d270f9637fa3a835884c842dc471b"} Feb 23 14:49:52.706330 master-0 kubenswrapper[28758]: I0223 14:49:52.705504 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-api-0" event={"ID":"d289190e-0bfa-414d-8482-421efb521c2a","Type":"ContainerDied","Data":"ad711b32806d21fdd299e8127650de9e235f34d35cd66955ad3a2dbe7ed4b627"} Feb 23 14:49:52.706330 master-0 kubenswrapper[28758]: I0223 14:49:52.705526 28758 scope.go:117] "RemoveContainer" containerID="546090b41e80668dadf15c66f794bc7a03bfda8d0cb8ef46f8937fd3d97ebb21" Feb 23 14:49:52.706330 master-0 kubenswrapper[28758]: I0223 14:49:52.705569 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d990-api-0" Feb 23 14:49:52.708092 master-0 kubenswrapper[28758]: I0223 14:49:52.708043 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-tnlrn" Feb 23 14:49:52.708092 master-0 kubenswrapper[28758]: I0223 14:49:52.708065 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-279b-account-create-update-vbsbw" Feb 23 14:49:52.708325 master-0 kubenswrapper[28758]: I0223 14:49:52.708117 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-tnlrn" event={"ID":"6d0c7920-4214-40e6-83d6-46e306919ec4","Type":"ContainerDied","Data":"6cf6a2e7247c6aeaf35c96acf10bdb278d896ed014749941e4872c96a2ef299f"} Feb 23 14:49:52.708325 master-0 kubenswrapper[28758]: I0223 14:49:52.708149 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cf6a2e7247c6aeaf35c96acf10bdb278d896ed014749941e4872c96a2ef299f" Feb 23 14:49:52.746599 master-0 kubenswrapper[28758]: I0223 14:49:52.746549 28758 scope.go:117] "RemoveContainer" containerID="77446772071b5cb8ee89e1e32162dc9af28d270f9637fa3a835884c842dc471b" Feb 23 14:49:52.775901 master-0 kubenswrapper[28758]: I0223 14:49:52.775758 28758 scope.go:117] "RemoveContainer" containerID="546090b41e80668dadf15c66f794bc7a03bfda8d0cb8ef46f8937fd3d97ebb21" Feb 23 14:49:52.776598 master-0 kubenswrapper[28758]: E0223 14:49:52.776377 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"546090b41e80668dadf15c66f794bc7a03bfda8d0cb8ef46f8937fd3d97ebb21\": container with ID starting with 546090b41e80668dadf15c66f794bc7a03bfda8d0cb8ef46f8937fd3d97ebb21 not found: ID does not exist" containerID="546090b41e80668dadf15c66f794bc7a03bfda8d0cb8ef46f8937fd3d97ebb21" Feb 23 14:49:52.776598 master-0 kubenswrapper[28758]: I0223 14:49:52.776415 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"546090b41e80668dadf15c66f794bc7a03bfda8d0cb8ef46f8937fd3d97ebb21"} err="failed to get container status \"546090b41e80668dadf15c66f794bc7a03bfda8d0cb8ef46f8937fd3d97ebb21\": rpc error: code = NotFound desc = could not find container \"546090b41e80668dadf15c66f794bc7a03bfda8d0cb8ef46f8937fd3d97ebb21\": container with ID starting with 546090b41e80668dadf15c66f794bc7a03bfda8d0cb8ef46f8937fd3d97ebb21 not found: ID does not exist" Feb 23 14:49:52.776598 master-0 kubenswrapper[28758]: I0223 14:49:52.776436 28758 scope.go:117] "RemoveContainer" containerID="77446772071b5cb8ee89e1e32162dc9af28d270f9637fa3a835884c842dc471b" Feb 23 14:49:52.779495 master-0 kubenswrapper[28758]: E0223 14:49:52.779398 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77446772071b5cb8ee89e1e32162dc9af28d270f9637fa3a835884c842dc471b\": container with ID starting with 77446772071b5cb8ee89e1e32162dc9af28d270f9637fa3a835884c842dc471b not found: ID does not exist" containerID="77446772071b5cb8ee89e1e32162dc9af28d270f9637fa3a835884c842dc471b" Feb 23 14:49:52.779738 master-0 kubenswrapper[28758]: I0223 14:49:52.779615 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77446772071b5cb8ee89e1e32162dc9af28d270f9637fa3a835884c842dc471b"} err="failed to get container status \"77446772071b5cb8ee89e1e32162dc9af28d270f9637fa3a835884c842dc471b\": rpc error: code = NotFound desc = could not find container \"77446772071b5cb8ee89e1e32162dc9af28d270f9637fa3a835884c842dc471b\": container with ID starting with 77446772071b5cb8ee89e1e32162dc9af28d270f9637fa3a835884c842dc471b not found: ID does not exist" Feb 23 14:49:52.779738 master-0 kubenswrapper[28758]: I0223 14:49:52.779672 28758 scope.go:117] "RemoveContainer" containerID="546090b41e80668dadf15c66f794bc7a03bfda8d0cb8ef46f8937fd3d97ebb21" Feb 23 14:49:52.780576 master-0 kubenswrapper[28758]: I0223 14:49:52.780541 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"546090b41e80668dadf15c66f794bc7a03bfda8d0cb8ef46f8937fd3d97ebb21"} err="failed to get container status \"546090b41e80668dadf15c66f794bc7a03bfda8d0cb8ef46f8937fd3d97ebb21\": rpc error: code = NotFound desc = could not find container \"546090b41e80668dadf15c66f794bc7a03bfda8d0cb8ef46f8937fd3d97ebb21\": container with ID starting with 546090b41e80668dadf15c66f794bc7a03bfda8d0cb8ef46f8937fd3d97ebb21 not found: ID does not exist" Feb 23 14:49:52.780690 master-0 kubenswrapper[28758]: I0223 14:49:52.780577 28758 scope.go:117] "RemoveContainer" containerID="77446772071b5cb8ee89e1e32162dc9af28d270f9637fa3a835884c842dc471b" Feb 23 14:49:52.781047 master-0 kubenswrapper[28758]: I0223 14:49:52.780957 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77446772071b5cb8ee89e1e32162dc9af28d270f9637fa3a835884c842dc471b"} err="failed to get container status \"77446772071b5cb8ee89e1e32162dc9af28d270f9637fa3a835884c842dc471b\": rpc error: code = NotFound desc = could not find container \"77446772071b5cb8ee89e1e32162dc9af28d270f9637fa3a835884c842dc471b\": container with ID starting with 77446772071b5cb8ee89e1e32162dc9af28d270f9637fa3a835884c842dc471b not found: ID does not exist" Feb 23 14:49:52.800122 master-0 kubenswrapper[28758]: I0223 14:49:52.800050 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-2d990-api-0"] Feb 23 14:49:52.829216 master-0 kubenswrapper[28758]: I0223 14:49:52.829146 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-2d990-api-0"] Feb 23 14:49:52.874664 master-0 kubenswrapper[28758]: I0223 14:49:52.873824 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-2d990-api-0"] Feb 23 14:49:52.892807 master-0 kubenswrapper[28758]: E0223 14:49:52.892741 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d289190e-0bfa-414d-8482-421efb521c2a" containerName="cinder-2d990-api-log" Feb 23 14:49:52.892807 master-0 kubenswrapper[28758]: I0223 14:49:52.892796 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="d289190e-0bfa-414d-8482-421efb521c2a" containerName="cinder-2d990-api-log" Feb 23 14:49:52.893025 master-0 kubenswrapper[28758]: E0223 14:49:52.892821 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d0c7920-4214-40e6-83d6-46e306919ec4" containerName="neutron-db-sync" Feb 23 14:49:52.893025 master-0 kubenswrapper[28758]: I0223 14:49:52.892831 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d0c7920-4214-40e6-83d6-46e306919ec4" containerName="neutron-db-sync" Feb 23 14:49:52.893025 master-0 kubenswrapper[28758]: E0223 14:49:52.892873 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d289190e-0bfa-414d-8482-421efb521c2a" containerName="cinder-api" Feb 23 14:49:52.893025 master-0 kubenswrapper[28758]: I0223 14:49:52.892883 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="d289190e-0bfa-414d-8482-421efb521c2a" containerName="cinder-api" Feb 23 14:49:52.893152 master-0 kubenswrapper[28758]: I0223 14:49:52.893100 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="d289190e-0bfa-414d-8482-421efb521c2a" containerName="cinder-api" Feb 23 14:49:52.893152 master-0 kubenswrapper[28758]: I0223 14:49:52.893137 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d0c7920-4214-40e6-83d6-46e306919ec4" containerName="neutron-db-sync" Feb 23 14:49:52.893210 master-0 kubenswrapper[28758]: I0223 14:49:52.893159 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="d289190e-0bfa-414d-8482-421efb521c2a" containerName="cinder-2d990-api-log" Feb 23 14:49:52.894736 master-0 kubenswrapper[28758]: I0223 14:49:52.894340 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2d990-api-0"] Feb 23 14:49:52.894736 master-0 kubenswrapper[28758]: I0223 14:49:52.894427 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d990-api-0" Feb 23 14:49:52.897114 master-0 kubenswrapper[28758]: I0223 14:49:52.897065 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 23 14:49:52.897290 master-0 kubenswrapper[28758]: I0223 14:49:52.897261 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 23 14:49:52.897457 master-0 kubenswrapper[28758]: I0223 14:49:52.897431 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-2d990-api-config-data" Feb 23 14:49:53.013836 master-0 kubenswrapper[28758]: I0223 14:49:53.013777 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56fd8c5c-tkwd6"] Feb 23 14:49:53.014090 master-0 kubenswrapper[28758]: I0223 14:49:53.014050 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56fd8c5c-tkwd6" podUID="b7cce387-f350-41ac-9293-ca3c7e93681a" containerName="dnsmasq-dns" containerID="cri-o://ab83798ecd0a01f893c05c912022a7b2f2a13fa225bcbd0f1c1a5b5f9983730f" gracePeriod=10 Feb 23 14:49:53.077494 master-0 kubenswrapper[28758]: I0223 14:49:53.076995 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvpsz\" (UniqueName: \"kubernetes.io/projected/1c260b91-35a4-4ed2-800e-14c67846ca98-kube-api-access-wvpsz\") pod \"cinder-2d990-api-0\" (UID: \"1c260b91-35a4-4ed2-800e-14c67846ca98\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:53.077494 master-0 kubenswrapper[28758]: I0223 14:49:53.077174 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c260b91-35a4-4ed2-800e-14c67846ca98-public-tls-certs\") pod \"cinder-2d990-api-0\" (UID: \"1c260b91-35a4-4ed2-800e-14c67846ca98\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:53.077494 master-0 kubenswrapper[28758]: I0223 14:49:53.077240 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c260b91-35a4-4ed2-800e-14c67846ca98-scripts\") pod \"cinder-2d990-api-0\" (UID: \"1c260b91-35a4-4ed2-800e-14c67846ca98\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:53.077494 master-0 kubenswrapper[28758]: I0223 14:49:53.077295 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b69b7cf8f-pcb5c"] Feb 23 14:49:53.077494 master-0 kubenswrapper[28758]: I0223 14:49:53.077425 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c260b91-35a4-4ed2-800e-14c67846ca98-etc-machine-id\") pod \"cinder-2d990-api-0\" (UID: \"1c260b91-35a4-4ed2-800e-14c67846ca98\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:53.077807 master-0 kubenswrapper[28758]: I0223 14:49:53.077567 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c260b91-35a4-4ed2-800e-14c67846ca98-config-data\") pod \"cinder-2d990-api-0\" (UID: \"1c260b91-35a4-4ed2-800e-14c67846ca98\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:53.077807 master-0 kubenswrapper[28758]: I0223 14:49:53.077646 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c260b91-35a4-4ed2-800e-14c67846ca98-combined-ca-bundle\") pod \"cinder-2d990-api-0\" (UID: \"1c260b91-35a4-4ed2-800e-14c67846ca98\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:53.077807 master-0 kubenswrapper[28758]: I0223 14:49:53.077775 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c260b91-35a4-4ed2-800e-14c67846ca98-logs\") pod \"cinder-2d990-api-0\" (UID: \"1c260b91-35a4-4ed2-800e-14c67846ca98\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:53.094540 master-0 kubenswrapper[28758]: I0223 14:49:53.077877 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c260b91-35a4-4ed2-800e-14c67846ca98-internal-tls-certs\") pod \"cinder-2d990-api-0\" (UID: \"1c260b91-35a4-4ed2-800e-14c67846ca98\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:53.094540 master-0 kubenswrapper[28758]: I0223 14:49:53.078069 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c260b91-35a4-4ed2-800e-14c67846ca98-config-data-custom\") pod \"cinder-2d990-api-0\" (UID: \"1c260b91-35a4-4ed2-800e-14c67846ca98\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:53.094540 master-0 kubenswrapper[28758]: I0223 14:49:53.079715 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b69b7cf8f-pcb5c" Feb 23 14:49:53.098470 master-0 kubenswrapper[28758]: I0223 14:49:53.098414 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-84f9bb9b4d-7ttsw"] Feb 23 14:49:53.103597 master-0 kubenswrapper[28758]: I0223 14:49:53.103538 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84f9bb9b4d-7ttsw" Feb 23 14:49:53.122524 master-0 kubenswrapper[28758]: I0223 14:49:53.108327 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:49:53.122524 master-0 kubenswrapper[28758]: I0223 14:49:53.112139 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-84f9bb9b4d-7ttsw"] Feb 23 14:49:53.122524 master-0 kubenswrapper[28758]: I0223 14:49:53.119289 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 23 14:49:53.122524 master-0 kubenswrapper[28758]: I0223 14:49:53.119454 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 23 14:49:53.122524 master-0 kubenswrapper[28758]: I0223 14:49:53.119560 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 23 14:49:53.172655 master-0 kubenswrapper[28758]: I0223 14:49:53.172605 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b69b7cf8f-pcb5c"] Feb 23 14:49:53.180544 master-0 kubenswrapper[28758]: I0223 14:49:53.180349 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4ab41cd-61e0-4117-9d0f-108f29ae692c-ovsdbserver-nb\") pod \"dnsmasq-dns-6b69b7cf8f-pcb5c\" (UID: \"e4ab41cd-61e0-4117-9d0f-108f29ae692c\") " pod="openstack/dnsmasq-dns-6b69b7cf8f-pcb5c" Feb 23 14:49:53.180544 master-0 kubenswrapper[28758]: I0223 14:49:53.180456 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvpsz\" (UniqueName: \"kubernetes.io/projected/1c260b91-35a4-4ed2-800e-14c67846ca98-kube-api-access-wvpsz\") pod \"cinder-2d990-api-0\" (UID: \"1c260b91-35a4-4ed2-800e-14c67846ca98\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:53.180544 master-0 kubenswrapper[28758]: I0223 14:49:53.180549 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c260b91-35a4-4ed2-800e-14c67846ca98-public-tls-certs\") pod \"cinder-2d990-api-0\" (UID: \"1c260b91-35a4-4ed2-800e-14c67846ca98\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:53.181318 master-0 kubenswrapper[28758]: I0223 14:49:53.180583 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c260b91-35a4-4ed2-800e-14c67846ca98-scripts\") pod \"cinder-2d990-api-0\" (UID: \"1c260b91-35a4-4ed2-800e-14c67846ca98\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:53.181318 master-0 kubenswrapper[28758]: I0223 14:49:53.180607 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ab41cd-61e0-4117-9d0f-108f29ae692c-config\") pod \"dnsmasq-dns-6b69b7cf8f-pcb5c\" (UID: \"e4ab41cd-61e0-4117-9d0f-108f29ae692c\") " pod="openstack/dnsmasq-dns-6b69b7cf8f-pcb5c" Feb 23 14:49:53.181318 master-0 kubenswrapper[28758]: I0223 14:49:53.180685 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c260b91-35a4-4ed2-800e-14c67846ca98-etc-machine-id\") pod \"cinder-2d990-api-0\" (UID: \"1c260b91-35a4-4ed2-800e-14c67846ca98\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:53.181318 master-0 kubenswrapper[28758]: I0223 14:49:53.180746 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hskm6\" (UniqueName: \"kubernetes.io/projected/e4ab41cd-61e0-4117-9d0f-108f29ae692c-kube-api-access-hskm6\") pod \"dnsmasq-dns-6b69b7cf8f-pcb5c\" (UID: \"e4ab41cd-61e0-4117-9d0f-108f29ae692c\") " pod="openstack/dnsmasq-dns-6b69b7cf8f-pcb5c" Feb 23 14:49:53.181318 master-0 kubenswrapper[28758]: I0223 14:49:53.180774 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c260b91-35a4-4ed2-800e-14c67846ca98-config-data\") pod \"cinder-2d990-api-0\" (UID: \"1c260b91-35a4-4ed2-800e-14c67846ca98\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:53.181318 master-0 kubenswrapper[28758]: I0223 14:49:53.180824 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c260b91-35a4-4ed2-800e-14c67846ca98-combined-ca-bundle\") pod \"cinder-2d990-api-0\" (UID: \"1c260b91-35a4-4ed2-800e-14c67846ca98\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:53.181318 master-0 kubenswrapper[28758]: I0223 14:49:53.180880 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4ab41cd-61e0-4117-9d0f-108f29ae692c-ovsdbserver-sb\") pod \"dnsmasq-dns-6b69b7cf8f-pcb5c\" (UID: \"e4ab41cd-61e0-4117-9d0f-108f29ae692c\") " pod="openstack/dnsmasq-dns-6b69b7cf8f-pcb5c" Feb 23 14:49:53.181318 master-0 kubenswrapper[28758]: I0223 14:49:53.180957 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c260b91-35a4-4ed2-800e-14c67846ca98-logs\") pod \"cinder-2d990-api-0\" (UID: \"1c260b91-35a4-4ed2-800e-14c67846ca98\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:53.181318 master-0 kubenswrapper[28758]: I0223 14:49:53.180987 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c260b91-35a4-4ed2-800e-14c67846ca98-internal-tls-certs\") pod \"cinder-2d990-api-0\" (UID: \"1c260b91-35a4-4ed2-800e-14c67846ca98\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:53.181318 master-0 kubenswrapper[28758]: I0223 14:49:53.181011 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4ab41cd-61e0-4117-9d0f-108f29ae692c-dns-swift-storage-0\") pod \"dnsmasq-dns-6b69b7cf8f-pcb5c\" (UID: \"e4ab41cd-61e0-4117-9d0f-108f29ae692c\") " pod="openstack/dnsmasq-dns-6b69b7cf8f-pcb5c" Feb 23 14:49:53.181318 master-0 kubenswrapper[28758]: I0223 14:49:53.181049 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4ab41cd-61e0-4117-9d0f-108f29ae692c-dns-svc\") pod \"dnsmasq-dns-6b69b7cf8f-pcb5c\" (UID: \"e4ab41cd-61e0-4117-9d0f-108f29ae692c\") " pod="openstack/dnsmasq-dns-6b69b7cf8f-pcb5c" Feb 23 14:49:53.181318 master-0 kubenswrapper[28758]: I0223 14:49:53.181162 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c260b91-35a4-4ed2-800e-14c67846ca98-config-data-custom\") pod \"cinder-2d990-api-0\" (UID: \"1c260b91-35a4-4ed2-800e-14c67846ca98\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:53.182915 master-0 kubenswrapper[28758]: I0223 14:49:53.182879 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c260b91-35a4-4ed2-800e-14c67846ca98-etc-machine-id\") pod \"cinder-2d990-api-0\" (UID: \"1c260b91-35a4-4ed2-800e-14c67846ca98\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:53.188796 master-0 kubenswrapper[28758]: I0223 14:49:53.188751 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c260b91-35a4-4ed2-800e-14c67846ca98-internal-tls-certs\") pod \"cinder-2d990-api-0\" (UID: \"1c260b91-35a4-4ed2-800e-14c67846ca98\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:53.189259 master-0 kubenswrapper[28758]: I0223 14:49:53.189229 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c260b91-35a4-4ed2-800e-14c67846ca98-logs\") pod \"cinder-2d990-api-0\" (UID: \"1c260b91-35a4-4ed2-800e-14c67846ca98\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:53.190329 master-0 kubenswrapper[28758]: I0223 14:49:53.190276 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c260b91-35a4-4ed2-800e-14c67846ca98-combined-ca-bundle\") pod \"cinder-2d990-api-0\" (UID: \"1c260b91-35a4-4ed2-800e-14c67846ca98\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:53.192220 master-0 kubenswrapper[28758]: I0223 14:49:53.192175 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c260b91-35a4-4ed2-800e-14c67846ca98-config-data-custom\") pod \"cinder-2d990-api-0\" (UID: \"1c260b91-35a4-4ed2-800e-14c67846ca98\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:53.193344 master-0 kubenswrapper[28758]: I0223 14:49:53.193308 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c260b91-35a4-4ed2-800e-14c67846ca98-public-tls-certs\") pod \"cinder-2d990-api-0\" (UID: \"1c260b91-35a4-4ed2-800e-14c67846ca98\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:53.200017 master-0 kubenswrapper[28758]: I0223 14:49:53.199959 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c260b91-35a4-4ed2-800e-14c67846ca98-config-data\") pod \"cinder-2d990-api-0\" (UID: \"1c260b91-35a4-4ed2-800e-14c67846ca98\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:53.203984 master-0 kubenswrapper[28758]: I0223 14:49:53.203921 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c260b91-35a4-4ed2-800e-14c67846ca98-scripts\") pod \"cinder-2d990-api-0\" (UID: \"1c260b91-35a4-4ed2-800e-14c67846ca98\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:53.212438 master-0 kubenswrapper[28758]: I0223 14:49:53.210290 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvpsz\" (UniqueName: \"kubernetes.io/projected/1c260b91-35a4-4ed2-800e-14c67846ca98-kube-api-access-wvpsz\") pod \"cinder-2d990-api-0\" (UID: \"1c260b91-35a4-4ed2-800e-14c67846ca98\") " pod="openstack/cinder-2d990-api-0" Feb 23 14:49:53.273338 master-0 kubenswrapper[28758]: I0223 14:49:53.273071 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d990-api-0" Feb 23 14:49:53.287825 master-0 kubenswrapper[28758]: I0223 14:49:53.287770 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ee1afef-0950-4013-95a6-e6ae71a155eb-httpd-config\") pod \"neutron-84f9bb9b4d-7ttsw\" (UID: \"8ee1afef-0950-4013-95a6-e6ae71a155eb\") " pod="openstack/neutron-84f9bb9b4d-7ttsw" Feb 23 14:49:53.288124 master-0 kubenswrapper[28758]: I0223 14:49:53.288023 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ee1afef-0950-4013-95a6-e6ae71a155eb-ovndb-tls-certs\") pod \"neutron-84f9bb9b4d-7ttsw\" (UID: \"8ee1afef-0950-4013-95a6-e6ae71a155eb\") " pod="openstack/neutron-84f9bb9b4d-7ttsw" Feb 23 14:49:53.288124 master-0 kubenswrapper[28758]: I0223 14:49:53.288079 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hskm6\" (UniqueName: \"kubernetes.io/projected/e4ab41cd-61e0-4117-9d0f-108f29ae692c-kube-api-access-hskm6\") pod \"dnsmasq-dns-6b69b7cf8f-pcb5c\" (UID: \"e4ab41cd-61e0-4117-9d0f-108f29ae692c\") " pod="openstack/dnsmasq-dns-6b69b7cf8f-pcb5c" Feb 23 14:49:53.288124 master-0 kubenswrapper[28758]: I0223 14:49:53.288107 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ee1afef-0950-4013-95a6-e6ae71a155eb-config\") pod \"neutron-84f9bb9b4d-7ttsw\" (UID: \"8ee1afef-0950-4013-95a6-e6ae71a155eb\") " pod="openstack/neutron-84f9bb9b4d-7ttsw" Feb 23 14:49:53.288754 master-0 kubenswrapper[28758]: I0223 14:49:53.288714 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4ab41cd-61e0-4117-9d0f-108f29ae692c-ovsdbserver-sb\") pod \"dnsmasq-dns-6b69b7cf8f-pcb5c\" (UID: \"e4ab41cd-61e0-4117-9d0f-108f29ae692c\") " pod="openstack/dnsmasq-dns-6b69b7cf8f-pcb5c" Feb 23 14:49:53.288819 master-0 kubenswrapper[28758]: I0223 14:49:53.288806 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ee1afef-0950-4013-95a6-e6ae71a155eb-combined-ca-bundle\") pod \"neutron-84f9bb9b4d-7ttsw\" (UID: \"8ee1afef-0950-4013-95a6-e6ae71a155eb\") " pod="openstack/neutron-84f9bb9b4d-7ttsw" Feb 23 14:49:53.289575 master-0 kubenswrapper[28758]: I0223 14:49:53.288870 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4ab41cd-61e0-4117-9d0f-108f29ae692c-dns-swift-storage-0\") pod \"dnsmasq-dns-6b69b7cf8f-pcb5c\" (UID: \"e4ab41cd-61e0-4117-9d0f-108f29ae692c\") " pod="openstack/dnsmasq-dns-6b69b7cf8f-pcb5c" Feb 23 14:49:53.289575 master-0 kubenswrapper[28758]: I0223 14:49:53.288920 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4ab41cd-61e0-4117-9d0f-108f29ae692c-dns-svc\") pod \"dnsmasq-dns-6b69b7cf8f-pcb5c\" (UID: \"e4ab41cd-61e0-4117-9d0f-108f29ae692c\") " pod="openstack/dnsmasq-dns-6b69b7cf8f-pcb5c" Feb 23 14:49:53.289575 master-0 kubenswrapper[28758]: I0223 14:49:53.289021 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvsvk\" (UniqueName: \"kubernetes.io/projected/8ee1afef-0950-4013-95a6-e6ae71a155eb-kube-api-access-rvsvk\") pod \"neutron-84f9bb9b4d-7ttsw\" (UID: \"8ee1afef-0950-4013-95a6-e6ae71a155eb\") " pod="openstack/neutron-84f9bb9b4d-7ttsw" Feb 23 14:49:53.289575 master-0 kubenswrapper[28758]: I0223 14:49:53.289059 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4ab41cd-61e0-4117-9d0f-108f29ae692c-ovsdbserver-nb\") pod \"dnsmasq-dns-6b69b7cf8f-pcb5c\" (UID: \"e4ab41cd-61e0-4117-9d0f-108f29ae692c\") " pod="openstack/dnsmasq-dns-6b69b7cf8f-pcb5c" Feb 23 14:49:53.289575 master-0 kubenswrapper[28758]: I0223 14:49:53.289202 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ab41cd-61e0-4117-9d0f-108f29ae692c-config\") pod \"dnsmasq-dns-6b69b7cf8f-pcb5c\" (UID: \"e4ab41cd-61e0-4117-9d0f-108f29ae692c\") " pod="openstack/dnsmasq-dns-6b69b7cf8f-pcb5c" Feb 23 14:49:53.290215 master-0 kubenswrapper[28758]: I0223 14:49:53.289869 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4ab41cd-61e0-4117-9d0f-108f29ae692c-ovsdbserver-sb\") pod \"dnsmasq-dns-6b69b7cf8f-pcb5c\" (UID: \"e4ab41cd-61e0-4117-9d0f-108f29ae692c\") " pod="openstack/dnsmasq-dns-6b69b7cf8f-pcb5c" Feb 23 14:49:53.290215 master-0 kubenswrapper[28758]: I0223 14:49:53.290199 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ab41cd-61e0-4117-9d0f-108f29ae692c-config\") pod \"dnsmasq-dns-6b69b7cf8f-pcb5c\" (UID: \"e4ab41cd-61e0-4117-9d0f-108f29ae692c\") " pod="openstack/dnsmasq-dns-6b69b7cf8f-pcb5c" Feb 23 14:49:53.290494 master-0 kubenswrapper[28758]: I0223 14:49:53.290461 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4ab41cd-61e0-4117-9d0f-108f29ae692c-dns-svc\") pod \"dnsmasq-dns-6b69b7cf8f-pcb5c\" (UID: \"e4ab41cd-61e0-4117-9d0f-108f29ae692c\") " pod="openstack/dnsmasq-dns-6b69b7cf8f-pcb5c" Feb 23 14:49:53.290748 master-0 kubenswrapper[28758]: I0223 14:49:53.290678 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4ab41cd-61e0-4117-9d0f-108f29ae692c-dns-swift-storage-0\") pod \"dnsmasq-dns-6b69b7cf8f-pcb5c\" (UID: \"e4ab41cd-61e0-4117-9d0f-108f29ae692c\") " pod="openstack/dnsmasq-dns-6b69b7cf8f-pcb5c" Feb 23 14:49:53.291542 master-0 kubenswrapper[28758]: I0223 14:49:53.291506 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4ab41cd-61e0-4117-9d0f-108f29ae692c-ovsdbserver-nb\") pod \"dnsmasq-dns-6b69b7cf8f-pcb5c\" (UID: \"e4ab41cd-61e0-4117-9d0f-108f29ae692c\") " pod="openstack/dnsmasq-dns-6b69b7cf8f-pcb5c" Feb 23 14:49:53.311548 master-0 kubenswrapper[28758]: I0223 14:49:53.309018 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hskm6\" (UniqueName: \"kubernetes.io/projected/e4ab41cd-61e0-4117-9d0f-108f29ae692c-kube-api-access-hskm6\") pod \"dnsmasq-dns-6b69b7cf8f-pcb5c\" (UID: \"e4ab41cd-61e0-4117-9d0f-108f29ae692c\") " pod="openstack/dnsmasq-dns-6b69b7cf8f-pcb5c" Feb 23 14:49:53.395008 master-0 kubenswrapper[28758]: I0223 14:49:53.394942 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ee1afef-0950-4013-95a6-e6ae71a155eb-ovndb-tls-certs\") pod \"neutron-84f9bb9b4d-7ttsw\" (UID: \"8ee1afef-0950-4013-95a6-e6ae71a155eb\") " pod="openstack/neutron-84f9bb9b4d-7ttsw" Feb 23 14:49:53.395008 master-0 kubenswrapper[28758]: I0223 14:49:53.395004 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ee1afef-0950-4013-95a6-e6ae71a155eb-config\") pod \"neutron-84f9bb9b4d-7ttsw\" (UID: \"8ee1afef-0950-4013-95a6-e6ae71a155eb\") " pod="openstack/neutron-84f9bb9b4d-7ttsw" Feb 23 14:49:53.395312 master-0 kubenswrapper[28758]: I0223 14:49:53.395078 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ee1afef-0950-4013-95a6-e6ae71a155eb-combined-ca-bundle\") pod \"neutron-84f9bb9b4d-7ttsw\" (UID: \"8ee1afef-0950-4013-95a6-e6ae71a155eb\") " pod="openstack/neutron-84f9bb9b4d-7ttsw" Feb 23 14:49:53.395312 master-0 kubenswrapper[28758]: I0223 14:49:53.395151 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvsvk\" (UniqueName: \"kubernetes.io/projected/8ee1afef-0950-4013-95a6-e6ae71a155eb-kube-api-access-rvsvk\") pod \"neutron-84f9bb9b4d-7ttsw\" (UID: \"8ee1afef-0950-4013-95a6-e6ae71a155eb\") " pod="openstack/neutron-84f9bb9b4d-7ttsw" Feb 23 14:49:53.395312 master-0 kubenswrapper[28758]: I0223 14:49:53.395235 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ee1afef-0950-4013-95a6-e6ae71a155eb-httpd-config\") pod \"neutron-84f9bb9b4d-7ttsw\" (UID: \"8ee1afef-0950-4013-95a6-e6ae71a155eb\") " pod="openstack/neutron-84f9bb9b4d-7ttsw" Feb 23 14:49:53.399051 master-0 kubenswrapper[28758]: I0223 14:49:53.399017 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ee1afef-0950-4013-95a6-e6ae71a155eb-httpd-config\") pod \"neutron-84f9bb9b4d-7ttsw\" (UID: \"8ee1afef-0950-4013-95a6-e6ae71a155eb\") " pod="openstack/neutron-84f9bb9b4d-7ttsw" Feb 23 14:49:53.399885 master-0 kubenswrapper[28758]: I0223 14:49:53.399845 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ee1afef-0950-4013-95a6-e6ae71a155eb-ovndb-tls-certs\") pod \"neutron-84f9bb9b4d-7ttsw\" (UID: \"8ee1afef-0950-4013-95a6-e6ae71a155eb\") " pod="openstack/neutron-84f9bb9b4d-7ttsw" Feb 23 14:49:53.419622 master-0 kubenswrapper[28758]: I0223 14:49:53.406328 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ee1afef-0950-4013-95a6-e6ae71a155eb-config\") pod \"neutron-84f9bb9b4d-7ttsw\" (UID: \"8ee1afef-0950-4013-95a6-e6ae71a155eb\") " pod="openstack/neutron-84f9bb9b4d-7ttsw" Feb 23 14:49:53.419622 master-0 kubenswrapper[28758]: I0223 14:49:53.408271 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ee1afef-0950-4013-95a6-e6ae71a155eb-combined-ca-bundle\") pod \"neutron-84f9bb9b4d-7ttsw\" (UID: \"8ee1afef-0950-4013-95a6-e6ae71a155eb\") " pod="openstack/neutron-84f9bb9b4d-7ttsw" Feb 23 14:49:53.419622 master-0 kubenswrapper[28758]: I0223 14:49:53.419273 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvsvk\" (UniqueName: \"kubernetes.io/projected/8ee1afef-0950-4013-95a6-e6ae71a155eb-kube-api-access-rvsvk\") pod \"neutron-84f9bb9b4d-7ttsw\" (UID: \"8ee1afef-0950-4013-95a6-e6ae71a155eb\") " pod="openstack/neutron-84f9bb9b4d-7ttsw" Feb 23 14:49:53.486677 master-0 kubenswrapper[28758]: I0223 14:49:53.484541 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:53.560722 master-0 kubenswrapper[28758]: I0223 14:49:53.560652 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:53.588593 master-0 kubenswrapper[28758]: I0223 14:49:53.588523 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b69b7cf8f-pcb5c" Feb 23 14:49:53.619336 master-0 kubenswrapper[28758]: I0223 14:49:53.612923 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84f9bb9b4d-7ttsw" Feb 23 14:49:53.754462 master-0 kubenswrapper[28758]: I0223 14:49:53.754406 28758 generic.go:334] "Generic (PLEG): container finished" podID="b7cce387-f350-41ac-9293-ca3c7e93681a" containerID="ab83798ecd0a01f893c05c912022a7b2f2a13fa225bcbd0f1c1a5b5f9983730f" exitCode=0 Feb 23 14:49:53.755432 master-0 kubenswrapper[28758]: I0223 14:49:53.755398 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56fd8c5c-tkwd6" event={"ID":"b7cce387-f350-41ac-9293-ca3c7e93681a","Type":"ContainerDied","Data":"ab83798ecd0a01f893c05c912022a7b2f2a13fa225bcbd0f1c1a5b5f9983730f"} Feb 23 14:49:53.755488 master-0 kubenswrapper[28758]: I0223 14:49:53.755432 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56fd8c5c-tkwd6" event={"ID":"b7cce387-f350-41ac-9293-ca3c7e93681a","Type":"ContainerDied","Data":"613bcf8892538b1cc18e766c27dcd7339f6d323819869c272d3a608d169a9595"} Feb 23 14:49:53.755488 master-0 kubenswrapper[28758]: I0223 14:49:53.755444 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="613bcf8892538b1cc18e766c27dcd7339f6d323819869c272d3a608d169a9595" Feb 23 14:49:53.758189 master-0 kubenswrapper[28758]: I0223 14:49:53.758150 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56fd8c5c-tkwd6" Feb 23 14:49:53.927742 master-0 kubenswrapper[28758]: I0223 14:49:53.927699 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b7cce387-f350-41ac-9293-ca3c7e93681a-dns-swift-storage-0\") pod \"b7cce387-f350-41ac-9293-ca3c7e93681a\" (UID: \"b7cce387-f350-41ac-9293-ca3c7e93681a\") " Feb 23 14:49:53.927863 master-0 kubenswrapper[28758]: I0223 14:49:53.927752 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7cce387-f350-41ac-9293-ca3c7e93681a-ovsdbserver-nb\") pod \"b7cce387-f350-41ac-9293-ca3c7e93681a\" (UID: \"b7cce387-f350-41ac-9293-ca3c7e93681a\") " Feb 23 14:49:53.927863 master-0 kubenswrapper[28758]: I0223 14:49:53.927828 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7cce387-f350-41ac-9293-ca3c7e93681a-dns-svc\") pod \"b7cce387-f350-41ac-9293-ca3c7e93681a\" (UID: \"b7cce387-f350-41ac-9293-ca3c7e93681a\") " Feb 23 14:49:53.927931 master-0 kubenswrapper[28758]: I0223 14:49:53.927866 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7cce387-f350-41ac-9293-ca3c7e93681a-config\") pod \"b7cce387-f350-41ac-9293-ca3c7e93681a\" (UID: \"b7cce387-f350-41ac-9293-ca3c7e93681a\") " Feb 23 14:49:53.927974 master-0 kubenswrapper[28758]: I0223 14:49:53.927938 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7cce387-f350-41ac-9293-ca3c7e93681a-ovsdbserver-sb\") pod \"b7cce387-f350-41ac-9293-ca3c7e93681a\" (UID: \"b7cce387-f350-41ac-9293-ca3c7e93681a\") " Feb 23 14:49:53.928011 master-0 kubenswrapper[28758]: I0223 14:49:53.927990 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-944nj\" (UniqueName: \"kubernetes.io/projected/b7cce387-f350-41ac-9293-ca3c7e93681a-kube-api-access-944nj\") pod \"b7cce387-f350-41ac-9293-ca3c7e93681a\" (UID: \"b7cce387-f350-41ac-9293-ca3c7e93681a\") " Feb 23 14:49:53.937780 master-0 kubenswrapper[28758]: I0223 14:49:53.937434 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7cce387-f350-41ac-9293-ca3c7e93681a-kube-api-access-944nj" (OuterVolumeSpecName: "kube-api-access-944nj") pod "b7cce387-f350-41ac-9293-ca3c7e93681a" (UID: "b7cce387-f350-41ac-9293-ca3c7e93681a"). InnerVolumeSpecName "kube-api-access-944nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:49:53.945583 master-0 kubenswrapper[28758]: I0223 14:49:53.944762 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2d990-api-0"] Feb 23 14:49:54.049434 master-0 kubenswrapper[28758]: I0223 14:49:54.045008 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-944nj\" (UniqueName: \"kubernetes.io/projected/b7cce387-f350-41ac-9293-ca3c7e93681a-kube-api-access-944nj\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:54.049434 master-0 kubenswrapper[28758]: I0223 14:49:54.045676 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7cce387-f350-41ac-9293-ca3c7e93681a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b7cce387-f350-41ac-9293-ca3c7e93681a" (UID: "b7cce387-f350-41ac-9293-ca3c7e93681a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:49:54.051589 master-0 kubenswrapper[28758]: I0223 14:49:54.051530 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7cce387-f350-41ac-9293-ca3c7e93681a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b7cce387-f350-41ac-9293-ca3c7e93681a" (UID: "b7cce387-f350-41ac-9293-ca3c7e93681a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:49:54.055314 master-0 kubenswrapper[28758]: I0223 14:49:54.055121 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7cce387-f350-41ac-9293-ca3c7e93681a-config" (OuterVolumeSpecName: "config") pod "b7cce387-f350-41ac-9293-ca3c7e93681a" (UID: "b7cce387-f350-41ac-9293-ca3c7e93681a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:49:54.092224 master-0 kubenswrapper[28758]: I0223 14:49:54.092154 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7cce387-f350-41ac-9293-ca3c7e93681a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b7cce387-f350-41ac-9293-ca3c7e93681a" (UID: "b7cce387-f350-41ac-9293-ca3c7e93681a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:49:54.130358 master-0 kubenswrapper[28758]: I0223 14:49:54.124819 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7cce387-f350-41ac-9293-ca3c7e93681a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b7cce387-f350-41ac-9293-ca3c7e93681a" (UID: "b7cce387-f350-41ac-9293-ca3c7e93681a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:49:54.150143 master-0 kubenswrapper[28758]: I0223 14:49:54.150065 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d289190e-0bfa-414d-8482-421efb521c2a" path="/var/lib/kubelet/pods/d289190e-0bfa-414d-8482-421efb521c2a/volumes" Feb 23 14:49:54.150961 master-0 kubenswrapper[28758]: I0223 14:49:54.150911 28758 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7cce387-f350-41ac-9293-ca3c7e93681a-dns-svc\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:54.150961 master-0 kubenswrapper[28758]: I0223 14:49:54.150951 28758 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7cce387-f350-41ac-9293-ca3c7e93681a-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:54.150961 master-0 kubenswrapper[28758]: I0223 14:49:54.150961 28758 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7cce387-f350-41ac-9293-ca3c7e93681a-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:54.151248 master-0 kubenswrapper[28758]: I0223 14:49:54.150980 28758 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b7cce387-f350-41ac-9293-ca3c7e93681a-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:54.151248 master-0 kubenswrapper[28758]: I0223 14:49:54.150990 28758 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7cce387-f350-41ac-9293-ca3c7e93681a-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:54.168445 master-0 kubenswrapper[28758]: W0223 14:49:54.168376 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4ab41cd_61e0_4117_9d0f_108f29ae692c.slice/crio-36aaac46fbc3ad1bbc0c1dbbd2d00f3be0000f33be79d3351597577ca0d1a0f1 WatchSource:0}: Error finding container 36aaac46fbc3ad1bbc0c1dbbd2d00f3be0000f33be79d3351597577ca0d1a0f1: Status 404 returned error can't find the container with id 36aaac46fbc3ad1bbc0c1dbbd2d00f3be0000f33be79d3351597577ca0d1a0f1 Feb 23 14:49:54.189086 master-0 kubenswrapper[28758]: I0223 14:49:54.189014 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b69b7cf8f-pcb5c"] Feb 23 14:49:54.399805 master-0 kubenswrapper[28758]: I0223 14:49:54.399471 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-84f9bb9b4d-7ttsw"] Feb 23 14:49:54.413375 master-0 kubenswrapper[28758]: W0223 14:49:54.411800 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ee1afef_0950_4013_95a6_e6ae71a155eb.slice/crio-1709dd9011be00258bddd92c5bbc5b79ca173ed231d874f86ca8a08dab81637f WatchSource:0}: Error finding container 1709dd9011be00258bddd92c5bbc5b79ca173ed231d874f86ca8a08dab81637f: Status 404 returned error can't find the container with id 1709dd9011be00258bddd92c5bbc5b79ca173ed231d874f86ca8a08dab81637f Feb 23 14:49:54.770918 master-0 kubenswrapper[28758]: I0223 14:49:54.770860 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84f9bb9b4d-7ttsw" event={"ID":"8ee1afef-0950-4013-95a6-e6ae71a155eb","Type":"ContainerStarted","Data":"a074856ff5b977bc702c4e244794e3d8c685bcf0b57c1ebe80bebe8f24288af9"} Feb 23 14:49:54.771540 master-0 kubenswrapper[28758]: I0223 14:49:54.771517 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84f9bb9b4d-7ttsw" event={"ID":"8ee1afef-0950-4013-95a6-e6ae71a155eb","Type":"ContainerStarted","Data":"1709dd9011be00258bddd92c5bbc5b79ca173ed231d874f86ca8a08dab81637f"} Feb 23 14:49:54.773286 master-0 kubenswrapper[28758]: I0223 14:49:54.773226 28758 generic.go:334] "Generic (PLEG): container finished" podID="e4ab41cd-61e0-4117-9d0f-108f29ae692c" containerID="aa0b709d714808d5efab61d938e3547055f70e8e14c6e2eafa2711f9e7448570" exitCode=0 Feb 23 14:49:54.773413 master-0 kubenswrapper[28758]: I0223 14:49:54.773318 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b69b7cf8f-pcb5c" event={"ID":"e4ab41cd-61e0-4117-9d0f-108f29ae692c","Type":"ContainerDied","Data":"aa0b709d714808d5efab61d938e3547055f70e8e14c6e2eafa2711f9e7448570"} Feb 23 14:49:54.773413 master-0 kubenswrapper[28758]: I0223 14:49:54.773351 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b69b7cf8f-pcb5c" event={"ID":"e4ab41cd-61e0-4117-9d0f-108f29ae692c","Type":"ContainerStarted","Data":"36aaac46fbc3ad1bbc0c1dbbd2d00f3be0000f33be79d3351597577ca0d1a0f1"} Feb 23 14:49:54.778575 master-0 kubenswrapper[28758]: I0223 14:49:54.777916 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-api-0" event={"ID":"1c260b91-35a4-4ed2-800e-14c67846ca98","Type":"ContainerStarted","Data":"9773fcf76531d22c1cbd4fa2a98ab097dc64a818b2bd019b47c24ba854ea1430"} Feb 23 14:49:54.778575 master-0 kubenswrapper[28758]: I0223 14:49:54.777964 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-api-0" event={"ID":"1c260b91-35a4-4ed2-800e-14c67846ca98","Type":"ContainerStarted","Data":"afb7c59a8359b89762f333f67a4a5ba508b306f62def1c86d672c21c3c1bec17"} Feb 23 14:49:54.778575 master-0 kubenswrapper[28758]: I0223 14:49:54.777973 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56fd8c5c-tkwd6" Feb 23 14:49:54.877896 master-0 kubenswrapper[28758]: I0223 14:49:54.877833 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56fd8c5c-tkwd6"] Feb 23 14:49:54.894564 master-0 kubenswrapper[28758]: I0223 14:49:54.894363 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56fd8c5c-tkwd6"] Feb 23 14:49:55.823501 master-0 kubenswrapper[28758]: I0223 14:49:55.820657 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b69b7cf8f-pcb5c" event={"ID":"e4ab41cd-61e0-4117-9d0f-108f29ae692c","Type":"ContainerStarted","Data":"16c08e5f7966221a02fc02f63d75f242d6fccae0624607fa8aab15528d4dcee7"} Feb 23 14:49:55.823501 master-0 kubenswrapper[28758]: I0223 14:49:55.822337 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b69b7cf8f-pcb5c" Feb 23 14:49:55.826589 master-0 kubenswrapper[28758]: I0223 14:49:55.826533 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-api-0" event={"ID":"1c260b91-35a4-4ed2-800e-14c67846ca98","Type":"ContainerStarted","Data":"e2fd31bec63ac8c03e382c7f18da0d7c18b5db697f5d1d43b438611625e7f7c4"} Feb 23 14:49:55.827460 master-0 kubenswrapper[28758]: I0223 14:49:55.827429 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-2d990-api-0" Feb 23 14:49:55.840861 master-0 kubenswrapper[28758]: I0223 14:49:55.840785 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84f9bb9b4d-7ttsw" event={"ID":"8ee1afef-0950-4013-95a6-e6ae71a155eb","Type":"ContainerStarted","Data":"f445dd7a9552e21e8b19b2f1128fc3bdf2dd91e14d688462a2ecd74bf371a0f5"} Feb 23 14:49:55.841836 master-0 kubenswrapper[28758]: I0223 14:49:55.841794 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-84f9bb9b4d-7ttsw" Feb 23 14:49:55.862506 master-0 kubenswrapper[28758]: I0223 14:49:55.862047 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b69b7cf8f-pcb5c" podStartSLOduration=2.862028656 podStartE2EDuration="2.862028656s" podCreationTimestamp="2026-02-23 14:49:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:49:55.861699547 +0000 UTC m=+927.988015479" watchObservedRunningTime="2026-02-23 14:49:55.862028656 +0000 UTC m=+927.988344588" Feb 23 14:49:55.916501 master-0 kubenswrapper[28758]: I0223 14:49:55.915212 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-84f9bb9b4d-7ttsw" podStartSLOduration=2.915193358 podStartE2EDuration="2.915193358s" podCreationTimestamp="2026-02-23 14:49:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:49:55.907751651 +0000 UTC m=+928.034067583" watchObservedRunningTime="2026-02-23 14:49:55.915193358 +0000 UTC m=+928.041509290" Feb 23 14:49:55.959506 master-0 kubenswrapper[28758]: I0223 14:49:55.958632 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-2d990-api-0" podStartSLOduration=3.958608962 podStartE2EDuration="3.958608962s" podCreationTimestamp="2026-02-23 14:49:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:49:55.948422471 +0000 UTC m=+928.074738413" watchObservedRunningTime="2026-02-23 14:49:55.958608962 +0000 UTC m=+928.084924904" Feb 23 14:49:56.101054 master-0 kubenswrapper[28758]: I0223 14:49:56.100947 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7cce387-f350-41ac-9293-ca3c7e93681a" path="/var/lib/kubelet/pods/b7cce387-f350-41ac-9293-ca3c7e93681a/volumes" Feb 23 14:49:56.101972 master-0 kubenswrapper[28758]: I0223 14:49:56.101949 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5b8b575dff-c9z8b"] Feb 23 14:49:56.102470 master-0 kubenswrapper[28758]: E0223 14:49:56.102450 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7cce387-f350-41ac-9293-ca3c7e93681a" containerName="dnsmasq-dns" Feb 23 14:49:56.102659 master-0 kubenswrapper[28758]: I0223 14:49:56.102644 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7cce387-f350-41ac-9293-ca3c7e93681a" containerName="dnsmasq-dns" Feb 23 14:49:56.102745 master-0 kubenswrapper[28758]: E0223 14:49:56.102733 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7cce387-f350-41ac-9293-ca3c7e93681a" containerName="init" Feb 23 14:49:56.102815 master-0 kubenswrapper[28758]: I0223 14:49:56.102803 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7cce387-f350-41ac-9293-ca3c7e93681a" containerName="init" Feb 23 14:49:56.103186 master-0 kubenswrapper[28758]: I0223 14:49:56.103168 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7cce387-f350-41ac-9293-ca3c7e93681a" containerName="dnsmasq-dns" Feb 23 14:49:56.104939 master-0 kubenswrapper[28758]: I0223 14:49:56.104913 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b8b575dff-c9z8b" Feb 23 14:49:56.108349 master-0 kubenswrapper[28758]: I0223 14:49:56.108301 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 23 14:49:56.109167 master-0 kubenswrapper[28758]: I0223 14:49:56.109128 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 23 14:49:56.129570 master-0 kubenswrapper[28758]: I0223 14:49:56.128941 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b8b575dff-c9z8b"] Feb 23 14:49:56.306777 master-0 kubenswrapper[28758]: I0223 14:49:56.306670 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/575c77e7-ca40-4cb6-9ef4-9938f44932ec-ovndb-tls-certs\") pod \"neutron-5b8b575dff-c9z8b\" (UID: \"575c77e7-ca40-4cb6-9ef4-9938f44932ec\") " pod="openstack/neutron-5b8b575dff-c9z8b" Feb 23 14:49:56.306998 master-0 kubenswrapper[28758]: I0223 14:49:56.306892 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/575c77e7-ca40-4cb6-9ef4-9938f44932ec-combined-ca-bundle\") pod \"neutron-5b8b575dff-c9z8b\" (UID: \"575c77e7-ca40-4cb6-9ef4-9938f44932ec\") " pod="openstack/neutron-5b8b575dff-c9z8b" Feb 23 14:49:56.306998 master-0 kubenswrapper[28758]: I0223 14:49:56.306967 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/575c77e7-ca40-4cb6-9ef4-9938f44932ec-config\") pod \"neutron-5b8b575dff-c9z8b\" (UID: \"575c77e7-ca40-4cb6-9ef4-9938f44932ec\") " pod="openstack/neutron-5b8b575dff-c9z8b" Feb 23 14:49:56.307108 master-0 kubenswrapper[28758]: I0223 14:49:56.307064 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxrn9\" (UniqueName: \"kubernetes.io/projected/575c77e7-ca40-4cb6-9ef4-9938f44932ec-kube-api-access-bxrn9\") pod \"neutron-5b8b575dff-c9z8b\" (UID: \"575c77e7-ca40-4cb6-9ef4-9938f44932ec\") " pod="openstack/neutron-5b8b575dff-c9z8b" Feb 23 14:49:56.307454 master-0 kubenswrapper[28758]: I0223 14:49:56.307412 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/575c77e7-ca40-4cb6-9ef4-9938f44932ec-httpd-config\") pod \"neutron-5b8b575dff-c9z8b\" (UID: \"575c77e7-ca40-4cb6-9ef4-9938f44932ec\") " pod="openstack/neutron-5b8b575dff-c9z8b" Feb 23 14:49:56.307911 master-0 kubenswrapper[28758]: I0223 14:49:56.307890 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/575c77e7-ca40-4cb6-9ef4-9938f44932ec-internal-tls-certs\") pod \"neutron-5b8b575dff-c9z8b\" (UID: \"575c77e7-ca40-4cb6-9ef4-9938f44932ec\") " pod="openstack/neutron-5b8b575dff-c9z8b" Feb 23 14:49:56.308030 master-0 kubenswrapper[28758]: I0223 14:49:56.308012 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/575c77e7-ca40-4cb6-9ef4-9938f44932ec-public-tls-certs\") pod \"neutron-5b8b575dff-c9z8b\" (UID: \"575c77e7-ca40-4cb6-9ef4-9938f44932ec\") " pod="openstack/neutron-5b8b575dff-c9z8b" Feb 23 14:49:56.410775 master-0 kubenswrapper[28758]: I0223 14:49:56.410658 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/575c77e7-ca40-4cb6-9ef4-9938f44932ec-combined-ca-bundle\") pod \"neutron-5b8b575dff-c9z8b\" (UID: \"575c77e7-ca40-4cb6-9ef4-9938f44932ec\") " pod="openstack/neutron-5b8b575dff-c9z8b" Feb 23 14:49:56.411046 master-0 kubenswrapper[28758]: I0223 14:49:56.411018 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/575c77e7-ca40-4cb6-9ef4-9938f44932ec-config\") pod \"neutron-5b8b575dff-c9z8b\" (UID: \"575c77e7-ca40-4cb6-9ef4-9938f44932ec\") " pod="openstack/neutron-5b8b575dff-c9z8b" Feb 23 14:49:56.411207 master-0 kubenswrapper[28758]: I0223 14:49:56.411187 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxrn9\" (UniqueName: \"kubernetes.io/projected/575c77e7-ca40-4cb6-9ef4-9938f44932ec-kube-api-access-bxrn9\") pod \"neutron-5b8b575dff-c9z8b\" (UID: \"575c77e7-ca40-4cb6-9ef4-9938f44932ec\") " pod="openstack/neutron-5b8b575dff-c9z8b" Feb 23 14:49:56.411419 master-0 kubenswrapper[28758]: I0223 14:49:56.411399 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/575c77e7-ca40-4cb6-9ef4-9938f44932ec-httpd-config\") pod \"neutron-5b8b575dff-c9z8b\" (UID: \"575c77e7-ca40-4cb6-9ef4-9938f44932ec\") " pod="openstack/neutron-5b8b575dff-c9z8b" Feb 23 14:49:56.411601 master-0 kubenswrapper[28758]: I0223 14:49:56.411583 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/575c77e7-ca40-4cb6-9ef4-9938f44932ec-internal-tls-certs\") pod \"neutron-5b8b575dff-c9z8b\" (UID: \"575c77e7-ca40-4cb6-9ef4-9938f44932ec\") " pod="openstack/neutron-5b8b575dff-c9z8b" Feb 23 14:49:56.411699 master-0 kubenswrapper[28758]: I0223 14:49:56.411686 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/575c77e7-ca40-4cb6-9ef4-9938f44932ec-public-tls-certs\") pod \"neutron-5b8b575dff-c9z8b\" (UID: \"575c77e7-ca40-4cb6-9ef4-9938f44932ec\") " pod="openstack/neutron-5b8b575dff-c9z8b" Feb 23 14:49:56.411810 master-0 kubenswrapper[28758]: I0223 14:49:56.411795 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/575c77e7-ca40-4cb6-9ef4-9938f44932ec-ovndb-tls-certs\") pod \"neutron-5b8b575dff-c9z8b\" (UID: \"575c77e7-ca40-4cb6-9ef4-9938f44932ec\") " pod="openstack/neutron-5b8b575dff-c9z8b" Feb 23 14:49:56.415186 master-0 kubenswrapper[28758]: I0223 14:49:56.415011 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/575c77e7-ca40-4cb6-9ef4-9938f44932ec-combined-ca-bundle\") pod \"neutron-5b8b575dff-c9z8b\" (UID: \"575c77e7-ca40-4cb6-9ef4-9938f44932ec\") " pod="openstack/neutron-5b8b575dff-c9z8b" Feb 23 14:49:56.415393 master-0 kubenswrapper[28758]: I0223 14:49:56.415353 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/575c77e7-ca40-4cb6-9ef4-9938f44932ec-internal-tls-certs\") pod \"neutron-5b8b575dff-c9z8b\" (UID: \"575c77e7-ca40-4cb6-9ef4-9938f44932ec\") " pod="openstack/neutron-5b8b575dff-c9z8b" Feb 23 14:49:56.415523 master-0 kubenswrapper[28758]: I0223 14:49:56.415493 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/575c77e7-ca40-4cb6-9ef4-9938f44932ec-httpd-config\") pod \"neutron-5b8b575dff-c9z8b\" (UID: \"575c77e7-ca40-4cb6-9ef4-9938f44932ec\") " pod="openstack/neutron-5b8b575dff-c9z8b" Feb 23 14:49:56.418529 master-0 kubenswrapper[28758]: I0223 14:49:56.418465 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/575c77e7-ca40-4cb6-9ef4-9938f44932ec-public-tls-certs\") pod \"neutron-5b8b575dff-c9z8b\" (UID: \"575c77e7-ca40-4cb6-9ef4-9938f44932ec\") " pod="openstack/neutron-5b8b575dff-c9z8b" Feb 23 14:49:56.419518 master-0 kubenswrapper[28758]: I0223 14:49:56.419469 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/575c77e7-ca40-4cb6-9ef4-9938f44932ec-config\") pod \"neutron-5b8b575dff-c9z8b\" (UID: \"575c77e7-ca40-4cb6-9ef4-9938f44932ec\") " pod="openstack/neutron-5b8b575dff-c9z8b" Feb 23 14:49:56.421822 master-0 kubenswrapper[28758]: I0223 14:49:56.421464 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/575c77e7-ca40-4cb6-9ef4-9938f44932ec-ovndb-tls-certs\") pod \"neutron-5b8b575dff-c9z8b\" (UID: \"575c77e7-ca40-4cb6-9ef4-9938f44932ec\") " pod="openstack/neutron-5b8b575dff-c9z8b" Feb 23 14:49:56.432352 master-0 kubenswrapper[28758]: I0223 14:49:56.432292 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxrn9\" (UniqueName: \"kubernetes.io/projected/575c77e7-ca40-4cb6-9ef4-9938f44932ec-kube-api-access-bxrn9\") pod \"neutron-5b8b575dff-c9z8b\" (UID: \"575c77e7-ca40-4cb6-9ef4-9938f44932ec\") " pod="openstack/neutron-5b8b575dff-c9z8b" Feb 23 14:49:56.433223 master-0 kubenswrapper[28758]: I0223 14:49:56.433174 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b8b575dff-c9z8b" Feb 23 14:49:57.035398 master-0 kubenswrapper[28758]: I0223 14:49:57.035310 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b8b575dff-c9z8b"] Feb 23 14:49:57.055830 master-0 kubenswrapper[28758]: W0223 14:49:57.050093 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod575c77e7_ca40_4cb6_9ef4_9938f44932ec.slice/crio-7ed47431ce0bfc9fe26b1cc2906518ed271a68c2b7670420ac45525531b30d11 WatchSource:0}: Error finding container 7ed47431ce0bfc9fe26b1cc2906518ed271a68c2b7670420ac45525531b30d11: Status 404 returned error can't find the container with id 7ed47431ce0bfc9fe26b1cc2906518ed271a68c2b7670420ac45525531b30d11 Feb 23 14:49:57.876539 master-0 kubenswrapper[28758]: I0223 14:49:57.875886 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b8b575dff-c9z8b" event={"ID":"575c77e7-ca40-4cb6-9ef4-9938f44932ec","Type":"ContainerStarted","Data":"9f735995e1de9afac55d98cc5d524cb2ccd2502682b4648ee09ba1d7f2253aee"} Feb 23 14:49:57.876539 master-0 kubenswrapper[28758]: I0223 14:49:57.875954 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b8b575dff-c9z8b" event={"ID":"575c77e7-ca40-4cb6-9ef4-9938f44932ec","Type":"ContainerStarted","Data":"f75e7073f04b421a1a4f3e588ff0c33bead6b848e368abd1bf616bf902d5bc2f"} Feb 23 14:49:57.876539 master-0 kubenswrapper[28758]: I0223 14:49:57.875972 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b8b575dff-c9z8b" event={"ID":"575c77e7-ca40-4cb6-9ef4-9938f44932ec","Type":"ContainerStarted","Data":"7ed47431ce0bfc9fe26b1cc2906518ed271a68c2b7670420ac45525531b30d11"} Feb 23 14:49:57.876539 master-0 kubenswrapper[28758]: I0223 14:49:57.876128 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5b8b575dff-c9z8b" Feb 23 14:49:57.878628 master-0 kubenswrapper[28758]: I0223 14:49:57.878581 28758 generic.go:334] "Generic (PLEG): container finished" podID="cc21cc06-410c-4afe-85d4-a72d8cebf881" containerID="f1899198a88f763740bd890a3b93d1aa145dbe8e2027b332ad777ba096752dec" exitCode=0 Feb 23 14:49:57.878723 master-0 kubenswrapper[28758]: I0223 14:49:57.878690 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-szljh" event={"ID":"cc21cc06-410c-4afe-85d4-a72d8cebf881","Type":"ContainerDied","Data":"f1899198a88f763740bd890a3b93d1aa145dbe8e2027b332ad777ba096752dec"} Feb 23 14:49:57.899509 master-0 kubenswrapper[28758]: I0223 14:49:57.897741 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5b8b575dff-c9z8b" podStartSLOduration=1.897715853 podStartE2EDuration="1.897715853s" podCreationTimestamp="2026-02-23 14:49:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:49:57.895747231 +0000 UTC m=+930.022063163" watchObservedRunningTime="2026-02-23 14:49:57.897715853 +0000 UTC m=+930.024031805" Feb 23 14:49:58.339587 master-0 kubenswrapper[28758]: I0223 14:49:58.339538 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:49:58.413653 master-0 kubenswrapper[28758]: I0223 14:49:58.413574 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-2d990-scheduler-0"] Feb 23 14:49:58.707280 master-0 kubenswrapper[28758]: I0223 14:49:58.706872 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:49:58.776207 master-0 kubenswrapper[28758]: I0223 14:49:58.776147 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-2d990-volume-lvm-iscsi-0"] Feb 23 14:49:58.850886 master-0 kubenswrapper[28758]: I0223 14:49:58.850652 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-2d990-backup-0" Feb 23 14:49:58.911594 master-0 kubenswrapper[28758]: I0223 14:49:58.904105 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-2d990-scheduler-0" podUID="4021d59b-dfa8-49cc-b55c-48469a02b971" containerName="cinder-scheduler" containerID="cri-o://e7b0433721951dfb37923e3f3397313fe35bb598c89aa4a7ffc288127028a2ad" gracePeriod=30 Feb 23 14:49:58.911594 master-0 kubenswrapper[28758]: I0223 14:49:58.904292 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-2d990-scheduler-0" podUID="4021d59b-dfa8-49cc-b55c-48469a02b971" containerName="probe" containerID="cri-o://d10703ef54d01e9d2bd5964168e28f6bb29d723aece463c73eb4d8418eb7106d" gracePeriod=30 Feb 23 14:49:58.921640 master-0 kubenswrapper[28758]: I0223 14:49:58.921559 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-2d990-volume-lvm-iscsi-0" podUID="42d8d5ca-eecd-4a43-9a68-1099bab111aa" containerName="cinder-volume" containerID="cri-o://3bdcc4b7933f394730d3f24b022b4af86aac0763903d06917b66ab3c00fdc562" gracePeriod=30 Feb 23 14:49:58.921878 master-0 kubenswrapper[28758]: I0223 14:49:58.921642 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-2d990-volume-lvm-iscsi-0" podUID="42d8d5ca-eecd-4a43-9a68-1099bab111aa" containerName="probe" containerID="cri-o://9d782ea0bcfed09e73794591745ae7dd51f3de5161f1e96d6cb094ade09f86aa" gracePeriod=30 Feb 23 14:49:58.922457 master-0 kubenswrapper[28758]: I0223 14:49:58.922399 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-2d990-backup-0"] Feb 23 14:49:58.922935 master-0 kubenswrapper[28758]: I0223 14:49:58.922870 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-2d990-backup-0" podUID="7a793659-5013-4d7e-a986-ea95782880c2" containerName="cinder-backup" containerID="cri-o://86b06d12ffc94bcbdc6b4b284ec6fa03e058c14f82ea14b05587f96271ce4a86" gracePeriod=30 Feb 23 14:49:58.923248 master-0 kubenswrapper[28758]: I0223 14:49:58.923163 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-2d990-backup-0" podUID="7a793659-5013-4d7e-a986-ea95782880c2" containerName="probe" containerID="cri-o://08cf06726322e72e4cb4550309df3da0cc5dbd24384fbcb6e109a66128208f17" gracePeriod=30 Feb 23 14:49:59.293072 master-0 kubenswrapper[28758]: E0223 14:49:59.293011 28758 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42d8d5ca_eecd_4a43_9a68_1099bab111aa.slice/crio-3bdcc4b7933f394730d3f24b022b4af86aac0763903d06917b66ab3c00fdc562.scope\": RecentStats: unable to find data in memory cache]" Feb 23 14:49:59.485847 master-0 kubenswrapper[28758]: I0223 14:49:59.485160 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-szljh" Feb 23 14:49:59.616017 master-0 kubenswrapper[28758]: I0223 14:49:59.615777 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cc21cc06-410c-4afe-85d4-a72d8cebf881-etc-podinfo\") pod \"cc21cc06-410c-4afe-85d4-a72d8cebf881\" (UID: \"cc21cc06-410c-4afe-85d4-a72d8cebf881\") " Feb 23 14:49:59.616017 master-0 kubenswrapper[28758]: I0223 14:49:59.615865 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc21cc06-410c-4afe-85d4-a72d8cebf881-combined-ca-bundle\") pod \"cc21cc06-410c-4afe-85d4-a72d8cebf881\" (UID: \"cc21cc06-410c-4afe-85d4-a72d8cebf881\") " Feb 23 14:49:59.616017 master-0 kubenswrapper[28758]: I0223 14:49:59.615904 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbxp6\" (UniqueName: \"kubernetes.io/projected/cc21cc06-410c-4afe-85d4-a72d8cebf881-kube-api-access-pbxp6\") pod \"cc21cc06-410c-4afe-85d4-a72d8cebf881\" (UID: \"cc21cc06-410c-4afe-85d4-a72d8cebf881\") " Feb 23 14:49:59.616360 master-0 kubenswrapper[28758]: I0223 14:49:59.616036 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cc21cc06-410c-4afe-85d4-a72d8cebf881-config-data-merged\") pod \"cc21cc06-410c-4afe-85d4-a72d8cebf881\" (UID: \"cc21cc06-410c-4afe-85d4-a72d8cebf881\") " Feb 23 14:49:59.616360 master-0 kubenswrapper[28758]: I0223 14:49:59.616150 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc21cc06-410c-4afe-85d4-a72d8cebf881-config-data\") pod \"cc21cc06-410c-4afe-85d4-a72d8cebf881\" (UID: \"cc21cc06-410c-4afe-85d4-a72d8cebf881\") " Feb 23 14:49:59.616360 master-0 kubenswrapper[28758]: I0223 14:49:59.616228 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc21cc06-410c-4afe-85d4-a72d8cebf881-scripts\") pod \"cc21cc06-410c-4afe-85d4-a72d8cebf881\" (UID: \"cc21cc06-410c-4afe-85d4-a72d8cebf881\") " Feb 23 14:49:59.616929 master-0 kubenswrapper[28758]: I0223 14:49:59.616870 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc21cc06-410c-4afe-85d4-a72d8cebf881-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "cc21cc06-410c-4afe-85d4-a72d8cebf881" (UID: "cc21cc06-410c-4afe-85d4-a72d8cebf881"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:49:59.621361 master-0 kubenswrapper[28758]: I0223 14:49:59.621008 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc21cc06-410c-4afe-85d4-a72d8cebf881-scripts" (OuterVolumeSpecName: "scripts") pod "cc21cc06-410c-4afe-85d4-a72d8cebf881" (UID: "cc21cc06-410c-4afe-85d4-a72d8cebf881"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:59.628745 master-0 kubenswrapper[28758]: I0223 14:49:59.625638 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/cc21cc06-410c-4afe-85d4-a72d8cebf881-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "cc21cc06-410c-4afe-85d4-a72d8cebf881" (UID: "cc21cc06-410c-4afe-85d4-a72d8cebf881"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 23 14:49:59.635501 master-0 kubenswrapper[28758]: I0223 14:49:59.633672 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc21cc06-410c-4afe-85d4-a72d8cebf881-kube-api-access-pbxp6" (OuterVolumeSpecName: "kube-api-access-pbxp6") pod "cc21cc06-410c-4afe-85d4-a72d8cebf881" (UID: "cc21cc06-410c-4afe-85d4-a72d8cebf881"). InnerVolumeSpecName "kube-api-access-pbxp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:49:59.678043 master-0 kubenswrapper[28758]: I0223 14:49:59.673824 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc21cc06-410c-4afe-85d4-a72d8cebf881-config-data" (OuterVolumeSpecName: "config-data") pod "cc21cc06-410c-4afe-85d4-a72d8cebf881" (UID: "cc21cc06-410c-4afe-85d4-a72d8cebf881"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:59.694503 master-0 kubenswrapper[28758]: I0223 14:49:59.694114 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc21cc06-410c-4afe-85d4-a72d8cebf881-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc21cc06-410c-4afe-85d4-a72d8cebf881" (UID: "cc21cc06-410c-4afe-85d4-a72d8cebf881"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:49:59.719513 master-0 kubenswrapper[28758]: I0223 14:49:59.718898 28758 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cc21cc06-410c-4afe-85d4-a72d8cebf881-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:59.719513 master-0 kubenswrapper[28758]: I0223 14:49:59.718949 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc21cc06-410c-4afe-85d4-a72d8cebf881-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:59.719513 master-0 kubenswrapper[28758]: I0223 14:49:59.718962 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbxp6\" (UniqueName: \"kubernetes.io/projected/cc21cc06-410c-4afe-85d4-a72d8cebf881-kube-api-access-pbxp6\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:59.719513 master-0 kubenswrapper[28758]: I0223 14:49:59.718972 28758 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cc21cc06-410c-4afe-85d4-a72d8cebf881-config-data-merged\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:59.719513 master-0 kubenswrapper[28758]: I0223 14:49:59.718982 28758 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc21cc06-410c-4afe-85d4-a72d8cebf881-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:59.719513 master-0 kubenswrapper[28758]: I0223 14:49:59.719021 28758 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc21cc06-410c-4afe-85d4-a72d8cebf881-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:49:59.938468 master-0 kubenswrapper[28758]: I0223 14:49:59.938395 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-szljh" event={"ID":"cc21cc06-410c-4afe-85d4-a72d8cebf881","Type":"ContainerDied","Data":"0ef7982cd5cc6497dcf8db96b7e01a6d38aa2c80836d3b65a3fa8ce65d99959e"} Feb 23 14:49:59.938468 master-0 kubenswrapper[28758]: I0223 14:49:59.938456 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ef7982cd5cc6497dcf8db96b7e01a6d38aa2c80836d3b65a3fa8ce65d99959e" Feb 23 14:49:59.938468 master-0 kubenswrapper[28758]: I0223 14:49:59.938421 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-szljh" Feb 23 14:49:59.941688 master-0 kubenswrapper[28758]: I0223 14:49:59.940923 28758 generic.go:334] "Generic (PLEG): container finished" podID="7a793659-5013-4d7e-a986-ea95782880c2" containerID="08cf06726322e72e4cb4550309df3da0cc5dbd24384fbcb6e109a66128208f17" exitCode=0 Feb 23 14:49:59.941688 master-0 kubenswrapper[28758]: I0223 14:49:59.940998 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-backup-0" event={"ID":"7a793659-5013-4d7e-a986-ea95782880c2","Type":"ContainerDied","Data":"08cf06726322e72e4cb4550309df3da0cc5dbd24384fbcb6e109a66128208f17"} Feb 23 14:49:59.943146 master-0 kubenswrapper[28758]: I0223 14:49:59.943009 28758 generic.go:334] "Generic (PLEG): container finished" podID="42d8d5ca-eecd-4a43-9a68-1099bab111aa" containerID="9d782ea0bcfed09e73794591745ae7dd51f3de5161f1e96d6cb094ade09f86aa" exitCode=0 Feb 23 14:49:59.943287 master-0 kubenswrapper[28758]: I0223 14:49:59.943147 28758 generic.go:334] "Generic (PLEG): container finished" podID="42d8d5ca-eecd-4a43-9a68-1099bab111aa" containerID="3bdcc4b7933f394730d3f24b022b4af86aac0763903d06917b66ab3c00fdc562" exitCode=0 Feb 23 14:49:59.943287 master-0 kubenswrapper[28758]: I0223 14:49:59.943113 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-volume-lvm-iscsi-0" event={"ID":"42d8d5ca-eecd-4a43-9a68-1099bab111aa","Type":"ContainerDied","Data":"9d782ea0bcfed09e73794591745ae7dd51f3de5161f1e96d6cb094ade09f86aa"} Feb 23 14:49:59.943287 master-0 kubenswrapper[28758]: I0223 14:49:59.943184 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-volume-lvm-iscsi-0" event={"ID":"42d8d5ca-eecd-4a43-9a68-1099bab111aa","Type":"ContainerDied","Data":"3bdcc4b7933f394730d3f24b022b4af86aac0763903d06917b66ab3c00fdc562"} Feb 23 14:50:00.410527 master-0 kubenswrapper[28758]: I0223 14:50:00.398839 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:00.433622 master-0 kubenswrapper[28758]: I0223 14:50:00.430340 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-db-create-w9qz6"] Feb 23 14:50:00.433622 master-0 kubenswrapper[28758]: E0223 14:50:00.431109 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42d8d5ca-eecd-4a43-9a68-1099bab111aa" containerName="probe" Feb 23 14:50:00.433622 master-0 kubenswrapper[28758]: I0223 14:50:00.431132 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="42d8d5ca-eecd-4a43-9a68-1099bab111aa" containerName="probe" Feb 23 14:50:00.433622 master-0 kubenswrapper[28758]: E0223 14:50:00.431192 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc21cc06-410c-4afe-85d4-a72d8cebf881" containerName="init" Feb 23 14:50:00.433622 master-0 kubenswrapper[28758]: I0223 14:50:00.431198 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc21cc06-410c-4afe-85d4-a72d8cebf881" containerName="init" Feb 23 14:50:00.433622 master-0 kubenswrapper[28758]: E0223 14:50:00.431208 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42d8d5ca-eecd-4a43-9a68-1099bab111aa" containerName="cinder-volume" Feb 23 14:50:00.433622 master-0 kubenswrapper[28758]: I0223 14:50:00.431215 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="42d8d5ca-eecd-4a43-9a68-1099bab111aa" containerName="cinder-volume" Feb 23 14:50:00.433622 master-0 kubenswrapper[28758]: E0223 14:50:00.431235 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc21cc06-410c-4afe-85d4-a72d8cebf881" containerName="ironic-db-sync" Feb 23 14:50:00.433622 master-0 kubenswrapper[28758]: I0223 14:50:00.431240 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc21cc06-410c-4afe-85d4-a72d8cebf881" containerName="ironic-db-sync" Feb 23 14:50:00.433622 master-0 kubenswrapper[28758]: I0223 14:50:00.431436 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="42d8d5ca-eecd-4a43-9a68-1099bab111aa" containerName="cinder-volume" Feb 23 14:50:00.433622 master-0 kubenswrapper[28758]: I0223 14:50:00.431469 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc21cc06-410c-4afe-85d4-a72d8cebf881" containerName="ironic-db-sync" Feb 23 14:50:00.433622 master-0 kubenswrapper[28758]: I0223 14:50:00.431514 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="42d8d5ca-eecd-4a43-9a68-1099bab111aa" containerName="probe" Feb 23 14:50:00.433622 master-0 kubenswrapper[28758]: I0223 14:50:00.432363 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-w9qz6" Feb 23 14:50:00.454501 master-0 kubenswrapper[28758]: I0223 14:50:00.452083 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-create-w9qz6"] Feb 23 14:50:00.507566 master-0 kubenswrapper[28758]: I0223 14:50:00.488859 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-neutron-agent-769d7f49c5-jj8xx"] Feb 23 14:50:00.507566 master-0 kubenswrapper[28758]: I0223 14:50:00.491322 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-neutron-agent-769d7f49c5-jj8xx" Feb 23 14:50:00.507566 master-0 kubenswrapper[28758]: I0223 14:50:00.501639 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-neutron-agent-769d7f49c5-jj8xx"] Feb 23 14:50:00.536598 master-0 kubenswrapper[28758]: I0223 14:50:00.521234 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-ironic-neutron-agent-config-data" Feb 23 14:50:00.536598 master-0 kubenswrapper[28758]: I0223 14:50:00.536546 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-149f-account-create-update-sktxn"] Feb 23 14:50:00.538213 master-0 kubenswrapper[28758]: I0223 14:50:00.538173 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-149f-account-create-update-sktxn" Feb 23 14:50:00.540042 master-0 kubenswrapper[28758]: I0223 14:50:00.539988 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-db-secret" Feb 23 14:50:00.542622 master-0 kubenswrapper[28758]: I0223 14:50:00.542580 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-var-locks-brick\") pod \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " Feb 23 14:50:00.542622 master-0 kubenswrapper[28758]: I0223 14:50:00.542623 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42d8d5ca-eecd-4a43-9a68-1099bab111aa-scripts\") pod \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " Feb 23 14:50:00.542800 master-0 kubenswrapper[28758]: I0223 14:50:00.542657 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-dev\") pod \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " Feb 23 14:50:00.544556 master-0 kubenswrapper[28758]: I0223 14:50:00.544504 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "42d8d5ca-eecd-4a43-9a68-1099bab111aa" (UID: "42d8d5ca-eecd-4a43-9a68-1099bab111aa"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:50:00.544911 master-0 kubenswrapper[28758]: I0223 14:50:00.544810 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-dev" (OuterVolumeSpecName: "dev") pod "42d8d5ca-eecd-4a43-9a68-1099bab111aa" (UID: "42d8d5ca-eecd-4a43-9a68-1099bab111aa"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:50:00.547575 master-0 kubenswrapper[28758]: I0223 14:50:00.547527 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-etc-iscsi\") pod \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " Feb 23 14:50:00.547710 master-0 kubenswrapper[28758]: I0223 14:50:00.547612 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szxf8\" (UniqueName: \"kubernetes.io/projected/42d8d5ca-eecd-4a43-9a68-1099bab111aa-kube-api-access-szxf8\") pod \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " Feb 23 14:50:00.547710 master-0 kubenswrapper[28758]: I0223 14:50:00.547644 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42d8d5ca-eecd-4a43-9a68-1099bab111aa-config-data\") pod \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " Feb 23 14:50:00.547710 master-0 kubenswrapper[28758]: I0223 14:50:00.547683 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-sys\") pod \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " Feb 23 14:50:00.547873 master-0 kubenswrapper[28758]: I0223 14:50:00.547714 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-lib-modules\") pod \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " Feb 23 14:50:00.547873 master-0 kubenswrapper[28758]: I0223 14:50:00.547782 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-etc-machine-id\") pod \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " Feb 23 14:50:00.547873 master-0 kubenswrapper[28758]: I0223 14:50:00.547858 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-var-locks-cinder\") pod \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " Feb 23 14:50:00.548118 master-0 kubenswrapper[28758]: I0223 14:50:00.547886 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d8d5ca-eecd-4a43-9a68-1099bab111aa-combined-ca-bundle\") pod \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " Feb 23 14:50:00.548118 master-0 kubenswrapper[28758]: I0223 14:50:00.547909 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-etc-nvme\") pod \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " Feb 23 14:50:00.548118 master-0 kubenswrapper[28758]: I0223 14:50:00.547946 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42d8d5ca-eecd-4a43-9a68-1099bab111aa-config-data-custom\") pod \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " Feb 23 14:50:00.548118 master-0 kubenswrapper[28758]: I0223 14:50:00.547977 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-run\") pod \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " Feb 23 14:50:00.548118 master-0 kubenswrapper[28758]: I0223 14:50:00.547993 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-var-lib-cinder\") pod \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\" (UID: \"42d8d5ca-eecd-4a43-9a68-1099bab111aa\") " Feb 23 14:50:00.548389 master-0 kubenswrapper[28758]: I0223 14:50:00.548256 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a47ca88-30b0-4569-bcd5-00994d3facc0-operator-scripts\") pod \"ironic-inspector-db-create-w9qz6\" (UID: \"6a47ca88-30b0-4569-bcd5-00994d3facc0\") " pod="openstack/ironic-inspector-db-create-w9qz6" Feb 23 14:50:00.548450 master-0 kubenswrapper[28758]: I0223 14:50:00.548434 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsndn\" (UniqueName: \"kubernetes.io/projected/6a47ca88-30b0-4569-bcd5-00994d3facc0-kube-api-access-wsndn\") pod \"ironic-inspector-db-create-w9qz6\" (UID: \"6a47ca88-30b0-4569-bcd5-00994d3facc0\") " pod="openstack/ironic-inspector-db-create-w9qz6" Feb 23 14:50:00.548521 master-0 kubenswrapper[28758]: I0223 14:50:00.548434 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-sys" (OuterVolumeSpecName: "sys") pod "42d8d5ca-eecd-4a43-9a68-1099bab111aa" (UID: "42d8d5ca-eecd-4a43-9a68-1099bab111aa"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:50:00.548571 master-0 kubenswrapper[28758]: I0223 14:50:00.548524 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "42d8d5ca-eecd-4a43-9a68-1099bab111aa" (UID: "42d8d5ca-eecd-4a43-9a68-1099bab111aa"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:50:00.548753 master-0 kubenswrapper[28758]: I0223 14:50:00.548709 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "42d8d5ca-eecd-4a43-9a68-1099bab111aa" (UID: "42d8d5ca-eecd-4a43-9a68-1099bab111aa"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:50:00.548834 master-0 kubenswrapper[28758]: I0223 14:50:00.548773 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "42d8d5ca-eecd-4a43-9a68-1099bab111aa" (UID: "42d8d5ca-eecd-4a43-9a68-1099bab111aa"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:50:00.549079 master-0 kubenswrapper[28758]: I0223 14:50:00.549047 28758 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-var-locks-brick\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:00.549079 master-0 kubenswrapper[28758]: I0223 14:50:00.549075 28758 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-dev\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:00.549616 master-0 kubenswrapper[28758]: I0223 14:50:00.549089 28758 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-etc-iscsi\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:00.549616 master-0 kubenswrapper[28758]: I0223 14:50:00.549101 28758 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-sys\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:00.549616 master-0 kubenswrapper[28758]: I0223 14:50:00.549112 28758 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-lib-modules\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:00.549616 master-0 kubenswrapper[28758]: I0223 14:50:00.549125 28758 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-var-locks-cinder\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:00.549616 master-0 kubenswrapper[28758]: I0223 14:50:00.549550 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-run" (OuterVolumeSpecName: "run") pod "42d8d5ca-eecd-4a43-9a68-1099bab111aa" (UID: "42d8d5ca-eecd-4a43-9a68-1099bab111aa"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:50:00.549616 master-0 kubenswrapper[28758]: I0223 14:50:00.549595 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "42d8d5ca-eecd-4a43-9a68-1099bab111aa" (UID: "42d8d5ca-eecd-4a43-9a68-1099bab111aa"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:50:00.550045 master-0 kubenswrapper[28758]: I0223 14:50:00.550014 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "42d8d5ca-eecd-4a43-9a68-1099bab111aa" (UID: "42d8d5ca-eecd-4a43-9a68-1099bab111aa"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:50:00.550120 master-0 kubenswrapper[28758]: I0223 14:50:00.550061 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "42d8d5ca-eecd-4a43-9a68-1099bab111aa" (UID: "42d8d5ca-eecd-4a43-9a68-1099bab111aa"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:50:00.583617 master-0 kubenswrapper[28758]: I0223 14:50:00.581747 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42d8d5ca-eecd-4a43-9a68-1099bab111aa-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "42d8d5ca-eecd-4a43-9a68-1099bab111aa" (UID: "42d8d5ca-eecd-4a43-9a68-1099bab111aa"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:00.583617 master-0 kubenswrapper[28758]: I0223 14:50:00.581986 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42d8d5ca-eecd-4a43-9a68-1099bab111aa-scripts" (OuterVolumeSpecName: "scripts") pod "42d8d5ca-eecd-4a43-9a68-1099bab111aa" (UID: "42d8d5ca-eecd-4a43-9a68-1099bab111aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:00.583617 master-0 kubenswrapper[28758]: I0223 14:50:00.582146 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42d8d5ca-eecd-4a43-9a68-1099bab111aa-kube-api-access-szxf8" (OuterVolumeSpecName: "kube-api-access-szxf8") pod "42d8d5ca-eecd-4a43-9a68-1099bab111aa" (UID: "42d8d5ca-eecd-4a43-9a68-1099bab111aa"). InnerVolumeSpecName "kube-api-access-szxf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:50:00.615085 master-0 kubenswrapper[28758]: I0223 14:50:00.607907 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42d8d5ca-eecd-4a43-9a68-1099bab111aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42d8d5ca-eecd-4a43-9a68-1099bab111aa" (UID: "42d8d5ca-eecd-4a43-9a68-1099bab111aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:00.650793 master-0 kubenswrapper[28758]: I0223 14:50:00.650674 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8c0c40ba-2093-4e1a-8166-bbb4c53f3a08-config\") pod \"ironic-neutron-agent-769d7f49c5-jj8xx\" (UID: \"8c0c40ba-2093-4e1a-8166-bbb4c53f3a08\") " pod="openstack/ironic-neutron-agent-769d7f49c5-jj8xx" Feb 23 14:50:00.650937 master-0 kubenswrapper[28758]: I0223 14:50:00.650894 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a47ca88-30b0-4569-bcd5-00994d3facc0-operator-scripts\") pod \"ironic-inspector-db-create-w9qz6\" (UID: \"6a47ca88-30b0-4569-bcd5-00994d3facc0\") " pod="openstack/ironic-inspector-db-create-w9qz6" Feb 23 14:50:00.650991 master-0 kubenswrapper[28758]: I0223 14:50:00.650936 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c0c40ba-2093-4e1a-8166-bbb4c53f3a08-combined-ca-bundle\") pod \"ironic-neutron-agent-769d7f49c5-jj8xx\" (UID: \"8c0c40ba-2093-4e1a-8166-bbb4c53f3a08\") " pod="openstack/ironic-neutron-agent-769d7f49c5-jj8xx" Feb 23 14:50:00.651165 master-0 kubenswrapper[28758]: I0223 14:50:00.651120 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z9v6\" (UniqueName: \"kubernetes.io/projected/251ce51f-9613-4b91-987b-bb29a897430f-kube-api-access-2z9v6\") pod \"ironic-inspector-149f-account-create-update-sktxn\" (UID: \"251ce51f-9613-4b91-987b-bb29a897430f\") " pod="openstack/ironic-inspector-149f-account-create-update-sktxn" Feb 23 14:50:00.651351 master-0 kubenswrapper[28758]: I0223 14:50:00.651200 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsndn\" (UniqueName: \"kubernetes.io/projected/6a47ca88-30b0-4569-bcd5-00994d3facc0-kube-api-access-wsndn\") pod \"ironic-inspector-db-create-w9qz6\" (UID: \"6a47ca88-30b0-4569-bcd5-00994d3facc0\") " pod="openstack/ironic-inspector-db-create-w9qz6" Feb 23 14:50:00.651351 master-0 kubenswrapper[28758]: I0223 14:50:00.651260 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/251ce51f-9613-4b91-987b-bb29a897430f-operator-scripts\") pod \"ironic-inspector-149f-account-create-update-sktxn\" (UID: \"251ce51f-9613-4b91-987b-bb29a897430f\") " pod="openstack/ironic-inspector-149f-account-create-update-sktxn" Feb 23 14:50:00.651351 master-0 kubenswrapper[28758]: I0223 14:50:00.651304 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j455\" (UniqueName: \"kubernetes.io/projected/8c0c40ba-2093-4e1a-8166-bbb4c53f3a08-kube-api-access-5j455\") pod \"ironic-neutron-agent-769d7f49c5-jj8xx\" (UID: \"8c0c40ba-2093-4e1a-8166-bbb4c53f3a08\") " pod="openstack/ironic-neutron-agent-769d7f49c5-jj8xx" Feb 23 14:50:00.651510 master-0 kubenswrapper[28758]: I0223 14:50:00.651375 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szxf8\" (UniqueName: \"kubernetes.io/projected/42d8d5ca-eecd-4a43-9a68-1099bab111aa-kube-api-access-szxf8\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:00.651510 master-0 kubenswrapper[28758]: I0223 14:50:00.651392 28758 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:00.651510 master-0 kubenswrapper[28758]: I0223 14:50:00.651401 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d8d5ca-eecd-4a43-9a68-1099bab111aa-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:00.651510 master-0 kubenswrapper[28758]: I0223 14:50:00.651411 28758 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-etc-nvme\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:00.651510 master-0 kubenswrapper[28758]: I0223 14:50:00.651434 28758 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/42d8d5ca-eecd-4a43-9a68-1099bab111aa-config-data-custom\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:00.651510 master-0 kubenswrapper[28758]: I0223 14:50:00.651443 28758 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-run\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:00.651510 master-0 kubenswrapper[28758]: I0223 14:50:00.651452 28758 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/42d8d5ca-eecd-4a43-9a68-1099bab111aa-var-lib-cinder\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:00.651510 master-0 kubenswrapper[28758]: I0223 14:50:00.651460 28758 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42d8d5ca-eecd-4a43-9a68-1099bab111aa-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:00.651790 master-0 kubenswrapper[28758]: I0223 14:50:00.651675 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a47ca88-30b0-4569-bcd5-00994d3facc0-operator-scripts\") pod \"ironic-inspector-db-create-w9qz6\" (UID: \"6a47ca88-30b0-4569-bcd5-00994d3facc0\") " pod="openstack/ironic-inspector-db-create-w9qz6" Feb 23 14:50:01.207490 master-0 kubenswrapper[28758]: I0223 14:50:00.753659 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c0c40ba-2093-4e1a-8166-bbb4c53f3a08-combined-ca-bundle\") pod \"ironic-neutron-agent-769d7f49c5-jj8xx\" (UID: \"8c0c40ba-2093-4e1a-8166-bbb4c53f3a08\") " pod="openstack/ironic-neutron-agent-769d7f49c5-jj8xx" Feb 23 14:50:01.207490 master-0 kubenswrapper[28758]: I0223 14:50:00.753749 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z9v6\" (UniqueName: \"kubernetes.io/projected/251ce51f-9613-4b91-987b-bb29a897430f-kube-api-access-2z9v6\") pod \"ironic-inspector-149f-account-create-update-sktxn\" (UID: \"251ce51f-9613-4b91-987b-bb29a897430f\") " pod="openstack/ironic-inspector-149f-account-create-update-sktxn" Feb 23 14:50:01.207490 master-0 kubenswrapper[28758]: I0223 14:50:00.753861 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/251ce51f-9613-4b91-987b-bb29a897430f-operator-scripts\") pod \"ironic-inspector-149f-account-create-update-sktxn\" (UID: \"251ce51f-9613-4b91-987b-bb29a897430f\") " pod="openstack/ironic-inspector-149f-account-create-update-sktxn" Feb 23 14:50:01.207490 master-0 kubenswrapper[28758]: I0223 14:50:00.753891 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j455\" (UniqueName: \"kubernetes.io/projected/8c0c40ba-2093-4e1a-8166-bbb4c53f3a08-kube-api-access-5j455\") pod \"ironic-neutron-agent-769d7f49c5-jj8xx\" (UID: \"8c0c40ba-2093-4e1a-8166-bbb4c53f3a08\") " pod="openstack/ironic-neutron-agent-769d7f49c5-jj8xx" Feb 23 14:50:01.207490 master-0 kubenswrapper[28758]: I0223 14:50:00.753924 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8c0c40ba-2093-4e1a-8166-bbb4c53f3a08-config\") pod \"ironic-neutron-agent-769d7f49c5-jj8xx\" (UID: \"8c0c40ba-2093-4e1a-8166-bbb4c53f3a08\") " pod="openstack/ironic-neutron-agent-769d7f49c5-jj8xx" Feb 23 14:50:01.207490 master-0 kubenswrapper[28758]: I0223 14:50:00.756567 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/251ce51f-9613-4b91-987b-bb29a897430f-operator-scripts\") pod \"ironic-inspector-149f-account-create-update-sktxn\" (UID: \"251ce51f-9613-4b91-987b-bb29a897430f\") " pod="openstack/ironic-inspector-149f-account-create-update-sktxn" Feb 23 14:50:01.207490 master-0 kubenswrapper[28758]: I0223 14:50:00.765116 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c0c40ba-2093-4e1a-8166-bbb4c53f3a08-combined-ca-bundle\") pod \"ironic-neutron-agent-769d7f49c5-jj8xx\" (UID: \"8c0c40ba-2093-4e1a-8166-bbb4c53f3a08\") " pod="openstack/ironic-neutron-agent-769d7f49c5-jj8xx" Feb 23 14:50:01.207490 master-0 kubenswrapper[28758]: I0223 14:50:00.765534 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8c0c40ba-2093-4e1a-8166-bbb4c53f3a08-config\") pod \"ironic-neutron-agent-769d7f49c5-jj8xx\" (UID: \"8c0c40ba-2093-4e1a-8166-bbb4c53f3a08\") " pod="openstack/ironic-neutron-agent-769d7f49c5-jj8xx" Feb 23 14:50:01.207490 master-0 kubenswrapper[28758]: I0223 14:50:00.954873 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-volume-lvm-iscsi-0" event={"ID":"42d8d5ca-eecd-4a43-9a68-1099bab111aa","Type":"ContainerDied","Data":"6fa8b148b095cdbd794cb63df596cd695896caadbdfe78ebcd369f966d6f37eb"} Feb 23 14:50:01.207490 master-0 kubenswrapper[28758]: I0223 14:50:00.954935 28758 scope.go:117] "RemoveContainer" containerID="9d782ea0bcfed09e73794591745ae7dd51f3de5161f1e96d6cb094ade09f86aa" Feb 23 14:50:01.207490 master-0 kubenswrapper[28758]: I0223 14:50:00.955120 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.207490 master-0 kubenswrapper[28758]: I0223 14:50:00.959569 28758 generic.go:334] "Generic (PLEG): container finished" podID="4021d59b-dfa8-49cc-b55c-48469a02b971" containerID="d10703ef54d01e9d2bd5964168e28f6bb29d723aece463c73eb4d8418eb7106d" exitCode=0 Feb 23 14:50:01.207490 master-0 kubenswrapper[28758]: I0223 14:50:00.959599 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-scheduler-0" event={"ID":"4021d59b-dfa8-49cc-b55c-48469a02b971","Type":"ContainerDied","Data":"d10703ef54d01e9d2bd5964168e28f6bb29d723aece463c73eb4d8418eb7106d"} Feb 23 14:50:01.207490 master-0 kubenswrapper[28758]: I0223 14:50:00.978562 28758 scope.go:117] "RemoveContainer" containerID="3bdcc4b7933f394730d3f24b022b4af86aac0763903d06917b66ab3c00fdc562" Feb 23 14:50:01.294260 master-0 kubenswrapper[28758]: I0223 14:50:01.288840 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-149f-account-create-update-sktxn"] Feb 23 14:50:01.329603 master-0 kubenswrapper[28758]: I0223 14:50:01.329554 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsndn\" (UniqueName: \"kubernetes.io/projected/6a47ca88-30b0-4569-bcd5-00994d3facc0-kube-api-access-wsndn\") pod \"ironic-inspector-db-create-w9qz6\" (UID: \"6a47ca88-30b0-4569-bcd5-00994d3facc0\") " pod="openstack/ironic-inspector-db-create-w9qz6" Feb 23 14:50:01.330358 master-0 kubenswrapper[28758]: I0223 14:50:01.330336 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z9v6\" (UniqueName: \"kubernetes.io/projected/251ce51f-9613-4b91-987b-bb29a897430f-kube-api-access-2z9v6\") pod \"ironic-inspector-149f-account-create-update-sktxn\" (UID: \"251ce51f-9613-4b91-987b-bb29a897430f\") " pod="openstack/ironic-inspector-149f-account-create-update-sktxn" Feb 23 14:50:01.337393 master-0 kubenswrapper[28758]: I0223 14:50:01.333933 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j455\" (UniqueName: \"kubernetes.io/projected/8c0c40ba-2093-4e1a-8166-bbb4c53f3a08-kube-api-access-5j455\") pod \"ironic-neutron-agent-769d7f49c5-jj8xx\" (UID: \"8c0c40ba-2093-4e1a-8166-bbb4c53f3a08\") " pod="openstack/ironic-neutron-agent-769d7f49c5-jj8xx" Feb 23 14:50:01.387830 master-0 kubenswrapper[28758]: I0223 14:50:01.387693 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-w9qz6" Feb 23 14:50:01.427499 master-0 kubenswrapper[28758]: I0223 14:50:01.413408 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b69b7cf8f-pcb5c"] Feb 23 14:50:01.427499 master-0 kubenswrapper[28758]: I0223 14:50:01.415098 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b69b7cf8f-pcb5c" podUID="e4ab41cd-61e0-4117-9d0f-108f29ae692c" containerName="dnsmasq-dns" containerID="cri-o://16c08e5f7966221a02fc02f63d75f242d6fccae0624607fa8aab15528d4dcee7" gracePeriod=10 Feb 23 14:50:01.429157 master-0 kubenswrapper[28758]: I0223 14:50:01.428560 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b69b7cf8f-pcb5c" Feb 23 14:50:01.439349 master-0 kubenswrapper[28758]: I0223 14:50:01.439281 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42d8d5ca-eecd-4a43-9a68-1099bab111aa-config-data" (OuterVolumeSpecName: "config-data") pod "42d8d5ca-eecd-4a43-9a68-1099bab111aa" (UID: "42d8d5ca-eecd-4a43-9a68-1099bab111aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:01.490027 master-0 kubenswrapper[28758]: I0223 14:50:01.489977 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-677b4847c-74h4p"] Feb 23 14:50:01.495686 master-0 kubenswrapper[28758]: I0223 14:50:01.495631 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-677b4847c-74h4p" Feb 23 14:50:01.511041 master-0 kubenswrapper[28758]: I0223 14:50:01.510977 28758 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42d8d5ca-eecd-4a43-9a68-1099bab111aa-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:01.511598 master-0 kubenswrapper[28758]: I0223 14:50:01.511052 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-677b4847c-74h4p"] Feb 23 14:50:01.571038 master-0 kubenswrapper[28758]: I0223 14:50:01.570317 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-neutron-agent-769d7f49c5-jj8xx" Feb 23 14:50:01.571038 master-0 kubenswrapper[28758]: I0223 14:50:01.570996 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-149f-account-create-update-sktxn" Feb 23 14:50:01.642394 master-0 kubenswrapper[28758]: I0223 14:50:01.642328 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7zw8\" (UniqueName: \"kubernetes.io/projected/07aaf639-2ccb-4ffb-be79-572b8990a03a-kube-api-access-t7zw8\") pod \"dnsmasq-dns-677b4847c-74h4p\" (UID: \"07aaf639-2ccb-4ffb-be79-572b8990a03a\") " pod="openstack/dnsmasq-dns-677b4847c-74h4p" Feb 23 14:50:01.642681 master-0 kubenswrapper[28758]: I0223 14:50:01.642472 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07aaf639-2ccb-4ffb-be79-572b8990a03a-dns-swift-storage-0\") pod \"dnsmasq-dns-677b4847c-74h4p\" (UID: \"07aaf639-2ccb-4ffb-be79-572b8990a03a\") " pod="openstack/dnsmasq-dns-677b4847c-74h4p" Feb 23 14:50:01.642681 master-0 kubenswrapper[28758]: I0223 14:50:01.642529 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07aaf639-2ccb-4ffb-be79-572b8990a03a-config\") pod \"dnsmasq-dns-677b4847c-74h4p\" (UID: \"07aaf639-2ccb-4ffb-be79-572b8990a03a\") " pod="openstack/dnsmasq-dns-677b4847c-74h4p" Feb 23 14:50:01.642681 master-0 kubenswrapper[28758]: I0223 14:50:01.642596 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07aaf639-2ccb-4ffb-be79-572b8990a03a-ovsdbserver-sb\") pod \"dnsmasq-dns-677b4847c-74h4p\" (UID: \"07aaf639-2ccb-4ffb-be79-572b8990a03a\") " pod="openstack/dnsmasq-dns-677b4847c-74h4p" Feb 23 14:50:01.642889 master-0 kubenswrapper[28758]: I0223 14:50:01.642858 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07aaf639-2ccb-4ffb-be79-572b8990a03a-ovsdbserver-nb\") pod \"dnsmasq-dns-677b4847c-74h4p\" (UID: \"07aaf639-2ccb-4ffb-be79-572b8990a03a\") " pod="openstack/dnsmasq-dns-677b4847c-74h4p" Feb 23 14:50:01.643373 master-0 kubenswrapper[28758]: I0223 14:50:01.643243 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07aaf639-2ccb-4ffb-be79-572b8990a03a-dns-svc\") pod \"dnsmasq-dns-677b4847c-74h4p\" (UID: \"07aaf639-2ccb-4ffb-be79-572b8990a03a\") " pod="openstack/dnsmasq-dns-677b4847c-74h4p" Feb 23 14:50:01.657443 master-0 kubenswrapper[28758]: I0223 14:50:01.657380 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-7dbb4799c9-gg6zd"] Feb 23 14:50:01.660807 master-0 kubenswrapper[28758]: I0223 14:50:01.660758 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-7dbb4799c9-gg6zd" Feb 23 14:50:01.666023 master-0 kubenswrapper[28758]: I0223 14:50:01.665975 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-transport" Feb 23 14:50:01.666327 master-0 kubenswrapper[28758]: I0223 14:50:01.666300 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-api-scripts" Feb 23 14:50:01.666618 master-0 kubenswrapper[28758]: I0223 14:50:01.666592 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-config-data" Feb 23 14:50:01.666825 master-0 kubenswrapper[28758]: I0223 14:50:01.666793 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 23 14:50:01.666969 master-0 kubenswrapper[28758]: I0223 14:50:01.666947 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-api-config-data" Feb 23 14:50:01.711360 master-0 kubenswrapper[28758]: I0223 14:50:01.711214 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-7dbb4799c9-gg6zd"] Feb 23 14:50:01.750280 master-0 kubenswrapper[28758]: I0223 14:50:01.744791 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7zw8\" (UniqueName: \"kubernetes.io/projected/07aaf639-2ccb-4ffb-be79-572b8990a03a-kube-api-access-t7zw8\") pod \"dnsmasq-dns-677b4847c-74h4p\" (UID: \"07aaf639-2ccb-4ffb-be79-572b8990a03a\") " pod="openstack/dnsmasq-dns-677b4847c-74h4p" Feb 23 14:50:01.750280 master-0 kubenswrapper[28758]: I0223 14:50:01.744844 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ddd7c1f3-7e33-4a32-a229-09521aa553e2-etc-podinfo\") pod \"ironic-7dbb4799c9-gg6zd\" (UID: \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\") " pod="openstack/ironic-7dbb4799c9-gg6zd" Feb 23 14:50:01.750280 master-0 kubenswrapper[28758]: I0223 14:50:01.744914 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddd7c1f3-7e33-4a32-a229-09521aa553e2-logs\") pod \"ironic-7dbb4799c9-gg6zd\" (UID: \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\") " pod="openstack/ironic-7dbb4799c9-gg6zd" Feb 23 14:50:01.750280 master-0 kubenswrapper[28758]: I0223 14:50:01.744977 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07aaf639-2ccb-4ffb-be79-572b8990a03a-dns-swift-storage-0\") pod \"dnsmasq-dns-677b4847c-74h4p\" (UID: \"07aaf639-2ccb-4ffb-be79-572b8990a03a\") " pod="openstack/dnsmasq-dns-677b4847c-74h4p" Feb 23 14:50:01.750280 master-0 kubenswrapper[28758]: I0223 14:50:01.745030 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07aaf639-2ccb-4ffb-be79-572b8990a03a-config\") pod \"dnsmasq-dns-677b4847c-74h4p\" (UID: \"07aaf639-2ccb-4ffb-be79-572b8990a03a\") " pod="openstack/dnsmasq-dns-677b4847c-74h4p" Feb 23 14:50:01.750280 master-0 kubenswrapper[28758]: I0223 14:50:01.745055 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ddd7c1f3-7e33-4a32-a229-09521aa553e2-config-data-custom\") pod \"ironic-7dbb4799c9-gg6zd\" (UID: \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\") " pod="openstack/ironic-7dbb4799c9-gg6zd" Feb 23 14:50:01.750280 master-0 kubenswrapper[28758]: I0223 14:50:01.745078 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07aaf639-2ccb-4ffb-be79-572b8990a03a-ovsdbserver-sb\") pod \"dnsmasq-dns-677b4847c-74h4p\" (UID: \"07aaf639-2ccb-4ffb-be79-572b8990a03a\") " pod="openstack/dnsmasq-dns-677b4847c-74h4p" Feb 23 14:50:01.750280 master-0 kubenswrapper[28758]: I0223 14:50:01.745120 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd7c1f3-7e33-4a32-a229-09521aa553e2-combined-ca-bundle\") pod \"ironic-7dbb4799c9-gg6zd\" (UID: \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\") " pod="openstack/ironic-7dbb4799c9-gg6zd" Feb 23 14:50:01.750280 master-0 kubenswrapper[28758]: I0223 14:50:01.745139 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07aaf639-2ccb-4ffb-be79-572b8990a03a-ovsdbserver-nb\") pod \"dnsmasq-dns-677b4847c-74h4p\" (UID: \"07aaf639-2ccb-4ffb-be79-572b8990a03a\") " pod="openstack/dnsmasq-dns-677b4847c-74h4p" Feb 23 14:50:01.750280 master-0 kubenswrapper[28758]: I0223 14:50:01.745164 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ddd7c1f3-7e33-4a32-a229-09521aa553e2-config-data-merged\") pod \"ironic-7dbb4799c9-gg6zd\" (UID: \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\") " pod="openstack/ironic-7dbb4799c9-gg6zd" Feb 23 14:50:01.750280 master-0 kubenswrapper[28758]: I0223 14:50:01.745202 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07aaf639-2ccb-4ffb-be79-572b8990a03a-dns-svc\") pod \"dnsmasq-dns-677b4847c-74h4p\" (UID: \"07aaf639-2ccb-4ffb-be79-572b8990a03a\") " pod="openstack/dnsmasq-dns-677b4847c-74h4p" Feb 23 14:50:01.750280 master-0 kubenswrapper[28758]: I0223 14:50:01.745241 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddd7c1f3-7e33-4a32-a229-09521aa553e2-scripts\") pod \"ironic-7dbb4799c9-gg6zd\" (UID: \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\") " pod="openstack/ironic-7dbb4799c9-gg6zd" Feb 23 14:50:01.750280 master-0 kubenswrapper[28758]: I0223 14:50:01.745302 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn2gq\" (UniqueName: \"kubernetes.io/projected/ddd7c1f3-7e33-4a32-a229-09521aa553e2-kube-api-access-sn2gq\") pod \"ironic-7dbb4799c9-gg6zd\" (UID: \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\") " pod="openstack/ironic-7dbb4799c9-gg6zd" Feb 23 14:50:01.750280 master-0 kubenswrapper[28758]: I0223 14:50:01.745366 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd7c1f3-7e33-4a32-a229-09521aa553e2-config-data\") pod \"ironic-7dbb4799c9-gg6zd\" (UID: \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\") " pod="openstack/ironic-7dbb4799c9-gg6zd" Feb 23 14:50:01.754435 master-0 kubenswrapper[28758]: I0223 14:50:01.754370 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07aaf639-2ccb-4ffb-be79-572b8990a03a-dns-swift-storage-0\") pod \"dnsmasq-dns-677b4847c-74h4p\" (UID: \"07aaf639-2ccb-4ffb-be79-572b8990a03a\") " pod="openstack/dnsmasq-dns-677b4847c-74h4p" Feb 23 14:50:01.760416 master-0 kubenswrapper[28758]: I0223 14:50:01.760189 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07aaf639-2ccb-4ffb-be79-572b8990a03a-config\") pod \"dnsmasq-dns-677b4847c-74h4p\" (UID: \"07aaf639-2ccb-4ffb-be79-572b8990a03a\") " pod="openstack/dnsmasq-dns-677b4847c-74h4p" Feb 23 14:50:01.766588 master-0 kubenswrapper[28758]: I0223 14:50:01.766197 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07aaf639-2ccb-4ffb-be79-572b8990a03a-ovsdbserver-nb\") pod \"dnsmasq-dns-677b4847c-74h4p\" (UID: \"07aaf639-2ccb-4ffb-be79-572b8990a03a\") " pod="openstack/dnsmasq-dns-677b4847c-74h4p" Feb 23 14:50:01.766962 master-0 kubenswrapper[28758]: I0223 14:50:01.766916 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07aaf639-2ccb-4ffb-be79-572b8990a03a-dns-svc\") pod \"dnsmasq-dns-677b4847c-74h4p\" (UID: \"07aaf639-2ccb-4ffb-be79-572b8990a03a\") " pod="openstack/dnsmasq-dns-677b4847c-74h4p" Feb 23 14:50:01.772414 master-0 kubenswrapper[28758]: I0223 14:50:01.769052 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07aaf639-2ccb-4ffb-be79-572b8990a03a-ovsdbserver-sb\") pod \"dnsmasq-dns-677b4847c-74h4p\" (UID: \"07aaf639-2ccb-4ffb-be79-572b8990a03a\") " pod="openstack/dnsmasq-dns-677b4847c-74h4p" Feb 23 14:50:01.778588 master-0 kubenswrapper[28758]: I0223 14:50:01.778249 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7zw8\" (UniqueName: \"kubernetes.io/projected/07aaf639-2ccb-4ffb-be79-572b8990a03a-kube-api-access-t7zw8\") pod \"dnsmasq-dns-677b4847c-74h4p\" (UID: \"07aaf639-2ccb-4ffb-be79-572b8990a03a\") " pod="openstack/dnsmasq-dns-677b4847c-74h4p" Feb 23 14:50:01.780024 master-0 kubenswrapper[28758]: I0223 14:50:01.779932 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-2d990-volume-lvm-iscsi-0"] Feb 23 14:50:01.805579 master-0 kubenswrapper[28758]: I0223 14:50:01.805508 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-2d990-volume-lvm-iscsi-0"] Feb 23 14:50:01.816119 master-0 kubenswrapper[28758]: I0223 14:50:01.816059 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-2d990-volume-lvm-iscsi-0"] Feb 23 14:50:01.820939 master-0 kubenswrapper[28758]: I0223 14:50:01.818311 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.827972 master-0 kubenswrapper[28758]: I0223 14:50:01.827446 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-2d990-volume-lvm-iscsi-config-data" Feb 23 14:50:01.837307 master-0 kubenswrapper[28758]: I0223 14:50:01.837195 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2d990-volume-lvm-iscsi-0"] Feb 23 14:50:01.854042 master-0 kubenswrapper[28758]: I0223 14:50:01.851122 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dc9b5902-d242-43ef-a8cc-a6b9f256507a-run\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.854042 master-0 kubenswrapper[28758]: I0223 14:50:01.851358 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ddd7c1f3-7e33-4a32-a229-09521aa553e2-config-data-custom\") pod \"ironic-7dbb4799c9-gg6zd\" (UID: \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\") " pod="openstack/ironic-7dbb4799c9-gg6zd" Feb 23 14:50:01.854042 master-0 kubenswrapper[28758]: I0223 14:50:01.851853 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd7c1f3-7e33-4a32-a229-09521aa553e2-combined-ca-bundle\") pod \"ironic-7dbb4799c9-gg6zd\" (UID: \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\") " pod="openstack/ironic-7dbb4799c9-gg6zd" Feb 23 14:50:01.854042 master-0 kubenswrapper[28758]: I0223 14:50:01.851943 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ddd7c1f3-7e33-4a32-a229-09521aa553e2-config-data-merged\") pod \"ironic-7dbb4799c9-gg6zd\" (UID: \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\") " pod="openstack/ironic-7dbb4799c9-gg6zd" Feb 23 14:50:01.854042 master-0 kubenswrapper[28758]: I0223 14:50:01.851987 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dc9b5902-d242-43ef-a8cc-a6b9f256507a-lib-modules\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.854042 master-0 kubenswrapper[28758]: I0223 14:50:01.852034 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc9b5902-d242-43ef-a8cc-a6b9f256507a-combined-ca-bundle\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.854042 master-0 kubenswrapper[28758]: I0223 14:50:01.852082 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/dc9b5902-d242-43ef-a8cc-a6b9f256507a-var-locks-brick\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.854042 master-0 kubenswrapper[28758]: I0223 14:50:01.852115 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc9b5902-d242-43ef-a8cc-a6b9f256507a-scripts\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.854042 master-0 kubenswrapper[28758]: I0223 14:50:01.852134 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/dc9b5902-d242-43ef-a8cc-a6b9f256507a-dev\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.854042 master-0 kubenswrapper[28758]: I0223 14:50:01.852184 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nfhc\" (UniqueName: \"kubernetes.io/projected/dc9b5902-d242-43ef-a8cc-a6b9f256507a-kube-api-access-2nfhc\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.854042 master-0 kubenswrapper[28758]: I0223 14:50:01.852222 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddd7c1f3-7e33-4a32-a229-09521aa553e2-scripts\") pod \"ironic-7dbb4799c9-gg6zd\" (UID: \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\") " pod="openstack/ironic-7dbb4799c9-gg6zd" Feb 23 14:50:01.854042 master-0 kubenswrapper[28758]: I0223 14:50:01.852262 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn2gq\" (UniqueName: \"kubernetes.io/projected/ddd7c1f3-7e33-4a32-a229-09521aa553e2-kube-api-access-sn2gq\") pod \"ironic-7dbb4799c9-gg6zd\" (UID: \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\") " pod="openstack/ironic-7dbb4799c9-gg6zd" Feb 23 14:50:01.854042 master-0 kubenswrapper[28758]: I0223 14:50:01.852292 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc9b5902-d242-43ef-a8cc-a6b9f256507a-etc-machine-id\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.854042 master-0 kubenswrapper[28758]: I0223 14:50:01.852389 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/dc9b5902-d242-43ef-a8cc-a6b9f256507a-etc-iscsi\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.854042 master-0 kubenswrapper[28758]: I0223 14:50:01.852434 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd7c1f3-7e33-4a32-a229-09521aa553e2-config-data\") pod \"ironic-7dbb4799c9-gg6zd\" (UID: \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\") " pod="openstack/ironic-7dbb4799c9-gg6zd" Feb 23 14:50:01.854042 master-0 kubenswrapper[28758]: I0223 14:50:01.852491 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dc9b5902-d242-43ef-a8cc-a6b9f256507a-sys\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.854042 master-0 kubenswrapper[28758]: I0223 14:50:01.852561 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/dc9b5902-d242-43ef-a8cc-a6b9f256507a-var-lib-cinder\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.854042 master-0 kubenswrapper[28758]: I0223 14:50:01.852596 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc9b5902-d242-43ef-a8cc-a6b9f256507a-config-data-custom\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.854042 master-0 kubenswrapper[28758]: I0223 14:50:01.852642 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ddd7c1f3-7e33-4a32-a229-09521aa553e2-etc-podinfo\") pod \"ironic-7dbb4799c9-gg6zd\" (UID: \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\") " pod="openstack/ironic-7dbb4799c9-gg6zd" Feb 23 14:50:01.854042 master-0 kubenswrapper[28758]: I0223 14:50:01.852705 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/dc9b5902-d242-43ef-a8cc-a6b9f256507a-var-locks-cinder\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.854042 master-0 kubenswrapper[28758]: I0223 14:50:01.852946 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/dc9b5902-d242-43ef-a8cc-a6b9f256507a-etc-nvme\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.854042 master-0 kubenswrapper[28758]: I0223 14:50:01.853034 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddd7c1f3-7e33-4a32-a229-09521aa553e2-logs\") pod \"ironic-7dbb4799c9-gg6zd\" (UID: \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\") " pod="openstack/ironic-7dbb4799c9-gg6zd" Feb 23 14:50:01.854042 master-0 kubenswrapper[28758]: I0223 14:50:01.853061 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc9b5902-d242-43ef-a8cc-a6b9f256507a-config-data\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.857764 master-0 kubenswrapper[28758]: I0223 14:50:01.857690 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ddd7c1f3-7e33-4a32-a229-09521aa553e2-config-data-merged\") pod \"ironic-7dbb4799c9-gg6zd\" (UID: \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\") " pod="openstack/ironic-7dbb4799c9-gg6zd" Feb 23 14:50:01.861376 master-0 kubenswrapper[28758]: I0223 14:50:01.861327 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddd7c1f3-7e33-4a32-a229-09521aa553e2-logs\") pod \"ironic-7dbb4799c9-gg6zd\" (UID: \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\") " pod="openstack/ironic-7dbb4799c9-gg6zd" Feb 23 14:50:01.865901 master-0 kubenswrapper[28758]: I0223 14:50:01.865808 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd7c1f3-7e33-4a32-a229-09521aa553e2-combined-ca-bundle\") pod \"ironic-7dbb4799c9-gg6zd\" (UID: \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\") " pod="openstack/ironic-7dbb4799c9-gg6zd" Feb 23 14:50:01.897569 master-0 kubenswrapper[28758]: I0223 14:50:01.897516 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn2gq\" (UniqueName: \"kubernetes.io/projected/ddd7c1f3-7e33-4a32-a229-09521aa553e2-kube-api-access-sn2gq\") pod \"ironic-7dbb4799c9-gg6zd\" (UID: \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\") " pod="openstack/ironic-7dbb4799c9-gg6zd" Feb 23 14:50:01.900043 master-0 kubenswrapper[28758]: I0223 14:50:01.900000 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd7c1f3-7e33-4a32-a229-09521aa553e2-config-data\") pod \"ironic-7dbb4799c9-gg6zd\" (UID: \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\") " pod="openstack/ironic-7dbb4799c9-gg6zd" Feb 23 14:50:01.903014 master-0 kubenswrapper[28758]: I0223 14:50:01.902966 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddd7c1f3-7e33-4a32-a229-09521aa553e2-scripts\") pod \"ironic-7dbb4799c9-gg6zd\" (UID: \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\") " pod="openstack/ironic-7dbb4799c9-gg6zd" Feb 23 14:50:01.904357 master-0 kubenswrapper[28758]: I0223 14:50:01.904313 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ddd7c1f3-7e33-4a32-a229-09521aa553e2-config-data-custom\") pod \"ironic-7dbb4799c9-gg6zd\" (UID: \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\") " pod="openstack/ironic-7dbb4799c9-gg6zd" Feb 23 14:50:01.905932 master-0 kubenswrapper[28758]: I0223 14:50:01.905889 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ddd7c1f3-7e33-4a32-a229-09521aa553e2-etc-podinfo\") pod \"ironic-7dbb4799c9-gg6zd\" (UID: \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\") " pod="openstack/ironic-7dbb4799c9-gg6zd" Feb 23 14:50:01.956969 master-0 kubenswrapper[28758]: I0223 14:50:01.956890 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/dc9b5902-d242-43ef-a8cc-a6b9f256507a-var-locks-cinder\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.956969 master-0 kubenswrapper[28758]: I0223 14:50:01.956969 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/dc9b5902-d242-43ef-a8cc-a6b9f256507a-etc-nvme\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.958009 master-0 kubenswrapper[28758]: I0223 14:50:01.957018 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc9b5902-d242-43ef-a8cc-a6b9f256507a-config-data\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.958009 master-0 kubenswrapper[28758]: I0223 14:50:01.957041 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dc9b5902-d242-43ef-a8cc-a6b9f256507a-run\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.958009 master-0 kubenswrapper[28758]: I0223 14:50:01.957077 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/dc9b5902-d242-43ef-a8cc-a6b9f256507a-var-locks-cinder\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.958009 master-0 kubenswrapper[28758]: I0223 14:50:01.957124 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dc9b5902-d242-43ef-a8cc-a6b9f256507a-lib-modules\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.958009 master-0 kubenswrapper[28758]: I0223 14:50:01.957142 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc9b5902-d242-43ef-a8cc-a6b9f256507a-combined-ca-bundle\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.958009 master-0 kubenswrapper[28758]: I0223 14:50:01.957170 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/dc9b5902-d242-43ef-a8cc-a6b9f256507a-var-locks-brick\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.958009 master-0 kubenswrapper[28758]: I0223 14:50:01.957193 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc9b5902-d242-43ef-a8cc-a6b9f256507a-scripts\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.958009 master-0 kubenswrapper[28758]: I0223 14:50:01.957211 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/dc9b5902-d242-43ef-a8cc-a6b9f256507a-dev\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.958009 master-0 kubenswrapper[28758]: I0223 14:50:01.957237 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nfhc\" (UniqueName: \"kubernetes.io/projected/dc9b5902-d242-43ef-a8cc-a6b9f256507a-kube-api-access-2nfhc\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.958009 master-0 kubenswrapper[28758]: I0223 14:50:01.957268 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc9b5902-d242-43ef-a8cc-a6b9f256507a-etc-machine-id\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.958009 master-0 kubenswrapper[28758]: I0223 14:50:01.957314 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/dc9b5902-d242-43ef-a8cc-a6b9f256507a-etc-iscsi\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.958009 master-0 kubenswrapper[28758]: I0223 14:50:01.957339 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dc9b5902-d242-43ef-a8cc-a6b9f256507a-sys\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.958009 master-0 kubenswrapper[28758]: I0223 14:50:01.957372 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/dc9b5902-d242-43ef-a8cc-a6b9f256507a-var-lib-cinder\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.958009 master-0 kubenswrapper[28758]: I0223 14:50:01.957390 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc9b5902-d242-43ef-a8cc-a6b9f256507a-config-data-custom\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.958009 master-0 kubenswrapper[28758]: I0223 14:50:01.957881 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/dc9b5902-d242-43ef-a8cc-a6b9f256507a-var-lib-cinder\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.958009 master-0 kubenswrapper[28758]: I0223 14:50:01.957934 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/dc9b5902-d242-43ef-a8cc-a6b9f256507a-etc-iscsi\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.959020 master-0 kubenswrapper[28758]: I0223 14:50:01.958438 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc9b5902-d242-43ef-a8cc-a6b9f256507a-etc-machine-id\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.959020 master-0 kubenswrapper[28758]: I0223 14:50:01.958525 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/dc9b5902-d242-43ef-a8cc-a6b9f256507a-etc-nvme\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.959020 master-0 kubenswrapper[28758]: I0223 14:50:01.958559 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dc9b5902-d242-43ef-a8cc-a6b9f256507a-sys\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.959141 master-0 kubenswrapper[28758]: I0223 14:50:01.959099 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dc9b5902-d242-43ef-a8cc-a6b9f256507a-run\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.959191 master-0 kubenswrapper[28758]: I0223 14:50:01.959162 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dc9b5902-d242-43ef-a8cc-a6b9f256507a-lib-modules\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.959232 master-0 kubenswrapper[28758]: I0223 14:50:01.959214 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/dc9b5902-d242-43ef-a8cc-a6b9f256507a-var-locks-brick\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.959372 master-0 kubenswrapper[28758]: I0223 14:50:01.959284 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/dc9b5902-d242-43ef-a8cc-a6b9f256507a-dev\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.970844 master-0 kubenswrapper[28758]: I0223 14:50:01.969738 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc9b5902-d242-43ef-a8cc-a6b9f256507a-config-data\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.973937 master-0 kubenswrapper[28758]: I0223 14:50:01.973245 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc9b5902-d242-43ef-a8cc-a6b9f256507a-scripts\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.977203 master-0 kubenswrapper[28758]: I0223 14:50:01.977152 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc9b5902-d242-43ef-a8cc-a6b9f256507a-config-data-custom\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.980531 master-0 kubenswrapper[28758]: I0223 14:50:01.980382 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc9b5902-d242-43ef-a8cc-a6b9f256507a-combined-ca-bundle\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.994051 master-0 kubenswrapper[28758]: I0223 14:50:01.993954 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nfhc\" (UniqueName: \"kubernetes.io/projected/dc9b5902-d242-43ef-a8cc-a6b9f256507a-kube-api-access-2nfhc\") pod \"cinder-2d990-volume-lvm-iscsi-0\" (UID: \"dc9b5902-d242-43ef-a8cc-a6b9f256507a\") " pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:01.998842 master-0 kubenswrapper[28758]: I0223 14:50:01.998750 28758 generic.go:334] "Generic (PLEG): container finished" podID="4021d59b-dfa8-49cc-b55c-48469a02b971" containerID="e7b0433721951dfb37923e3f3397313fe35bb598c89aa4a7ffc288127028a2ad" exitCode=0 Feb 23 14:50:01.998842 master-0 kubenswrapper[28758]: I0223 14:50:01.998829 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-scheduler-0" event={"ID":"4021d59b-dfa8-49cc-b55c-48469a02b971","Type":"ContainerDied","Data":"e7b0433721951dfb37923e3f3397313fe35bb598c89aa4a7ffc288127028a2ad"} Feb 23 14:50:02.009651 master-0 kubenswrapper[28758]: I0223 14:50:02.009530 28758 generic.go:334] "Generic (PLEG): container finished" podID="7a793659-5013-4d7e-a986-ea95782880c2" containerID="86b06d12ffc94bcbdc6b4b284ec6fa03e058c14f82ea14b05587f96271ce4a86" exitCode=0 Feb 23 14:50:02.009821 master-0 kubenswrapper[28758]: I0223 14:50:02.009694 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-backup-0" event={"ID":"7a793659-5013-4d7e-a986-ea95782880c2","Type":"ContainerDied","Data":"86b06d12ffc94bcbdc6b4b284ec6fa03e058c14f82ea14b05587f96271ce4a86"} Feb 23 14:50:02.012954 master-0 kubenswrapper[28758]: I0223 14:50:02.012910 28758 generic.go:334] "Generic (PLEG): container finished" podID="e4ab41cd-61e0-4117-9d0f-108f29ae692c" containerID="16c08e5f7966221a02fc02f63d75f242d6fccae0624607fa8aab15528d4dcee7" exitCode=0 Feb 23 14:50:02.013028 master-0 kubenswrapper[28758]: I0223 14:50:02.012979 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b69b7cf8f-pcb5c" event={"ID":"e4ab41cd-61e0-4117-9d0f-108f29ae692c","Type":"ContainerDied","Data":"16c08e5f7966221a02fc02f63d75f242d6fccae0624607fa8aab15528d4dcee7"} Feb 23 14:50:02.030618 master-0 kubenswrapper[28758]: I0223 14:50:02.030498 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:02.042306 master-0 kubenswrapper[28758]: I0223 14:50:02.042247 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-677b4847c-74h4p" Feb 23 14:50:02.075401 master-0 kubenswrapper[28758]: I0223 14:50:02.075328 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-7dbb4799c9-gg6zd" Feb 23 14:50:02.113678 master-0 kubenswrapper[28758]: I0223 14:50:02.113588 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42d8d5ca-eecd-4a43-9a68-1099bab111aa" path="/var/lib/kubelet/pods/42d8d5ca-eecd-4a43-9a68-1099bab111aa/volumes" Feb 23 14:50:02.997717 master-0 kubenswrapper[28758]: I0223 14:50:02.997505 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b69b7cf8f-pcb5c" Feb 23 14:50:03.052498 master-0 kubenswrapper[28758]: I0223 14:50:03.047913 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:50:03.079836 master-0 kubenswrapper[28758]: I0223 14:50:03.075501 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b69b7cf8f-pcb5c" event={"ID":"e4ab41cd-61e0-4117-9d0f-108f29ae692c","Type":"ContainerDied","Data":"36aaac46fbc3ad1bbc0c1dbbd2d00f3be0000f33be79d3351597577ca0d1a0f1"} Feb 23 14:50:03.079836 master-0 kubenswrapper[28758]: I0223 14:50:03.075568 28758 scope.go:117] "RemoveContainer" containerID="16c08e5f7966221a02fc02f63d75f242d6fccae0624607fa8aab15528d4dcee7" Feb 23 14:50:03.079836 master-0 kubenswrapper[28758]: I0223 14:50:03.075771 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b69b7cf8f-pcb5c" Feb 23 14:50:03.138499 master-0 kubenswrapper[28758]: I0223 14:50:03.136184 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4ab41cd-61e0-4117-9d0f-108f29ae692c-ovsdbserver-nb\") pod \"e4ab41cd-61e0-4117-9d0f-108f29ae692c\" (UID: \"e4ab41cd-61e0-4117-9d0f-108f29ae692c\") " Feb 23 14:50:03.138499 master-0 kubenswrapper[28758]: I0223 14:50:03.136261 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4021d59b-dfa8-49cc-b55c-48469a02b971-config-data-custom\") pod \"4021d59b-dfa8-49cc-b55c-48469a02b971\" (UID: \"4021d59b-dfa8-49cc-b55c-48469a02b971\") " Feb 23 14:50:03.138499 master-0 kubenswrapper[28758]: I0223 14:50:03.136303 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ab41cd-61e0-4117-9d0f-108f29ae692c-config\") pod \"e4ab41cd-61e0-4117-9d0f-108f29ae692c\" (UID: \"e4ab41cd-61e0-4117-9d0f-108f29ae692c\") " Feb 23 14:50:03.138499 master-0 kubenswrapper[28758]: I0223 14:50:03.136395 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4021d59b-dfa8-49cc-b55c-48469a02b971-scripts\") pod \"4021d59b-dfa8-49cc-b55c-48469a02b971\" (UID: \"4021d59b-dfa8-49cc-b55c-48469a02b971\") " Feb 23 14:50:03.138499 master-0 kubenswrapper[28758]: I0223 14:50:03.136439 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4021d59b-dfa8-49cc-b55c-48469a02b971-combined-ca-bundle\") pod \"4021d59b-dfa8-49cc-b55c-48469a02b971\" (UID: \"4021d59b-dfa8-49cc-b55c-48469a02b971\") " Feb 23 14:50:03.138499 master-0 kubenswrapper[28758]: I0223 14:50:03.136496 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4ab41cd-61e0-4117-9d0f-108f29ae692c-dns-swift-storage-0\") pod \"e4ab41cd-61e0-4117-9d0f-108f29ae692c\" (UID: \"e4ab41cd-61e0-4117-9d0f-108f29ae692c\") " Feb 23 14:50:03.138499 master-0 kubenswrapper[28758]: I0223 14:50:03.136711 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4ab41cd-61e0-4117-9d0f-108f29ae692c-ovsdbserver-sb\") pod \"e4ab41cd-61e0-4117-9d0f-108f29ae692c\" (UID: \"e4ab41cd-61e0-4117-9d0f-108f29ae692c\") " Feb 23 14:50:03.138499 master-0 kubenswrapper[28758]: I0223 14:50:03.136789 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hskm6\" (UniqueName: \"kubernetes.io/projected/e4ab41cd-61e0-4117-9d0f-108f29ae692c-kube-api-access-hskm6\") pod \"e4ab41cd-61e0-4117-9d0f-108f29ae692c\" (UID: \"e4ab41cd-61e0-4117-9d0f-108f29ae692c\") " Feb 23 14:50:03.138499 master-0 kubenswrapper[28758]: I0223 14:50:03.137035 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4021d59b-dfa8-49cc-b55c-48469a02b971-etc-machine-id\") pod \"4021d59b-dfa8-49cc-b55c-48469a02b971\" (UID: \"4021d59b-dfa8-49cc-b55c-48469a02b971\") " Feb 23 14:50:03.138499 master-0 kubenswrapper[28758]: I0223 14:50:03.137062 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4ab41cd-61e0-4117-9d0f-108f29ae692c-dns-svc\") pod \"e4ab41cd-61e0-4117-9d0f-108f29ae692c\" (UID: \"e4ab41cd-61e0-4117-9d0f-108f29ae692c\") " Feb 23 14:50:03.138499 master-0 kubenswrapper[28758]: I0223 14:50:03.137085 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl5fr\" (UniqueName: \"kubernetes.io/projected/4021d59b-dfa8-49cc-b55c-48469a02b971-kube-api-access-fl5fr\") pod \"4021d59b-dfa8-49cc-b55c-48469a02b971\" (UID: \"4021d59b-dfa8-49cc-b55c-48469a02b971\") " Feb 23 14:50:03.138499 master-0 kubenswrapper[28758]: I0223 14:50:03.137143 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4021d59b-dfa8-49cc-b55c-48469a02b971-config-data\") pod \"4021d59b-dfa8-49cc-b55c-48469a02b971\" (UID: \"4021d59b-dfa8-49cc-b55c-48469a02b971\") " Feb 23 14:50:03.152507 master-0 kubenswrapper[28758]: I0223 14:50:03.146665 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-scheduler-0" event={"ID":"4021d59b-dfa8-49cc-b55c-48469a02b971","Type":"ContainerDied","Data":"af68541bc58e379745c979863813dd1df390c09cb5e65da31042416700954961"} Feb 23 14:50:03.152507 master-0 kubenswrapper[28758]: I0223 14:50:03.146673 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4021d59b-dfa8-49cc-b55c-48469a02b971-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4021d59b-dfa8-49cc-b55c-48469a02b971" (UID: "4021d59b-dfa8-49cc-b55c-48469a02b971"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:50:03.152507 master-0 kubenswrapper[28758]: I0223 14:50:03.146796 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:50:03.246578 master-0 kubenswrapper[28758]: I0223 14:50:03.207837 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4ab41cd-61e0-4117-9d0f-108f29ae692c-kube-api-access-hskm6" (OuterVolumeSpecName: "kube-api-access-hskm6") pod "e4ab41cd-61e0-4117-9d0f-108f29ae692c" (UID: "e4ab41cd-61e0-4117-9d0f-108f29ae692c"). InnerVolumeSpecName "kube-api-access-hskm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:50:03.246578 master-0 kubenswrapper[28758]: I0223 14:50:03.225772 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-create-w9qz6"] Feb 23 14:50:03.246578 master-0 kubenswrapper[28758]: I0223 14:50:03.241529 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hskm6\" (UniqueName: \"kubernetes.io/projected/e4ab41cd-61e0-4117-9d0f-108f29ae692c-kube-api-access-hskm6\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:03.246578 master-0 kubenswrapper[28758]: I0223 14:50:03.241566 28758 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4021d59b-dfa8-49cc-b55c-48469a02b971-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:03.307280 master-0 kubenswrapper[28758]: I0223 14:50:03.251671 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4021d59b-dfa8-49cc-b55c-48469a02b971-scripts" (OuterVolumeSpecName: "scripts") pod "4021d59b-dfa8-49cc-b55c-48469a02b971" (UID: "4021d59b-dfa8-49cc-b55c-48469a02b971"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:03.307280 master-0 kubenswrapper[28758]: I0223 14:50:03.269732 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4021d59b-dfa8-49cc-b55c-48469a02b971-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4021d59b-dfa8-49cc-b55c-48469a02b971" (UID: "4021d59b-dfa8-49cc-b55c-48469a02b971"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:03.307280 master-0 kubenswrapper[28758]: W0223 14:50:03.288194 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a47ca88_30b0_4569_bcd5_00994d3facc0.slice/crio-11e5f909571d3f87a65ba922309000163e4f78e7116d4a68d9332679e81c8be7 WatchSource:0}: Error finding container 11e5f909571d3f87a65ba922309000163e4f78e7116d4a68d9332679e81c8be7: Status 404 returned error can't find the container with id 11e5f909571d3f87a65ba922309000163e4f78e7116d4a68d9332679e81c8be7 Feb 23 14:50:03.307280 master-0 kubenswrapper[28758]: I0223 14:50:03.306182 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4021d59b-dfa8-49cc-b55c-48469a02b971-kube-api-access-fl5fr" (OuterVolumeSpecName: "kube-api-access-fl5fr") pod "4021d59b-dfa8-49cc-b55c-48469a02b971" (UID: "4021d59b-dfa8-49cc-b55c-48469a02b971"). InnerVolumeSpecName "kube-api-access-fl5fr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:50:03.347655 master-0 kubenswrapper[28758]: I0223 14:50:03.347337 28758 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4021d59b-dfa8-49cc-b55c-48469a02b971-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:03.347655 master-0 kubenswrapper[28758]: I0223 14:50:03.347381 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl5fr\" (UniqueName: \"kubernetes.io/projected/4021d59b-dfa8-49cc-b55c-48469a02b971-kube-api-access-fl5fr\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:03.347655 master-0 kubenswrapper[28758]: I0223 14:50:03.347391 28758 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4021d59b-dfa8-49cc-b55c-48469a02b971-config-data-custom\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:03.382871 master-0 kubenswrapper[28758]: I0223 14:50:03.373069 28758 scope.go:117] "RemoveContainer" containerID="aa0b709d714808d5efab61d938e3547055f70e8e14c6e2eafa2711f9e7448570" Feb 23 14:50:03.441335 master-0 kubenswrapper[28758]: I0223 14:50:03.441248 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-conductor-0"] Feb 23 14:50:03.442062 master-0 kubenswrapper[28758]: E0223 14:50:03.442024 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4ab41cd-61e0-4117-9d0f-108f29ae692c" containerName="init" Feb 23 14:50:03.442062 master-0 kubenswrapper[28758]: I0223 14:50:03.442056 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4ab41cd-61e0-4117-9d0f-108f29ae692c" containerName="init" Feb 23 14:50:03.442186 master-0 kubenswrapper[28758]: E0223 14:50:03.442076 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4ab41cd-61e0-4117-9d0f-108f29ae692c" containerName="dnsmasq-dns" Feb 23 14:50:03.442186 master-0 kubenswrapper[28758]: I0223 14:50:03.442084 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4ab41cd-61e0-4117-9d0f-108f29ae692c" containerName="dnsmasq-dns" Feb 23 14:50:03.442186 master-0 kubenswrapper[28758]: E0223 14:50:03.442121 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4021d59b-dfa8-49cc-b55c-48469a02b971" containerName="probe" Feb 23 14:50:03.442186 master-0 kubenswrapper[28758]: I0223 14:50:03.442130 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="4021d59b-dfa8-49cc-b55c-48469a02b971" containerName="probe" Feb 23 14:50:03.442331 master-0 kubenswrapper[28758]: E0223 14:50:03.442238 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4021d59b-dfa8-49cc-b55c-48469a02b971" containerName="cinder-scheduler" Feb 23 14:50:03.442331 master-0 kubenswrapper[28758]: I0223 14:50:03.442251 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="4021d59b-dfa8-49cc-b55c-48469a02b971" containerName="cinder-scheduler" Feb 23 14:50:03.442937 master-0 kubenswrapper[28758]: I0223 14:50:03.442904 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4ab41cd-61e0-4117-9d0f-108f29ae692c" containerName="dnsmasq-dns" Feb 23 14:50:03.443009 master-0 kubenswrapper[28758]: I0223 14:50:03.442997 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="4021d59b-dfa8-49cc-b55c-48469a02b971" containerName="cinder-scheduler" Feb 23 14:50:03.443056 master-0 kubenswrapper[28758]: I0223 14:50:03.443011 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="4021d59b-dfa8-49cc-b55c-48469a02b971" containerName="probe" Feb 23 14:50:03.481784 master-0 kubenswrapper[28758]: I0223 14:50:03.479782 28758 scope.go:117] "RemoveContainer" containerID="d10703ef54d01e9d2bd5964168e28f6bb29d723aece463c73eb4d8418eb7106d" Feb 23 14:50:03.485491 master-0 kubenswrapper[28758]: I0223 14:50:03.482616 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4021d59b-dfa8-49cc-b55c-48469a02b971-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4021d59b-dfa8-49cc-b55c-48469a02b971" (UID: "4021d59b-dfa8-49cc-b55c-48469a02b971"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:03.500438 master-0 kubenswrapper[28758]: I0223 14:50:03.497265 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4021d59b-dfa8-49cc-b55c-48469a02b971-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:03.506068 master-0 kubenswrapper[28758]: I0223 14:50:03.506002 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-conductor-0" Feb 23 14:50:03.515063 master-0 kubenswrapper[28758]: I0223 14:50:03.514980 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-conductor-config-data" Feb 23 14:50:03.516966 master-0 kubenswrapper[28758]: I0223 14:50:03.516828 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-conductor-scripts" Feb 23 14:50:03.518842 master-0 kubenswrapper[28758]: I0223 14:50:03.518374 28758 scope.go:117] "RemoveContainer" containerID="e7b0433721951dfb37923e3f3397313fe35bb598c89aa4a7ffc288127028a2ad" Feb 23 14:50:03.535454 master-0 kubenswrapper[28758]: I0223 14:50:03.535408 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-conductor-0"] Feb 23 14:50:03.553434 master-0 kubenswrapper[28758]: I0223 14:50:03.552780 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:03.563320 master-0 kubenswrapper[28758]: I0223 14:50:03.562384 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4ab41cd-61e0-4117-9d0f-108f29ae692c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e4ab41cd-61e0-4117-9d0f-108f29ae692c" (UID: "e4ab41cd-61e0-4117-9d0f-108f29ae692c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:50:03.599222 master-0 kubenswrapper[28758]: I0223 14:50:03.599165 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/894fbd22-c889-426b-954b-04a9a0e4d905-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"894fbd22-c889-426b-954b-04a9a0e4d905\") " pod="openstack/ironic-conductor-0" Feb 23 14:50:03.599502 master-0 kubenswrapper[28758]: I0223 14:50:03.599461 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/894fbd22-c889-426b-954b-04a9a0e4d905-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"894fbd22-c889-426b-954b-04a9a0e4d905\") " pod="openstack/ironic-conductor-0" Feb 23 14:50:03.599680 master-0 kubenswrapper[28758]: I0223 14:50:03.599658 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a235a236-5566-49dd-8535-9eda75c13d3e\" (UniqueName: \"kubernetes.io/csi/topolvm.io^c5924d5c-7571-44ad-ba50-1c9e0c5d8928\") pod \"ironic-conductor-0\" (UID: \"894fbd22-c889-426b-954b-04a9a0e4d905\") " pod="openstack/ironic-conductor-0" Feb 23 14:50:03.599810 master-0 kubenswrapper[28758]: I0223 14:50:03.599793 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/894fbd22-c889-426b-954b-04a9a0e4d905-config-data\") pod \"ironic-conductor-0\" (UID: \"894fbd22-c889-426b-954b-04a9a0e4d905\") " pod="openstack/ironic-conductor-0" Feb 23 14:50:03.599950 master-0 kubenswrapper[28758]: I0223 14:50:03.599932 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmplf\" (UniqueName: \"kubernetes.io/projected/894fbd22-c889-426b-954b-04a9a0e4d905-kube-api-access-nmplf\") pod \"ironic-conductor-0\" (UID: \"894fbd22-c889-426b-954b-04a9a0e4d905\") " pod="openstack/ironic-conductor-0" Feb 23 14:50:03.600066 master-0 kubenswrapper[28758]: I0223 14:50:03.600050 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/894fbd22-c889-426b-954b-04a9a0e4d905-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"894fbd22-c889-426b-954b-04a9a0e4d905\") " pod="openstack/ironic-conductor-0" Feb 23 14:50:03.600229 master-0 kubenswrapper[28758]: I0223 14:50:03.600211 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/894fbd22-c889-426b-954b-04a9a0e4d905-scripts\") pod \"ironic-conductor-0\" (UID: \"894fbd22-c889-426b-954b-04a9a0e4d905\") " pod="openstack/ironic-conductor-0" Feb 23 14:50:03.602295 master-0 kubenswrapper[28758]: I0223 14:50:03.600461 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/894fbd22-c889-426b-954b-04a9a0e4d905-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"894fbd22-c889-426b-954b-04a9a0e4d905\") " pod="openstack/ironic-conductor-0" Feb 23 14:50:03.602295 master-0 kubenswrapper[28758]: I0223 14:50:03.600805 28758 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4ab41cd-61e0-4117-9d0f-108f29ae692c-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:03.638251 master-0 kubenswrapper[28758]: I0223 14:50:03.636740 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4ab41cd-61e0-4117-9d0f-108f29ae692c-config" (OuterVolumeSpecName: "config") pod "e4ab41cd-61e0-4117-9d0f-108f29ae692c" (UID: "e4ab41cd-61e0-4117-9d0f-108f29ae692c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:50:03.682439 master-0 kubenswrapper[28758]: I0223 14:50:03.681514 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4ab41cd-61e0-4117-9d0f-108f29ae692c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e4ab41cd-61e0-4117-9d0f-108f29ae692c" (UID: "e4ab41cd-61e0-4117-9d0f-108f29ae692c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:50:03.690206 master-0 kubenswrapper[28758]: I0223 14:50:03.688247 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4ab41cd-61e0-4117-9d0f-108f29ae692c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e4ab41cd-61e0-4117-9d0f-108f29ae692c" (UID: "e4ab41cd-61e0-4117-9d0f-108f29ae692c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:50:03.701313 master-0 kubenswrapper[28758]: I0223 14:50:03.701253 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4ab41cd-61e0-4117-9d0f-108f29ae692c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e4ab41cd-61e0-4117-9d0f-108f29ae692c" (UID: "e4ab41cd-61e0-4117-9d0f-108f29ae692c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:50:03.701912 master-0 kubenswrapper[28758]: I0223 14:50:03.701794 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-var-locks-brick\") pod \"7a793659-5013-4d7e-a986-ea95782880c2\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " Feb 23 14:50:03.702006 master-0 kubenswrapper[28758]: I0223 14:50:03.701918 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-dev\") pod \"7a793659-5013-4d7e-a986-ea95782880c2\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " Feb 23 14:50:03.702006 master-0 kubenswrapper[28758]: I0223 14:50:03.701959 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-etc-iscsi\") pod \"7a793659-5013-4d7e-a986-ea95782880c2\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " Feb 23 14:50:03.702176 master-0 kubenswrapper[28758]: I0223 14:50:03.702025 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-var-lib-cinder\") pod \"7a793659-5013-4d7e-a986-ea95782880c2\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " Feb 23 14:50:03.702176 master-0 kubenswrapper[28758]: I0223 14:50:03.702068 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a793659-5013-4d7e-a986-ea95782880c2-combined-ca-bundle\") pod \"7a793659-5013-4d7e-a986-ea95782880c2\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " Feb 23 14:50:03.702176 master-0 kubenswrapper[28758]: I0223 14:50:03.702127 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-etc-nvme\") pod \"7a793659-5013-4d7e-a986-ea95782880c2\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " Feb 23 14:50:03.702176 master-0 kubenswrapper[28758]: I0223 14:50:03.702156 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a793659-5013-4d7e-a986-ea95782880c2-scripts\") pod \"7a793659-5013-4d7e-a986-ea95782880c2\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " Feb 23 14:50:03.706117 master-0 kubenswrapper[28758]: I0223 14:50:03.702184 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-lib-modules\") pod \"7a793659-5013-4d7e-a986-ea95782880c2\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " Feb 23 14:50:03.706117 master-0 kubenswrapper[28758]: I0223 14:50:03.702227 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-etc-machine-id\") pod \"7a793659-5013-4d7e-a986-ea95782880c2\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " Feb 23 14:50:03.706117 master-0 kubenswrapper[28758]: I0223 14:50:03.702247 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a793659-5013-4d7e-a986-ea95782880c2-config-data\") pod \"7a793659-5013-4d7e-a986-ea95782880c2\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " Feb 23 14:50:03.706117 master-0 kubenswrapper[28758]: I0223 14:50:03.702264 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6zq4\" (UniqueName: \"kubernetes.io/projected/7a793659-5013-4d7e-a986-ea95782880c2-kube-api-access-w6zq4\") pod \"7a793659-5013-4d7e-a986-ea95782880c2\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " Feb 23 14:50:03.706117 master-0 kubenswrapper[28758]: I0223 14:50:03.702309 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-var-locks-cinder\") pod \"7a793659-5013-4d7e-a986-ea95782880c2\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " Feb 23 14:50:03.706117 master-0 kubenswrapper[28758]: I0223 14:50:03.702326 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-sys\") pod \"7a793659-5013-4d7e-a986-ea95782880c2\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " Feb 23 14:50:03.706117 master-0 kubenswrapper[28758]: I0223 14:50:03.702344 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-run\") pod \"7a793659-5013-4d7e-a986-ea95782880c2\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " Feb 23 14:50:03.706117 master-0 kubenswrapper[28758]: I0223 14:50:03.702406 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a793659-5013-4d7e-a986-ea95782880c2-config-data-custom\") pod \"7a793659-5013-4d7e-a986-ea95782880c2\" (UID: \"7a793659-5013-4d7e-a986-ea95782880c2\") " Feb 23 14:50:03.706117 master-0 kubenswrapper[28758]: I0223 14:50:03.705769 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "7a793659-5013-4d7e-a986-ea95782880c2" (UID: "7a793659-5013-4d7e-a986-ea95782880c2"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:50:03.706117 master-0 kubenswrapper[28758]: I0223 14:50:03.705854 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "7a793659-5013-4d7e-a986-ea95782880c2" (UID: "7a793659-5013-4d7e-a986-ea95782880c2"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:50:03.706117 master-0 kubenswrapper[28758]: I0223 14:50:03.705880 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-dev" (OuterVolumeSpecName: "dev") pod "7a793659-5013-4d7e-a986-ea95782880c2" (UID: "7a793659-5013-4d7e-a986-ea95782880c2"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:50:03.706117 master-0 kubenswrapper[28758]: I0223 14:50:03.705976 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a235a236-5566-49dd-8535-9eda75c13d3e\" (UniqueName: \"kubernetes.io/csi/topolvm.io^c5924d5c-7571-44ad-ba50-1c9e0c5d8928\") pod \"ironic-conductor-0\" (UID: \"894fbd22-c889-426b-954b-04a9a0e4d905\") " pod="openstack/ironic-conductor-0" Feb 23 14:50:03.706117 master-0 kubenswrapper[28758]: I0223 14:50:03.706079 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/894fbd22-c889-426b-954b-04a9a0e4d905-config-data\") pod \"ironic-conductor-0\" (UID: \"894fbd22-c889-426b-954b-04a9a0e4d905\") " pod="openstack/ironic-conductor-0" Feb 23 14:50:03.706722 master-0 kubenswrapper[28758]: I0223 14:50:03.706176 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmplf\" (UniqueName: \"kubernetes.io/projected/894fbd22-c889-426b-954b-04a9a0e4d905-kube-api-access-nmplf\") pod \"ironic-conductor-0\" (UID: \"894fbd22-c889-426b-954b-04a9a0e4d905\") " pod="openstack/ironic-conductor-0" Feb 23 14:50:03.706722 master-0 kubenswrapper[28758]: I0223 14:50:03.706247 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/894fbd22-c889-426b-954b-04a9a0e4d905-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"894fbd22-c889-426b-954b-04a9a0e4d905\") " pod="openstack/ironic-conductor-0" Feb 23 14:50:03.706722 master-0 kubenswrapper[28758]: I0223 14:50:03.706383 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/894fbd22-c889-426b-954b-04a9a0e4d905-scripts\") pod \"ironic-conductor-0\" (UID: \"894fbd22-c889-426b-954b-04a9a0e4d905\") " pod="openstack/ironic-conductor-0" Feb 23 14:50:03.706722 master-0 kubenswrapper[28758]: I0223 14:50:03.706521 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/894fbd22-c889-426b-954b-04a9a0e4d905-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"894fbd22-c889-426b-954b-04a9a0e4d905\") " pod="openstack/ironic-conductor-0" Feb 23 14:50:03.706892 master-0 kubenswrapper[28758]: I0223 14:50:03.706770 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/894fbd22-c889-426b-954b-04a9a0e4d905-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"894fbd22-c889-426b-954b-04a9a0e4d905\") " pod="openstack/ironic-conductor-0" Feb 23 14:50:03.706892 master-0 kubenswrapper[28758]: I0223 14:50:03.706800 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/894fbd22-c889-426b-954b-04a9a0e4d905-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"894fbd22-c889-426b-954b-04a9a0e4d905\") " pod="openstack/ironic-conductor-0" Feb 23 14:50:03.706978 master-0 kubenswrapper[28758]: I0223 14:50:03.706949 28758 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4ab41cd-61e0-4117-9d0f-108f29ae692c-dns-svc\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:03.706978 master-0 kubenswrapper[28758]: I0223 14:50:03.706973 28758 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4ab41cd-61e0-4117-9d0f-108f29ae692c-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:03.707063 master-0 kubenswrapper[28758]: I0223 14:50:03.706989 28758 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4ab41cd-61e0-4117-9d0f-108f29ae692c-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:03.707063 master-0 kubenswrapper[28758]: I0223 14:50:03.707001 28758 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-var-locks-brick\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:03.707063 master-0 kubenswrapper[28758]: I0223 14:50:03.707015 28758 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4ab41cd-61e0-4117-9d0f-108f29ae692c-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:03.707063 master-0 kubenswrapper[28758]: I0223 14:50:03.707027 28758 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-dev\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:03.707063 master-0 kubenswrapper[28758]: I0223 14:50:03.707038 28758 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-etc-iscsi\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:03.709223 master-0 kubenswrapper[28758]: I0223 14:50:03.707569 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "7a793659-5013-4d7e-a986-ea95782880c2" (UID: "7a793659-5013-4d7e-a986-ea95782880c2"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:50:03.709223 master-0 kubenswrapper[28758]: I0223 14:50:03.707872 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "7a793659-5013-4d7e-a986-ea95782880c2" (UID: "7a793659-5013-4d7e-a986-ea95782880c2"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:50:03.709223 master-0 kubenswrapper[28758]: I0223 14:50:03.708339 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/894fbd22-c889-426b-954b-04a9a0e4d905-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"894fbd22-c889-426b-954b-04a9a0e4d905\") " pod="openstack/ironic-conductor-0" Feb 23 14:50:03.709583 master-0 kubenswrapper[28758]: I0223 14:50:03.709457 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-sys" (OuterVolumeSpecName: "sys") pod "7a793659-5013-4d7e-a986-ea95782880c2" (UID: "7a793659-5013-4d7e-a986-ea95782880c2"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:50:03.709583 master-0 kubenswrapper[28758]: I0223 14:50:03.709548 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "7a793659-5013-4d7e-a986-ea95782880c2" (UID: "7a793659-5013-4d7e-a986-ea95782880c2"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:50:03.709680 master-0 kubenswrapper[28758]: I0223 14:50:03.709590 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-run" (OuterVolumeSpecName: "run") pod "7a793659-5013-4d7e-a986-ea95782880c2" (UID: "7a793659-5013-4d7e-a986-ea95782880c2"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:50:03.710290 master-0 kubenswrapper[28758]: I0223 14:50:03.709929 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7a793659-5013-4d7e-a986-ea95782880c2" (UID: "7a793659-5013-4d7e-a986-ea95782880c2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:50:03.710290 master-0 kubenswrapper[28758]: I0223 14:50:03.709976 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "7a793659-5013-4d7e-a986-ea95782880c2" (UID: "7a793659-5013-4d7e-a986-ea95782880c2"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 14:50:03.715140 master-0 kubenswrapper[28758]: I0223 14:50:03.715101 28758 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 14:50:03.715140 master-0 kubenswrapper[28758]: I0223 14:50:03.715151 28758 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a235a236-5566-49dd-8535-9eda75c13d3e\" (UniqueName: \"kubernetes.io/csi/topolvm.io^c5924d5c-7571-44ad-ba50-1c9e0c5d8928\") pod \"ironic-conductor-0\" (UID: \"894fbd22-c889-426b-954b-04a9a0e4d905\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/3a6eaf19d6a1b3ff09505a03a36e036d8775fba8d510afd358e02337496f21a6/globalmount\"" pod="openstack/ironic-conductor-0" Feb 23 14:50:03.720943 master-0 kubenswrapper[28758]: I0223 14:50:03.720890 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/894fbd22-c889-426b-954b-04a9a0e4d905-scripts\") pod \"ironic-conductor-0\" (UID: \"894fbd22-c889-426b-954b-04a9a0e4d905\") " pod="openstack/ironic-conductor-0" Feb 23 14:50:03.739812 master-0 kubenswrapper[28758]: I0223 14:50:03.739769 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/894fbd22-c889-426b-954b-04a9a0e4d905-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"894fbd22-c889-426b-954b-04a9a0e4d905\") " pod="openstack/ironic-conductor-0" Feb 23 14:50:03.740726 master-0 kubenswrapper[28758]: I0223 14:50:03.740692 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/894fbd22-c889-426b-954b-04a9a0e4d905-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"894fbd22-c889-426b-954b-04a9a0e4d905\") " pod="openstack/ironic-conductor-0" Feb 23 14:50:03.740970 master-0 kubenswrapper[28758]: I0223 14:50:03.740939 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/894fbd22-c889-426b-954b-04a9a0e4d905-config-data\") pod \"ironic-conductor-0\" (UID: \"894fbd22-c889-426b-954b-04a9a0e4d905\") " pod="openstack/ironic-conductor-0" Feb 23 14:50:03.742953 master-0 kubenswrapper[28758]: I0223 14:50:03.742835 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/894fbd22-c889-426b-954b-04a9a0e4d905-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"894fbd22-c889-426b-954b-04a9a0e4d905\") " pod="openstack/ironic-conductor-0" Feb 23 14:50:03.745668 master-0 kubenswrapper[28758]: I0223 14:50:03.745593 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a793659-5013-4d7e-a986-ea95782880c2-scripts" (OuterVolumeSpecName: "scripts") pod "7a793659-5013-4d7e-a986-ea95782880c2" (UID: "7a793659-5013-4d7e-a986-ea95782880c2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:03.747073 master-0 kubenswrapper[28758]: I0223 14:50:03.747023 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a793659-5013-4d7e-a986-ea95782880c2-kube-api-access-w6zq4" (OuterVolumeSpecName: "kube-api-access-w6zq4") pod "7a793659-5013-4d7e-a986-ea95782880c2" (UID: "7a793659-5013-4d7e-a986-ea95782880c2"). InnerVolumeSpecName "kube-api-access-w6zq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:50:03.748722 master-0 kubenswrapper[28758]: I0223 14:50:03.748679 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmplf\" (UniqueName: \"kubernetes.io/projected/894fbd22-c889-426b-954b-04a9a0e4d905-kube-api-access-nmplf\") pod \"ironic-conductor-0\" (UID: \"894fbd22-c889-426b-954b-04a9a0e4d905\") " pod="openstack/ironic-conductor-0" Feb 23 14:50:03.773070 master-0 kubenswrapper[28758]: I0223 14:50:03.772929 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a793659-5013-4d7e-a986-ea95782880c2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7a793659-5013-4d7e-a986-ea95782880c2" (UID: "7a793659-5013-4d7e-a986-ea95782880c2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:03.788415 master-0 kubenswrapper[28758]: I0223 14:50:03.787376 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2d990-volume-lvm-iscsi-0"] Feb 23 14:50:03.809382 master-0 kubenswrapper[28758]: I0223 14:50:03.809321 28758 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-etc-nvme\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:03.809695 master-0 kubenswrapper[28758]: I0223 14:50:03.809434 28758 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a793659-5013-4d7e-a986-ea95782880c2-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:03.809695 master-0 kubenswrapper[28758]: I0223 14:50:03.809461 28758 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-lib-modules\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:03.809695 master-0 kubenswrapper[28758]: I0223 14:50:03.809471 28758 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:03.809695 master-0 kubenswrapper[28758]: I0223 14:50:03.809564 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6zq4\" (UniqueName: \"kubernetes.io/projected/7a793659-5013-4d7e-a986-ea95782880c2-kube-api-access-w6zq4\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:03.809695 master-0 kubenswrapper[28758]: I0223 14:50:03.809575 28758 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-var-locks-cinder\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:03.809695 master-0 kubenswrapper[28758]: I0223 14:50:03.809583 28758 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-sys\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:03.809695 master-0 kubenswrapper[28758]: I0223 14:50:03.809591 28758 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-run\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:03.809695 master-0 kubenswrapper[28758]: I0223 14:50:03.809601 28758 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a793659-5013-4d7e-a986-ea95782880c2-config-data-custom\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:03.809695 master-0 kubenswrapper[28758]: I0223 14:50:03.809610 28758 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/7a793659-5013-4d7e-a986-ea95782880c2-var-lib-cinder\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:03.922429 master-0 kubenswrapper[28758]: I0223 14:50:03.922359 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-neutron-agent-769d7f49c5-jj8xx"] Feb 23 14:50:03.948927 master-0 kubenswrapper[28758]: I0223 14:50:03.948878 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-149f-account-create-update-sktxn"] Feb 23 14:50:04.002599 master-0 kubenswrapper[28758]: I0223 14:50:04.002540 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-677b4847c-74h4p"] Feb 23 14:50:04.014243 master-0 kubenswrapper[28758]: I0223 14:50:04.014177 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-7dbb4799c9-gg6zd"] Feb 23 14:50:04.087294 master-0 kubenswrapper[28758]: I0223 14:50:04.087221 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b69b7cf8f-pcb5c"] Feb 23 14:50:04.091644 master-0 kubenswrapper[28758]: I0223 14:50:04.091573 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a793659-5013-4d7e-a986-ea95782880c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a793659-5013-4d7e-a986-ea95782880c2" (UID: "7a793659-5013-4d7e-a986-ea95782880c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:04.117601 master-0 kubenswrapper[28758]: I0223 14:50:04.117471 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a793659-5013-4d7e-a986-ea95782880c2-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:04.127237 master-0 kubenswrapper[28758]: I0223 14:50:04.127165 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4021d59b-dfa8-49cc-b55c-48469a02b971-config-data" (OuterVolumeSpecName: "config-data") pod "4021d59b-dfa8-49cc-b55c-48469a02b971" (UID: "4021d59b-dfa8-49cc-b55c-48469a02b971"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:04.173693 master-0 kubenswrapper[28758]: I0223 14:50:04.173452 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.230995 master-0 kubenswrapper[28758]: I0223 14:50:04.230866 28758 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4021d59b-dfa8-49cc-b55c-48469a02b971-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:04.275778 master-0 kubenswrapper[28758]: I0223 14:50:04.275703 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a793659-5013-4d7e-a986-ea95782880c2-config-data" (OuterVolumeSpecName: "config-data") pod "7a793659-5013-4d7e-a986-ea95782880c2" (UID: "7a793659-5013-4d7e-a986-ea95782880c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:04.333765 master-0 kubenswrapper[28758]: I0223 14:50:04.333707 28758 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a793659-5013-4d7e-a986-ea95782880c2-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:04.357531 master-0 kubenswrapper[28758]: I0223 14:50:04.357466 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b69b7cf8f-pcb5c"] Feb 23 14:50:04.357848 master-0 kubenswrapper[28758]: I0223 14:50:04.357596 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-volume-lvm-iscsi-0" event={"ID":"dc9b5902-d242-43ef-a8cc-a6b9f256507a","Type":"ContainerStarted","Data":"eea3c04c5708b2f0bb562efe7080ca8428703b3e39bfbd09455df3ae21a9503f"} Feb 23 14:50:04.357848 master-0 kubenswrapper[28758]: I0223 14:50:04.357618 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-149f-account-create-update-sktxn" event={"ID":"251ce51f-9613-4b91-987b-bb29a897430f","Type":"ContainerStarted","Data":"4dc56005a393d48dccae4d24c1ab1a22a3518b67e521426d9ad5d68a38cb8185"} Feb 23 14:50:04.357848 master-0 kubenswrapper[28758]: I0223 14:50:04.357634 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7dbb4799c9-gg6zd" event={"ID":"ddd7c1f3-7e33-4a32-a229-09521aa553e2","Type":"ContainerStarted","Data":"9ec8606c9605fc6bff44c7578f847f8da953577b255dff354684cc17bf229516"} Feb 23 14:50:04.357848 master-0 kubenswrapper[28758]: I0223 14:50:04.357647 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-677b4847c-74h4p" event={"ID":"07aaf639-2ccb-4ffb-be79-572b8990a03a","Type":"ContainerStarted","Data":"49faa17d5b50e1775025769332eb5f1f97e9dc8fdab50376834e4752b3dcb907"} Feb 23 14:50:04.357848 master-0 kubenswrapper[28758]: I0223 14:50:04.357658 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-backup-0" event={"ID":"7a793659-5013-4d7e-a986-ea95782880c2","Type":"ContainerDied","Data":"603a95c4ec2210a0fcdee0b0f3e894ce0c00cf99b448ab802058f2b047e1fa69"} Feb 23 14:50:04.357848 master-0 kubenswrapper[28758]: I0223 14:50:04.357670 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-w9qz6" event={"ID":"6a47ca88-30b0-4569-bcd5-00994d3facc0","Type":"ContainerStarted","Data":"8b53e0ca29c760d86a78f00fa7f9b5319f8023c9a30f5cbcf5bcf9156478c801"} Feb 23 14:50:04.357848 master-0 kubenswrapper[28758]: I0223 14:50:04.357680 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-w9qz6" event={"ID":"6a47ca88-30b0-4569-bcd5-00994d3facc0","Type":"ContainerStarted","Data":"11e5f909571d3f87a65ba922309000163e4f78e7116d4a68d9332679e81c8be7"} Feb 23 14:50:04.357848 master-0 kubenswrapper[28758]: I0223 14:50:04.357689 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-769d7f49c5-jj8xx" event={"ID":"8c0c40ba-2093-4e1a-8166-bbb4c53f3a08","Type":"ContainerStarted","Data":"641f8a2ecd9058ea59ec8817814660b2de8bebb62fee5613ad0161db3853f87c"} Feb 23 14:50:04.357848 master-0 kubenswrapper[28758]: I0223 14:50:04.357744 28758 scope.go:117] "RemoveContainer" containerID="08cf06726322e72e4cb4550309df3da0cc5dbd24384fbcb6e109a66128208f17" Feb 23 14:50:04.469059 master-0 kubenswrapper[28758]: I0223 14:50:04.468858 28758 scope.go:117] "RemoveContainer" containerID="86b06d12ffc94bcbdc6b4b284ec6fa03e058c14f82ea14b05587f96271ce4a86" Feb 23 14:50:04.515446 master-0 kubenswrapper[28758]: I0223 14:50:04.515360 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-db-create-w9qz6" podStartSLOduration=4.515336347 podStartE2EDuration="4.515336347s" podCreationTimestamp="2026-02-23 14:50:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:50:04.219500377 +0000 UTC m=+936.345816329" watchObservedRunningTime="2026-02-23 14:50:04.515336347 +0000 UTC m=+936.641652289" Feb 23 14:50:04.518640 master-0 kubenswrapper[28758]: I0223 14:50:04.518574 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-2d990-backup-0"] Feb 23 14:50:04.532396 master-0 kubenswrapper[28758]: I0223 14:50:04.532158 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-2d990-backup-0"] Feb 23 14:50:04.551281 master-0 kubenswrapper[28758]: I0223 14:50:04.551150 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-2d990-backup-0"] Feb 23 14:50:04.551967 master-0 kubenswrapper[28758]: E0223 14:50:04.551937 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a793659-5013-4d7e-a986-ea95782880c2" containerName="cinder-backup" Feb 23 14:50:04.551967 master-0 kubenswrapper[28758]: I0223 14:50:04.551962 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a793659-5013-4d7e-a986-ea95782880c2" containerName="cinder-backup" Feb 23 14:50:04.552051 master-0 kubenswrapper[28758]: E0223 14:50:04.551982 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a793659-5013-4d7e-a986-ea95782880c2" containerName="probe" Feb 23 14:50:04.552051 master-0 kubenswrapper[28758]: I0223 14:50:04.551989 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a793659-5013-4d7e-a986-ea95782880c2" containerName="probe" Feb 23 14:50:04.552219 master-0 kubenswrapper[28758]: I0223 14:50:04.552196 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a793659-5013-4d7e-a986-ea95782880c2" containerName="cinder-backup" Feb 23 14:50:04.552252 master-0 kubenswrapper[28758]: I0223 14:50:04.552230 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a793659-5013-4d7e-a986-ea95782880c2" containerName="probe" Feb 23 14:50:04.554368 master-0 kubenswrapper[28758]: I0223 14:50:04.554300 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.557280 master-0 kubenswrapper[28758]: I0223 14:50:04.556919 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-2d990-backup-config-data" Feb 23 14:50:04.568575 master-0 kubenswrapper[28758]: I0223 14:50:04.568510 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2d990-backup-0"] Feb 23 14:50:04.651214 master-0 kubenswrapper[28758]: I0223 14:50:04.650443 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3465aa00-c58d-4c78-8d97-4f2543f9265d-var-locks-brick\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.651214 master-0 kubenswrapper[28758]: I0223 14:50:04.650548 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3465aa00-c58d-4c78-8d97-4f2543f9265d-var-locks-cinder\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.651214 master-0 kubenswrapper[28758]: I0223 14:50:04.650592 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3465aa00-c58d-4c78-8d97-4f2543f9265d-lib-modules\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.651214 master-0 kubenswrapper[28758]: I0223 14:50:04.650628 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3465aa00-c58d-4c78-8d97-4f2543f9265d-sys\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.651214 master-0 kubenswrapper[28758]: I0223 14:50:04.650663 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3465aa00-c58d-4c78-8d97-4f2543f9265d-run\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.651214 master-0 kubenswrapper[28758]: I0223 14:50:04.650685 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgpt9\" (UniqueName: \"kubernetes.io/projected/3465aa00-c58d-4c78-8d97-4f2543f9265d-kube-api-access-lgpt9\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.651214 master-0 kubenswrapper[28758]: I0223 14:50:04.650706 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3465aa00-c58d-4c78-8d97-4f2543f9265d-etc-nvme\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.651214 master-0 kubenswrapper[28758]: I0223 14:50:04.650741 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3465aa00-c58d-4c78-8d97-4f2543f9265d-scripts\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.651214 master-0 kubenswrapper[28758]: I0223 14:50:04.650775 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3465aa00-c58d-4c78-8d97-4f2543f9265d-etc-machine-id\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.651214 master-0 kubenswrapper[28758]: I0223 14:50:04.650792 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3465aa00-c58d-4c78-8d97-4f2543f9265d-config-data-custom\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.651214 master-0 kubenswrapper[28758]: I0223 14:50:04.650823 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3465aa00-c58d-4c78-8d97-4f2543f9265d-etc-iscsi\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.651214 master-0 kubenswrapper[28758]: I0223 14:50:04.650841 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3465aa00-c58d-4c78-8d97-4f2543f9265d-combined-ca-bundle\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.651214 master-0 kubenswrapper[28758]: I0223 14:50:04.650870 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3465aa00-c58d-4c78-8d97-4f2543f9265d-dev\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.651214 master-0 kubenswrapper[28758]: I0223 14:50:04.650890 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3465aa00-c58d-4c78-8d97-4f2543f9265d-config-data\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.651214 master-0 kubenswrapper[28758]: I0223 14:50:04.650912 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3465aa00-c58d-4c78-8d97-4f2543f9265d-var-lib-cinder\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.756454 master-0 kubenswrapper[28758]: I0223 14:50:04.754903 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3465aa00-c58d-4c78-8d97-4f2543f9265d-etc-nvme\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.756454 master-0 kubenswrapper[28758]: I0223 14:50:04.755014 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3465aa00-c58d-4c78-8d97-4f2543f9265d-scripts\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.756454 master-0 kubenswrapper[28758]: I0223 14:50:04.755076 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3465aa00-c58d-4c78-8d97-4f2543f9265d-etc-machine-id\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.756454 master-0 kubenswrapper[28758]: I0223 14:50:04.755123 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3465aa00-c58d-4c78-8d97-4f2543f9265d-config-data-custom\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.756454 master-0 kubenswrapper[28758]: I0223 14:50:04.755184 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3465aa00-c58d-4c78-8d97-4f2543f9265d-etc-iscsi\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.756454 master-0 kubenswrapper[28758]: I0223 14:50:04.755215 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3465aa00-c58d-4c78-8d97-4f2543f9265d-combined-ca-bundle\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.756454 master-0 kubenswrapper[28758]: I0223 14:50:04.755264 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3465aa00-c58d-4c78-8d97-4f2543f9265d-dev\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.756454 master-0 kubenswrapper[28758]: I0223 14:50:04.755302 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3465aa00-c58d-4c78-8d97-4f2543f9265d-config-data\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.756454 master-0 kubenswrapper[28758]: I0223 14:50:04.755341 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3465aa00-c58d-4c78-8d97-4f2543f9265d-var-lib-cinder\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.756454 master-0 kubenswrapper[28758]: I0223 14:50:04.755387 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3465aa00-c58d-4c78-8d97-4f2543f9265d-var-locks-brick\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.756454 master-0 kubenswrapper[28758]: I0223 14:50:04.755490 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3465aa00-c58d-4c78-8d97-4f2543f9265d-var-locks-cinder\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.756454 master-0 kubenswrapper[28758]: I0223 14:50:04.755557 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3465aa00-c58d-4c78-8d97-4f2543f9265d-lib-modules\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.756454 master-0 kubenswrapper[28758]: I0223 14:50:04.756085 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3465aa00-c58d-4c78-8d97-4f2543f9265d-etc-machine-id\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.756454 master-0 kubenswrapper[28758]: I0223 14:50:04.756161 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3465aa00-c58d-4c78-8d97-4f2543f9265d-etc-nvme\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.757088 master-0 kubenswrapper[28758]: I0223 14:50:04.756631 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3465aa00-c58d-4c78-8d97-4f2543f9265d-sys\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.757088 master-0 kubenswrapper[28758]: I0223 14:50:04.756790 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3465aa00-c58d-4c78-8d97-4f2543f9265d-run\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.757088 master-0 kubenswrapper[28758]: I0223 14:50:04.756839 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgpt9\" (UniqueName: \"kubernetes.io/projected/3465aa00-c58d-4c78-8d97-4f2543f9265d-kube-api-access-lgpt9\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.757414 master-0 kubenswrapper[28758]: I0223 14:50:04.757384 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3465aa00-c58d-4c78-8d97-4f2543f9265d-var-locks-cinder\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.757460 master-0 kubenswrapper[28758]: I0223 14:50:04.757422 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3465aa00-c58d-4c78-8d97-4f2543f9265d-sys\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.757518 master-0 kubenswrapper[28758]: I0223 14:50:04.757490 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3465aa00-c58d-4c78-8d97-4f2543f9265d-var-locks-brick\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.757563 master-0 kubenswrapper[28758]: I0223 14:50:04.757502 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3465aa00-c58d-4c78-8d97-4f2543f9265d-var-lib-cinder\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.757563 master-0 kubenswrapper[28758]: I0223 14:50:04.757521 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3465aa00-c58d-4c78-8d97-4f2543f9265d-etc-iscsi\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.757563 master-0 kubenswrapper[28758]: I0223 14:50:04.757549 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3465aa00-c58d-4c78-8d97-4f2543f9265d-run\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.757677 master-0 kubenswrapper[28758]: I0223 14:50:04.757660 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3465aa00-c58d-4c78-8d97-4f2543f9265d-dev\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.762636 master-0 kubenswrapper[28758]: I0223 14:50:04.759413 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3465aa00-c58d-4c78-8d97-4f2543f9265d-lib-modules\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.762636 master-0 kubenswrapper[28758]: I0223 14:50:04.762528 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3465aa00-c58d-4c78-8d97-4f2543f9265d-scripts\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.765982 master-0 kubenswrapper[28758]: I0223 14:50:04.765566 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-55d877d745-zlz8n"] Feb 23 14:50:04.771580 master-0 kubenswrapper[28758]: I0223 14:50:04.767269 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3465aa00-c58d-4c78-8d97-4f2543f9265d-config-data-custom\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.771580 master-0 kubenswrapper[28758]: I0223 14:50:04.768105 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:04.772584 master-0 kubenswrapper[28758]: I0223 14:50:04.771930 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-internal-svc" Feb 23 14:50:04.778489 master-0 kubenswrapper[28758]: I0223 14:50:04.773223 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-public-svc" Feb 23 14:50:04.789461 master-0 kubenswrapper[28758]: I0223 14:50:04.789395 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3465aa00-c58d-4c78-8d97-4f2543f9265d-config-data\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.804881 master-0 kubenswrapper[28758]: I0223 14:50:04.804650 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-55d877d745-zlz8n"] Feb 23 14:50:04.851780 master-0 kubenswrapper[28758]: I0223 14:50:04.848340 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3465aa00-c58d-4c78-8d97-4f2543f9265d-combined-ca-bundle\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.852169 master-0 kubenswrapper[28758]: I0223 14:50:04.852102 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgpt9\" (UniqueName: \"kubernetes.io/projected/3465aa00-c58d-4c78-8d97-4f2543f9265d-kube-api-access-lgpt9\") pod \"cinder-2d990-backup-0\" (UID: \"3465aa00-c58d-4c78-8d97-4f2543f9265d\") " pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.865009 master-0 kubenswrapper[28758]: I0223 14:50:04.859655 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ec9d99fd-acd0-4435-bc55-034519a4b417-config-data-merged\") pod \"ironic-55d877d745-zlz8n\" (UID: \"ec9d99fd-acd0-4435-bc55-034519a4b417\") " pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:04.865009 master-0 kubenswrapper[28758]: I0223 14:50:04.859857 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x7nz\" (UniqueName: \"kubernetes.io/projected/ec9d99fd-acd0-4435-bc55-034519a4b417-kube-api-access-7x7nz\") pod \"ironic-55d877d745-zlz8n\" (UID: \"ec9d99fd-acd0-4435-bc55-034519a4b417\") " pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:04.865009 master-0 kubenswrapper[28758]: I0223 14:50:04.859997 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec9d99fd-acd0-4435-bc55-034519a4b417-scripts\") pod \"ironic-55d877d745-zlz8n\" (UID: \"ec9d99fd-acd0-4435-bc55-034519a4b417\") " pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:04.865009 master-0 kubenswrapper[28758]: I0223 14:50:04.860074 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec9d99fd-acd0-4435-bc55-034519a4b417-logs\") pod \"ironic-55d877d745-zlz8n\" (UID: \"ec9d99fd-acd0-4435-bc55-034519a4b417\") " pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:04.865009 master-0 kubenswrapper[28758]: I0223 14:50:04.860164 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec9d99fd-acd0-4435-bc55-034519a4b417-public-tls-certs\") pod \"ironic-55d877d745-zlz8n\" (UID: \"ec9d99fd-acd0-4435-bc55-034519a4b417\") " pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:04.865009 master-0 kubenswrapper[28758]: I0223 14:50:04.860234 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec9d99fd-acd0-4435-bc55-034519a4b417-config-data-custom\") pod \"ironic-55d877d745-zlz8n\" (UID: \"ec9d99fd-acd0-4435-bc55-034519a4b417\") " pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:04.865009 master-0 kubenswrapper[28758]: I0223 14:50:04.860297 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec9d99fd-acd0-4435-bc55-034519a4b417-combined-ca-bundle\") pod \"ironic-55d877d745-zlz8n\" (UID: \"ec9d99fd-acd0-4435-bc55-034519a4b417\") " pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:04.865009 master-0 kubenswrapper[28758]: I0223 14:50:04.860341 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec9d99fd-acd0-4435-bc55-034519a4b417-internal-tls-certs\") pod \"ironic-55d877d745-zlz8n\" (UID: \"ec9d99fd-acd0-4435-bc55-034519a4b417\") " pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:04.865009 master-0 kubenswrapper[28758]: I0223 14:50:04.860423 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec9d99fd-acd0-4435-bc55-034519a4b417-config-data\") pod \"ironic-55d877d745-zlz8n\" (UID: \"ec9d99fd-acd0-4435-bc55-034519a4b417\") " pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:04.865009 master-0 kubenswrapper[28758]: I0223 14:50:04.860528 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ec9d99fd-acd0-4435-bc55-034519a4b417-etc-podinfo\") pod \"ironic-55d877d745-zlz8n\" (UID: \"ec9d99fd-acd0-4435-bc55-034519a4b417\") " pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:04.877081 master-0 kubenswrapper[28758]: I0223 14:50:04.876582 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:04.921497 master-0 kubenswrapper[28758]: I0223 14:50:04.916503 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6bdcbb4f68-p649t" Feb 23 14:50:04.921497 master-0 kubenswrapper[28758]: I0223 14:50:04.918679 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6bdcbb4f68-p649t" Feb 23 14:50:04.965682 master-0 kubenswrapper[28758]: I0223 14:50:04.962455 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec9d99fd-acd0-4435-bc55-034519a4b417-config-data\") pod \"ironic-55d877d745-zlz8n\" (UID: \"ec9d99fd-acd0-4435-bc55-034519a4b417\") " pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:04.965682 master-0 kubenswrapper[28758]: I0223 14:50:04.962543 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ec9d99fd-acd0-4435-bc55-034519a4b417-etc-podinfo\") pod \"ironic-55d877d745-zlz8n\" (UID: \"ec9d99fd-acd0-4435-bc55-034519a4b417\") " pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:04.965682 master-0 kubenswrapper[28758]: I0223 14:50:04.962607 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ec9d99fd-acd0-4435-bc55-034519a4b417-config-data-merged\") pod \"ironic-55d877d745-zlz8n\" (UID: \"ec9d99fd-acd0-4435-bc55-034519a4b417\") " pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:04.965682 master-0 kubenswrapper[28758]: I0223 14:50:04.962653 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x7nz\" (UniqueName: \"kubernetes.io/projected/ec9d99fd-acd0-4435-bc55-034519a4b417-kube-api-access-7x7nz\") pod \"ironic-55d877d745-zlz8n\" (UID: \"ec9d99fd-acd0-4435-bc55-034519a4b417\") " pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:04.965682 master-0 kubenswrapper[28758]: I0223 14:50:04.962677 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec9d99fd-acd0-4435-bc55-034519a4b417-scripts\") pod \"ironic-55d877d745-zlz8n\" (UID: \"ec9d99fd-acd0-4435-bc55-034519a4b417\") " pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:04.965682 master-0 kubenswrapper[28758]: I0223 14:50:04.962699 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec9d99fd-acd0-4435-bc55-034519a4b417-logs\") pod \"ironic-55d877d745-zlz8n\" (UID: \"ec9d99fd-acd0-4435-bc55-034519a4b417\") " pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:04.965682 master-0 kubenswrapper[28758]: I0223 14:50:04.962729 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec9d99fd-acd0-4435-bc55-034519a4b417-public-tls-certs\") pod \"ironic-55d877d745-zlz8n\" (UID: \"ec9d99fd-acd0-4435-bc55-034519a4b417\") " pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:04.965682 master-0 kubenswrapper[28758]: I0223 14:50:04.964015 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec9d99fd-acd0-4435-bc55-034519a4b417-logs\") pod \"ironic-55d877d745-zlz8n\" (UID: \"ec9d99fd-acd0-4435-bc55-034519a4b417\") " pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:04.965682 master-0 kubenswrapper[28758]: I0223 14:50:04.965313 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ec9d99fd-acd0-4435-bc55-034519a4b417-config-data-merged\") pod \"ironic-55d877d745-zlz8n\" (UID: \"ec9d99fd-acd0-4435-bc55-034519a4b417\") " pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:04.978463 master-0 kubenswrapper[28758]: I0223 14:50:04.967675 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec9d99fd-acd0-4435-bc55-034519a4b417-config-data-custom\") pod \"ironic-55d877d745-zlz8n\" (UID: \"ec9d99fd-acd0-4435-bc55-034519a4b417\") " pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:04.978463 master-0 kubenswrapper[28758]: I0223 14:50:04.967778 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec9d99fd-acd0-4435-bc55-034519a4b417-combined-ca-bundle\") pod \"ironic-55d877d745-zlz8n\" (UID: \"ec9d99fd-acd0-4435-bc55-034519a4b417\") " pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:04.978463 master-0 kubenswrapper[28758]: I0223 14:50:04.967831 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec9d99fd-acd0-4435-bc55-034519a4b417-internal-tls-certs\") pod \"ironic-55d877d745-zlz8n\" (UID: \"ec9d99fd-acd0-4435-bc55-034519a4b417\") " pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:04.978463 master-0 kubenswrapper[28758]: I0223 14:50:04.970674 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec9d99fd-acd0-4435-bc55-034519a4b417-combined-ca-bundle\") pod \"ironic-55d877d745-zlz8n\" (UID: \"ec9d99fd-acd0-4435-bc55-034519a4b417\") " pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:04.978463 master-0 kubenswrapper[28758]: I0223 14:50:04.971663 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec9d99fd-acd0-4435-bc55-034519a4b417-config-data\") pod \"ironic-55d877d745-zlz8n\" (UID: \"ec9d99fd-acd0-4435-bc55-034519a4b417\") " pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:04.978463 master-0 kubenswrapper[28758]: I0223 14:50:04.976981 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec9d99fd-acd0-4435-bc55-034519a4b417-internal-tls-certs\") pod \"ironic-55d877d745-zlz8n\" (UID: \"ec9d99fd-acd0-4435-bc55-034519a4b417\") " pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:04.987383 master-0 kubenswrapper[28758]: I0223 14:50:04.986927 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec9d99fd-acd0-4435-bc55-034519a4b417-public-tls-certs\") pod \"ironic-55d877d745-zlz8n\" (UID: \"ec9d99fd-acd0-4435-bc55-034519a4b417\") " pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:04.987577 master-0 kubenswrapper[28758]: I0223 14:50:04.987418 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ec9d99fd-acd0-4435-bc55-034519a4b417-etc-podinfo\") pod \"ironic-55d877d745-zlz8n\" (UID: \"ec9d99fd-acd0-4435-bc55-034519a4b417\") " pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:04.989455 master-0 kubenswrapper[28758]: I0223 14:50:04.988399 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec9d99fd-acd0-4435-bc55-034519a4b417-scripts\") pod \"ironic-55d877d745-zlz8n\" (UID: \"ec9d99fd-acd0-4435-bc55-034519a4b417\") " pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:04.989455 master-0 kubenswrapper[28758]: I0223 14:50:04.988907 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec9d99fd-acd0-4435-bc55-034519a4b417-config-data-custom\") pod \"ironic-55d877d745-zlz8n\" (UID: \"ec9d99fd-acd0-4435-bc55-034519a4b417\") " pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:05.042452 master-0 kubenswrapper[28758]: I0223 14:50:05.042337 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x7nz\" (UniqueName: \"kubernetes.io/projected/ec9d99fd-acd0-4435-bc55-034519a4b417-kube-api-access-7x7nz\") pod \"ironic-55d877d745-zlz8n\" (UID: \"ec9d99fd-acd0-4435-bc55-034519a4b417\") " pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:05.213116 master-0 kubenswrapper[28758]: I0223 14:50:05.211487 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:05.220340 master-0 kubenswrapper[28758]: I0223 14:50:05.220273 28758 generic.go:334] "Generic (PLEG): container finished" podID="07aaf639-2ccb-4ffb-be79-572b8990a03a" containerID="daf88e757071588e1c1e500c56b0244ca9ad17ba37196c8aeb3f21f3fb19885b" exitCode=0 Feb 23 14:50:05.220586 master-0 kubenswrapper[28758]: I0223 14:50:05.220363 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-677b4847c-74h4p" event={"ID":"07aaf639-2ccb-4ffb-be79-572b8990a03a","Type":"ContainerDied","Data":"daf88e757071588e1c1e500c56b0244ca9ad17ba37196c8aeb3f21f3fb19885b"} Feb 23 14:50:05.226720 master-0 kubenswrapper[28758]: I0223 14:50:05.226619 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-volume-lvm-iscsi-0" event={"ID":"dc9b5902-d242-43ef-a8cc-a6b9f256507a","Type":"ContainerStarted","Data":"0bdb10aa559f7e37810d963099c65ac844decf3a99f65342518651a81459b65c"} Feb 23 14:50:05.230776 master-0 kubenswrapper[28758]: I0223 14:50:05.230198 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-149f-account-create-update-sktxn" event={"ID":"251ce51f-9613-4b91-987b-bb29a897430f","Type":"ContainerStarted","Data":"0839226bf47baff2ee4462e5c3235acac5b23f2c1f08a5b5f9b45464eed9fd45"} Feb 23 14:50:05.241763 master-0 kubenswrapper[28758]: I0223 14:50:05.241549 28758 generic.go:334] "Generic (PLEG): container finished" podID="6a47ca88-30b0-4569-bcd5-00994d3facc0" containerID="8b53e0ca29c760d86a78f00fa7f9b5319f8023c9a30f5cbcf5bcf9156478c801" exitCode=0 Feb 23 14:50:05.241826 master-0 kubenswrapper[28758]: I0223 14:50:05.241742 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-w9qz6" event={"ID":"6a47ca88-30b0-4569-bcd5-00994d3facc0","Type":"ContainerDied","Data":"8b53e0ca29c760d86a78f00fa7f9b5319f8023c9a30f5cbcf5bcf9156478c801"} Feb 23 14:50:05.435669 master-0 kubenswrapper[28758]: I0223 14:50:05.435551 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-149f-account-create-update-sktxn" podStartSLOduration=5.435527935 podStartE2EDuration="5.435527935s" podCreationTimestamp="2026-02-23 14:50:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:50:05.429885045 +0000 UTC m=+937.556200987" watchObservedRunningTime="2026-02-23 14:50:05.435527935 +0000 UTC m=+937.561843877" Feb 23 14:50:05.554578 master-0 kubenswrapper[28758]: I0223 14:50:05.552713 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5fb8596466-gd59d"] Feb 23 14:50:05.569978 master-0 kubenswrapper[28758]: I0223 14:50:05.569765 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5fb8596466-gd59d" Feb 23 14:50:05.601083 master-0 kubenswrapper[28758]: I0223 14:50:05.598450 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a235a236-5566-49dd-8535-9eda75c13d3e\" (UniqueName: \"kubernetes.io/csi/topolvm.io^c5924d5c-7571-44ad-ba50-1c9e0c5d8928\") pod \"ironic-conductor-0\" (UID: \"894fbd22-c889-426b-954b-04a9a0e4d905\") " pod="openstack/ironic-conductor-0" Feb 23 14:50:05.601083 master-0 kubenswrapper[28758]: I0223 14:50:05.599425 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5fb8596466-gd59d"] Feb 23 14:50:05.636195 master-0 kubenswrapper[28758]: I0223 14:50:05.636016 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-conductor-0" Feb 23 14:50:05.662027 master-0 kubenswrapper[28758]: I0223 14:50:05.661992 28758 trace.go:236] Trace[1664769204]: "Calculate volume metrics of ovndbcluster-nb-etc-ovn for pod openstack/ovsdbserver-nb-0" (23-Feb-2026 14:50:04.242) (total time: 1419ms): Feb 23 14:50:05.662027 master-0 kubenswrapper[28758]: Trace[1664769204]: [1.419909635s] [1.419909635s] END Feb 23 14:50:05.696011 master-0 kubenswrapper[28758]: I0223 14:50:05.695952 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/041b044e-bc1a-4693-8c06-6260d5fe663e-internal-tls-certs\") pod \"placement-5fb8596466-gd59d\" (UID: \"041b044e-bc1a-4693-8c06-6260d5fe663e\") " pod="openstack/placement-5fb8596466-gd59d" Feb 23 14:50:05.696137 master-0 kubenswrapper[28758]: I0223 14:50:05.696047 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/041b044e-bc1a-4693-8c06-6260d5fe663e-public-tls-certs\") pod \"placement-5fb8596466-gd59d\" (UID: \"041b044e-bc1a-4693-8c06-6260d5fe663e\") " pod="openstack/placement-5fb8596466-gd59d" Feb 23 14:50:05.696137 master-0 kubenswrapper[28758]: I0223 14:50:05.696110 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/041b044e-bc1a-4693-8c06-6260d5fe663e-config-data\") pod \"placement-5fb8596466-gd59d\" (UID: \"041b044e-bc1a-4693-8c06-6260d5fe663e\") " pod="openstack/placement-5fb8596466-gd59d" Feb 23 14:50:05.696213 master-0 kubenswrapper[28758]: I0223 14:50:05.696190 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/041b044e-bc1a-4693-8c06-6260d5fe663e-combined-ca-bundle\") pod \"placement-5fb8596466-gd59d\" (UID: \"041b044e-bc1a-4693-8c06-6260d5fe663e\") " pod="openstack/placement-5fb8596466-gd59d" Feb 23 14:50:05.696213 master-0 kubenswrapper[28758]: I0223 14:50:05.696208 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/041b044e-bc1a-4693-8c06-6260d5fe663e-logs\") pod \"placement-5fb8596466-gd59d\" (UID: \"041b044e-bc1a-4693-8c06-6260d5fe663e\") " pod="openstack/placement-5fb8596466-gd59d" Feb 23 14:50:05.696289 master-0 kubenswrapper[28758]: I0223 14:50:05.696234 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrvw7\" (UniqueName: \"kubernetes.io/projected/041b044e-bc1a-4693-8c06-6260d5fe663e-kube-api-access-nrvw7\") pod \"placement-5fb8596466-gd59d\" (UID: \"041b044e-bc1a-4693-8c06-6260d5fe663e\") " pod="openstack/placement-5fb8596466-gd59d" Feb 23 14:50:05.696289 master-0 kubenswrapper[28758]: I0223 14:50:05.696256 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/041b044e-bc1a-4693-8c06-6260d5fe663e-scripts\") pod \"placement-5fb8596466-gd59d\" (UID: \"041b044e-bc1a-4693-8c06-6260d5fe663e\") " pod="openstack/placement-5fb8596466-gd59d" Feb 23 14:50:05.698423 master-0 kubenswrapper[28758]: I0223 14:50:05.697540 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2d990-backup-0"] Feb 23 14:50:05.700629 master-0 kubenswrapper[28758]: W0223 14:50:05.700001 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3465aa00_c58d_4c78_8d97_4f2543f9265d.slice/crio-0da3d0ca9b86f002ff04a00b34e4e642c0f63a349277b514a98573e7dbec9ede WatchSource:0}: Error finding container 0da3d0ca9b86f002ff04a00b34e4e642c0f63a349277b514a98573e7dbec9ede: Status 404 returned error can't find the container with id 0da3d0ca9b86f002ff04a00b34e4e642c0f63a349277b514a98573e7dbec9ede Feb 23 14:50:05.814514 master-0 kubenswrapper[28758]: I0223 14:50:05.797948 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/041b044e-bc1a-4693-8c06-6260d5fe663e-public-tls-certs\") pod \"placement-5fb8596466-gd59d\" (UID: \"041b044e-bc1a-4693-8c06-6260d5fe663e\") " pod="openstack/placement-5fb8596466-gd59d" Feb 23 14:50:05.814514 master-0 kubenswrapper[28758]: I0223 14:50:05.798136 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/041b044e-bc1a-4693-8c06-6260d5fe663e-config-data\") pod \"placement-5fb8596466-gd59d\" (UID: \"041b044e-bc1a-4693-8c06-6260d5fe663e\") " pod="openstack/placement-5fb8596466-gd59d" Feb 23 14:50:05.814514 master-0 kubenswrapper[28758]: I0223 14:50:05.798267 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/041b044e-bc1a-4693-8c06-6260d5fe663e-combined-ca-bundle\") pod \"placement-5fb8596466-gd59d\" (UID: \"041b044e-bc1a-4693-8c06-6260d5fe663e\") " pod="openstack/placement-5fb8596466-gd59d" Feb 23 14:50:05.814514 master-0 kubenswrapper[28758]: I0223 14:50:05.798525 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/041b044e-bc1a-4693-8c06-6260d5fe663e-logs\") pod \"placement-5fb8596466-gd59d\" (UID: \"041b044e-bc1a-4693-8c06-6260d5fe663e\") " pod="openstack/placement-5fb8596466-gd59d" Feb 23 14:50:05.814514 master-0 kubenswrapper[28758]: I0223 14:50:05.798715 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrvw7\" (UniqueName: \"kubernetes.io/projected/041b044e-bc1a-4693-8c06-6260d5fe663e-kube-api-access-nrvw7\") pod \"placement-5fb8596466-gd59d\" (UID: \"041b044e-bc1a-4693-8c06-6260d5fe663e\") " pod="openstack/placement-5fb8596466-gd59d" Feb 23 14:50:05.814514 master-0 kubenswrapper[28758]: I0223 14:50:05.800222 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/041b044e-bc1a-4693-8c06-6260d5fe663e-scripts\") pod \"placement-5fb8596466-gd59d\" (UID: \"041b044e-bc1a-4693-8c06-6260d5fe663e\") " pod="openstack/placement-5fb8596466-gd59d" Feb 23 14:50:05.814514 master-0 kubenswrapper[28758]: I0223 14:50:05.800549 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/041b044e-bc1a-4693-8c06-6260d5fe663e-internal-tls-certs\") pod \"placement-5fb8596466-gd59d\" (UID: \"041b044e-bc1a-4693-8c06-6260d5fe663e\") " pod="openstack/placement-5fb8596466-gd59d" Feb 23 14:50:05.814514 master-0 kubenswrapper[28758]: I0223 14:50:05.800770 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/041b044e-bc1a-4693-8c06-6260d5fe663e-logs\") pod \"placement-5fb8596466-gd59d\" (UID: \"041b044e-bc1a-4693-8c06-6260d5fe663e\") " pod="openstack/placement-5fb8596466-gd59d" Feb 23 14:50:05.814514 master-0 kubenswrapper[28758]: I0223 14:50:05.805592 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/041b044e-bc1a-4693-8c06-6260d5fe663e-config-data\") pod \"placement-5fb8596466-gd59d\" (UID: \"041b044e-bc1a-4693-8c06-6260d5fe663e\") " pod="openstack/placement-5fb8596466-gd59d" Feb 23 14:50:05.814514 master-0 kubenswrapper[28758]: I0223 14:50:05.808097 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/041b044e-bc1a-4693-8c06-6260d5fe663e-combined-ca-bundle\") pod \"placement-5fb8596466-gd59d\" (UID: \"041b044e-bc1a-4693-8c06-6260d5fe663e\") " pod="openstack/placement-5fb8596466-gd59d" Feb 23 14:50:05.814514 master-0 kubenswrapper[28758]: I0223 14:50:05.808460 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/041b044e-bc1a-4693-8c06-6260d5fe663e-public-tls-certs\") pod \"placement-5fb8596466-gd59d\" (UID: \"041b044e-bc1a-4693-8c06-6260d5fe663e\") " pod="openstack/placement-5fb8596466-gd59d" Feb 23 14:50:05.814514 master-0 kubenswrapper[28758]: I0223 14:50:05.811249 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/041b044e-bc1a-4693-8c06-6260d5fe663e-internal-tls-certs\") pod \"placement-5fb8596466-gd59d\" (UID: \"041b044e-bc1a-4693-8c06-6260d5fe663e\") " pod="openstack/placement-5fb8596466-gd59d" Feb 23 14:50:05.814514 master-0 kubenswrapper[28758]: I0223 14:50:05.813395 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/041b044e-bc1a-4693-8c06-6260d5fe663e-scripts\") pod \"placement-5fb8596466-gd59d\" (UID: \"041b044e-bc1a-4693-8c06-6260d5fe663e\") " pod="openstack/placement-5fb8596466-gd59d" Feb 23 14:50:05.832547 master-0 kubenswrapper[28758]: I0223 14:50:05.831341 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrvw7\" (UniqueName: \"kubernetes.io/projected/041b044e-bc1a-4693-8c06-6260d5fe663e-kube-api-access-nrvw7\") pod \"placement-5fb8596466-gd59d\" (UID: \"041b044e-bc1a-4693-8c06-6260d5fe663e\") " pod="openstack/placement-5fb8596466-gd59d" Feb 23 14:50:05.939512 master-0 kubenswrapper[28758]: I0223 14:50:05.937369 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-55d877d745-zlz8n"] Feb 23 14:50:05.959103 master-0 kubenswrapper[28758]: W0223 14:50:05.957987 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec9d99fd_acd0_4435_bc55_034519a4b417.slice/crio-5f94ed0d13591303d2941b2802bb17fbbe0d924baff3362ccbe5b3cb8ecc2e3e WatchSource:0}: Error finding container 5f94ed0d13591303d2941b2802bb17fbbe0d924baff3362ccbe5b3cb8ecc2e3e: Status 404 returned error can't find the container with id 5f94ed0d13591303d2941b2802bb17fbbe0d924baff3362ccbe5b3cb8ecc2e3e Feb 23 14:50:06.005375 master-0 kubenswrapper[28758]: I0223 14:50:05.997521 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5fb8596466-gd59d" Feb 23 14:50:06.122453 master-0 kubenswrapper[28758]: I0223 14:50:06.122229 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a793659-5013-4d7e-a986-ea95782880c2" path="/var/lib/kubelet/pods/7a793659-5013-4d7e-a986-ea95782880c2/volumes" Feb 23 14:50:06.123804 master-0 kubenswrapper[28758]: I0223 14:50:06.123774 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4ab41cd-61e0-4117-9d0f-108f29ae692c" path="/var/lib/kubelet/pods/e4ab41cd-61e0-4117-9d0f-108f29ae692c/volumes" Feb 23 14:50:06.213456 master-0 kubenswrapper[28758]: I0223 14:50:06.152728 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-2d990-api-0" Feb 23 14:50:06.277459 master-0 kubenswrapper[28758]: I0223 14:50:06.277392 28758 generic.go:334] "Generic (PLEG): container finished" podID="251ce51f-9613-4b91-987b-bb29a897430f" containerID="0839226bf47baff2ee4462e5c3235acac5b23f2c1f08a5b5f9b45464eed9fd45" exitCode=0 Feb 23 14:50:06.277984 master-0 kubenswrapper[28758]: I0223 14:50:06.277457 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-149f-account-create-update-sktxn" event={"ID":"251ce51f-9613-4b91-987b-bb29a897430f","Type":"ContainerDied","Data":"0839226bf47baff2ee4462e5c3235acac5b23f2c1f08a5b5f9b45464eed9fd45"} Feb 23 14:50:06.282423 master-0 kubenswrapper[28758]: I0223 14:50:06.280523 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-55d877d745-zlz8n" event={"ID":"ec9d99fd-acd0-4435-bc55-034519a4b417","Type":"ContainerStarted","Data":"5f94ed0d13591303d2941b2802bb17fbbe0d924baff3362ccbe5b3cb8ecc2e3e"} Feb 23 14:50:06.286471 master-0 kubenswrapper[28758]: I0223 14:50:06.285662 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-677b4847c-74h4p" event={"ID":"07aaf639-2ccb-4ffb-be79-572b8990a03a","Type":"ContainerStarted","Data":"fa3cfa1fada3c7da4b5e8ec9e6a80a74a5a1d0556a728604612ebc94d61bf72e"} Feb 23 14:50:06.286771 master-0 kubenswrapper[28758]: I0223 14:50:06.286707 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-677b4847c-74h4p" Feb 23 14:50:06.290882 master-0 kubenswrapper[28758]: I0223 14:50:06.289068 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-volume-lvm-iscsi-0" event={"ID":"dc9b5902-d242-43ef-a8cc-a6b9f256507a","Type":"ContainerStarted","Data":"82f27205e4a4da246581aafa465f66d219a819abbe1c471eb3f7cf9a58143d5a"} Feb 23 14:50:06.293541 master-0 kubenswrapper[28758]: I0223 14:50:06.293461 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-backup-0" event={"ID":"3465aa00-c58d-4c78-8d97-4f2543f9265d","Type":"ContainerStarted","Data":"892b8dfc1515ef485c692f4a6be1443c842b21f0a3fcb1597a6316eb951ec553"} Feb 23 14:50:06.293541 master-0 kubenswrapper[28758]: I0223 14:50:06.293514 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-backup-0" event={"ID":"3465aa00-c58d-4c78-8d97-4f2543f9265d","Type":"ContainerStarted","Data":"0da3d0ca9b86f002ff04a00b34e4e642c0f63a349277b514a98573e7dbec9ede"} Feb 23 14:50:06.371710 master-0 kubenswrapper[28758]: I0223 14:50:06.371247 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-2d990-volume-lvm-iscsi-0" podStartSLOduration=5.371215566 podStartE2EDuration="5.371215566s" podCreationTimestamp="2026-02-23 14:50:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:50:06.364374804 +0000 UTC m=+938.490690736" watchObservedRunningTime="2026-02-23 14:50:06.371215566 +0000 UTC m=+938.497531508" Feb 23 14:50:06.410993 master-0 kubenswrapper[28758]: I0223 14:50:06.409673 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-677b4847c-74h4p" podStartSLOduration=5.409645007 podStartE2EDuration="5.409645007s" podCreationTimestamp="2026-02-23 14:50:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:50:06.402529348 +0000 UTC m=+938.528845280" watchObservedRunningTime="2026-02-23 14:50:06.409645007 +0000 UTC m=+938.535960939" Feb 23 14:50:06.448078 master-0 kubenswrapper[28758]: I0223 14:50:06.448001 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-conductor-0"] Feb 23 14:50:07.031385 master-0 kubenswrapper[28758]: I0223 14:50:07.031305 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:07.122829 master-0 kubenswrapper[28758]: I0223 14:50:07.122767 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-w9qz6" Feb 23 14:50:07.253501 master-0 kubenswrapper[28758]: I0223 14:50:07.252438 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a47ca88-30b0-4569-bcd5-00994d3facc0-operator-scripts\") pod \"6a47ca88-30b0-4569-bcd5-00994d3facc0\" (UID: \"6a47ca88-30b0-4569-bcd5-00994d3facc0\") " Feb 23 14:50:07.253501 master-0 kubenswrapper[28758]: I0223 14:50:07.253165 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsndn\" (UniqueName: \"kubernetes.io/projected/6a47ca88-30b0-4569-bcd5-00994d3facc0-kube-api-access-wsndn\") pod \"6a47ca88-30b0-4569-bcd5-00994d3facc0\" (UID: \"6a47ca88-30b0-4569-bcd5-00994d3facc0\") " Feb 23 14:50:07.258500 master-0 kubenswrapper[28758]: I0223 14:50:07.252994 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a47ca88-30b0-4569-bcd5-00994d3facc0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6a47ca88-30b0-4569-bcd5-00994d3facc0" (UID: "6a47ca88-30b0-4569-bcd5-00994d3facc0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:50:07.264498 master-0 kubenswrapper[28758]: I0223 14:50:07.263680 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a47ca88-30b0-4569-bcd5-00994d3facc0-kube-api-access-wsndn" (OuterVolumeSpecName: "kube-api-access-wsndn") pod "6a47ca88-30b0-4569-bcd5-00994d3facc0" (UID: "6a47ca88-30b0-4569-bcd5-00994d3facc0"). InnerVolumeSpecName "kube-api-access-wsndn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:50:07.315514 master-0 kubenswrapper[28758]: I0223 14:50:07.315014 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"894fbd22-c889-426b-954b-04a9a0e4d905","Type":"ContainerStarted","Data":"7516998ba7eabe3f615be54cc8029a9c71642ce6813c5f2e0f273c6a9f5824aa"} Feb 23 14:50:07.319505 master-0 kubenswrapper[28758]: I0223 14:50:07.316807 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-w9qz6" event={"ID":"6a47ca88-30b0-4569-bcd5-00994d3facc0","Type":"ContainerDied","Data":"11e5f909571d3f87a65ba922309000163e4f78e7116d4a68d9332679e81c8be7"} Feb 23 14:50:07.319505 master-0 kubenswrapper[28758]: I0223 14:50:07.316841 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11e5f909571d3f87a65ba922309000163e4f78e7116d4a68d9332679e81c8be7" Feb 23 14:50:07.319505 master-0 kubenswrapper[28758]: I0223 14:50:07.316880 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-w9qz6" Feb 23 14:50:07.358742 master-0 kubenswrapper[28758]: I0223 14:50:07.357239 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsndn\" (UniqueName: \"kubernetes.io/projected/6a47ca88-30b0-4569-bcd5-00994d3facc0-kube-api-access-wsndn\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:07.358742 master-0 kubenswrapper[28758]: I0223 14:50:07.357291 28758 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a47ca88-30b0-4569-bcd5-00994d3facc0-operator-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:08.106970 master-0 kubenswrapper[28758]: I0223 14:50:08.106649 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-149f-account-create-update-sktxn" Feb 23 14:50:08.196628 master-0 kubenswrapper[28758]: I0223 14:50:08.196230 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/251ce51f-9613-4b91-987b-bb29a897430f-operator-scripts\") pod \"251ce51f-9613-4b91-987b-bb29a897430f\" (UID: \"251ce51f-9613-4b91-987b-bb29a897430f\") " Feb 23 14:50:08.196628 master-0 kubenswrapper[28758]: I0223 14:50:08.196498 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z9v6\" (UniqueName: \"kubernetes.io/projected/251ce51f-9613-4b91-987b-bb29a897430f-kube-api-access-2z9v6\") pod \"251ce51f-9613-4b91-987b-bb29a897430f\" (UID: \"251ce51f-9613-4b91-987b-bb29a897430f\") " Feb 23 14:50:08.197270 master-0 kubenswrapper[28758]: I0223 14:50:08.197066 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/251ce51f-9613-4b91-987b-bb29a897430f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "251ce51f-9613-4b91-987b-bb29a897430f" (UID: "251ce51f-9613-4b91-987b-bb29a897430f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:50:08.200614 master-0 kubenswrapper[28758]: I0223 14:50:08.197825 28758 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/251ce51f-9613-4b91-987b-bb29a897430f-operator-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:08.235505 master-0 kubenswrapper[28758]: I0223 14:50:08.229155 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/251ce51f-9613-4b91-987b-bb29a897430f-kube-api-access-2z9v6" (OuterVolumeSpecName: "kube-api-access-2z9v6") pod "251ce51f-9613-4b91-987b-bb29a897430f" (UID: "251ce51f-9613-4b91-987b-bb29a897430f"). InnerVolumeSpecName "kube-api-access-2z9v6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:50:08.303501 master-0 kubenswrapper[28758]: I0223 14:50:08.301681 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z9v6\" (UniqueName: \"kubernetes.io/projected/251ce51f-9613-4b91-987b-bb29a897430f-kube-api-access-2z9v6\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:08.351498 master-0 kubenswrapper[28758]: I0223 14:50:08.348092 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-backup-0" event={"ID":"3465aa00-c58d-4c78-8d97-4f2543f9265d","Type":"ContainerStarted","Data":"f4ab4fa6c1d3be460b3f657de8069c11ec8dd7cf882539ac767fb2e3d7232c14"} Feb 23 14:50:08.366002 master-0 kubenswrapper[28758]: I0223 14:50:08.353665 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-149f-account-create-update-sktxn" event={"ID":"251ce51f-9613-4b91-987b-bb29a897430f","Type":"ContainerDied","Data":"4dc56005a393d48dccae4d24c1ab1a22a3518b67e521426d9ad5d68a38cb8185"} Feb 23 14:50:08.366002 master-0 kubenswrapper[28758]: I0223 14:50:08.353712 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dc56005a393d48dccae4d24c1ab1a22a3518b67e521426d9ad5d68a38cb8185" Feb 23 14:50:08.366002 master-0 kubenswrapper[28758]: I0223 14:50:08.353766 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-149f-account-create-update-sktxn" Feb 23 14:50:08.419674 master-0 kubenswrapper[28758]: I0223 14:50:08.419592 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5fb8596466-gd59d"] Feb 23 14:50:08.429421 master-0 kubenswrapper[28758]: I0223 14:50:08.429350 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-2d990-backup-0" podStartSLOduration=4.429329309 podStartE2EDuration="4.429329309s" podCreationTimestamp="2026-02-23 14:50:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:50:08.374092752 +0000 UTC m=+940.500408684" watchObservedRunningTime="2026-02-23 14:50:08.429329309 +0000 UTC m=+940.555645241" Feb 23 14:50:09.386388 master-0 kubenswrapper[28758]: I0223 14:50:09.386184 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5fb8596466-gd59d" event={"ID":"041b044e-bc1a-4693-8c06-6260d5fe663e","Type":"ContainerStarted","Data":"fa1b2d693629a1f10c76ee21b05e0831243c0138c0b32d39a42898eecc465a33"} Feb 23 14:50:09.386388 master-0 kubenswrapper[28758]: I0223 14:50:09.386260 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5fb8596466-gd59d" event={"ID":"041b044e-bc1a-4693-8c06-6260d5fe663e","Type":"ContainerStarted","Data":"2654d1c5eae7af147a185e796f74d7b3fb71e7ab36d5448a322518c3e8288c1d"} Feb 23 14:50:09.386388 master-0 kubenswrapper[28758]: I0223 14:50:09.386273 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5fb8596466-gd59d" event={"ID":"041b044e-bc1a-4693-8c06-6260d5fe663e","Type":"ContainerStarted","Data":"3d2f9d80d2f30936598c98c4325dac6d585930e66674af6e38d1b9f41961abf6"} Feb 23 14:50:09.387004 master-0 kubenswrapper[28758]: I0223 14:50:09.386468 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5fb8596466-gd59d" Feb 23 14:50:09.387004 master-0 kubenswrapper[28758]: I0223 14:50:09.386591 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5fb8596466-gd59d" Feb 23 14:50:09.390104 master-0 kubenswrapper[28758]: I0223 14:50:09.390028 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-769d7f49c5-jj8xx" event={"ID":"8c0c40ba-2093-4e1a-8166-bbb4c53f3a08","Type":"ContainerStarted","Data":"7ec34062e9ab758e1b03daf0b27364f5d5426c10fdbad94ead0aa03eef90fe6c"} Feb 23 14:50:09.390661 master-0 kubenswrapper[28758]: I0223 14:50:09.390443 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-769d7f49c5-jj8xx" Feb 23 14:50:09.392702 master-0 kubenswrapper[28758]: I0223 14:50:09.391684 28758 generic.go:334] "Generic (PLEG): container finished" podID="ec9d99fd-acd0-4435-bc55-034519a4b417" containerID="7589954e4ef815812128b262508ba1797cd44ec3ff7162f50615063d823fc500" exitCode=0 Feb 23 14:50:09.392702 master-0 kubenswrapper[28758]: I0223 14:50:09.391733 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-55d877d745-zlz8n" event={"ID":"ec9d99fd-acd0-4435-bc55-034519a4b417","Type":"ContainerDied","Data":"7589954e4ef815812128b262508ba1797cd44ec3ff7162f50615063d823fc500"} Feb 23 14:50:09.393866 master-0 kubenswrapper[28758]: I0223 14:50:09.393165 28758 generic.go:334] "Generic (PLEG): container finished" podID="ddd7c1f3-7e33-4a32-a229-09521aa553e2" containerID="00469c31edf043d3bd6d22e3bf72fc59d69df0f246c782a7ae55c95933685939" exitCode=0 Feb 23 14:50:09.393866 master-0 kubenswrapper[28758]: I0223 14:50:09.393215 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7dbb4799c9-gg6zd" event={"ID":"ddd7c1f3-7e33-4a32-a229-09521aa553e2","Type":"ContainerDied","Data":"00469c31edf043d3bd6d22e3bf72fc59d69df0f246c782a7ae55c95933685939"} Feb 23 14:50:09.397774 master-0 kubenswrapper[28758]: I0223 14:50:09.397588 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"894fbd22-c889-426b-954b-04a9a0e4d905","Type":"ContainerStarted","Data":"68b88b5ee88140ed1c25412c5788c79b6659e27cce563152d8f13ca0888be4f2"} Feb 23 14:50:09.466041 master-0 kubenswrapper[28758]: I0223 14:50:09.465969 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5fb8596466-gd59d" podStartSLOduration=4.465948661 podStartE2EDuration="4.465948661s" podCreationTimestamp="2026-02-23 14:50:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:50:09.418183841 +0000 UTC m=+941.544499773" watchObservedRunningTime="2026-02-23 14:50:09.465948661 +0000 UTC m=+941.592264593" Feb 23 14:50:09.544329 master-0 kubenswrapper[28758]: I0223 14:50:09.544235 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-neutron-agent-769d7f49c5-jj8xx" podStartSLOduration=5.521317004 podStartE2EDuration="9.54421419s" podCreationTimestamp="2026-02-23 14:50:00 +0000 UTC" firstStartedPulling="2026-02-23 14:50:03.979604073 +0000 UTC m=+936.105920005" lastFinishedPulling="2026-02-23 14:50:08.002501239 +0000 UTC m=+940.128817191" observedRunningTime="2026-02-23 14:50:09.529264393 +0000 UTC m=+941.655580335" watchObservedRunningTime="2026-02-23 14:50:09.54421419 +0000 UTC m=+941.670530122" Feb 23 14:50:09.878317 master-0 kubenswrapper[28758]: I0223 14:50:09.878237 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:10.408798 master-0 kubenswrapper[28758]: I0223 14:50:10.408713 28758 generic.go:334] "Generic (PLEG): container finished" podID="ddd7c1f3-7e33-4a32-a229-09521aa553e2" containerID="b80e135e91550ae24e3a338fb4d8dc0a2e64ca6a51568536464caba6443ec08d" exitCode=1 Feb 23 14:50:10.408798 master-0 kubenswrapper[28758]: I0223 14:50:10.408784 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7dbb4799c9-gg6zd" event={"ID":"ddd7c1f3-7e33-4a32-a229-09521aa553e2","Type":"ContainerDied","Data":"b80e135e91550ae24e3a338fb4d8dc0a2e64ca6a51568536464caba6443ec08d"} Feb 23 14:50:10.408798 master-0 kubenswrapper[28758]: I0223 14:50:10.408814 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7dbb4799c9-gg6zd" event={"ID":"ddd7c1f3-7e33-4a32-a229-09521aa553e2","Type":"ContainerStarted","Data":"146c10621d1b6c06a49e812424dab5aeb362a464f2a46c840f5ddc128e722f23"} Feb 23 14:50:10.409635 master-0 kubenswrapper[28758]: I0223 14:50:10.409550 28758 scope.go:117] "RemoveContainer" containerID="b80e135e91550ae24e3a338fb4d8dc0a2e64ca6a51568536464caba6443ec08d" Feb 23 14:50:10.417023 master-0 kubenswrapper[28758]: I0223 14:50:10.415625 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-55d877d745-zlz8n" event={"ID":"ec9d99fd-acd0-4435-bc55-034519a4b417","Type":"ContainerStarted","Data":"f73af67f289f29900c491caca45ff8b76865e6f5d8ef0222253816c8a51ae675"} Feb 23 14:50:10.417023 master-0 kubenswrapper[28758]: I0223 14:50:10.416809 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-55d877d745-zlz8n" event={"ID":"ec9d99fd-acd0-4435-bc55-034519a4b417","Type":"ContainerStarted","Data":"7285c038cb83617be506d2492ec99564df16a05365e9fa59cd5fe1d5bb1659a8"} Feb 23 14:50:10.419037 master-0 kubenswrapper[28758]: I0223 14:50:10.418097 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:10.503956 master-0 kubenswrapper[28758]: I0223 14:50:10.503024 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-55d877d745-zlz8n" podStartSLOduration=4.45314789 podStartE2EDuration="6.503001404s" podCreationTimestamp="2026-02-23 14:50:04 +0000 UTC" firstStartedPulling="2026-02-23 14:50:05.960252977 +0000 UTC m=+938.086568909" lastFinishedPulling="2026-02-23 14:50:08.010106491 +0000 UTC m=+940.136422423" observedRunningTime="2026-02-23 14:50:10.488406757 +0000 UTC m=+942.614722699" watchObservedRunningTime="2026-02-23 14:50:10.503001404 +0000 UTC m=+942.629317346" Feb 23 14:50:11.434529 master-0 kubenswrapper[28758]: I0223 14:50:11.433837 28758 generic.go:334] "Generic (PLEG): container finished" podID="ddd7c1f3-7e33-4a32-a229-09521aa553e2" containerID="99765de22307a1c342adcf8794d27957519466d1118f780cf8792ba120d06e91" exitCode=1 Feb 23 14:50:11.434529 master-0 kubenswrapper[28758]: I0223 14:50:11.433975 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7dbb4799c9-gg6zd" event={"ID":"ddd7c1f3-7e33-4a32-a229-09521aa553e2","Type":"ContainerDied","Data":"99765de22307a1c342adcf8794d27957519466d1118f780cf8792ba120d06e91"} Feb 23 14:50:11.434529 master-0 kubenswrapper[28758]: I0223 14:50:11.434085 28758 scope.go:117] "RemoveContainer" containerID="b80e135e91550ae24e3a338fb4d8dc0a2e64ca6a51568536464caba6443ec08d" Feb 23 14:50:11.437135 master-0 kubenswrapper[28758]: I0223 14:50:11.436717 28758 generic.go:334] "Generic (PLEG): container finished" podID="894fbd22-c889-426b-954b-04a9a0e4d905" containerID="68b88b5ee88140ed1c25412c5788c79b6659e27cce563152d8f13ca0888be4f2" exitCode=0 Feb 23 14:50:11.437135 master-0 kubenswrapper[28758]: I0223 14:50:11.436821 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"894fbd22-c889-426b-954b-04a9a0e4d905","Type":"ContainerDied","Data":"68b88b5ee88140ed1c25412c5788c79b6659e27cce563152d8f13ca0888be4f2"} Feb 23 14:50:11.440365 master-0 kubenswrapper[28758]: I0223 14:50:11.440321 28758 scope.go:117] "RemoveContainer" containerID="99765de22307a1c342adcf8794d27957519466d1118f780cf8792ba120d06e91" Feb 23 14:50:11.440602 master-0 kubenswrapper[28758]: I0223 14:50:11.440572 28758 generic.go:334] "Generic (PLEG): container finished" podID="8c0c40ba-2093-4e1a-8166-bbb4c53f3a08" containerID="7ec34062e9ab758e1b03daf0b27364f5d5426c10fdbad94ead0aa03eef90fe6c" exitCode=1 Feb 23 14:50:11.440686 master-0 kubenswrapper[28758]: I0223 14:50:11.440627 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-769d7f49c5-jj8xx" event={"ID":"8c0c40ba-2093-4e1a-8166-bbb4c53f3a08","Type":"ContainerDied","Data":"7ec34062e9ab758e1b03daf0b27364f5d5426c10fdbad94ead0aa03eef90fe6c"} Feb 23 14:50:11.441027 master-0 kubenswrapper[28758]: E0223 14:50:11.440967 28758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-7dbb4799c9-gg6zd_openstack(ddd7c1f3-7e33-4a32-a229-09521aa553e2)\"" pod="openstack/ironic-7dbb4799c9-gg6zd" podUID="ddd7c1f3-7e33-4a32-a229-09521aa553e2" Feb 23 14:50:11.441623 master-0 kubenswrapper[28758]: I0223 14:50:11.441571 28758 scope.go:117] "RemoveContainer" containerID="7ec34062e9ab758e1b03daf0b27364f5d5426c10fdbad94ead0aa03eef90fe6c" Feb 23 14:50:11.571739 master-0 kubenswrapper[28758]: I0223 14:50:11.571659 28758 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-neutron-agent-769d7f49c5-jj8xx" Feb 23 14:50:12.045251 master-0 kubenswrapper[28758]: I0223 14:50:12.045173 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-677b4847c-74h4p" Feb 23 14:50:12.081715 master-0 kubenswrapper[28758]: I0223 14:50:12.081612 28758 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-7dbb4799c9-gg6zd" Feb 23 14:50:12.081715 master-0 kubenswrapper[28758]: I0223 14:50:12.081671 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-7dbb4799c9-gg6zd" Feb 23 14:50:12.138371 master-0 kubenswrapper[28758]: I0223 14:50:12.138297 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ddcf4757-slsxg"] Feb 23 14:50:12.138732 master-0 kubenswrapper[28758]: I0223 14:50:12.138687 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-ddcf4757-slsxg" podUID="f30a16e8-1f43-43c6-bc0c-e471a0cda8ef" containerName="dnsmasq-dns" containerID="cri-o://7cb67f8d72e3e54bce465a80b556566400c5d647ff8f81e442763ed5e78f5ff4" gracePeriod=10 Feb 23 14:50:12.408221 master-0 kubenswrapper[28758]: I0223 14:50:12.408158 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-2d990-volume-lvm-iscsi-0" Feb 23 14:50:12.458416 master-0 kubenswrapper[28758]: I0223 14:50:12.457730 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-769d7f49c5-jj8xx" event={"ID":"8c0c40ba-2093-4e1a-8166-bbb4c53f3a08","Type":"ContainerStarted","Data":"7a8e55d4703a52a75f17c69a848f147a36062c67b0d146662f6b138cb546db14"} Feb 23 14:50:12.458416 master-0 kubenswrapper[28758]: I0223 14:50:12.457806 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-769d7f49c5-jj8xx" Feb 23 14:50:12.461174 master-0 kubenswrapper[28758]: I0223 14:50:12.461127 28758 generic.go:334] "Generic (PLEG): container finished" podID="f30a16e8-1f43-43c6-bc0c-e471a0cda8ef" containerID="7cb67f8d72e3e54bce465a80b556566400c5d647ff8f81e442763ed5e78f5ff4" exitCode=0 Feb 23 14:50:12.461266 master-0 kubenswrapper[28758]: I0223 14:50:12.461203 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ddcf4757-slsxg" event={"ID":"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef","Type":"ContainerDied","Data":"7cb67f8d72e3e54bce465a80b556566400c5d647ff8f81e442763ed5e78f5ff4"} Feb 23 14:50:12.465116 master-0 kubenswrapper[28758]: I0223 14:50:12.464995 28758 scope.go:117] "RemoveContainer" containerID="99765de22307a1c342adcf8794d27957519466d1118f780cf8792ba120d06e91" Feb 23 14:50:12.465448 master-0 kubenswrapper[28758]: E0223 14:50:12.465272 28758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-7dbb4799c9-gg6zd_openstack(ddd7c1f3-7e33-4a32-a229-09521aa553e2)\"" pod="openstack/ironic-7dbb4799c9-gg6zd" podUID="ddd7c1f3-7e33-4a32-a229-09521aa553e2" Feb 23 14:50:12.562985 master-0 kubenswrapper[28758]: I0223 14:50:12.562769 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-56fc45f8f5-fsvgg" Feb 23 14:50:12.889998 master-0 kubenswrapper[28758]: I0223 14:50:12.889127 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 23 14:50:12.889998 master-0 kubenswrapper[28758]: E0223 14:50:12.889793 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="251ce51f-9613-4b91-987b-bb29a897430f" containerName="mariadb-account-create-update" Feb 23 14:50:12.889998 master-0 kubenswrapper[28758]: I0223 14:50:12.889811 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="251ce51f-9613-4b91-987b-bb29a897430f" containerName="mariadb-account-create-update" Feb 23 14:50:12.889998 master-0 kubenswrapper[28758]: E0223 14:50:12.889838 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a47ca88-30b0-4569-bcd5-00994d3facc0" containerName="mariadb-database-create" Feb 23 14:50:12.889998 master-0 kubenswrapper[28758]: I0223 14:50:12.889845 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a47ca88-30b0-4569-bcd5-00994d3facc0" containerName="mariadb-database-create" Feb 23 14:50:12.890345 master-0 kubenswrapper[28758]: I0223 14:50:12.890065 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="251ce51f-9613-4b91-987b-bb29a897430f" containerName="mariadb-account-create-update" Feb 23 14:50:12.890345 master-0 kubenswrapper[28758]: I0223 14:50:12.890088 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a47ca88-30b0-4569-bcd5-00994d3facc0" containerName="mariadb-database-create" Feb 23 14:50:12.895508 master-0 kubenswrapper[28758]: I0223 14:50:12.890798 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 14:50:12.908554 master-0 kubenswrapper[28758]: I0223 14:50:12.902334 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 23 14:50:12.908554 master-0 kubenswrapper[28758]: I0223 14:50:12.905392 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 23 14:50:12.908554 master-0 kubenswrapper[28758]: I0223 14:50:12.905982 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 23 14:50:12.927505 master-0 kubenswrapper[28758]: I0223 14:50:12.926988 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ddcf4757-slsxg" Feb 23 14:50:13.059983 master-0 kubenswrapper[28758]: I0223 14:50:13.059932 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-config\") pod \"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef\" (UID: \"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef\") " Feb 23 14:50:13.060205 master-0 kubenswrapper[28758]: I0223 14:50:13.060007 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-dns-svc\") pod \"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef\" (UID: \"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef\") " Feb 23 14:50:13.060205 master-0 kubenswrapper[28758]: I0223 14:50:13.060085 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-ovsdbserver-nb\") pod \"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef\" (UID: \"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef\") " Feb 23 14:50:13.060205 master-0 kubenswrapper[28758]: I0223 14:50:13.060155 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-ovsdbserver-sb\") pod \"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef\" (UID: \"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef\") " Feb 23 14:50:13.060343 master-0 kubenswrapper[28758]: I0223 14:50:13.060216 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-dns-swift-storage-0\") pod \"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef\" (UID: \"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef\") " Feb 23 14:50:13.060343 master-0 kubenswrapper[28758]: I0223 14:50:13.060241 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpqhd\" (UniqueName: \"kubernetes.io/projected/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-kube-api-access-tpqhd\") pod \"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef\" (UID: \"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef\") " Feb 23 14:50:13.060861 master-0 kubenswrapper[28758]: I0223 14:50:13.060830 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/718a5e05-c1d8-4982-8808-135e9883dab9-openstack-config\") pod \"openstackclient\" (UID: \"718a5e05-c1d8-4982-8808-135e9883dab9\") " pod="openstack/openstackclient" Feb 23 14:50:13.060943 master-0 kubenswrapper[28758]: I0223 14:50:13.060902 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/718a5e05-c1d8-4982-8808-135e9883dab9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"718a5e05-c1d8-4982-8808-135e9883dab9\") " pod="openstack/openstackclient" Feb 23 14:50:13.061065 master-0 kubenswrapper[28758]: I0223 14:50:13.061050 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/718a5e05-c1d8-4982-8808-135e9883dab9-openstack-config-secret\") pod \"openstackclient\" (UID: \"718a5e05-c1d8-4982-8808-135e9883dab9\") " pod="openstack/openstackclient" Feb 23 14:50:13.061160 master-0 kubenswrapper[28758]: I0223 14:50:13.061136 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfdnc\" (UniqueName: \"kubernetes.io/projected/718a5e05-c1d8-4982-8808-135e9883dab9-kube-api-access-zfdnc\") pod \"openstackclient\" (UID: \"718a5e05-c1d8-4982-8808-135e9883dab9\") " pod="openstack/openstackclient" Feb 23 14:50:13.070413 master-0 kubenswrapper[28758]: I0223 14:50:13.070343 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-kube-api-access-tpqhd" (OuterVolumeSpecName: "kube-api-access-tpqhd") pod "f30a16e8-1f43-43c6-bc0c-e471a0cda8ef" (UID: "f30a16e8-1f43-43c6-bc0c-e471a0cda8ef"). InnerVolumeSpecName "kube-api-access-tpqhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:50:13.177787 master-0 kubenswrapper[28758]: I0223 14:50:13.177702 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/718a5e05-c1d8-4982-8808-135e9883dab9-openstack-config\") pod \"openstackclient\" (UID: \"718a5e05-c1d8-4982-8808-135e9883dab9\") " pod="openstack/openstackclient" Feb 23 14:50:13.178004 master-0 kubenswrapper[28758]: I0223 14:50:13.177831 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/718a5e05-c1d8-4982-8808-135e9883dab9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"718a5e05-c1d8-4982-8808-135e9883dab9\") " pod="openstack/openstackclient" Feb 23 14:50:13.178097 master-0 kubenswrapper[28758]: I0223 14:50:13.178069 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/718a5e05-c1d8-4982-8808-135e9883dab9-openstack-config-secret\") pod \"openstackclient\" (UID: \"718a5e05-c1d8-4982-8808-135e9883dab9\") " pod="openstack/openstackclient" Feb 23 14:50:13.178226 master-0 kubenswrapper[28758]: I0223 14:50:13.178195 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfdnc\" (UniqueName: \"kubernetes.io/projected/718a5e05-c1d8-4982-8808-135e9883dab9-kube-api-access-zfdnc\") pod \"openstackclient\" (UID: \"718a5e05-c1d8-4982-8808-135e9883dab9\") " pod="openstack/openstackclient" Feb 23 14:50:13.179980 master-0 kubenswrapper[28758]: I0223 14:50:13.179334 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f30a16e8-1f43-43c6-bc0c-e471a0cda8ef" (UID: "f30a16e8-1f43-43c6-bc0c-e471a0cda8ef"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:50:13.179980 master-0 kubenswrapper[28758]: I0223 14:50:13.179938 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/718a5e05-c1d8-4982-8808-135e9883dab9-openstack-config\") pod \"openstackclient\" (UID: \"718a5e05-c1d8-4982-8808-135e9883dab9\") " pod="openstack/openstackclient" Feb 23 14:50:13.181108 master-0 kubenswrapper[28758]: I0223 14:50:13.180368 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpqhd\" (UniqueName: \"kubernetes.io/projected/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-kube-api-access-tpqhd\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:13.184991 master-0 kubenswrapper[28758]: I0223 14:50:13.184951 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/718a5e05-c1d8-4982-8808-135e9883dab9-openstack-config-secret\") pod \"openstackclient\" (UID: \"718a5e05-c1d8-4982-8808-135e9883dab9\") " pod="openstack/openstackclient" Feb 23 14:50:13.189013 master-0 kubenswrapper[28758]: I0223 14:50:13.188891 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/718a5e05-c1d8-4982-8808-135e9883dab9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"718a5e05-c1d8-4982-8808-135e9883dab9\") " pod="openstack/openstackclient" Feb 23 14:50:13.189719 master-0 kubenswrapper[28758]: I0223 14:50:13.189666 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-config" (OuterVolumeSpecName: "config") pod "f30a16e8-1f43-43c6-bc0c-e471a0cda8ef" (UID: "f30a16e8-1f43-43c6-bc0c-e471a0cda8ef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:50:13.197658 master-0 kubenswrapper[28758]: I0223 14:50:13.197601 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfdnc\" (UniqueName: \"kubernetes.io/projected/718a5e05-c1d8-4982-8808-135e9883dab9-kube-api-access-zfdnc\") pod \"openstackclient\" (UID: \"718a5e05-c1d8-4982-8808-135e9883dab9\") " pod="openstack/openstackclient" Feb 23 14:50:13.198461 master-0 kubenswrapper[28758]: I0223 14:50:13.198400 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f30a16e8-1f43-43c6-bc0c-e471a0cda8ef" (UID: "f30a16e8-1f43-43c6-bc0c-e471a0cda8ef"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:50:13.212614 master-0 kubenswrapper[28758]: I0223 14:50:13.212023 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f30a16e8-1f43-43c6-bc0c-e471a0cda8ef" (UID: "f30a16e8-1f43-43c6-bc0c-e471a0cda8ef"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:50:13.245924 master-0 kubenswrapper[28758]: I0223 14:50:13.241623 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 23 14:50:13.254045 master-0 kubenswrapper[28758]: I0223 14:50:13.253987 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f30a16e8-1f43-43c6-bc0c-e471a0cda8ef" (UID: "f30a16e8-1f43-43c6-bc0c-e471a0cda8ef"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:50:13.285991 master-0 kubenswrapper[28758]: I0223 14:50:13.285812 28758 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:13.285991 master-0 kubenswrapper[28758]: I0223 14:50:13.285853 28758 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-dns-svc\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:13.285991 master-0 kubenswrapper[28758]: I0223 14:50:13.285865 28758 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:13.285991 master-0 kubenswrapper[28758]: I0223 14:50:13.285877 28758 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:13.285991 master-0 kubenswrapper[28758]: I0223 14:50:13.285889 28758 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:13.492497 master-0 kubenswrapper[28758]: I0223 14:50:13.492056 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ddcf4757-slsxg" Feb 23 14:50:13.492908 master-0 kubenswrapper[28758]: I0223 14:50:13.492679 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ddcf4757-slsxg" event={"ID":"f30a16e8-1f43-43c6-bc0c-e471a0cda8ef","Type":"ContainerDied","Data":"001346ba47fe071524b194d59c260c1ff7699837870361fc77044032a22fa2e4"} Feb 23 14:50:13.492908 master-0 kubenswrapper[28758]: I0223 14:50:13.492753 28758 scope.go:117] "RemoveContainer" containerID="7cb67f8d72e3e54bce465a80b556566400c5d647ff8f81e442763ed5e78f5ff4" Feb 23 14:50:13.497031 master-0 kubenswrapper[28758]: I0223 14:50:13.496910 28758 scope.go:117] "RemoveContainer" containerID="99765de22307a1c342adcf8794d27957519466d1118f780cf8792ba120d06e91" Feb 23 14:50:13.497517 master-0 kubenswrapper[28758]: E0223 14:50:13.497226 28758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-7dbb4799c9-gg6zd_openstack(ddd7c1f3-7e33-4a32-a229-09521aa553e2)\"" pod="openstack/ironic-7dbb4799c9-gg6zd" podUID="ddd7c1f3-7e33-4a32-a229-09521aa553e2" Feb 23 14:50:13.538273 master-0 kubenswrapper[28758]: I0223 14:50:13.538144 28758 scope.go:117] "RemoveContainer" containerID="559c41debc259ac95fb37f6f92e19f67d910b7c8815d98b28491eb492e7aebd6" Feb 23 14:50:13.545301 master-0 kubenswrapper[28758]: I0223 14:50:13.545276 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ddcf4757-slsxg"] Feb 23 14:50:13.555375 master-0 kubenswrapper[28758]: I0223 14:50:13.555311 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-ddcf4757-slsxg"] Feb 23 14:50:13.862010 master-0 kubenswrapper[28758]: I0223 14:50:13.859533 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 23 14:50:14.113820 master-0 kubenswrapper[28758]: I0223 14:50:14.113771 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f30a16e8-1f43-43c6-bc0c-e471a0cda8ef" path="/var/lib/kubelet/pods/f30a16e8-1f43-43c6-bc0c-e471a0cda8ef/volumes" Feb 23 14:50:14.515007 master-0 kubenswrapper[28758]: I0223 14:50:14.514955 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"718a5e05-c1d8-4982-8808-135e9883dab9","Type":"ContainerStarted","Data":"f8181f3d239a2b006d7c66eee8d7659d0a20f3b5004b87fea13df2e4f3a6648a"} Feb 23 14:50:15.111928 master-0 kubenswrapper[28758]: I0223 14:50:15.111873 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-2d990-backup-0" Feb 23 14:50:15.533949 master-0 kubenswrapper[28758]: I0223 14:50:15.533796 28758 generic.go:334] "Generic (PLEG): container finished" podID="8c0c40ba-2093-4e1a-8166-bbb4c53f3a08" containerID="7a8e55d4703a52a75f17c69a848f147a36062c67b0d146662f6b138cb546db14" exitCode=1 Feb 23 14:50:15.533949 master-0 kubenswrapper[28758]: I0223 14:50:15.533896 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-769d7f49c5-jj8xx" event={"ID":"8c0c40ba-2093-4e1a-8166-bbb4c53f3a08","Type":"ContainerDied","Data":"7a8e55d4703a52a75f17c69a848f147a36062c67b0d146662f6b138cb546db14"} Feb 23 14:50:15.543665 master-0 kubenswrapper[28758]: I0223 14:50:15.533984 28758 scope.go:117] "RemoveContainer" containerID="7ec34062e9ab758e1b03daf0b27364f5d5426c10fdbad94ead0aa03eef90fe6c" Feb 23 14:50:15.543665 master-0 kubenswrapper[28758]: I0223 14:50:15.534825 28758 scope.go:117] "RemoveContainer" containerID="7a8e55d4703a52a75f17c69a848f147a36062c67b0d146662f6b138cb546db14" Feb 23 14:50:15.543665 master-0 kubenswrapper[28758]: E0223 14:50:15.535151 28758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-769d7f49c5-jj8xx_openstack(8c0c40ba-2093-4e1a-8166-bbb4c53f3a08)\"" pod="openstack/ironic-neutron-agent-769d7f49c5-jj8xx" podUID="8c0c40ba-2093-4e1a-8166-bbb4c53f3a08" Feb 23 14:50:15.744681 master-0 kubenswrapper[28758]: I0223 14:50:15.744474 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-db-sync-8kt4c"] Feb 23 14:50:15.745209 master-0 kubenswrapper[28758]: E0223 14:50:15.745167 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f30a16e8-1f43-43c6-bc0c-e471a0cda8ef" containerName="init" Feb 23 14:50:15.745209 master-0 kubenswrapper[28758]: I0223 14:50:15.745198 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="f30a16e8-1f43-43c6-bc0c-e471a0cda8ef" containerName="init" Feb 23 14:50:15.745351 master-0 kubenswrapper[28758]: E0223 14:50:15.745223 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f30a16e8-1f43-43c6-bc0c-e471a0cda8ef" containerName="dnsmasq-dns" Feb 23 14:50:15.745351 master-0 kubenswrapper[28758]: I0223 14:50:15.745234 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="f30a16e8-1f43-43c6-bc0c-e471a0cda8ef" containerName="dnsmasq-dns" Feb 23 14:50:15.745618 master-0 kubenswrapper[28758]: I0223 14:50:15.745586 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="f30a16e8-1f43-43c6-bc0c-e471a0cda8ef" containerName="dnsmasq-dns" Feb 23 14:50:15.746502 master-0 kubenswrapper[28758]: I0223 14:50:15.746428 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-8kt4c" Feb 23 14:50:15.749525 master-0 kubenswrapper[28758]: I0223 14:50:15.749464 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Feb 23 14:50:15.750749 master-0 kubenswrapper[28758]: I0223 14:50:15.750496 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Feb 23 14:50:15.788407 master-0 kubenswrapper[28758]: I0223 14:50:15.788292 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-sync-8kt4c"] Feb 23 14:50:15.875186 master-0 kubenswrapper[28758]: I0223 14:50:15.875112 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/98bc143e-fd3c-4a62-a069-5e2357cb2209-var-lib-ironic\") pod \"ironic-inspector-db-sync-8kt4c\" (UID: \"98bc143e-fd3c-4a62-a069-5e2357cb2209\") " pod="openstack/ironic-inspector-db-sync-8kt4c" Feb 23 14:50:15.875419 master-0 kubenswrapper[28758]: I0223 14:50:15.875194 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24s57\" (UniqueName: \"kubernetes.io/projected/98bc143e-fd3c-4a62-a069-5e2357cb2209-kube-api-access-24s57\") pod \"ironic-inspector-db-sync-8kt4c\" (UID: \"98bc143e-fd3c-4a62-a069-5e2357cb2209\") " pod="openstack/ironic-inspector-db-sync-8kt4c" Feb 23 14:50:15.875419 master-0 kubenswrapper[28758]: I0223 14:50:15.875271 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/98bc143e-fd3c-4a62-a069-5e2357cb2209-config\") pod \"ironic-inspector-db-sync-8kt4c\" (UID: \"98bc143e-fd3c-4a62-a069-5e2357cb2209\") " pod="openstack/ironic-inspector-db-sync-8kt4c" Feb 23 14:50:15.875419 master-0 kubenswrapper[28758]: I0223 14:50:15.875382 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc143e-fd3c-4a62-a069-5e2357cb2209-combined-ca-bundle\") pod \"ironic-inspector-db-sync-8kt4c\" (UID: \"98bc143e-fd3c-4a62-a069-5e2357cb2209\") " pod="openstack/ironic-inspector-db-sync-8kt4c" Feb 23 14:50:15.875600 master-0 kubenswrapper[28758]: I0223 14:50:15.875473 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/98bc143e-fd3c-4a62-a069-5e2357cb2209-etc-podinfo\") pod \"ironic-inspector-db-sync-8kt4c\" (UID: \"98bc143e-fd3c-4a62-a069-5e2357cb2209\") " pod="openstack/ironic-inspector-db-sync-8kt4c" Feb 23 14:50:15.876021 master-0 kubenswrapper[28758]: I0223 14:50:15.875963 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98bc143e-fd3c-4a62-a069-5e2357cb2209-scripts\") pod \"ironic-inspector-db-sync-8kt4c\" (UID: \"98bc143e-fd3c-4a62-a069-5e2357cb2209\") " pod="openstack/ironic-inspector-db-sync-8kt4c" Feb 23 14:50:15.876103 master-0 kubenswrapper[28758]: I0223 14:50:15.876076 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/98bc143e-fd3c-4a62-a069-5e2357cb2209-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-8kt4c\" (UID: \"98bc143e-fd3c-4a62-a069-5e2357cb2209\") " pod="openstack/ironic-inspector-db-sync-8kt4c" Feb 23 14:50:15.980574 master-0 kubenswrapper[28758]: I0223 14:50:15.980271 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/98bc143e-fd3c-4a62-a069-5e2357cb2209-var-lib-ironic\") pod \"ironic-inspector-db-sync-8kt4c\" (UID: \"98bc143e-fd3c-4a62-a069-5e2357cb2209\") " pod="openstack/ironic-inspector-db-sync-8kt4c" Feb 23 14:50:15.980574 master-0 kubenswrapper[28758]: I0223 14:50:15.980353 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24s57\" (UniqueName: \"kubernetes.io/projected/98bc143e-fd3c-4a62-a069-5e2357cb2209-kube-api-access-24s57\") pod \"ironic-inspector-db-sync-8kt4c\" (UID: \"98bc143e-fd3c-4a62-a069-5e2357cb2209\") " pod="openstack/ironic-inspector-db-sync-8kt4c" Feb 23 14:50:15.980574 master-0 kubenswrapper[28758]: I0223 14:50:15.980404 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/98bc143e-fd3c-4a62-a069-5e2357cb2209-config\") pod \"ironic-inspector-db-sync-8kt4c\" (UID: \"98bc143e-fd3c-4a62-a069-5e2357cb2209\") " pod="openstack/ironic-inspector-db-sync-8kt4c" Feb 23 14:50:15.980574 master-0 kubenswrapper[28758]: I0223 14:50:15.980493 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc143e-fd3c-4a62-a069-5e2357cb2209-combined-ca-bundle\") pod \"ironic-inspector-db-sync-8kt4c\" (UID: \"98bc143e-fd3c-4a62-a069-5e2357cb2209\") " pod="openstack/ironic-inspector-db-sync-8kt4c" Feb 23 14:50:15.980574 master-0 kubenswrapper[28758]: I0223 14:50:15.980548 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/98bc143e-fd3c-4a62-a069-5e2357cb2209-etc-podinfo\") pod \"ironic-inspector-db-sync-8kt4c\" (UID: \"98bc143e-fd3c-4a62-a069-5e2357cb2209\") " pod="openstack/ironic-inspector-db-sync-8kt4c" Feb 23 14:50:15.980574 master-0 kubenswrapper[28758]: I0223 14:50:15.980567 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98bc143e-fd3c-4a62-a069-5e2357cb2209-scripts\") pod \"ironic-inspector-db-sync-8kt4c\" (UID: \"98bc143e-fd3c-4a62-a069-5e2357cb2209\") " pod="openstack/ironic-inspector-db-sync-8kt4c" Feb 23 14:50:15.980574 master-0 kubenswrapper[28758]: I0223 14:50:15.980590 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/98bc143e-fd3c-4a62-a069-5e2357cb2209-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-8kt4c\" (UID: \"98bc143e-fd3c-4a62-a069-5e2357cb2209\") " pod="openstack/ironic-inspector-db-sync-8kt4c" Feb 23 14:50:15.981140 master-0 kubenswrapper[28758]: I0223 14:50:15.981106 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/98bc143e-fd3c-4a62-a069-5e2357cb2209-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-8kt4c\" (UID: \"98bc143e-fd3c-4a62-a069-5e2357cb2209\") " pod="openstack/ironic-inspector-db-sync-8kt4c" Feb 23 14:50:15.981354 master-0 kubenswrapper[28758]: I0223 14:50:15.981330 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/98bc143e-fd3c-4a62-a069-5e2357cb2209-var-lib-ironic\") pod \"ironic-inspector-db-sync-8kt4c\" (UID: \"98bc143e-fd3c-4a62-a069-5e2357cb2209\") " pod="openstack/ironic-inspector-db-sync-8kt4c" Feb 23 14:50:15.986394 master-0 kubenswrapper[28758]: I0223 14:50:15.985499 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/98bc143e-fd3c-4a62-a069-5e2357cb2209-etc-podinfo\") pod \"ironic-inspector-db-sync-8kt4c\" (UID: \"98bc143e-fd3c-4a62-a069-5e2357cb2209\") " pod="openstack/ironic-inspector-db-sync-8kt4c" Feb 23 14:50:15.986394 master-0 kubenswrapper[28758]: I0223 14:50:15.985879 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc143e-fd3c-4a62-a069-5e2357cb2209-combined-ca-bundle\") pod \"ironic-inspector-db-sync-8kt4c\" (UID: \"98bc143e-fd3c-4a62-a069-5e2357cb2209\") " pod="openstack/ironic-inspector-db-sync-8kt4c" Feb 23 14:50:15.986804 master-0 kubenswrapper[28758]: I0223 14:50:15.986770 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/98bc143e-fd3c-4a62-a069-5e2357cb2209-config\") pod \"ironic-inspector-db-sync-8kt4c\" (UID: \"98bc143e-fd3c-4a62-a069-5e2357cb2209\") " pod="openstack/ironic-inspector-db-sync-8kt4c" Feb 23 14:50:15.987857 master-0 kubenswrapper[28758]: I0223 14:50:15.987792 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98bc143e-fd3c-4a62-a069-5e2357cb2209-scripts\") pod \"ironic-inspector-db-sync-8kt4c\" (UID: \"98bc143e-fd3c-4a62-a069-5e2357cb2209\") " pod="openstack/ironic-inspector-db-sync-8kt4c" Feb 23 14:50:16.003327 master-0 kubenswrapper[28758]: I0223 14:50:16.002142 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24s57\" (UniqueName: \"kubernetes.io/projected/98bc143e-fd3c-4a62-a069-5e2357cb2209-kube-api-access-24s57\") pod \"ironic-inspector-db-sync-8kt4c\" (UID: \"98bc143e-fd3c-4a62-a069-5e2357cb2209\") " pod="openstack/ironic-inspector-db-sync-8kt4c" Feb 23 14:50:16.105554 master-0 kubenswrapper[28758]: I0223 14:50:16.105272 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-8kt4c" Feb 23 14:50:16.576799 master-0 kubenswrapper[28758]: I0223 14:50:16.576584 28758 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-neutron-agent-769d7f49c5-jj8xx" Feb 23 14:50:16.578215 master-0 kubenswrapper[28758]: I0223 14:50:16.577710 28758 scope.go:117] "RemoveContainer" containerID="7a8e55d4703a52a75f17c69a848f147a36062c67b0d146662f6b138cb546db14" Feb 23 14:50:16.578318 master-0 kubenswrapper[28758]: E0223 14:50:16.578279 28758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-769d7f49c5-jj8xx_openstack(8c0c40ba-2093-4e1a-8166-bbb4c53f3a08)\"" pod="openstack/ironic-neutron-agent-769d7f49c5-jj8xx" podUID="8c0c40ba-2093-4e1a-8166-bbb4c53f3a08" Feb 23 14:50:16.670239 master-0 kubenswrapper[28758]: I0223 14:50:16.669979 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-sync-8kt4c"] Feb 23 14:50:16.955722 master-0 kubenswrapper[28758]: I0223 14:50:16.955653 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7cd7df45f8-pgvmr"] Feb 23 14:50:16.962864 master-0 kubenswrapper[28758]: I0223 14:50:16.961093 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7cd7df45f8-pgvmr" Feb 23 14:50:16.965573 master-0 kubenswrapper[28758]: I0223 14:50:16.965003 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 23 14:50:16.965573 master-0 kubenswrapper[28758]: I0223 14:50:16.965175 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 23 14:50:16.965573 master-0 kubenswrapper[28758]: I0223 14:50:16.965268 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 23 14:50:16.986101 master-0 kubenswrapper[28758]: I0223 14:50:16.985999 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7cd7df45f8-pgvmr"] Feb 23 14:50:17.018416 master-0 kubenswrapper[28758]: I0223 14:50:17.017365 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-55d877d745-zlz8n" Feb 23 14:50:17.038200 master-0 kubenswrapper[28758]: I0223 14:50:17.038103 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d1bdc4f-86b7-4c36-906b-4aa5e49cd017-run-httpd\") pod \"swift-proxy-7cd7df45f8-pgvmr\" (UID: \"3d1bdc4f-86b7-4c36-906b-4aa5e49cd017\") " pod="openstack/swift-proxy-7cd7df45f8-pgvmr" Feb 23 14:50:17.038821 master-0 kubenswrapper[28758]: I0223 14:50:17.038245 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d1bdc4f-86b7-4c36-906b-4aa5e49cd017-config-data\") pod \"swift-proxy-7cd7df45f8-pgvmr\" (UID: \"3d1bdc4f-86b7-4c36-906b-4aa5e49cd017\") " pod="openstack/swift-proxy-7cd7df45f8-pgvmr" Feb 23 14:50:17.038821 master-0 kubenswrapper[28758]: I0223 14:50:17.038342 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d1bdc4f-86b7-4c36-906b-4aa5e49cd017-internal-tls-certs\") pod \"swift-proxy-7cd7df45f8-pgvmr\" (UID: \"3d1bdc4f-86b7-4c36-906b-4aa5e49cd017\") " pod="openstack/swift-proxy-7cd7df45f8-pgvmr" Feb 23 14:50:17.038821 master-0 kubenswrapper[28758]: I0223 14:50:17.038742 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d1bdc4f-86b7-4c36-906b-4aa5e49cd017-log-httpd\") pod \"swift-proxy-7cd7df45f8-pgvmr\" (UID: \"3d1bdc4f-86b7-4c36-906b-4aa5e49cd017\") " pod="openstack/swift-proxy-7cd7df45f8-pgvmr" Feb 23 14:50:17.038821 master-0 kubenswrapper[28758]: I0223 14:50:17.038765 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1bdc4f-86b7-4c36-906b-4aa5e49cd017-combined-ca-bundle\") pod \"swift-proxy-7cd7df45f8-pgvmr\" (UID: \"3d1bdc4f-86b7-4c36-906b-4aa5e49cd017\") " pod="openstack/swift-proxy-7cd7df45f8-pgvmr" Feb 23 14:50:17.038996 master-0 kubenswrapper[28758]: I0223 14:50:17.038865 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3d1bdc4f-86b7-4c36-906b-4aa5e49cd017-etc-swift\") pod \"swift-proxy-7cd7df45f8-pgvmr\" (UID: \"3d1bdc4f-86b7-4c36-906b-4aa5e49cd017\") " pod="openstack/swift-proxy-7cd7df45f8-pgvmr" Feb 23 14:50:17.039880 master-0 kubenswrapper[28758]: I0223 14:50:17.039199 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d1bdc4f-86b7-4c36-906b-4aa5e49cd017-public-tls-certs\") pod \"swift-proxy-7cd7df45f8-pgvmr\" (UID: \"3d1bdc4f-86b7-4c36-906b-4aa5e49cd017\") " pod="openstack/swift-proxy-7cd7df45f8-pgvmr" Feb 23 14:50:17.039880 master-0 kubenswrapper[28758]: I0223 14:50:17.039278 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td2wv\" (UniqueName: \"kubernetes.io/projected/3d1bdc4f-86b7-4c36-906b-4aa5e49cd017-kube-api-access-td2wv\") pod \"swift-proxy-7cd7df45f8-pgvmr\" (UID: \"3d1bdc4f-86b7-4c36-906b-4aa5e49cd017\") " pod="openstack/swift-proxy-7cd7df45f8-pgvmr" Feb 23 14:50:17.138737 master-0 kubenswrapper[28758]: I0223 14:50:17.136962 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-7dbb4799c9-gg6zd"] Feb 23 14:50:17.138737 master-0 kubenswrapper[28758]: I0223 14:50:17.137290 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ironic-7dbb4799c9-gg6zd" podUID="ddd7c1f3-7e33-4a32-a229-09521aa553e2" containerName="ironic-api-log" containerID="cri-o://146c10621d1b6c06a49e812424dab5aeb362a464f2a46c840f5ddc128e722f23" gracePeriod=60 Feb 23 14:50:17.140788 master-0 kubenswrapper[28758]: I0223 14:50:17.140759 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d1bdc4f-86b7-4c36-906b-4aa5e49cd017-config-data\") pod \"swift-proxy-7cd7df45f8-pgvmr\" (UID: \"3d1bdc4f-86b7-4c36-906b-4aa5e49cd017\") " pod="openstack/swift-proxy-7cd7df45f8-pgvmr" Feb 23 14:50:17.140993 master-0 kubenswrapper[28758]: I0223 14:50:17.140978 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d1bdc4f-86b7-4c36-906b-4aa5e49cd017-internal-tls-certs\") pod \"swift-proxy-7cd7df45f8-pgvmr\" (UID: \"3d1bdc4f-86b7-4c36-906b-4aa5e49cd017\") " pod="openstack/swift-proxy-7cd7df45f8-pgvmr" Feb 23 14:50:17.142057 master-0 kubenswrapper[28758]: I0223 14:50:17.142040 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d1bdc4f-86b7-4c36-906b-4aa5e49cd017-log-httpd\") pod \"swift-proxy-7cd7df45f8-pgvmr\" (UID: \"3d1bdc4f-86b7-4c36-906b-4aa5e49cd017\") " pod="openstack/swift-proxy-7cd7df45f8-pgvmr" Feb 23 14:50:17.142161 master-0 kubenswrapper[28758]: I0223 14:50:17.142148 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1bdc4f-86b7-4c36-906b-4aa5e49cd017-combined-ca-bundle\") pod \"swift-proxy-7cd7df45f8-pgvmr\" (UID: \"3d1bdc4f-86b7-4c36-906b-4aa5e49cd017\") " pod="openstack/swift-proxy-7cd7df45f8-pgvmr" Feb 23 14:50:17.142363 master-0 kubenswrapper[28758]: I0223 14:50:17.142348 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3d1bdc4f-86b7-4c36-906b-4aa5e49cd017-etc-swift\") pod \"swift-proxy-7cd7df45f8-pgvmr\" (UID: \"3d1bdc4f-86b7-4c36-906b-4aa5e49cd017\") " pod="openstack/swift-proxy-7cd7df45f8-pgvmr" Feb 23 14:50:17.144409 master-0 kubenswrapper[28758]: I0223 14:50:17.144239 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d1bdc4f-86b7-4c36-906b-4aa5e49cd017-log-httpd\") pod \"swift-proxy-7cd7df45f8-pgvmr\" (UID: \"3d1bdc4f-86b7-4c36-906b-4aa5e49cd017\") " pod="openstack/swift-proxy-7cd7df45f8-pgvmr" Feb 23 14:50:17.144580 master-0 kubenswrapper[28758]: I0223 14:50:17.144562 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d1bdc4f-86b7-4c36-906b-4aa5e49cd017-public-tls-certs\") pod \"swift-proxy-7cd7df45f8-pgvmr\" (UID: \"3d1bdc4f-86b7-4c36-906b-4aa5e49cd017\") " pod="openstack/swift-proxy-7cd7df45f8-pgvmr" Feb 23 14:50:17.145283 master-0 kubenswrapper[28758]: I0223 14:50:17.145267 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td2wv\" (UniqueName: \"kubernetes.io/projected/3d1bdc4f-86b7-4c36-906b-4aa5e49cd017-kube-api-access-td2wv\") pod \"swift-proxy-7cd7df45f8-pgvmr\" (UID: \"3d1bdc4f-86b7-4c36-906b-4aa5e49cd017\") " pod="openstack/swift-proxy-7cd7df45f8-pgvmr" Feb 23 14:50:17.145993 master-0 kubenswrapper[28758]: I0223 14:50:17.145977 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d1bdc4f-86b7-4c36-906b-4aa5e49cd017-run-httpd\") pod \"swift-proxy-7cd7df45f8-pgvmr\" (UID: \"3d1bdc4f-86b7-4c36-906b-4aa5e49cd017\") " pod="openstack/swift-proxy-7cd7df45f8-pgvmr" Feb 23 14:50:17.146846 master-0 kubenswrapper[28758]: I0223 14:50:17.146827 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d1bdc4f-86b7-4c36-906b-4aa5e49cd017-run-httpd\") pod \"swift-proxy-7cd7df45f8-pgvmr\" (UID: \"3d1bdc4f-86b7-4c36-906b-4aa5e49cd017\") " pod="openstack/swift-proxy-7cd7df45f8-pgvmr" Feb 23 14:50:17.148459 master-0 kubenswrapper[28758]: I0223 14:50:17.148358 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d1bdc4f-86b7-4c36-906b-4aa5e49cd017-public-tls-certs\") pod \"swift-proxy-7cd7df45f8-pgvmr\" (UID: \"3d1bdc4f-86b7-4c36-906b-4aa5e49cd017\") " pod="openstack/swift-proxy-7cd7df45f8-pgvmr" Feb 23 14:50:17.153212 master-0 kubenswrapper[28758]: I0223 14:50:17.152579 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d1bdc4f-86b7-4c36-906b-4aa5e49cd017-internal-tls-certs\") pod \"swift-proxy-7cd7df45f8-pgvmr\" (UID: \"3d1bdc4f-86b7-4c36-906b-4aa5e49cd017\") " pod="openstack/swift-proxy-7cd7df45f8-pgvmr" Feb 23 14:50:17.163160 master-0 kubenswrapper[28758]: I0223 14:50:17.162704 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d1bdc4f-86b7-4c36-906b-4aa5e49cd017-combined-ca-bundle\") pod \"swift-proxy-7cd7df45f8-pgvmr\" (UID: \"3d1bdc4f-86b7-4c36-906b-4aa5e49cd017\") " pod="openstack/swift-proxy-7cd7df45f8-pgvmr" Feb 23 14:50:17.181694 master-0 kubenswrapper[28758]: I0223 14:50:17.168378 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d1bdc4f-86b7-4c36-906b-4aa5e49cd017-config-data\") pod \"swift-proxy-7cd7df45f8-pgvmr\" (UID: \"3d1bdc4f-86b7-4c36-906b-4aa5e49cd017\") " pod="openstack/swift-proxy-7cd7df45f8-pgvmr" Feb 23 14:50:17.181694 master-0 kubenswrapper[28758]: I0223 14:50:17.172515 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3d1bdc4f-86b7-4c36-906b-4aa5e49cd017-etc-swift\") pod \"swift-proxy-7cd7df45f8-pgvmr\" (UID: \"3d1bdc4f-86b7-4c36-906b-4aa5e49cd017\") " pod="openstack/swift-proxy-7cd7df45f8-pgvmr" Feb 23 14:50:17.181694 master-0 kubenswrapper[28758]: I0223 14:50:17.177803 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td2wv\" (UniqueName: \"kubernetes.io/projected/3d1bdc4f-86b7-4c36-906b-4aa5e49cd017-kube-api-access-td2wv\") pod \"swift-proxy-7cd7df45f8-pgvmr\" (UID: \"3d1bdc4f-86b7-4c36-906b-4aa5e49cd017\") " pod="openstack/swift-proxy-7cd7df45f8-pgvmr" Feb 23 14:50:17.314509 master-0 kubenswrapper[28758]: I0223 14:50:17.308826 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7cd7df45f8-pgvmr" Feb 23 14:50:17.602707 master-0 kubenswrapper[28758]: I0223 14:50:17.599735 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-8kt4c" event={"ID":"98bc143e-fd3c-4a62-a069-5e2357cb2209","Type":"ContainerStarted","Data":"86b385bb89c71ab6eb6fd05d275d5171f4997cccacf605af20d019d9846a5b6f"} Feb 23 14:50:17.604403 master-0 kubenswrapper[28758]: I0223 14:50:17.604363 28758 generic.go:334] "Generic (PLEG): container finished" podID="ddd7c1f3-7e33-4a32-a229-09521aa553e2" containerID="146c10621d1b6c06a49e812424dab5aeb362a464f2a46c840f5ddc128e722f23" exitCode=143 Feb 23 14:50:17.604617 master-0 kubenswrapper[28758]: I0223 14:50:17.604419 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7dbb4799c9-gg6zd" event={"ID":"ddd7c1f3-7e33-4a32-a229-09521aa553e2","Type":"ContainerDied","Data":"146c10621d1b6c06a49e812424dab5aeb362a464f2a46c840f5ddc128e722f23"} Feb 23 14:50:18.087596 master-0 kubenswrapper[28758]: I0223 14:50:18.086742 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-rk2l6"] Feb 23 14:50:18.102506 master-0 kubenswrapper[28758]: I0223 14:50:18.093618 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rk2l6" Feb 23 14:50:18.156583 master-0 kubenswrapper[28758]: I0223 14:50:18.155884 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-7dbb4799c9-gg6zd" Feb 23 14:50:18.172150 master-0 kubenswrapper[28758]: W0223 14:50:18.171118 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d1bdc4f_86b7_4c36_906b_4aa5e49cd017.slice/crio-18951324c9e242185ba8bec628bf2b60ff0dc70bc4451bd6080e58262531653d WatchSource:0}: Error finding container 18951324c9e242185ba8bec628bf2b60ff0dc70bc4451bd6080e58262531653d: Status 404 returned error can't find the container with id 18951324c9e242185ba8bec628bf2b60ff0dc70bc4451bd6080e58262531653d Feb 23 14:50:18.181751 master-0 kubenswrapper[28758]: I0223 14:50:18.181637 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-rk2l6"] Feb 23 14:50:18.189403 master-0 kubenswrapper[28758]: I0223 14:50:18.189324 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7cd7df45f8-pgvmr"] Feb 23 14:50:18.225630 master-0 kubenswrapper[28758]: I0223 14:50:18.208450 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-p68gm"] Feb 23 14:50:18.225630 master-0 kubenswrapper[28758]: E0223 14:50:18.209117 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd7c1f3-7e33-4a32-a229-09521aa553e2" containerName="ironic-api" Feb 23 14:50:18.225630 master-0 kubenswrapper[28758]: I0223 14:50:18.209138 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd7c1f3-7e33-4a32-a229-09521aa553e2" containerName="ironic-api" Feb 23 14:50:18.225630 master-0 kubenswrapper[28758]: E0223 14:50:18.209177 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd7c1f3-7e33-4a32-a229-09521aa553e2" containerName="ironic-api-log" Feb 23 14:50:18.225630 master-0 kubenswrapper[28758]: I0223 14:50:18.209183 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd7c1f3-7e33-4a32-a229-09521aa553e2" containerName="ironic-api-log" Feb 23 14:50:18.225630 master-0 kubenswrapper[28758]: E0223 14:50:18.209212 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd7c1f3-7e33-4a32-a229-09521aa553e2" containerName="init" Feb 23 14:50:18.225630 master-0 kubenswrapper[28758]: I0223 14:50:18.209220 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd7c1f3-7e33-4a32-a229-09521aa553e2" containerName="init" Feb 23 14:50:18.225630 master-0 kubenswrapper[28758]: I0223 14:50:18.209499 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddd7c1f3-7e33-4a32-a229-09521aa553e2" containerName="ironic-api-log" Feb 23 14:50:18.225630 master-0 kubenswrapper[28758]: I0223 14:50:18.209521 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddd7c1f3-7e33-4a32-a229-09521aa553e2" containerName="ironic-api" Feb 23 14:50:18.225630 master-0 kubenswrapper[28758]: I0223 14:50:18.209547 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddd7c1f3-7e33-4a32-a229-09521aa553e2" containerName="ironic-api" Feb 23 14:50:18.225630 master-0 kubenswrapper[28758]: I0223 14:50:18.209806 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd7c1f3-7e33-4a32-a229-09521aa553e2-combined-ca-bundle\") pod \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\" (UID: \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\") " Feb 23 14:50:18.225630 master-0 kubenswrapper[28758]: I0223 14:50:18.209966 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ddd7c1f3-7e33-4a32-a229-09521aa553e2-config-data-merged\") pod \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\" (UID: \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\") " Feb 23 14:50:18.225630 master-0 kubenswrapper[28758]: I0223 14:50:18.210078 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddd7c1f3-7e33-4a32-a229-09521aa553e2-logs\") pod \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\" (UID: \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\") " Feb 23 14:50:18.225630 master-0 kubenswrapper[28758]: I0223 14:50:18.210305 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-p68gm" Feb 23 14:50:18.225630 master-0 kubenswrapper[28758]: I0223 14:50:18.210414 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddd7c1f3-7e33-4a32-a229-09521aa553e2-scripts\") pod \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\" (UID: \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\") " Feb 23 14:50:18.225630 master-0 kubenswrapper[28758]: I0223 14:50:18.210452 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ddd7c1f3-7e33-4a32-a229-09521aa553e2-etc-podinfo\") pod \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\" (UID: \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\") " Feb 23 14:50:18.225630 master-0 kubenswrapper[28758]: I0223 14:50:18.210704 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ddd7c1f3-7e33-4a32-a229-09521aa553e2-config-data-custom\") pod \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\" (UID: \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\") " Feb 23 14:50:18.225630 master-0 kubenswrapper[28758]: I0223 14:50:18.211306 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddd7c1f3-7e33-4a32-a229-09521aa553e2-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "ddd7c1f3-7e33-4a32-a229-09521aa553e2" (UID: "ddd7c1f3-7e33-4a32-a229-09521aa553e2"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:50:18.225630 master-0 kubenswrapper[28758]: I0223 14:50:18.210816 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd7c1f3-7e33-4a32-a229-09521aa553e2-config-data\") pod \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\" (UID: \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\") " Feb 23 14:50:18.225630 master-0 kubenswrapper[28758]: I0223 14:50:18.213064 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn2gq\" (UniqueName: \"kubernetes.io/projected/ddd7c1f3-7e33-4a32-a229-09521aa553e2-kube-api-access-sn2gq\") pod \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\" (UID: \"ddd7c1f3-7e33-4a32-a229-09521aa553e2\") " Feb 23 14:50:18.225630 master-0 kubenswrapper[28758]: I0223 14:50:18.213443 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04a17006-c08e-4ac7-8b6f-f412b4249c6c-operator-scripts\") pod \"nova-api-db-create-rk2l6\" (UID: \"04a17006-c08e-4ac7-8b6f-f412b4249c6c\") " pod="openstack/nova-api-db-create-rk2l6" Feb 23 14:50:18.225630 master-0 kubenswrapper[28758]: I0223 14:50:18.213557 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8qvx\" (UniqueName: \"kubernetes.io/projected/04a17006-c08e-4ac7-8b6f-f412b4249c6c-kube-api-access-g8qvx\") pod \"nova-api-db-create-rk2l6\" (UID: \"04a17006-c08e-4ac7-8b6f-f412b4249c6c\") " pod="openstack/nova-api-db-create-rk2l6" Feb 23 14:50:18.225630 master-0 kubenswrapper[28758]: I0223 14:50:18.214420 28758 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ddd7c1f3-7e33-4a32-a229-09521aa553e2-config-data-merged\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:18.225630 master-0 kubenswrapper[28758]: I0223 14:50:18.217181 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddd7c1f3-7e33-4a32-a229-09521aa553e2-logs" (OuterVolumeSpecName: "logs") pod "ddd7c1f3-7e33-4a32-a229-09521aa553e2" (UID: "ddd7c1f3-7e33-4a32-a229-09521aa553e2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:50:18.228891 master-0 kubenswrapper[28758]: I0223 14:50:18.228778 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd7c1f3-7e33-4a32-a229-09521aa553e2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ddd7c1f3-7e33-4a32-a229-09521aa553e2" (UID: "ddd7c1f3-7e33-4a32-a229-09521aa553e2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:18.240642 master-0 kubenswrapper[28758]: I0223 14:50:18.240568 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-844d-account-create-update-l79ql"] Feb 23 14:50:18.241335 master-0 kubenswrapper[28758]: E0223 14:50:18.241289 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd7c1f3-7e33-4a32-a229-09521aa553e2" containerName="ironic-api" Feb 23 14:50:18.241335 master-0 kubenswrapper[28758]: I0223 14:50:18.241315 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd7c1f3-7e33-4a32-a229-09521aa553e2" containerName="ironic-api" Feb 23 14:50:18.242781 master-0 kubenswrapper[28758]: I0223 14:50:18.242692 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-844d-account-create-update-l79ql" Feb 23 14:50:18.245876 master-0 kubenswrapper[28758]: I0223 14:50:18.245802 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ddd7c1f3-7e33-4a32-a229-09521aa553e2-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "ddd7c1f3-7e33-4a32-a229-09521aa553e2" (UID: "ddd7c1f3-7e33-4a32-a229-09521aa553e2"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 23 14:50:18.248654 master-0 kubenswrapper[28758]: I0223 14:50:18.246983 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddd7c1f3-7e33-4a32-a229-09521aa553e2-kube-api-access-sn2gq" (OuterVolumeSpecName: "kube-api-access-sn2gq") pod "ddd7c1f3-7e33-4a32-a229-09521aa553e2" (UID: "ddd7c1f3-7e33-4a32-a229-09521aa553e2"). InnerVolumeSpecName "kube-api-access-sn2gq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:50:18.249950 master-0 kubenswrapper[28758]: I0223 14:50:18.249883 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 23 14:50:18.250124 master-0 kubenswrapper[28758]: I0223 14:50:18.250099 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-p68gm"] Feb 23 14:50:18.252372 master-0 kubenswrapper[28758]: I0223 14:50:18.252311 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd7c1f3-7e33-4a32-a229-09521aa553e2-scripts" (OuterVolumeSpecName: "scripts") pod "ddd7c1f3-7e33-4a32-a229-09521aa553e2" (UID: "ddd7c1f3-7e33-4a32-a229-09521aa553e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:18.276623 master-0 kubenswrapper[28758]: I0223 14:50:18.275676 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-844d-account-create-update-l79ql"] Feb 23 14:50:18.286287 master-0 kubenswrapper[28758]: I0223 14:50:18.285109 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd7c1f3-7e33-4a32-a229-09521aa553e2-config-data" (OuterVolumeSpecName: "config-data") pod "ddd7c1f3-7e33-4a32-a229-09521aa553e2" (UID: "ddd7c1f3-7e33-4a32-a229-09521aa553e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:18.316642 master-0 kubenswrapper[28758]: I0223 14:50:18.316183 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ad3874c-6300-4cff-80d7-b332ebc88e5d-operator-scripts\") pod \"nova-api-844d-account-create-update-l79ql\" (UID: \"4ad3874c-6300-4cff-80d7-b332ebc88e5d\") " pod="openstack/nova-api-844d-account-create-update-l79ql" Feb 23 14:50:18.319759 master-0 kubenswrapper[28758]: I0223 14:50:18.318762 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2f2ca95-2590-4599-8b0f-4714ed65c1b4-operator-scripts\") pod \"nova-cell0-db-create-p68gm\" (UID: \"f2f2ca95-2590-4599-8b0f-4714ed65c1b4\") " pod="openstack/nova-cell0-db-create-p68gm" Feb 23 14:50:18.319759 master-0 kubenswrapper[28758]: I0223 14:50:18.318842 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l7lz\" (UniqueName: \"kubernetes.io/projected/4ad3874c-6300-4cff-80d7-b332ebc88e5d-kube-api-access-5l7lz\") pod \"nova-api-844d-account-create-update-l79ql\" (UID: \"4ad3874c-6300-4cff-80d7-b332ebc88e5d\") " pod="openstack/nova-api-844d-account-create-update-l79ql" Feb 23 14:50:18.319759 master-0 kubenswrapper[28758]: I0223 14:50:18.318977 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd9c6\" (UniqueName: \"kubernetes.io/projected/f2f2ca95-2590-4599-8b0f-4714ed65c1b4-kube-api-access-cd9c6\") pod \"nova-cell0-db-create-p68gm\" (UID: \"f2f2ca95-2590-4599-8b0f-4714ed65c1b4\") " pod="openstack/nova-cell0-db-create-p68gm" Feb 23 14:50:18.319759 master-0 kubenswrapper[28758]: I0223 14:50:18.319139 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04a17006-c08e-4ac7-8b6f-f412b4249c6c-operator-scripts\") pod \"nova-api-db-create-rk2l6\" (UID: \"04a17006-c08e-4ac7-8b6f-f412b4249c6c\") " pod="openstack/nova-api-db-create-rk2l6" Feb 23 14:50:18.319759 master-0 kubenswrapper[28758]: I0223 14:50:18.319228 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8qvx\" (UniqueName: \"kubernetes.io/projected/04a17006-c08e-4ac7-8b6f-f412b4249c6c-kube-api-access-g8qvx\") pod \"nova-api-db-create-rk2l6\" (UID: \"04a17006-c08e-4ac7-8b6f-f412b4249c6c\") " pod="openstack/nova-api-db-create-rk2l6" Feb 23 14:50:18.319759 master-0 kubenswrapper[28758]: I0223 14:50:18.319424 28758 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddd7c1f3-7e33-4a32-a229-09521aa553e2-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:18.319759 master-0 kubenswrapper[28758]: I0223 14:50:18.319443 28758 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ddd7c1f3-7e33-4a32-a229-09521aa553e2-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:18.319759 master-0 kubenswrapper[28758]: I0223 14:50:18.319500 28758 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ddd7c1f3-7e33-4a32-a229-09521aa553e2-config-data-custom\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:18.319759 master-0 kubenswrapper[28758]: I0223 14:50:18.319515 28758 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddd7c1f3-7e33-4a32-a229-09521aa553e2-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:18.319759 master-0 kubenswrapper[28758]: I0223 14:50:18.319526 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sn2gq\" (UniqueName: \"kubernetes.io/projected/ddd7c1f3-7e33-4a32-a229-09521aa553e2-kube-api-access-sn2gq\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:18.319759 master-0 kubenswrapper[28758]: I0223 14:50:18.319567 28758 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddd7c1f3-7e33-4a32-a229-09521aa553e2-logs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:18.321000 master-0 kubenswrapper[28758]: I0223 14:50:18.320886 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04a17006-c08e-4ac7-8b6f-f412b4249c6c-operator-scripts\") pod \"nova-api-db-create-rk2l6\" (UID: \"04a17006-c08e-4ac7-8b6f-f412b4249c6c\") " pod="openstack/nova-api-db-create-rk2l6" Feb 23 14:50:18.359795 master-0 kubenswrapper[28758]: I0223 14:50:18.354824 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8qvx\" (UniqueName: \"kubernetes.io/projected/04a17006-c08e-4ac7-8b6f-f412b4249c6c-kube-api-access-g8qvx\") pod \"nova-api-db-create-rk2l6\" (UID: \"04a17006-c08e-4ac7-8b6f-f412b4249c6c\") " pod="openstack/nova-api-db-create-rk2l6" Feb 23 14:50:18.427287 master-0 kubenswrapper[28758]: I0223 14:50:18.424462 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2f2ca95-2590-4599-8b0f-4714ed65c1b4-operator-scripts\") pod \"nova-cell0-db-create-p68gm\" (UID: \"f2f2ca95-2590-4599-8b0f-4714ed65c1b4\") " pod="openstack/nova-cell0-db-create-p68gm" Feb 23 14:50:18.427287 master-0 kubenswrapper[28758]: I0223 14:50:18.424539 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l7lz\" (UniqueName: \"kubernetes.io/projected/4ad3874c-6300-4cff-80d7-b332ebc88e5d-kube-api-access-5l7lz\") pod \"nova-api-844d-account-create-update-l79ql\" (UID: \"4ad3874c-6300-4cff-80d7-b332ebc88e5d\") " pod="openstack/nova-api-844d-account-create-update-l79ql" Feb 23 14:50:18.427287 master-0 kubenswrapper[28758]: I0223 14:50:18.424597 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd9c6\" (UniqueName: \"kubernetes.io/projected/f2f2ca95-2590-4599-8b0f-4714ed65c1b4-kube-api-access-cd9c6\") pod \"nova-cell0-db-create-p68gm\" (UID: \"f2f2ca95-2590-4599-8b0f-4714ed65c1b4\") " pod="openstack/nova-cell0-db-create-p68gm" Feb 23 14:50:18.427287 master-0 kubenswrapper[28758]: I0223 14:50:18.424713 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ad3874c-6300-4cff-80d7-b332ebc88e5d-operator-scripts\") pod \"nova-api-844d-account-create-update-l79ql\" (UID: \"4ad3874c-6300-4cff-80d7-b332ebc88e5d\") " pod="openstack/nova-api-844d-account-create-update-l79ql" Feb 23 14:50:18.427287 master-0 kubenswrapper[28758]: I0223 14:50:18.427006 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ad3874c-6300-4cff-80d7-b332ebc88e5d-operator-scripts\") pod \"nova-api-844d-account-create-update-l79ql\" (UID: \"4ad3874c-6300-4cff-80d7-b332ebc88e5d\") " pod="openstack/nova-api-844d-account-create-update-l79ql" Feb 23 14:50:18.428770 master-0 kubenswrapper[28758]: I0223 14:50:18.428025 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2f2ca95-2590-4599-8b0f-4714ed65c1b4-operator-scripts\") pod \"nova-cell0-db-create-p68gm\" (UID: \"f2f2ca95-2590-4599-8b0f-4714ed65c1b4\") " pod="openstack/nova-cell0-db-create-p68gm" Feb 23 14:50:18.454325 master-0 kubenswrapper[28758]: I0223 14:50:18.454261 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd9c6\" (UniqueName: \"kubernetes.io/projected/f2f2ca95-2590-4599-8b0f-4714ed65c1b4-kube-api-access-cd9c6\") pod \"nova-cell0-db-create-p68gm\" (UID: \"f2f2ca95-2590-4599-8b0f-4714ed65c1b4\") " pod="openstack/nova-cell0-db-create-p68gm" Feb 23 14:50:18.457254 master-0 kubenswrapper[28758]: I0223 14:50:18.457187 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l7lz\" (UniqueName: \"kubernetes.io/projected/4ad3874c-6300-4cff-80d7-b332ebc88e5d-kube-api-access-5l7lz\") pod \"nova-api-844d-account-create-update-l79ql\" (UID: \"4ad3874c-6300-4cff-80d7-b332ebc88e5d\") " pod="openstack/nova-api-844d-account-create-update-l79ql" Feb 23 14:50:18.465754 master-0 kubenswrapper[28758]: I0223 14:50:18.465673 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rk2l6" Feb 23 14:50:18.478821 master-0 kubenswrapper[28758]: I0223 14:50:18.467433 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd7c1f3-7e33-4a32-a229-09521aa553e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ddd7c1f3-7e33-4a32-a229-09521aa553e2" (UID: "ddd7c1f3-7e33-4a32-a229-09521aa553e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:18.499663 master-0 kubenswrapper[28758]: I0223 14:50:18.495437 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-p68gm" Feb 23 14:50:18.512136 master-0 kubenswrapper[28758]: I0223 14:50:18.512057 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-844d-account-create-update-l79ql" Feb 23 14:50:18.531380 master-0 kubenswrapper[28758]: I0223 14:50:18.531306 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd7c1f3-7e33-4a32-a229-09521aa553e2-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:18.575900 master-0 kubenswrapper[28758]: I0223 14:50:18.575824 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-tggnt"] Feb 23 14:50:18.580277 master-0 kubenswrapper[28758]: I0223 14:50:18.580229 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tggnt" Feb 23 14:50:18.622786 master-0 kubenswrapper[28758]: I0223 14:50:18.622727 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-73e0-account-create-update-v8x2s"] Feb 23 14:50:18.625976 master-0 kubenswrapper[28758]: I0223 14:50:18.625920 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-73e0-account-create-update-v8x2s" Feb 23 14:50:18.631289 master-0 kubenswrapper[28758]: I0223 14:50:18.629390 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 23 14:50:18.632968 master-0 kubenswrapper[28758]: I0223 14:50:18.632904 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tknnk\" (UniqueName: \"kubernetes.io/projected/2e586dba-e451-4458-972b-223909c7901e-kube-api-access-tknnk\") pod \"nova-cell1-db-create-tggnt\" (UID: \"2e586dba-e451-4458-972b-223909c7901e\") " pod="openstack/nova-cell1-db-create-tggnt" Feb 23 14:50:18.633071 master-0 kubenswrapper[28758]: I0223 14:50:18.633057 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e586dba-e451-4458-972b-223909c7901e-operator-scripts\") pod \"nova-cell1-db-create-tggnt\" (UID: \"2e586dba-e451-4458-972b-223909c7901e\") " pod="openstack/nova-cell1-db-create-tggnt" Feb 23 14:50:18.667179 master-0 kubenswrapper[28758]: I0223 14:50:18.662069 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-tggnt"] Feb 23 14:50:18.670009 master-0 kubenswrapper[28758]: I0223 14:50:18.669928 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-73e0-account-create-update-v8x2s"] Feb 23 14:50:18.670889 master-0 kubenswrapper[28758]: I0223 14:50:18.670833 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-7dbb4799c9-gg6zd" event={"ID":"ddd7c1f3-7e33-4a32-a229-09521aa553e2","Type":"ContainerDied","Data":"9ec8606c9605fc6bff44c7578f847f8da953577b255dff354684cc17bf229516"} Feb 23 14:50:18.670889 master-0 kubenswrapper[28758]: I0223 14:50:18.670873 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-7dbb4799c9-gg6zd" Feb 23 14:50:18.671014 master-0 kubenswrapper[28758]: I0223 14:50:18.670901 28758 scope.go:117] "RemoveContainer" containerID="99765de22307a1c342adcf8794d27957519466d1118f780cf8792ba120d06e91" Feb 23 14:50:18.673367 master-0 kubenswrapper[28758]: I0223 14:50:18.673310 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7cd7df45f8-pgvmr" event={"ID":"3d1bdc4f-86b7-4c36-906b-4aa5e49cd017","Type":"ContainerStarted","Data":"18951324c9e242185ba8bec628bf2b60ff0dc70bc4451bd6080e58262531653d"} Feb 23 14:50:18.684068 master-0 kubenswrapper[28758]: I0223 14:50:18.682166 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-8d9a-account-create-update-47dvn"] Feb 23 14:50:18.733094 master-0 kubenswrapper[28758]: I0223 14:50:18.724602 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8d9a-account-create-update-47dvn"] Feb 23 14:50:18.733094 master-0 kubenswrapper[28758]: I0223 14:50:18.724730 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8d9a-account-create-update-47dvn" Feb 23 14:50:18.737003 master-0 kubenswrapper[28758]: I0223 14:50:18.734045 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 23 14:50:18.737003 master-0 kubenswrapper[28758]: I0223 14:50:18.735205 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqww6\" (UniqueName: \"kubernetes.io/projected/708dbd36-4a71-40d9-8de7-827de4921a8a-kube-api-access-gqww6\") pod \"nova-cell0-73e0-account-create-update-v8x2s\" (UID: \"708dbd36-4a71-40d9-8de7-827de4921a8a\") " pod="openstack/nova-cell0-73e0-account-create-update-v8x2s" Feb 23 14:50:18.737003 master-0 kubenswrapper[28758]: I0223 14:50:18.735325 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e586dba-e451-4458-972b-223909c7901e-operator-scripts\") pod \"nova-cell1-db-create-tggnt\" (UID: \"2e586dba-e451-4458-972b-223909c7901e\") " pod="openstack/nova-cell1-db-create-tggnt" Feb 23 14:50:18.737003 master-0 kubenswrapper[28758]: I0223 14:50:18.735449 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/708dbd36-4a71-40d9-8de7-827de4921a8a-operator-scripts\") pod \"nova-cell0-73e0-account-create-update-v8x2s\" (UID: \"708dbd36-4a71-40d9-8de7-827de4921a8a\") " pod="openstack/nova-cell0-73e0-account-create-update-v8x2s" Feb 23 14:50:18.737003 master-0 kubenswrapper[28758]: I0223 14:50:18.735530 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tknnk\" (UniqueName: \"kubernetes.io/projected/2e586dba-e451-4458-972b-223909c7901e-kube-api-access-tknnk\") pod \"nova-cell1-db-create-tggnt\" (UID: \"2e586dba-e451-4458-972b-223909c7901e\") " pod="openstack/nova-cell1-db-create-tggnt" Feb 23 14:50:18.738023 master-0 kubenswrapper[28758]: I0223 14:50:18.737286 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e586dba-e451-4458-972b-223909c7901e-operator-scripts\") pod \"nova-cell1-db-create-tggnt\" (UID: \"2e586dba-e451-4458-972b-223909c7901e\") " pod="openstack/nova-cell1-db-create-tggnt" Feb 23 14:50:18.771864 master-0 kubenswrapper[28758]: I0223 14:50:18.771783 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-7dbb4799c9-gg6zd"] Feb 23 14:50:18.811064 master-0 kubenswrapper[28758]: I0223 14:50:18.810957 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tknnk\" (UniqueName: \"kubernetes.io/projected/2e586dba-e451-4458-972b-223909c7901e-kube-api-access-tknnk\") pod \"nova-cell1-db-create-tggnt\" (UID: \"2e586dba-e451-4458-972b-223909c7901e\") " pod="openstack/nova-cell1-db-create-tggnt" Feb 23 14:50:18.842171 master-0 kubenswrapper[28758]: I0223 14:50:18.838752 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/708dbd36-4a71-40d9-8de7-827de4921a8a-operator-scripts\") pod \"nova-cell0-73e0-account-create-update-v8x2s\" (UID: \"708dbd36-4a71-40d9-8de7-827de4921a8a\") " pod="openstack/nova-cell0-73e0-account-create-update-v8x2s" Feb 23 14:50:18.842171 master-0 kubenswrapper[28758]: I0223 14:50:18.839898 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48f37d7c-b625-42da-93ff-b6d5a1702356-operator-scripts\") pod \"nova-cell1-8d9a-account-create-update-47dvn\" (UID: \"48f37d7c-b625-42da-93ff-b6d5a1702356\") " pod="openstack/nova-cell1-8d9a-account-create-update-47dvn" Feb 23 14:50:18.842171 master-0 kubenswrapper[28758]: I0223 14:50:18.839934 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/708dbd36-4a71-40d9-8de7-827de4921a8a-operator-scripts\") pod \"nova-cell0-73e0-account-create-update-v8x2s\" (UID: \"708dbd36-4a71-40d9-8de7-827de4921a8a\") " pod="openstack/nova-cell0-73e0-account-create-update-v8x2s" Feb 23 14:50:18.842171 master-0 kubenswrapper[28758]: I0223 14:50:18.839981 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqww6\" (UniqueName: \"kubernetes.io/projected/708dbd36-4a71-40d9-8de7-827de4921a8a-kube-api-access-gqww6\") pod \"nova-cell0-73e0-account-create-update-v8x2s\" (UID: \"708dbd36-4a71-40d9-8de7-827de4921a8a\") " pod="openstack/nova-cell0-73e0-account-create-update-v8x2s" Feb 23 14:50:18.842171 master-0 kubenswrapper[28758]: I0223 14:50:18.840262 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l9kl\" (UniqueName: \"kubernetes.io/projected/48f37d7c-b625-42da-93ff-b6d5a1702356-kube-api-access-8l9kl\") pod \"nova-cell1-8d9a-account-create-update-47dvn\" (UID: \"48f37d7c-b625-42da-93ff-b6d5a1702356\") " pod="openstack/nova-cell1-8d9a-account-create-update-47dvn" Feb 23 14:50:18.856118 master-0 kubenswrapper[28758]: I0223 14:50:18.850856 28758 scope.go:117] "RemoveContainer" containerID="146c10621d1b6c06a49e812424dab5aeb362a464f2a46c840f5ddc128e722f23" Feb 23 14:50:18.926111 master-0 kubenswrapper[28758]: I0223 14:50:18.926061 28758 scope.go:117] "RemoveContainer" containerID="00469c31edf043d3bd6d22e3bf72fc59d69df0f246c782a7ae55c95933685939" Feb 23 14:50:18.932659 master-0 kubenswrapper[28758]: I0223 14:50:18.932605 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqww6\" (UniqueName: \"kubernetes.io/projected/708dbd36-4a71-40d9-8de7-827de4921a8a-kube-api-access-gqww6\") pod \"nova-cell0-73e0-account-create-update-v8x2s\" (UID: \"708dbd36-4a71-40d9-8de7-827de4921a8a\") " pod="openstack/nova-cell0-73e0-account-create-update-v8x2s" Feb 23 14:50:18.945541 master-0 kubenswrapper[28758]: I0223 14:50:18.945426 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l9kl\" (UniqueName: \"kubernetes.io/projected/48f37d7c-b625-42da-93ff-b6d5a1702356-kube-api-access-8l9kl\") pod \"nova-cell1-8d9a-account-create-update-47dvn\" (UID: \"48f37d7c-b625-42da-93ff-b6d5a1702356\") " pod="openstack/nova-cell1-8d9a-account-create-update-47dvn" Feb 23 14:50:18.948085 master-0 kubenswrapper[28758]: I0223 14:50:18.948026 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48f37d7c-b625-42da-93ff-b6d5a1702356-operator-scripts\") pod \"nova-cell1-8d9a-account-create-update-47dvn\" (UID: \"48f37d7c-b625-42da-93ff-b6d5a1702356\") " pod="openstack/nova-cell1-8d9a-account-create-update-47dvn" Feb 23 14:50:18.948883 master-0 kubenswrapper[28758]: I0223 14:50:18.948849 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48f37d7c-b625-42da-93ff-b6d5a1702356-operator-scripts\") pod \"nova-cell1-8d9a-account-create-update-47dvn\" (UID: \"48f37d7c-b625-42da-93ff-b6d5a1702356\") " pod="openstack/nova-cell1-8d9a-account-create-update-47dvn" Feb 23 14:50:18.964849 master-0 kubenswrapper[28758]: I0223 14:50:18.964767 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-7dbb4799c9-gg6zd"] Feb 23 14:50:18.968008 master-0 kubenswrapper[28758]: I0223 14:50:18.967079 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tggnt" Feb 23 14:50:18.999703 master-0 kubenswrapper[28758]: I0223 14:50:18.999625 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-73e0-account-create-update-v8x2s" Feb 23 14:50:19.088930 master-0 kubenswrapper[28758]: I0223 14:50:19.088734 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l9kl\" (UniqueName: \"kubernetes.io/projected/48f37d7c-b625-42da-93ff-b6d5a1702356-kube-api-access-8l9kl\") pod \"nova-cell1-8d9a-account-create-update-47dvn\" (UID: \"48f37d7c-b625-42da-93ff-b6d5a1702356\") " pod="openstack/nova-cell1-8d9a-account-create-update-47dvn" Feb 23 14:50:19.115576 master-0 kubenswrapper[28758]: I0223 14:50:19.115502 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8d9a-account-create-update-47dvn" Feb 23 14:50:19.572271 master-0 kubenswrapper[28758]: I0223 14:50:19.568649 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-rk2l6"] Feb 23 14:50:19.588532 master-0 kubenswrapper[28758]: I0223 14:50:19.586330 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-844d-account-create-update-l79ql"] Feb 23 14:50:19.605292 master-0 kubenswrapper[28758]: I0223 14:50:19.604881 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-p68gm"] Feb 23 14:50:19.666692 master-0 kubenswrapper[28758]: I0223 14:50:19.658515 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-tggnt"] Feb 23 14:50:19.693102 master-0 kubenswrapper[28758]: I0223 14:50:19.693021 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-73e0-account-create-update-v8x2s"] Feb 23 14:50:19.697217 master-0 kubenswrapper[28758]: I0223 14:50:19.697172 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7cd7df45f8-pgvmr" event={"ID":"3d1bdc4f-86b7-4c36-906b-4aa5e49cd017","Type":"ContainerStarted","Data":"0223a524e10e6e19c0d53ce55f9e145ab77dae2de456a540a652f46e1bd41b86"} Feb 23 14:50:19.697217 master-0 kubenswrapper[28758]: I0223 14:50:19.697213 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7cd7df45f8-pgvmr" event={"ID":"3d1bdc4f-86b7-4c36-906b-4aa5e49cd017","Type":"ContainerStarted","Data":"5fe16303bcf46fd16e80c37dce0dc511ffcd51e0c35dbaadd43d3eac662c6e2e"} Feb 23 14:50:19.697647 master-0 kubenswrapper[28758]: I0223 14:50:19.697595 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7cd7df45f8-pgvmr" Feb 23 14:50:19.732305 master-0 kubenswrapper[28758]: I0223 14:50:19.732172 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7cd7df45f8-pgvmr" podStartSLOduration=3.732144045 podStartE2EDuration="3.732144045s" podCreationTimestamp="2026-02-23 14:50:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:50:19.721997965 +0000 UTC m=+951.848313907" watchObservedRunningTime="2026-02-23 14:50:19.732144045 +0000 UTC m=+951.858460007" Feb 23 14:50:19.891725 master-0 kubenswrapper[28758]: I0223 14:50:19.891664 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8d9a-account-create-update-47dvn"] Feb 23 14:50:20.113118 master-0 kubenswrapper[28758]: I0223 14:50:20.111598 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddd7c1f3-7e33-4a32-a229-09521aa553e2" path="/var/lib/kubelet/pods/ddd7c1f3-7e33-4a32-a229-09521aa553e2/volumes" Feb 23 14:50:20.712467 master-0 kubenswrapper[28758]: I0223 14:50:20.712412 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7cd7df45f8-pgvmr" Feb 23 14:50:21.240884 master-0 kubenswrapper[28758]: W0223 14:50:21.240827 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod708dbd36_4a71_40d9_8de7_827de4921a8a.slice/crio-6dc9a9b1b899f98cea70fae7762222cafbf34b0247e6e079c36639b87000de87 WatchSource:0}: Error finding container 6dc9a9b1b899f98cea70fae7762222cafbf34b0247e6e079c36639b87000de87: Status 404 returned error can't find the container with id 6dc9a9b1b899f98cea70fae7762222cafbf34b0247e6e079c36639b87000de87 Feb 23 14:50:21.242615 master-0 kubenswrapper[28758]: W0223 14:50:21.242563 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04a17006_c08e_4ac7_8b6f_f412b4249c6c.slice/crio-c367da00d44ca612e78803453161061c6f74752b934353ee3c8e35d18ee7f2e3 WatchSource:0}: Error finding container c367da00d44ca612e78803453161061c6f74752b934353ee3c8e35d18ee7f2e3: Status 404 returned error can't find the container with id c367da00d44ca612e78803453161061c6f74752b934353ee3c8e35d18ee7f2e3 Feb 23 14:50:21.254361 master-0 kubenswrapper[28758]: W0223 14:50:21.254274 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e586dba_e451_4458_972b_223909c7901e.slice/crio-dce1a6f8f6f8345f897314b1b1ad244d7c6b80e59a9f5027cee6424581027e7c WatchSource:0}: Error finding container dce1a6f8f6f8345f897314b1b1ad244d7c6b80e59a9f5027cee6424581027e7c: Status 404 returned error can't find the container with id dce1a6f8f6f8345f897314b1b1ad244d7c6b80e59a9f5027cee6424581027e7c Feb 23 14:50:21.258373 master-0 kubenswrapper[28758]: W0223 14:50:21.258286 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2f2ca95_2590_4599_8b0f_4714ed65c1b4.slice/crio-bf0f0600d7d37aad4b5a82f93c1da195be32efd3deb3d8d78c209ae0cccf8415 WatchSource:0}: Error finding container bf0f0600d7d37aad4b5a82f93c1da195be32efd3deb3d8d78c209ae0cccf8415: Status 404 returned error can't find the container with id bf0f0600d7d37aad4b5a82f93c1da195be32efd3deb3d8d78c209ae0cccf8415 Feb 23 14:50:21.264571 master-0 kubenswrapper[28758]: W0223 14:50:21.264517 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48f37d7c_b625_42da_93ff_b6d5a1702356.slice/crio-64620923c66062f6f5afb17ba032d5205ee2fd42075f5f415736d215c1278faf WatchSource:0}: Error finding container 64620923c66062f6f5afb17ba032d5205ee2fd42075f5f415736d215c1278faf: Status 404 returned error can't find the container with id 64620923c66062f6f5afb17ba032d5205ee2fd42075f5f415736d215c1278faf Feb 23 14:50:21.268356 master-0 kubenswrapper[28758]: W0223 14:50:21.268194 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ad3874c_6300_4cff_80d7_b332ebc88e5d.slice/crio-54a3df2feb3d641a35a85c9cdccf71898cc68d764a1cbcb2195f02bf0d8151e9 WatchSource:0}: Error finding container 54a3df2feb3d641a35a85c9cdccf71898cc68d764a1cbcb2195f02bf0d8151e9: Status 404 returned error can't find the container with id 54a3df2feb3d641a35a85c9cdccf71898cc68d764a1cbcb2195f02bf0d8151e9 Feb 23 14:50:21.745458 master-0 kubenswrapper[28758]: I0223 14:50:21.745338 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8d9a-account-create-update-47dvn" event={"ID":"48f37d7c-b625-42da-93ff-b6d5a1702356","Type":"ContainerStarted","Data":"81b5a93cbe911727c15ff561e49dbfea918939c3fadde01e9b97707a0530b51d"} Feb 23 14:50:21.745458 master-0 kubenswrapper[28758]: I0223 14:50:21.745444 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8d9a-account-create-update-47dvn" event={"ID":"48f37d7c-b625-42da-93ff-b6d5a1702356","Type":"ContainerStarted","Data":"64620923c66062f6f5afb17ba032d5205ee2fd42075f5f415736d215c1278faf"} Feb 23 14:50:21.751240 master-0 kubenswrapper[28758]: I0223 14:50:21.751138 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-73e0-account-create-update-v8x2s" event={"ID":"708dbd36-4a71-40d9-8de7-827de4921a8a","Type":"ContainerStarted","Data":"6dc9a9b1b899f98cea70fae7762222cafbf34b0247e6e079c36639b87000de87"} Feb 23 14:50:21.756449 master-0 kubenswrapper[28758]: I0223 14:50:21.756375 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rk2l6" event={"ID":"04a17006-c08e-4ac7-8b6f-f412b4249c6c","Type":"ContainerStarted","Data":"c367da00d44ca612e78803453161061c6f74752b934353ee3c8e35d18ee7f2e3"} Feb 23 14:50:21.762342 master-0 kubenswrapper[28758]: I0223 14:50:21.760960 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tggnt" event={"ID":"2e586dba-e451-4458-972b-223909c7901e","Type":"ContainerStarted","Data":"dce1a6f8f6f8345f897314b1b1ad244d7c6b80e59a9f5027cee6424581027e7c"} Feb 23 14:50:21.772450 master-0 kubenswrapper[28758]: I0223 14:50:21.769612 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-p68gm" event={"ID":"f2f2ca95-2590-4599-8b0f-4714ed65c1b4","Type":"ContainerStarted","Data":"bf0f0600d7d37aad4b5a82f93c1da195be32efd3deb3d8d78c209ae0cccf8415"} Feb 23 14:50:21.776579 master-0 kubenswrapper[28758]: I0223 14:50:21.774358 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-844d-account-create-update-l79ql" event={"ID":"4ad3874c-6300-4cff-80d7-b332ebc88e5d","Type":"ContainerStarted","Data":"54a3df2feb3d641a35a85c9cdccf71898cc68d764a1cbcb2195f02bf0d8151e9"} Feb 23 14:50:21.796086 master-0 kubenswrapper[28758]: I0223 14:50:21.793947 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-8d9a-account-create-update-47dvn" podStartSLOduration=3.793923275 podStartE2EDuration="3.793923275s" podCreationTimestamp="2026-02-23 14:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:50:21.789149328 +0000 UTC m=+953.915465270" watchObservedRunningTime="2026-02-23 14:50:21.793923275 +0000 UTC m=+953.920239207" Feb 23 14:50:22.788593 master-0 kubenswrapper[28758]: I0223 14:50:22.788519 28758 generic.go:334] "Generic (PLEG): container finished" podID="4ad3874c-6300-4cff-80d7-b332ebc88e5d" containerID="d0727e4a4a021693796beed67c9d1e16bd7e64167c5f7f6fb334424cfa61f51a" exitCode=0 Feb 23 14:50:22.789217 master-0 kubenswrapper[28758]: I0223 14:50:22.788642 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-844d-account-create-update-l79ql" event={"ID":"4ad3874c-6300-4cff-80d7-b332ebc88e5d","Type":"ContainerDied","Data":"d0727e4a4a021693796beed67c9d1e16bd7e64167c5f7f6fb334424cfa61f51a"} Feb 23 14:50:22.792123 master-0 kubenswrapper[28758]: I0223 14:50:22.792084 28758 generic.go:334] "Generic (PLEG): container finished" podID="48f37d7c-b625-42da-93ff-b6d5a1702356" containerID="81b5a93cbe911727c15ff561e49dbfea918939c3fadde01e9b97707a0530b51d" exitCode=0 Feb 23 14:50:22.792467 master-0 kubenswrapper[28758]: I0223 14:50:22.792154 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8d9a-account-create-update-47dvn" event={"ID":"48f37d7c-b625-42da-93ff-b6d5a1702356","Type":"ContainerDied","Data":"81b5a93cbe911727c15ff561e49dbfea918939c3fadde01e9b97707a0530b51d"} Feb 23 14:50:22.795594 master-0 kubenswrapper[28758]: I0223 14:50:22.795258 28758 generic.go:334] "Generic (PLEG): container finished" podID="708dbd36-4a71-40d9-8de7-827de4921a8a" containerID="92c8c5e5998cae5570d1aa787109bc8e92bb5d07c7a2f67c6912e04e15f6011d" exitCode=0 Feb 23 14:50:22.795898 master-0 kubenswrapper[28758]: I0223 14:50:22.795309 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-73e0-account-create-update-v8x2s" event={"ID":"708dbd36-4a71-40d9-8de7-827de4921a8a","Type":"ContainerDied","Data":"92c8c5e5998cae5570d1aa787109bc8e92bb5d07c7a2f67c6912e04e15f6011d"} Feb 23 14:50:22.799034 master-0 kubenswrapper[28758]: I0223 14:50:22.798905 28758 generic.go:334] "Generic (PLEG): container finished" podID="04a17006-c08e-4ac7-8b6f-f412b4249c6c" containerID="2897b78f65d741597b65d80d7ce9ae6c67ba520a4aa7a6776404a9ba4cebb879" exitCode=0 Feb 23 14:50:22.799034 master-0 kubenswrapper[28758]: I0223 14:50:22.798968 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rk2l6" event={"ID":"04a17006-c08e-4ac7-8b6f-f412b4249c6c","Type":"ContainerDied","Data":"2897b78f65d741597b65d80d7ce9ae6c67ba520a4aa7a6776404a9ba4cebb879"} Feb 23 14:50:22.801644 master-0 kubenswrapper[28758]: I0223 14:50:22.801586 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-8kt4c" event={"ID":"98bc143e-fd3c-4a62-a069-5e2357cb2209","Type":"ContainerStarted","Data":"864f3c07d114d4242476f8231283727d6e18c11c580c1e6f8dddb6379ad9dea6"} Feb 23 14:50:22.804289 master-0 kubenswrapper[28758]: I0223 14:50:22.804176 28758 generic.go:334] "Generic (PLEG): container finished" podID="2e586dba-e451-4458-972b-223909c7901e" containerID="abcc5bdf5d9d30d513b449d5db08480f5e24894fb2a696a2e1ce51134aa83860" exitCode=0 Feb 23 14:50:22.804289 master-0 kubenswrapper[28758]: I0223 14:50:22.804217 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tggnt" event={"ID":"2e586dba-e451-4458-972b-223909c7901e","Type":"ContainerDied","Data":"abcc5bdf5d9d30d513b449d5db08480f5e24894fb2a696a2e1ce51134aa83860"} Feb 23 14:50:22.806224 master-0 kubenswrapper[28758]: I0223 14:50:22.806197 28758 generic.go:334] "Generic (PLEG): container finished" podID="f2f2ca95-2590-4599-8b0f-4714ed65c1b4" containerID="43ebae455950d08989e69efc2e780b795372f8bc26c6b93ffee2179991f345a2" exitCode=0 Feb 23 14:50:22.806323 master-0 kubenswrapper[28758]: I0223 14:50:22.806233 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-p68gm" event={"ID":"f2f2ca95-2590-4599-8b0f-4714ed65c1b4","Type":"ContainerDied","Data":"43ebae455950d08989e69efc2e780b795372f8bc26c6b93ffee2179991f345a2"} Feb 23 14:50:22.930404 master-0 kubenswrapper[28758]: I0223 14:50:22.930323 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-db-sync-8kt4c" podStartSLOduration=2.995514784 podStartE2EDuration="7.930302418s" podCreationTimestamp="2026-02-23 14:50:15 +0000 UTC" firstStartedPulling="2026-02-23 14:50:16.671778543 +0000 UTC m=+948.798094485" lastFinishedPulling="2026-02-23 14:50:21.606566197 +0000 UTC m=+953.732882119" observedRunningTime="2026-02-23 14:50:22.914033396 +0000 UTC m=+955.040349348" watchObservedRunningTime="2026-02-23 14:50:22.930302418 +0000 UTC m=+955.056618350" Feb 23 14:50:23.623582 master-0 kubenswrapper[28758]: I0223 14:50:23.623236 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-84f9bb9b4d-7ttsw" Feb 23 14:50:24.859320 master-0 kubenswrapper[28758]: I0223 14:50:24.859188 28758 generic.go:334] "Generic (PLEG): container finished" podID="98bc143e-fd3c-4a62-a069-5e2357cb2209" containerID="864f3c07d114d4242476f8231283727d6e18c11c580c1e6f8dddb6379ad9dea6" exitCode=0 Feb 23 14:50:24.859320 master-0 kubenswrapper[28758]: I0223 14:50:24.859257 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-8kt4c" event={"ID":"98bc143e-fd3c-4a62-a069-5e2357cb2209","Type":"ContainerDied","Data":"864f3c07d114d4242476f8231283727d6e18c11c580c1e6f8dddb6379ad9dea6"} Feb 23 14:50:26.456236 master-0 kubenswrapper[28758]: I0223 14:50:26.456173 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5b8b575dff-c9z8b" Feb 23 14:50:26.554451 master-0 kubenswrapper[28758]: I0223 14:50:26.553760 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-84f9bb9b4d-7ttsw"] Feb 23 14:50:26.556566 master-0 kubenswrapper[28758]: I0223 14:50:26.556000 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-84f9bb9b4d-7ttsw" podUID="8ee1afef-0950-4013-95a6-e6ae71a155eb" containerName="neutron-api" containerID="cri-o://a074856ff5b977bc702c4e244794e3d8c685bcf0b57c1ebe80bebe8f24288af9" gracePeriod=30 Feb 23 14:50:26.556566 master-0 kubenswrapper[28758]: I0223 14:50:26.556109 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-84f9bb9b4d-7ttsw" podUID="8ee1afef-0950-4013-95a6-e6ae71a155eb" containerName="neutron-httpd" containerID="cri-o://f445dd7a9552e21e8b19b2f1128fc3bdf2dd91e14d688462a2ecd74bf371a0f5" gracePeriod=30 Feb 23 14:50:26.920033 master-0 kubenswrapper[28758]: I0223 14:50:26.919965 28758 generic.go:334] "Generic (PLEG): container finished" podID="8ee1afef-0950-4013-95a6-e6ae71a155eb" containerID="f445dd7a9552e21e8b19b2f1128fc3bdf2dd91e14d688462a2ecd74bf371a0f5" exitCode=0 Feb 23 14:50:26.920033 master-0 kubenswrapper[28758]: I0223 14:50:26.920032 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84f9bb9b4d-7ttsw" event={"ID":"8ee1afef-0950-4013-95a6-e6ae71a155eb","Type":"ContainerDied","Data":"f445dd7a9552e21e8b19b2f1128fc3bdf2dd91e14d688462a2ecd74bf371a0f5"} Feb 23 14:50:27.319970 master-0 kubenswrapper[28758]: I0223 14:50:27.319910 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7cd7df45f8-pgvmr" Feb 23 14:50:27.322171 master-0 kubenswrapper[28758]: I0223 14:50:27.322145 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7cd7df45f8-pgvmr" Feb 23 14:50:28.706502 master-0 kubenswrapper[28758]: I0223 14:50:28.706430 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tggnt" Feb 23 14:50:28.763980 master-0 kubenswrapper[28758]: I0223 14:50:28.763905 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tknnk\" (UniqueName: \"kubernetes.io/projected/2e586dba-e451-4458-972b-223909c7901e-kube-api-access-tknnk\") pod \"2e586dba-e451-4458-972b-223909c7901e\" (UID: \"2e586dba-e451-4458-972b-223909c7901e\") " Feb 23 14:50:28.764831 master-0 kubenswrapper[28758]: I0223 14:50:28.764262 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e586dba-e451-4458-972b-223909c7901e-operator-scripts\") pod \"2e586dba-e451-4458-972b-223909c7901e\" (UID: \"2e586dba-e451-4458-972b-223909c7901e\") " Feb 23 14:50:28.768720 master-0 kubenswrapper[28758]: I0223 14:50:28.768658 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e586dba-e451-4458-972b-223909c7901e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2e586dba-e451-4458-972b-223909c7901e" (UID: "2e586dba-e451-4458-972b-223909c7901e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:50:28.788693 master-0 kubenswrapper[28758]: I0223 14:50:28.788616 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e586dba-e451-4458-972b-223909c7901e-kube-api-access-tknnk" (OuterVolumeSpecName: "kube-api-access-tknnk") pod "2e586dba-e451-4458-972b-223909c7901e" (UID: "2e586dba-e451-4458-972b-223909c7901e"). InnerVolumeSpecName "kube-api-access-tknnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:50:28.870994 master-0 kubenswrapper[28758]: I0223 14:50:28.870869 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tknnk\" (UniqueName: \"kubernetes.io/projected/2e586dba-e451-4458-972b-223909c7901e-kube-api-access-tknnk\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:28.870994 master-0 kubenswrapper[28758]: I0223 14:50:28.870922 28758 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e586dba-e451-4458-972b-223909c7901e-operator-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:28.950858 master-0 kubenswrapper[28758]: I0223 14:50:28.950775 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tggnt" event={"ID":"2e586dba-e451-4458-972b-223909c7901e","Type":"ContainerDied","Data":"dce1a6f8f6f8345f897314b1b1ad244d7c6b80e59a9f5027cee6424581027e7c"} Feb 23 14:50:28.950858 master-0 kubenswrapper[28758]: I0223 14:50:28.950861 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dce1a6f8f6f8345f897314b1b1ad244d7c6b80e59a9f5027cee6424581027e7c" Feb 23 14:50:28.951204 master-0 kubenswrapper[28758]: I0223 14:50:28.950953 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tggnt" Feb 23 14:50:30.090794 master-0 kubenswrapper[28758]: I0223 14:50:30.090662 28758 scope.go:117] "RemoveContainer" containerID="7a8e55d4703a52a75f17c69a848f147a36062c67b0d146662f6b138cb546db14" Feb 23 14:50:31.997186 master-0 kubenswrapper[28758]: I0223 14:50:31.996907 28758 generic.go:334] "Generic (PLEG): container finished" podID="8ee1afef-0950-4013-95a6-e6ae71a155eb" containerID="a074856ff5b977bc702c4e244794e3d8c685bcf0b57c1ebe80bebe8f24288af9" exitCode=0 Feb 23 14:50:31.997186 master-0 kubenswrapper[28758]: I0223 14:50:31.996952 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84f9bb9b4d-7ttsw" event={"ID":"8ee1afef-0950-4013-95a6-e6ae71a155eb","Type":"ContainerDied","Data":"a074856ff5b977bc702c4e244794e3d8c685bcf0b57c1ebe80bebe8f24288af9"} Feb 23 14:50:33.815734 master-0 kubenswrapper[28758]: I0223 14:50:33.812104 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-73e0-account-create-update-v8x2s" Feb 23 14:50:33.843063 master-0 kubenswrapper[28758]: I0223 14:50:33.840013 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-8kt4c" Feb 23 14:50:33.873785 master-0 kubenswrapper[28758]: I0223 14:50:33.873721 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8d9a-account-create-update-47dvn" Feb 23 14:50:33.917741 master-0 kubenswrapper[28758]: I0223 14:50:33.917638 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-844d-account-create-update-l79ql" Feb 23 14:50:33.925306 master-0 kubenswrapper[28758]: I0223 14:50:33.925236 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/708dbd36-4a71-40d9-8de7-827de4921a8a-operator-scripts\") pod \"708dbd36-4a71-40d9-8de7-827de4921a8a\" (UID: \"708dbd36-4a71-40d9-8de7-827de4921a8a\") " Feb 23 14:50:33.925547 master-0 kubenswrapper[28758]: I0223 14:50:33.925370 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/98bc143e-fd3c-4a62-a069-5e2357cb2209-config\") pod \"98bc143e-fd3c-4a62-a069-5e2357cb2209\" (UID: \"98bc143e-fd3c-4a62-a069-5e2357cb2209\") " Feb 23 14:50:33.925845 master-0 kubenswrapper[28758]: I0223 14:50:33.925554 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98bc143e-fd3c-4a62-a069-5e2357cb2209-scripts\") pod \"98bc143e-fd3c-4a62-a069-5e2357cb2209\" (UID: \"98bc143e-fd3c-4a62-a069-5e2357cb2209\") " Feb 23 14:50:33.925845 master-0 kubenswrapper[28758]: I0223 14:50:33.925626 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/98bc143e-fd3c-4a62-a069-5e2357cb2209-etc-podinfo\") pod \"98bc143e-fd3c-4a62-a069-5e2357cb2209\" (UID: \"98bc143e-fd3c-4a62-a069-5e2357cb2209\") " Feb 23 14:50:33.925845 master-0 kubenswrapper[28758]: I0223 14:50:33.925703 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/98bc143e-fd3c-4a62-a069-5e2357cb2209-var-lib-ironic\") pod \"98bc143e-fd3c-4a62-a069-5e2357cb2209\" (UID: \"98bc143e-fd3c-4a62-a069-5e2357cb2209\") " Feb 23 14:50:33.927681 master-0 kubenswrapper[28758]: I0223 14:50:33.926068 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqww6\" (UniqueName: \"kubernetes.io/projected/708dbd36-4a71-40d9-8de7-827de4921a8a-kube-api-access-gqww6\") pod \"708dbd36-4a71-40d9-8de7-827de4921a8a\" (UID: \"708dbd36-4a71-40d9-8de7-827de4921a8a\") " Feb 23 14:50:33.927681 master-0 kubenswrapper[28758]: I0223 14:50:33.926142 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/98bc143e-fd3c-4a62-a069-5e2357cb2209-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"98bc143e-fd3c-4a62-a069-5e2357cb2209\" (UID: \"98bc143e-fd3c-4a62-a069-5e2357cb2209\") " Feb 23 14:50:33.927681 master-0 kubenswrapper[28758]: I0223 14:50:33.926200 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24s57\" (UniqueName: \"kubernetes.io/projected/98bc143e-fd3c-4a62-a069-5e2357cb2209-kube-api-access-24s57\") pod \"98bc143e-fd3c-4a62-a069-5e2357cb2209\" (UID: \"98bc143e-fd3c-4a62-a069-5e2357cb2209\") " Feb 23 14:50:33.927681 master-0 kubenswrapper[28758]: I0223 14:50:33.926332 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc143e-fd3c-4a62-a069-5e2357cb2209-combined-ca-bundle\") pod \"98bc143e-fd3c-4a62-a069-5e2357cb2209\" (UID: \"98bc143e-fd3c-4a62-a069-5e2357cb2209\") " Feb 23 14:50:33.927681 master-0 kubenswrapper[28758]: I0223 14:50:33.927459 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98bc143e-fd3c-4a62-a069-5e2357cb2209-var-lib-ironic-inspector-dhcp-hostsdir" (OuterVolumeSpecName: "var-lib-ironic-inspector-dhcp-hostsdir") pod "98bc143e-fd3c-4a62-a069-5e2357cb2209" (UID: "98bc143e-fd3c-4a62-a069-5e2357cb2209"). InnerVolumeSpecName "var-lib-ironic-inspector-dhcp-hostsdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:50:33.927913 master-0 kubenswrapper[28758]: I0223 14:50:33.927849 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/708dbd36-4a71-40d9-8de7-827de4921a8a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "708dbd36-4a71-40d9-8de7-827de4921a8a" (UID: "708dbd36-4a71-40d9-8de7-827de4921a8a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:50:33.928279 master-0 kubenswrapper[28758]: I0223 14:50:33.928243 28758 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/708dbd36-4a71-40d9-8de7-827de4921a8a-operator-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:33.928345 master-0 kubenswrapper[28758]: I0223 14:50:33.928278 28758 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/98bc143e-fd3c-4a62-a069-5e2357cb2209-var-lib-ironic-inspector-dhcp-hostsdir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:33.931325 master-0 kubenswrapper[28758]: I0223 14:50:33.931251 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98bc143e-fd3c-4a62-a069-5e2357cb2209-var-lib-ironic" (OuterVolumeSpecName: "var-lib-ironic") pod "98bc143e-fd3c-4a62-a069-5e2357cb2209" (UID: "98bc143e-fd3c-4a62-a069-5e2357cb2209"). InnerVolumeSpecName "var-lib-ironic". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:50:33.932161 master-0 kubenswrapper[28758]: I0223 14:50:33.932114 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rk2l6" Feb 23 14:50:33.936432 master-0 kubenswrapper[28758]: I0223 14:50:33.936386 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98bc143e-fd3c-4a62-a069-5e2357cb2209-scripts" (OuterVolumeSpecName: "scripts") pod "98bc143e-fd3c-4a62-a069-5e2357cb2209" (UID: "98bc143e-fd3c-4a62-a069-5e2357cb2209"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:33.962615 master-0 kubenswrapper[28758]: I0223 14:50:33.962524 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-p68gm" Feb 23 14:50:33.962906 master-0 kubenswrapper[28758]: I0223 14:50:33.962630 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/708dbd36-4a71-40d9-8de7-827de4921a8a-kube-api-access-gqww6" (OuterVolumeSpecName: "kube-api-access-gqww6") pod "708dbd36-4a71-40d9-8de7-827de4921a8a" (UID: "708dbd36-4a71-40d9-8de7-827de4921a8a"). InnerVolumeSpecName "kube-api-access-gqww6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:50:33.963620 master-0 kubenswrapper[28758]: I0223 14:50:33.963134 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/98bc143e-fd3c-4a62-a069-5e2357cb2209-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "98bc143e-fd3c-4a62-a069-5e2357cb2209" (UID: "98bc143e-fd3c-4a62-a069-5e2357cb2209"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 23 14:50:33.963620 master-0 kubenswrapper[28758]: I0223 14:50:33.963257 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98bc143e-fd3c-4a62-a069-5e2357cb2209-kube-api-access-24s57" (OuterVolumeSpecName: "kube-api-access-24s57") pod "98bc143e-fd3c-4a62-a069-5e2357cb2209" (UID: "98bc143e-fd3c-4a62-a069-5e2357cb2209"). InnerVolumeSpecName "kube-api-access-24s57". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:50:34.030339 master-0 kubenswrapper[28758]: I0223 14:50:34.030018 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8qvx\" (UniqueName: \"kubernetes.io/projected/04a17006-c08e-4ac7-8b6f-f412b4249c6c-kube-api-access-g8qvx\") pod \"04a17006-c08e-4ac7-8b6f-f412b4249c6c\" (UID: \"04a17006-c08e-4ac7-8b6f-f412b4249c6c\") " Feb 23 14:50:34.030339 master-0 kubenswrapper[28758]: I0223 14:50:34.030290 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04a17006-c08e-4ac7-8b6f-f412b4249c6c-operator-scripts\") pod \"04a17006-c08e-4ac7-8b6f-f412b4249c6c\" (UID: \"04a17006-c08e-4ac7-8b6f-f412b4249c6c\") " Feb 23 14:50:34.030920 master-0 kubenswrapper[28758]: I0223 14:50:34.030587 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8l9kl\" (UniqueName: \"kubernetes.io/projected/48f37d7c-b625-42da-93ff-b6d5a1702356-kube-api-access-8l9kl\") pod \"48f37d7c-b625-42da-93ff-b6d5a1702356\" (UID: \"48f37d7c-b625-42da-93ff-b6d5a1702356\") " Feb 23 14:50:34.030920 master-0 kubenswrapper[28758]: I0223 14:50:34.030688 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l7lz\" (UniqueName: \"kubernetes.io/projected/4ad3874c-6300-4cff-80d7-b332ebc88e5d-kube-api-access-5l7lz\") pod \"4ad3874c-6300-4cff-80d7-b332ebc88e5d\" (UID: \"4ad3874c-6300-4cff-80d7-b332ebc88e5d\") " Feb 23 14:50:34.030920 master-0 kubenswrapper[28758]: I0223 14:50:34.030863 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ad3874c-6300-4cff-80d7-b332ebc88e5d-operator-scripts\") pod \"4ad3874c-6300-4cff-80d7-b332ebc88e5d\" (UID: \"4ad3874c-6300-4cff-80d7-b332ebc88e5d\") " Feb 23 14:50:34.031290 master-0 kubenswrapper[28758]: I0223 14:50:34.030933 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48f37d7c-b625-42da-93ff-b6d5a1702356-operator-scripts\") pod \"48f37d7c-b625-42da-93ff-b6d5a1702356\" (UID: \"48f37d7c-b625-42da-93ff-b6d5a1702356\") " Feb 23 14:50:34.032117 master-0 kubenswrapper[28758]: I0223 14:50:34.032080 28758 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98bc143e-fd3c-4a62-a069-5e2357cb2209-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:34.032855 master-0 kubenswrapper[28758]: I0223 14:50:34.032800 28758 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/98bc143e-fd3c-4a62-a069-5e2357cb2209-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:34.032946 master-0 kubenswrapper[28758]: I0223 14:50:34.032872 28758 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/98bc143e-fd3c-4a62-a069-5e2357cb2209-var-lib-ironic\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:34.032946 master-0 kubenswrapper[28758]: I0223 14:50:34.032891 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqww6\" (UniqueName: \"kubernetes.io/projected/708dbd36-4a71-40d9-8de7-827de4921a8a-kube-api-access-gqww6\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:34.032946 master-0 kubenswrapper[28758]: I0223 14:50:34.032904 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24s57\" (UniqueName: \"kubernetes.io/projected/98bc143e-fd3c-4a62-a069-5e2357cb2209-kube-api-access-24s57\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:34.033579 master-0 kubenswrapper[28758]: I0223 14:50:34.033544 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48f37d7c-b625-42da-93ff-b6d5a1702356-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "48f37d7c-b625-42da-93ff-b6d5a1702356" (UID: "48f37d7c-b625-42da-93ff-b6d5a1702356"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:50:34.034072 master-0 kubenswrapper[28758]: I0223 14:50:34.034040 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ad3874c-6300-4cff-80d7-b332ebc88e5d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4ad3874c-6300-4cff-80d7-b332ebc88e5d" (UID: "4ad3874c-6300-4cff-80d7-b332ebc88e5d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:50:34.035130 master-0 kubenswrapper[28758]: I0223 14:50:34.035072 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04a17006-c08e-4ac7-8b6f-f412b4249c6c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "04a17006-c08e-4ac7-8b6f-f412b4249c6c" (UID: "04a17006-c08e-4ac7-8b6f-f412b4249c6c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:50:34.044303 master-0 kubenswrapper[28758]: I0223 14:50:34.044242 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-p68gm" event={"ID":"f2f2ca95-2590-4599-8b0f-4714ed65c1b4","Type":"ContainerDied","Data":"bf0f0600d7d37aad4b5a82f93c1da195be32efd3deb3d8d78c209ae0cccf8415"} Feb 23 14:50:34.044397 master-0 kubenswrapper[28758]: I0223 14:50:34.044306 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf0f0600d7d37aad4b5a82f93c1da195be32efd3deb3d8d78c209ae0cccf8415" Feb 23 14:50:34.044397 master-0 kubenswrapper[28758]: I0223 14:50:34.044258 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-p68gm" Feb 23 14:50:34.046443 master-0 kubenswrapper[28758]: I0223 14:50:34.046420 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8d9a-account-create-update-47dvn" Feb 23 14:50:34.046443 master-0 kubenswrapper[28758]: I0223 14:50:34.046410 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8d9a-account-create-update-47dvn" event={"ID":"48f37d7c-b625-42da-93ff-b6d5a1702356","Type":"ContainerDied","Data":"64620923c66062f6f5afb17ba032d5205ee2fd42075f5f415736d215c1278faf"} Feb 23 14:50:34.046590 master-0 kubenswrapper[28758]: I0223 14:50:34.046463 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64620923c66062f6f5afb17ba032d5205ee2fd42075f5f415736d215c1278faf" Feb 23 14:50:34.048027 master-0 kubenswrapper[28758]: I0223 14:50:34.047999 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-73e0-account-create-update-v8x2s" event={"ID":"708dbd36-4a71-40d9-8de7-827de4921a8a","Type":"ContainerDied","Data":"6dc9a9b1b899f98cea70fae7762222cafbf34b0247e6e079c36639b87000de87"} Feb 23 14:50:34.048027 master-0 kubenswrapper[28758]: I0223 14:50:34.048030 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dc9a9b1b899f98cea70fae7762222cafbf34b0247e6e079c36639b87000de87" Feb 23 14:50:34.048171 master-0 kubenswrapper[28758]: I0223 14:50:34.048110 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-73e0-account-create-update-v8x2s" Feb 23 14:50:34.052421 master-0 kubenswrapper[28758]: I0223 14:50:34.052000 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48f37d7c-b625-42da-93ff-b6d5a1702356-kube-api-access-8l9kl" (OuterVolumeSpecName: "kube-api-access-8l9kl") pod "48f37d7c-b625-42da-93ff-b6d5a1702356" (UID: "48f37d7c-b625-42da-93ff-b6d5a1702356"). InnerVolumeSpecName "kube-api-access-8l9kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:50:34.053730 master-0 kubenswrapper[28758]: I0223 14:50:34.053646 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04a17006-c08e-4ac7-8b6f-f412b4249c6c-kube-api-access-g8qvx" (OuterVolumeSpecName: "kube-api-access-g8qvx") pod "04a17006-c08e-4ac7-8b6f-f412b4249c6c" (UID: "04a17006-c08e-4ac7-8b6f-f412b4249c6c"). InnerVolumeSpecName "kube-api-access-g8qvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:50:34.057424 master-0 kubenswrapper[28758]: I0223 14:50:34.057354 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ad3874c-6300-4cff-80d7-b332ebc88e5d-kube-api-access-5l7lz" (OuterVolumeSpecName: "kube-api-access-5l7lz") pod "4ad3874c-6300-4cff-80d7-b332ebc88e5d" (UID: "4ad3874c-6300-4cff-80d7-b332ebc88e5d"). InnerVolumeSpecName "kube-api-access-5l7lz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:50:34.060655 master-0 kubenswrapper[28758]: I0223 14:50:34.060588 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"894fbd22-c889-426b-954b-04a9a0e4d905","Type":"ContainerStarted","Data":"9fac8bae38a13ee223ad77eb26713fc97e7e4c43fcb76b8a9f6864525859b3ef"} Feb 23 14:50:34.064684 master-0 kubenswrapper[28758]: I0223 14:50:34.064494 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-769d7f49c5-jj8xx" event={"ID":"8c0c40ba-2093-4e1a-8166-bbb4c53f3a08","Type":"ContainerStarted","Data":"a46f245bc53f009fa76b72cae5a6e0ea1fbc7b4a3432009b103e1880544f6e79"} Feb 23 14:50:34.065247 master-0 kubenswrapper[28758]: I0223 14:50:34.064898 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-769d7f49c5-jj8xx" Feb 23 14:50:34.075636 master-0 kubenswrapper[28758]: I0223 14:50:34.075571 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-844d-account-create-update-l79ql" event={"ID":"4ad3874c-6300-4cff-80d7-b332ebc88e5d","Type":"ContainerDied","Data":"54a3df2feb3d641a35a85c9cdccf71898cc68d764a1cbcb2195f02bf0d8151e9"} Feb 23 14:50:34.075636 master-0 kubenswrapper[28758]: I0223 14:50:34.075637 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54a3df2feb3d641a35a85c9cdccf71898cc68d764a1cbcb2195f02bf0d8151e9" Feb 23 14:50:34.075917 master-0 kubenswrapper[28758]: I0223 14:50:34.075720 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-844d-account-create-update-l79ql" Feb 23 14:50:34.080432 master-0 kubenswrapper[28758]: I0223 14:50:34.079962 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rk2l6" event={"ID":"04a17006-c08e-4ac7-8b6f-f412b4249c6c","Type":"ContainerDied","Data":"c367da00d44ca612e78803453161061c6f74752b934353ee3c8e35d18ee7f2e3"} Feb 23 14:50:34.080432 master-0 kubenswrapper[28758]: I0223 14:50:34.080028 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c367da00d44ca612e78803453161061c6f74752b934353ee3c8e35d18ee7f2e3" Feb 23 14:50:34.080432 master-0 kubenswrapper[28758]: I0223 14:50:34.080109 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rk2l6" Feb 23 14:50:34.092513 master-0 kubenswrapper[28758]: I0223 14:50:34.086128 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84f9bb9b4d-7ttsw" Feb 23 14:50:34.092513 master-0 kubenswrapper[28758]: I0223 14:50:34.087319 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-8kt4c" event={"ID":"98bc143e-fd3c-4a62-a069-5e2357cb2209","Type":"ContainerDied","Data":"86b385bb89c71ab6eb6fd05d275d5171f4997cccacf605af20d019d9846a5b6f"} Feb 23 14:50:34.092513 master-0 kubenswrapper[28758]: I0223 14:50:34.087405 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86b385bb89c71ab6eb6fd05d275d5171f4997cccacf605af20d019d9846a5b6f" Feb 23 14:50:34.092513 master-0 kubenswrapper[28758]: I0223 14:50:34.087438 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-8kt4c" Feb 23 14:50:34.138585 master-0 kubenswrapper[28758]: I0223 14:50:34.134295 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cd9c6\" (UniqueName: \"kubernetes.io/projected/f2f2ca95-2590-4599-8b0f-4714ed65c1b4-kube-api-access-cd9c6\") pod \"f2f2ca95-2590-4599-8b0f-4714ed65c1b4\" (UID: \"f2f2ca95-2590-4599-8b0f-4714ed65c1b4\") " Feb 23 14:50:34.138585 master-0 kubenswrapper[28758]: I0223 14:50:34.134462 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2f2ca95-2590-4599-8b0f-4714ed65c1b4-operator-scripts\") pod \"f2f2ca95-2590-4599-8b0f-4714ed65c1b4\" (UID: \"f2f2ca95-2590-4599-8b0f-4714ed65c1b4\") " Feb 23 14:50:34.138585 master-0 kubenswrapper[28758]: I0223 14:50:34.135247 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8qvx\" (UniqueName: \"kubernetes.io/projected/04a17006-c08e-4ac7-8b6f-f412b4249c6c-kube-api-access-g8qvx\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:34.138585 master-0 kubenswrapper[28758]: I0223 14:50:34.135271 28758 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04a17006-c08e-4ac7-8b6f-f412b4249c6c-operator-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:34.138585 master-0 kubenswrapper[28758]: I0223 14:50:34.135281 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8l9kl\" (UniqueName: \"kubernetes.io/projected/48f37d7c-b625-42da-93ff-b6d5a1702356-kube-api-access-8l9kl\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:34.138585 master-0 kubenswrapper[28758]: I0223 14:50:34.135290 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l7lz\" (UniqueName: \"kubernetes.io/projected/4ad3874c-6300-4cff-80d7-b332ebc88e5d-kube-api-access-5l7lz\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:34.138585 master-0 kubenswrapper[28758]: I0223 14:50:34.135301 28758 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ad3874c-6300-4cff-80d7-b332ebc88e5d-operator-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:34.138585 master-0 kubenswrapper[28758]: I0223 14:50:34.135391 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2f2ca95-2590-4599-8b0f-4714ed65c1b4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f2f2ca95-2590-4599-8b0f-4714ed65c1b4" (UID: "f2f2ca95-2590-4599-8b0f-4714ed65c1b4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:50:34.138585 master-0 kubenswrapper[28758]: I0223 14:50:34.135309 28758 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48f37d7c-b625-42da-93ff-b6d5a1702356-operator-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:34.138585 master-0 kubenswrapper[28758]: I0223 14:50:34.138151 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2f2ca95-2590-4599-8b0f-4714ed65c1b4-kube-api-access-cd9c6" (OuterVolumeSpecName: "kube-api-access-cd9c6") pod "f2f2ca95-2590-4599-8b0f-4714ed65c1b4" (UID: "f2f2ca95-2590-4599-8b0f-4714ed65c1b4"). InnerVolumeSpecName "kube-api-access-cd9c6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:50:34.146733 master-0 kubenswrapper[28758]: I0223 14:50:34.145770 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98bc143e-fd3c-4a62-a069-5e2357cb2209-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "98bc143e-fd3c-4a62-a069-5e2357cb2209" (UID: "98bc143e-fd3c-4a62-a069-5e2357cb2209"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:34.162899 master-0 kubenswrapper[28758]: I0223 14:50:34.161167 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98bc143e-fd3c-4a62-a069-5e2357cb2209-config" (OuterVolumeSpecName: "config") pod "98bc143e-fd3c-4a62-a069-5e2357cb2209" (UID: "98bc143e-fd3c-4a62-a069-5e2357cb2209"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:34.237509 master-0 kubenswrapper[28758]: I0223 14:50:34.237418 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ee1afef-0950-4013-95a6-e6ae71a155eb-config\") pod \"8ee1afef-0950-4013-95a6-e6ae71a155eb\" (UID: \"8ee1afef-0950-4013-95a6-e6ae71a155eb\") " Feb 23 14:50:34.237743 master-0 kubenswrapper[28758]: I0223 14:50:34.237651 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvsvk\" (UniqueName: \"kubernetes.io/projected/8ee1afef-0950-4013-95a6-e6ae71a155eb-kube-api-access-rvsvk\") pod \"8ee1afef-0950-4013-95a6-e6ae71a155eb\" (UID: \"8ee1afef-0950-4013-95a6-e6ae71a155eb\") " Feb 23 14:50:34.238510 master-0 kubenswrapper[28758]: I0223 14:50:34.237836 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ee1afef-0950-4013-95a6-e6ae71a155eb-httpd-config\") pod \"8ee1afef-0950-4013-95a6-e6ae71a155eb\" (UID: \"8ee1afef-0950-4013-95a6-e6ae71a155eb\") " Feb 23 14:50:34.238510 master-0 kubenswrapper[28758]: I0223 14:50:34.237868 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ee1afef-0950-4013-95a6-e6ae71a155eb-ovndb-tls-certs\") pod \"8ee1afef-0950-4013-95a6-e6ae71a155eb\" (UID: \"8ee1afef-0950-4013-95a6-e6ae71a155eb\") " Feb 23 14:50:34.238510 master-0 kubenswrapper[28758]: I0223 14:50:34.237896 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ee1afef-0950-4013-95a6-e6ae71a155eb-combined-ca-bundle\") pod \"8ee1afef-0950-4013-95a6-e6ae71a155eb\" (UID: \"8ee1afef-0950-4013-95a6-e6ae71a155eb\") " Feb 23 14:50:34.238665 master-0 kubenswrapper[28758]: I0223 14:50:34.238626 28758 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2f2ca95-2590-4599-8b0f-4714ed65c1b4-operator-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:34.238665 master-0 kubenswrapper[28758]: I0223 14:50:34.238645 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cd9c6\" (UniqueName: \"kubernetes.io/projected/f2f2ca95-2590-4599-8b0f-4714ed65c1b4-kube-api-access-cd9c6\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:34.238665 master-0 kubenswrapper[28758]: I0223 14:50:34.238659 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98bc143e-fd3c-4a62-a069-5e2357cb2209-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:34.238796 master-0 kubenswrapper[28758]: I0223 14:50:34.238673 28758 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/98bc143e-fd3c-4a62-a069-5e2357cb2209-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:34.246514 master-0 kubenswrapper[28758]: I0223 14:50:34.244727 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ee1afef-0950-4013-95a6-e6ae71a155eb-kube-api-access-rvsvk" (OuterVolumeSpecName: "kube-api-access-rvsvk") pod "8ee1afef-0950-4013-95a6-e6ae71a155eb" (UID: "8ee1afef-0950-4013-95a6-e6ae71a155eb"). InnerVolumeSpecName "kube-api-access-rvsvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:50:34.246514 master-0 kubenswrapper[28758]: I0223 14:50:34.245221 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ee1afef-0950-4013-95a6-e6ae71a155eb-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "8ee1afef-0950-4013-95a6-e6ae71a155eb" (UID: "8ee1afef-0950-4013-95a6-e6ae71a155eb"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:34.335579 master-0 kubenswrapper[28758]: I0223 14:50:34.329237 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ee1afef-0950-4013-95a6-e6ae71a155eb-config" (OuterVolumeSpecName: "config") pod "8ee1afef-0950-4013-95a6-e6ae71a155eb" (UID: "8ee1afef-0950-4013-95a6-e6ae71a155eb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:34.347522 master-0 kubenswrapper[28758]: I0223 14:50:34.341009 28758 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ee1afef-0950-4013-95a6-e6ae71a155eb-httpd-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:34.347522 master-0 kubenswrapper[28758]: I0223 14:50:34.341082 28758 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ee1afef-0950-4013-95a6-e6ae71a155eb-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:34.347522 master-0 kubenswrapper[28758]: I0223 14:50:34.341095 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvsvk\" (UniqueName: \"kubernetes.io/projected/8ee1afef-0950-4013-95a6-e6ae71a155eb-kube-api-access-rvsvk\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:34.372512 master-0 kubenswrapper[28758]: I0223 14:50:34.367953 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ee1afef-0950-4013-95a6-e6ae71a155eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ee1afef-0950-4013-95a6-e6ae71a155eb" (UID: "8ee1afef-0950-4013-95a6-e6ae71a155eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:34.383517 master-0 kubenswrapper[28758]: I0223 14:50:34.383349 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ee1afef-0950-4013-95a6-e6ae71a155eb-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "8ee1afef-0950-4013-95a6-e6ae71a155eb" (UID: "8ee1afef-0950-4013-95a6-e6ae71a155eb"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:34.443709 master-0 kubenswrapper[28758]: I0223 14:50:34.443640 28758 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ee1afef-0950-4013-95a6-e6ae71a155eb-ovndb-tls-certs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:34.443709 master-0 kubenswrapper[28758]: I0223 14:50:34.443687 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ee1afef-0950-4013-95a6-e6ae71a155eb-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:34.444243 master-0 kubenswrapper[28758]: I0223 14:50:34.443737 28758 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod4021d59b-dfa8-49cc-b55c-48469a02b971"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod4021d59b-dfa8-49cc-b55c-48469a02b971] : Timed out while waiting for systemd to remove kubepods-besteffort-pod4021d59b_dfa8_49cc_b55c_48469a02b971.slice" Feb 23 14:50:34.444243 master-0 kubenswrapper[28758]: E0223 14:50:34.443767 28758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod4021d59b-dfa8-49cc-b55c-48469a02b971] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod4021d59b-dfa8-49cc-b55c-48469a02b971] : Timed out while waiting for systemd to remove kubepods-besteffort-pod4021d59b_dfa8_49cc_b55c_48469a02b971.slice" pod="openstack/cinder-2d990-scheduler-0" podUID="4021d59b-dfa8-49cc-b55c-48469a02b971" Feb 23 14:50:35.102340 master-0 kubenswrapper[28758]: I0223 14:50:35.102271 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84f9bb9b4d-7ttsw" event={"ID":"8ee1afef-0950-4013-95a6-e6ae71a155eb","Type":"ContainerDied","Data":"1709dd9011be00258bddd92c5bbc5b79ca173ed231d874f86ca8a08dab81637f"} Feb 23 14:50:35.102340 master-0 kubenswrapper[28758]: I0223 14:50:35.102347 28758 scope.go:117] "RemoveContainer" containerID="f445dd7a9552e21e8b19b2f1128fc3bdf2dd91e14d688462a2ecd74bf371a0f5" Feb 23 14:50:35.103015 master-0 kubenswrapper[28758]: I0223 14:50:35.102526 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84f9bb9b4d-7ttsw" Feb 23 14:50:35.108186 master-0 kubenswrapper[28758]: I0223 14:50:35.108122 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"718a5e05-c1d8-4982-8808-135e9883dab9","Type":"ContainerStarted","Data":"d8a7a001956b6f48704adb6b9de1152ed65185bfe97783e9c8b74fe422deaf51"} Feb 23 14:50:35.108308 master-0 kubenswrapper[28758]: I0223 14:50:35.108271 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:50:35.133245 master-0 kubenswrapper[28758]: I0223 14:50:35.133135 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.452312763 podStartE2EDuration="23.133104456s" podCreationTimestamp="2026-02-23 14:50:12 +0000 UTC" firstStartedPulling="2026-02-23 14:50:13.863962602 +0000 UTC m=+945.990278534" lastFinishedPulling="2026-02-23 14:50:33.544754295 +0000 UTC m=+965.671070227" observedRunningTime="2026-02-23 14:50:35.128405171 +0000 UTC m=+967.254721103" watchObservedRunningTime="2026-02-23 14:50:35.133104456 +0000 UTC m=+967.259420398" Feb 23 14:50:35.135298 master-0 kubenswrapper[28758]: I0223 14:50:35.135130 28758 scope.go:117] "RemoveContainer" containerID="a074856ff5b977bc702c4e244794e3d8c685bcf0b57c1ebe80bebe8f24288af9" Feb 23 14:50:35.209127 master-0 kubenswrapper[28758]: I0223 14:50:35.205148 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-2d990-scheduler-0"] Feb 23 14:50:35.222452 master-0 kubenswrapper[28758]: I0223 14:50:35.222342 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-2d990-scheduler-0"] Feb 23 14:50:35.246117 master-0 kubenswrapper[28758]: I0223 14:50:35.245992 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-84f9bb9b4d-7ttsw"] Feb 23 14:50:35.332595 master-0 kubenswrapper[28758]: I0223 14:50:35.332527 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-84f9bb9b4d-7ttsw"] Feb 23 14:50:35.346772 master-0 kubenswrapper[28758]: I0223 14:50:35.346664 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-2d990-scheduler-0"] Feb 23 14:50:35.347967 master-0 kubenswrapper[28758]: E0223 14:50:35.347920 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ee1afef-0950-4013-95a6-e6ae71a155eb" containerName="neutron-httpd" Feb 23 14:50:35.348046 master-0 kubenswrapper[28758]: I0223 14:50:35.347974 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ee1afef-0950-4013-95a6-e6ae71a155eb" containerName="neutron-httpd" Feb 23 14:50:35.348046 master-0 kubenswrapper[28758]: E0223 14:50:35.348011 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ee1afef-0950-4013-95a6-e6ae71a155eb" containerName="neutron-api" Feb 23 14:50:35.348046 master-0 kubenswrapper[28758]: I0223 14:50:35.348027 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ee1afef-0950-4013-95a6-e6ae71a155eb" containerName="neutron-api" Feb 23 14:50:35.348140 master-0 kubenswrapper[28758]: E0223 14:50:35.348103 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e586dba-e451-4458-972b-223909c7901e" containerName="mariadb-database-create" Feb 23 14:50:35.348140 master-0 kubenswrapper[28758]: I0223 14:50:35.348116 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e586dba-e451-4458-972b-223909c7901e" containerName="mariadb-database-create" Feb 23 14:50:35.348140 master-0 kubenswrapper[28758]: E0223 14:50:35.348129 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2f2ca95-2590-4599-8b0f-4714ed65c1b4" containerName="mariadb-database-create" Feb 23 14:50:35.348140 master-0 kubenswrapper[28758]: I0223 14:50:35.348136 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2f2ca95-2590-4599-8b0f-4714ed65c1b4" containerName="mariadb-database-create" Feb 23 14:50:35.348267 master-0 kubenswrapper[28758]: E0223 14:50:35.348150 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98bc143e-fd3c-4a62-a069-5e2357cb2209" containerName="ironic-inspector-db-sync" Feb 23 14:50:35.348267 master-0 kubenswrapper[28758]: I0223 14:50:35.348157 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="98bc143e-fd3c-4a62-a069-5e2357cb2209" containerName="ironic-inspector-db-sync" Feb 23 14:50:35.348267 master-0 kubenswrapper[28758]: E0223 14:50:35.348181 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="708dbd36-4a71-40d9-8de7-827de4921a8a" containerName="mariadb-account-create-update" Feb 23 14:50:35.348267 master-0 kubenswrapper[28758]: I0223 14:50:35.348188 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="708dbd36-4a71-40d9-8de7-827de4921a8a" containerName="mariadb-account-create-update" Feb 23 14:50:35.348267 master-0 kubenswrapper[28758]: E0223 14:50:35.348211 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04a17006-c08e-4ac7-8b6f-f412b4249c6c" containerName="mariadb-database-create" Feb 23 14:50:35.348267 master-0 kubenswrapper[28758]: I0223 14:50:35.348220 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a17006-c08e-4ac7-8b6f-f412b4249c6c" containerName="mariadb-database-create" Feb 23 14:50:35.348267 master-0 kubenswrapper[28758]: E0223 14:50:35.348251 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ad3874c-6300-4cff-80d7-b332ebc88e5d" containerName="mariadb-account-create-update" Feb 23 14:50:35.348267 master-0 kubenswrapper[28758]: I0223 14:50:35.348258 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ad3874c-6300-4cff-80d7-b332ebc88e5d" containerName="mariadb-account-create-update" Feb 23 14:50:35.348518 master-0 kubenswrapper[28758]: E0223 14:50:35.348287 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f37d7c-b625-42da-93ff-b6d5a1702356" containerName="mariadb-account-create-update" Feb 23 14:50:35.348518 master-0 kubenswrapper[28758]: I0223 14:50:35.348295 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f37d7c-b625-42da-93ff-b6d5a1702356" containerName="mariadb-account-create-update" Feb 23 14:50:35.348746 master-0 kubenswrapper[28758]: I0223 14:50:35.348678 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e586dba-e451-4458-972b-223909c7901e" containerName="mariadb-database-create" Feb 23 14:50:35.348805 master-0 kubenswrapper[28758]: I0223 14:50:35.348752 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ee1afef-0950-4013-95a6-e6ae71a155eb" containerName="neutron-httpd" Feb 23 14:50:35.348805 master-0 kubenswrapper[28758]: I0223 14:50:35.348771 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="04a17006-c08e-4ac7-8b6f-f412b4249c6c" containerName="mariadb-database-create" Feb 23 14:50:35.348805 master-0 kubenswrapper[28758]: I0223 14:50:35.348794 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ad3874c-6300-4cff-80d7-b332ebc88e5d" containerName="mariadb-account-create-update" Feb 23 14:50:35.348901 master-0 kubenswrapper[28758]: I0223 14:50:35.348814 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="98bc143e-fd3c-4a62-a069-5e2357cb2209" containerName="ironic-inspector-db-sync" Feb 23 14:50:35.348901 master-0 kubenswrapper[28758]: I0223 14:50:35.348835 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2f2ca95-2590-4599-8b0f-4714ed65c1b4" containerName="mariadb-database-create" Feb 23 14:50:35.348901 master-0 kubenswrapper[28758]: I0223 14:50:35.348843 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ee1afef-0950-4013-95a6-e6ae71a155eb" containerName="neutron-api" Feb 23 14:50:35.348901 master-0 kubenswrapper[28758]: I0223 14:50:35.348862 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="48f37d7c-b625-42da-93ff-b6d5a1702356" containerName="mariadb-account-create-update" Feb 23 14:50:35.348901 master-0 kubenswrapper[28758]: I0223 14:50:35.348873 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="708dbd36-4a71-40d9-8de7-827de4921a8a" containerName="mariadb-account-create-update" Feb 23 14:50:35.351261 master-0 kubenswrapper[28758]: I0223 14:50:35.351220 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:50:35.353961 master-0 kubenswrapper[28758]: I0223 14:50:35.353859 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-2d990-scheduler-config-data" Feb 23 14:50:35.362513 master-0 kubenswrapper[28758]: I0223 14:50:35.362413 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2d990-scheduler-0"] Feb 23 14:50:35.479032 master-0 kubenswrapper[28758]: I0223 14:50:35.478932 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7266a40-897f-41e1-a8bc-0bd0c7c0f268-scripts\") pod \"cinder-2d990-scheduler-0\" (UID: \"b7266a40-897f-41e1-a8bc-0bd0c7c0f268\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:50:35.479032 master-0 kubenswrapper[28758]: I0223 14:50:35.479025 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7266a40-897f-41e1-a8bc-0bd0c7c0f268-config-data\") pod \"cinder-2d990-scheduler-0\" (UID: \"b7266a40-897f-41e1-a8bc-0bd0c7c0f268\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:50:35.479443 master-0 kubenswrapper[28758]: I0223 14:50:35.479335 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7266a40-897f-41e1-a8bc-0bd0c7c0f268-config-data-custom\") pod \"cinder-2d990-scheduler-0\" (UID: \"b7266a40-897f-41e1-a8bc-0bd0c7c0f268\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:50:35.479518 master-0 kubenswrapper[28758]: I0223 14:50:35.479460 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cqvw\" (UniqueName: \"kubernetes.io/projected/b7266a40-897f-41e1-a8bc-0bd0c7c0f268-kube-api-access-8cqvw\") pod \"cinder-2d990-scheduler-0\" (UID: \"b7266a40-897f-41e1-a8bc-0bd0c7c0f268\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:50:35.479934 master-0 kubenswrapper[28758]: I0223 14:50:35.479893 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b7266a40-897f-41e1-a8bc-0bd0c7c0f268-etc-machine-id\") pod \"cinder-2d990-scheduler-0\" (UID: \"b7266a40-897f-41e1-a8bc-0bd0c7c0f268\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:50:35.480000 master-0 kubenswrapper[28758]: I0223 14:50:35.479955 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7266a40-897f-41e1-a8bc-0bd0c7c0f268-combined-ca-bundle\") pod \"cinder-2d990-scheduler-0\" (UID: \"b7266a40-897f-41e1-a8bc-0bd0c7c0f268\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:50:35.583904 master-0 kubenswrapper[28758]: I0223 14:50:35.583805 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7266a40-897f-41e1-a8bc-0bd0c7c0f268-scripts\") pod \"cinder-2d990-scheduler-0\" (UID: \"b7266a40-897f-41e1-a8bc-0bd0c7c0f268\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:50:35.583904 master-0 kubenswrapper[28758]: I0223 14:50:35.583886 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7266a40-897f-41e1-a8bc-0bd0c7c0f268-config-data\") pod \"cinder-2d990-scheduler-0\" (UID: \"b7266a40-897f-41e1-a8bc-0bd0c7c0f268\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:50:35.584356 master-0 kubenswrapper[28758]: I0223 14:50:35.583964 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7266a40-897f-41e1-a8bc-0bd0c7c0f268-config-data-custom\") pod \"cinder-2d990-scheduler-0\" (UID: \"b7266a40-897f-41e1-a8bc-0bd0c7c0f268\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:50:35.584356 master-0 kubenswrapper[28758]: I0223 14:50:35.583991 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cqvw\" (UniqueName: \"kubernetes.io/projected/b7266a40-897f-41e1-a8bc-0bd0c7c0f268-kube-api-access-8cqvw\") pod \"cinder-2d990-scheduler-0\" (UID: \"b7266a40-897f-41e1-a8bc-0bd0c7c0f268\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:50:35.584356 master-0 kubenswrapper[28758]: I0223 14:50:35.584150 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b7266a40-897f-41e1-a8bc-0bd0c7c0f268-etc-machine-id\") pod \"cinder-2d990-scheduler-0\" (UID: \"b7266a40-897f-41e1-a8bc-0bd0c7c0f268\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:50:35.584356 master-0 kubenswrapper[28758]: I0223 14:50:35.584174 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7266a40-897f-41e1-a8bc-0bd0c7c0f268-combined-ca-bundle\") pod \"cinder-2d990-scheduler-0\" (UID: \"b7266a40-897f-41e1-a8bc-0bd0c7c0f268\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:50:35.586783 master-0 kubenswrapper[28758]: I0223 14:50:35.586700 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b7266a40-897f-41e1-a8bc-0bd0c7c0f268-etc-machine-id\") pod \"cinder-2d990-scheduler-0\" (UID: \"b7266a40-897f-41e1-a8bc-0bd0c7c0f268\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:50:35.590323 master-0 kubenswrapper[28758]: I0223 14:50:35.589827 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7266a40-897f-41e1-a8bc-0bd0c7c0f268-scripts\") pod \"cinder-2d990-scheduler-0\" (UID: \"b7266a40-897f-41e1-a8bc-0bd0c7c0f268\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:50:35.590323 master-0 kubenswrapper[28758]: I0223 14:50:35.589985 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7266a40-897f-41e1-a8bc-0bd0c7c0f268-config-data\") pod \"cinder-2d990-scheduler-0\" (UID: \"b7266a40-897f-41e1-a8bc-0bd0c7c0f268\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:50:35.591332 master-0 kubenswrapper[28758]: I0223 14:50:35.591285 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7266a40-897f-41e1-a8bc-0bd0c7c0f268-combined-ca-bundle\") pod \"cinder-2d990-scheduler-0\" (UID: \"b7266a40-897f-41e1-a8bc-0bd0c7c0f268\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:50:35.591449 master-0 kubenswrapper[28758]: I0223 14:50:35.591419 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7266a40-897f-41e1-a8bc-0bd0c7c0f268-config-data-custom\") pod \"cinder-2d990-scheduler-0\" (UID: \"b7266a40-897f-41e1-a8bc-0bd0c7c0f268\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:50:35.614126 master-0 kubenswrapper[28758]: I0223 14:50:35.612960 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cqvw\" (UniqueName: \"kubernetes.io/projected/b7266a40-897f-41e1-a8bc-0bd0c7c0f268-kube-api-access-8cqvw\") pod \"cinder-2d990-scheduler-0\" (UID: \"b7266a40-897f-41e1-a8bc-0bd0c7c0f268\") " pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:50:35.685872 master-0 kubenswrapper[28758]: I0223 14:50:35.684899 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:50:36.126099 master-0 kubenswrapper[28758]: I0223 14:50:36.113445 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4021d59b-dfa8-49cc-b55c-48469a02b971" path="/var/lib/kubelet/pods/4021d59b-dfa8-49cc-b55c-48469a02b971/volumes" Feb 23 14:50:36.131256 master-0 kubenswrapper[28758]: I0223 14:50:36.130357 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ee1afef-0950-4013-95a6-e6ae71a155eb" path="/var/lib/kubelet/pods/8ee1afef-0950-4013-95a6-e6ae71a155eb/volumes" Feb 23 14:50:36.230269 master-0 kubenswrapper[28758]: I0223 14:50:36.220966 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2d990-scheduler-0"] Feb 23 14:50:36.453518 master-0 kubenswrapper[28758]: I0223 14:50:36.449984 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c76dc76f7-sw64n"] Feb 23 14:50:36.485097 master-0 kubenswrapper[28758]: I0223 14:50:36.481405 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c76dc76f7-sw64n"] Feb 23 14:50:36.485097 master-0 kubenswrapper[28758]: I0223 14:50:36.481523 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c76dc76f7-sw64n" Feb 23 14:50:36.546634 master-0 kubenswrapper[28758]: I0223 14:50:36.546580 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-dns-svc\") pod \"dnsmasq-dns-c76dc76f7-sw64n\" (UID: \"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce\") " pod="openstack/dnsmasq-dns-c76dc76f7-sw64n" Feb 23 14:50:36.546997 master-0 kubenswrapper[28758]: I0223 14:50:36.546690 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-ovsdbserver-sb\") pod \"dnsmasq-dns-c76dc76f7-sw64n\" (UID: \"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce\") " pod="openstack/dnsmasq-dns-c76dc76f7-sw64n" Feb 23 14:50:36.546997 master-0 kubenswrapper[28758]: I0223 14:50:36.546727 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw79x\" (UniqueName: \"kubernetes.io/projected/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-kube-api-access-cw79x\") pod \"dnsmasq-dns-c76dc76f7-sw64n\" (UID: \"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce\") " pod="openstack/dnsmasq-dns-c76dc76f7-sw64n" Feb 23 14:50:36.546997 master-0 kubenswrapper[28758]: I0223 14:50:36.546759 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-dns-swift-storage-0\") pod \"dnsmasq-dns-c76dc76f7-sw64n\" (UID: \"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce\") " pod="openstack/dnsmasq-dns-c76dc76f7-sw64n" Feb 23 14:50:36.546997 master-0 kubenswrapper[28758]: I0223 14:50:36.546826 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-ovsdbserver-nb\") pod \"dnsmasq-dns-c76dc76f7-sw64n\" (UID: \"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce\") " pod="openstack/dnsmasq-dns-c76dc76f7-sw64n" Feb 23 14:50:36.546997 master-0 kubenswrapper[28758]: I0223 14:50:36.546844 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-config\") pod \"dnsmasq-dns-c76dc76f7-sw64n\" (UID: \"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce\") " pod="openstack/dnsmasq-dns-c76dc76f7-sw64n" Feb 23 14:50:36.615510 master-0 kubenswrapper[28758]: I0223 14:50:36.601462 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-0"] Feb 23 14:50:36.615510 master-0 kubenswrapper[28758]: I0223 14:50:36.612775 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Feb 23 14:50:36.631511 master-0 kubenswrapper[28758]: I0223 14:50:36.618667 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Feb 23 14:50:36.631511 master-0 kubenswrapper[28758]: I0223 14:50:36.618950 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Feb 23 14:50:36.631511 master-0 kubenswrapper[28758]: I0223 14:50:36.619137 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-inspector-transport" Feb 23 14:50:36.664794 master-0 kubenswrapper[28758]: I0223 14:50:36.662007 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-ovsdbserver-nb\") pod \"dnsmasq-dns-c76dc76f7-sw64n\" (UID: \"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce\") " pod="openstack/dnsmasq-dns-c76dc76f7-sw64n" Feb 23 14:50:36.664794 master-0 kubenswrapper[28758]: I0223 14:50:36.662132 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk9nx\" (UniqueName: \"kubernetes.io/projected/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-kube-api-access-zk9nx\") pod \"ironic-inspector-0\" (UID: \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:36.664794 master-0 kubenswrapper[28758]: I0223 14:50:36.662180 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-config\") pod \"dnsmasq-dns-c76dc76f7-sw64n\" (UID: \"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce\") " pod="openstack/dnsmasq-dns-c76dc76f7-sw64n" Feb 23 14:50:36.664794 master-0 kubenswrapper[28758]: I0223 14:50:36.662365 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:36.664794 master-0 kubenswrapper[28758]: I0223 14:50:36.662525 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-dns-svc\") pod \"dnsmasq-dns-c76dc76f7-sw64n\" (UID: \"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce\") " pod="openstack/dnsmasq-dns-c76dc76f7-sw64n" Feb 23 14:50:36.664794 master-0 kubenswrapper[28758]: I0223 14:50:36.662700 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:36.664794 master-0 kubenswrapper[28758]: I0223 14:50:36.662856 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-ovsdbserver-sb\") pod \"dnsmasq-dns-c76dc76f7-sw64n\" (UID: \"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce\") " pod="openstack/dnsmasq-dns-c76dc76f7-sw64n" Feb 23 14:50:36.664794 master-0 kubenswrapper[28758]: I0223 14:50:36.662937 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw79x\" (UniqueName: \"kubernetes.io/projected/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-kube-api-access-cw79x\") pod \"dnsmasq-dns-c76dc76f7-sw64n\" (UID: \"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce\") " pod="openstack/dnsmasq-dns-c76dc76f7-sw64n" Feb 23 14:50:36.664794 master-0 kubenswrapper[28758]: I0223 14:50:36.662992 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-scripts\") pod \"ironic-inspector-0\" (UID: \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:36.664794 master-0 kubenswrapper[28758]: I0223 14:50:36.663037 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-dns-swift-storage-0\") pod \"dnsmasq-dns-c76dc76f7-sw64n\" (UID: \"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce\") " pod="openstack/dnsmasq-dns-c76dc76f7-sw64n" Feb 23 14:50:36.664794 master-0 kubenswrapper[28758]: I0223 14:50:36.663063 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:36.664794 master-0 kubenswrapper[28758]: I0223 14:50:36.663133 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:36.664794 master-0 kubenswrapper[28758]: I0223 14:50:36.663275 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-config\") pod \"ironic-inspector-0\" (UID: \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:36.664794 master-0 kubenswrapper[28758]: I0223 14:50:36.664689 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-ovsdbserver-nb\") pod \"dnsmasq-dns-c76dc76f7-sw64n\" (UID: \"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce\") " pod="openstack/dnsmasq-dns-c76dc76f7-sw64n" Feb 23 14:50:36.667783 master-0 kubenswrapper[28758]: I0223 14:50:36.667090 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-neutron-agent-769d7f49c5-jj8xx" Feb 23 14:50:36.676060 master-0 kubenswrapper[28758]: I0223 14:50:36.675977 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-dns-svc\") pod \"dnsmasq-dns-c76dc76f7-sw64n\" (UID: \"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce\") " pod="openstack/dnsmasq-dns-c76dc76f7-sw64n" Feb 23 14:50:36.677585 master-0 kubenswrapper[28758]: I0223 14:50:36.676049 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-config\") pod \"dnsmasq-dns-c76dc76f7-sw64n\" (UID: \"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce\") " pod="openstack/dnsmasq-dns-c76dc76f7-sw64n" Feb 23 14:50:36.677585 master-0 kubenswrapper[28758]: I0223 14:50:36.676053 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-dns-swift-storage-0\") pod \"dnsmasq-dns-c76dc76f7-sw64n\" (UID: \"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce\") " pod="openstack/dnsmasq-dns-c76dc76f7-sw64n" Feb 23 14:50:36.679300 master-0 kubenswrapper[28758]: I0223 14:50:36.679220 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-ovsdbserver-sb\") pod \"dnsmasq-dns-c76dc76f7-sw64n\" (UID: \"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce\") " pod="openstack/dnsmasq-dns-c76dc76f7-sw64n" Feb 23 14:50:36.691799 master-0 kubenswrapper[28758]: I0223 14:50:36.691677 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Feb 23 14:50:36.701929 master-0 kubenswrapper[28758]: I0223 14:50:36.701818 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw79x\" (UniqueName: \"kubernetes.io/projected/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-kube-api-access-cw79x\") pod \"dnsmasq-dns-c76dc76f7-sw64n\" (UID: \"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce\") " pod="openstack/dnsmasq-dns-c76dc76f7-sw64n" Feb 23 14:50:36.768354 master-0 kubenswrapper[28758]: I0223 14:50:36.766223 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-scripts\") pod \"ironic-inspector-0\" (UID: \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:36.768354 master-0 kubenswrapper[28758]: I0223 14:50:36.766336 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:36.771679 master-0 kubenswrapper[28758]: I0223 14:50:36.770580 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:36.771679 master-0 kubenswrapper[28758]: I0223 14:50:36.770761 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-config\") pod \"ironic-inspector-0\" (UID: \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:36.771679 master-0 kubenswrapper[28758]: I0223 14:50:36.770831 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk9nx\" (UniqueName: \"kubernetes.io/projected/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-kube-api-access-zk9nx\") pod \"ironic-inspector-0\" (UID: \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:36.771679 master-0 kubenswrapper[28758]: I0223 14:50:36.770953 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:36.771679 master-0 kubenswrapper[28758]: I0223 14:50:36.771130 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:36.773762 master-0 kubenswrapper[28758]: I0223 14:50:36.773591 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:36.776654 master-0 kubenswrapper[28758]: I0223 14:50:36.776406 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:36.795977 master-0 kubenswrapper[28758]: I0223 14:50:36.795890 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:36.796289 master-0 kubenswrapper[28758]: I0223 14:50:36.796169 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:36.797570 master-0 kubenswrapper[28758]: I0223 14:50:36.796699 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-scripts\") pod \"ironic-inspector-0\" (UID: \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:36.797570 master-0 kubenswrapper[28758]: I0223 14:50:36.796972 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-config\") pod \"ironic-inspector-0\" (UID: \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:36.809914 master-0 kubenswrapper[28758]: I0223 14:50:36.807177 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk9nx\" (UniqueName: \"kubernetes.io/projected/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-kube-api-access-zk9nx\") pod \"ironic-inspector-0\" (UID: \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:36.879511 master-0 kubenswrapper[28758]: I0223 14:50:36.878733 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c76dc76f7-sw64n" Feb 23 14:50:36.994335 master-0 kubenswrapper[28758]: I0223 14:50:36.993539 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Feb 23 14:50:37.203604 master-0 kubenswrapper[28758]: I0223 14:50:37.203544 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-scheduler-0" event={"ID":"b7266a40-897f-41e1-a8bc-0bd0c7c0f268","Type":"ContainerStarted","Data":"d602fb836bccab1ff842c64178ecba835f17d39ba8e0abd6d3856a84b6060047"} Feb 23 14:50:37.533559 master-0 kubenswrapper[28758]: I0223 14:50:37.533500 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c76dc76f7-sw64n"] Feb 23 14:50:37.892692 master-0 kubenswrapper[28758]: I0223 14:50:37.884760 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Feb 23 14:50:38.263558 master-0 kubenswrapper[28758]: I0223 14:50:38.263019 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c76dc76f7-sw64n" event={"ID":"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce","Type":"ContainerStarted","Data":"7fbd8935766a6a7baf4d15ec0f154b1d43bbb7d6395ff1b99eb58c2e47c43324"} Feb 23 14:50:38.272513 master-0 kubenswrapper[28758]: I0223 14:50:38.272032 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-scheduler-0" event={"ID":"b7266a40-897f-41e1-a8bc-0bd0c7c0f268","Type":"ContainerStarted","Data":"c4be56017a6b88f5e4238e7a575b4c77a13b8ca24db0916acfa0721c6f55ef53"} Feb 23 14:50:38.280163 master-0 kubenswrapper[28758]: I0223 14:50:38.279895 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"f9f6fbc7-da78-4768-94ca-ed89ac38eec6","Type":"ContainerStarted","Data":"5d1a2c37ac72b57d6a4fb42ada2153306ad3d42652a5eccd5abb8950ec62c175"} Feb 23 14:50:38.284183 master-0 kubenswrapper[28758]: I0223 14:50:38.284120 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5fb8596466-gd59d" Feb 23 14:50:38.478788 master-0 kubenswrapper[28758]: I0223 14:50:38.475699 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5fb8596466-gd59d" Feb 23 14:50:38.608519 master-0 kubenswrapper[28758]: I0223 14:50:38.602523 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6bdcbb4f68-p649t"] Feb 23 14:50:38.608519 master-0 kubenswrapper[28758]: I0223 14:50:38.602946 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6bdcbb4f68-p649t" podUID="2de7b549-c3f5-4105-8d7b-b0de62f9784e" containerName="placement-log" containerID="cri-o://8df68d5ca8c4ef5606a78ffd1f8761494e1613cefdb191fc895a8b358a8b3d9c" gracePeriod=30 Feb 23 14:50:38.608519 master-0 kubenswrapper[28758]: I0223 14:50:38.603427 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6bdcbb4f68-p649t" podUID="2de7b549-c3f5-4105-8d7b-b0de62f9784e" containerName="placement-api" containerID="cri-o://4f1c16bcc5617661be7a37f2eafd96e60f43859b2c7e4c4a5ff2bb5a0ce24f36" gracePeriod=30 Feb 23 14:50:39.304500 master-0 kubenswrapper[28758]: I0223 14:50:39.304382 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"894fbd22-c889-426b-954b-04a9a0e4d905","Type":"ContainerDied","Data":"9fac8bae38a13ee223ad77eb26713fc97e7e4c43fcb76b8a9f6864525859b3ef"} Feb 23 14:50:39.304500 master-0 kubenswrapper[28758]: I0223 14:50:39.304372 28758 generic.go:334] "Generic (PLEG): container finished" podID="894fbd22-c889-426b-954b-04a9a0e4d905" containerID="9fac8bae38a13ee223ad77eb26713fc97e7e4c43fcb76b8a9f6864525859b3ef" exitCode=0 Feb 23 14:50:39.307773 master-0 kubenswrapper[28758]: I0223 14:50:39.307729 28758 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 14:50:39.315696 master-0 kubenswrapper[28758]: I0223 14:50:39.315607 28758 generic.go:334] "Generic (PLEG): container finished" podID="f9f6fbc7-da78-4768-94ca-ed89ac38eec6" containerID="ba6c66b9e82399edb9054fe6f305a0a99b532d92daa4459913775a35869089a1" exitCode=0 Feb 23 14:50:39.316142 master-0 kubenswrapper[28758]: I0223 14:50:39.315759 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"f9f6fbc7-da78-4768-94ca-ed89ac38eec6","Type":"ContainerDied","Data":"ba6c66b9e82399edb9054fe6f305a0a99b532d92daa4459913775a35869089a1"} Feb 23 14:50:39.319294 master-0 kubenswrapper[28758]: I0223 14:50:39.319216 28758 generic.go:334] "Generic (PLEG): container finished" podID="76dec5c7-dcd8-4b1b-9022-8d87a29cbbce" containerID="c1d09aa7450de0a128c7415e92e3b14172427ec6e9a4711c385022c917cdd8f1" exitCode=0 Feb 23 14:50:39.319407 master-0 kubenswrapper[28758]: I0223 14:50:39.319331 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c76dc76f7-sw64n" event={"ID":"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce","Type":"ContainerDied","Data":"c1d09aa7450de0a128c7415e92e3b14172427ec6e9a4711c385022c917cdd8f1"} Feb 23 14:50:39.323805 master-0 kubenswrapper[28758]: I0223 14:50:39.323702 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2d990-scheduler-0" event={"ID":"b7266a40-897f-41e1-a8bc-0bd0c7c0f268","Type":"ContainerStarted","Data":"a1fc9e2cbeb3a47b30edb020c83447d9bdfb2affa904dc05bf475a51f98d5e49"} Feb 23 14:50:39.327885 master-0 kubenswrapper[28758]: I0223 14:50:39.327781 28758 generic.go:334] "Generic (PLEG): container finished" podID="2de7b549-c3f5-4105-8d7b-b0de62f9784e" containerID="8df68d5ca8c4ef5606a78ffd1f8761494e1613cefdb191fc895a8b358a8b3d9c" exitCode=143 Feb 23 14:50:39.329037 master-0 kubenswrapper[28758]: I0223 14:50:39.328958 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6bdcbb4f68-p649t" event={"ID":"2de7b549-c3f5-4105-8d7b-b0de62f9784e","Type":"ContainerDied","Data":"8df68d5ca8c4ef5606a78ffd1f8761494e1613cefdb191fc895a8b358a8b3d9c"} Feb 23 14:50:40.342722 master-0 kubenswrapper[28758]: I0223 14:50:40.342659 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c76dc76f7-sw64n" event={"ID":"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce","Type":"ContainerStarted","Data":"c8c2dea1161a129d36bce1c9b0d068cc637e4dfe1f874f2c144717c424b1a8bc"} Feb 23 14:50:40.686821 master-0 kubenswrapper[28758]: I0223 14:50:40.686729 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:50:41.879347 master-0 kubenswrapper[28758]: I0223 14:50:41.879274 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-c76dc76f7-sw64n" Feb 23 14:50:41.983847 master-0 kubenswrapper[28758]: I0223 14:50:41.982623 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-2d990-scheduler-0" podStartSLOduration=6.982600651 podStartE2EDuration="6.982600651s" podCreationTimestamp="2026-02-23 14:50:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:50:41.980116845 +0000 UTC m=+974.106432797" watchObservedRunningTime="2026-02-23 14:50:41.982600651 +0000 UTC m=+974.108916583" Feb 23 14:50:42.094028 master-0 kubenswrapper[28758]: I0223 14:50:42.093943 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-c76dc76f7-sw64n" podStartSLOduration=6.093920949 podStartE2EDuration="6.093920949s" podCreationTimestamp="2026-02-23 14:50:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:50:42.081101508 +0000 UTC m=+974.207417440" watchObservedRunningTime="2026-02-23 14:50:42.093920949 +0000 UTC m=+974.220236871" Feb 23 14:50:42.256263 master-0 kubenswrapper[28758]: I0223 14:50:42.256168 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-454z9"] Feb 23 14:50:42.258330 master-0 kubenswrapper[28758]: I0223 14:50:42.258290 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-454z9" Feb 23 14:50:42.261350 master-0 kubenswrapper[28758]: I0223 14:50:42.261279 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 23 14:50:42.265282 master-0 kubenswrapper[28758]: I0223 14:50:42.265158 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 23 14:50:42.275568 master-0 kubenswrapper[28758]: I0223 14:50:42.275466 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-454z9"] Feb 23 14:50:42.371384 master-0 kubenswrapper[28758]: I0223 14:50:42.371288 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76227c43-1d77-4dd0-93fd-90a100ccb01e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-454z9\" (UID: \"76227c43-1d77-4dd0-93fd-90a100ccb01e\") " pod="openstack/nova-cell0-conductor-db-sync-454z9" Feb 23 14:50:42.371384 master-0 kubenswrapper[28758]: I0223 14:50:42.371343 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6rhr\" (UniqueName: \"kubernetes.io/projected/76227c43-1d77-4dd0-93fd-90a100ccb01e-kube-api-access-z6rhr\") pod \"nova-cell0-conductor-db-sync-454z9\" (UID: \"76227c43-1d77-4dd0-93fd-90a100ccb01e\") " pod="openstack/nova-cell0-conductor-db-sync-454z9" Feb 23 14:50:42.371705 master-0 kubenswrapper[28758]: I0223 14:50:42.371419 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76227c43-1d77-4dd0-93fd-90a100ccb01e-scripts\") pod \"nova-cell0-conductor-db-sync-454z9\" (UID: \"76227c43-1d77-4dd0-93fd-90a100ccb01e\") " pod="openstack/nova-cell0-conductor-db-sync-454z9" Feb 23 14:50:42.371705 master-0 kubenswrapper[28758]: I0223 14:50:42.371437 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76227c43-1d77-4dd0-93fd-90a100ccb01e-config-data\") pod \"nova-cell0-conductor-db-sync-454z9\" (UID: \"76227c43-1d77-4dd0-93fd-90a100ccb01e\") " pod="openstack/nova-cell0-conductor-db-sync-454z9" Feb 23 14:50:42.376852 master-0 kubenswrapper[28758]: I0223 14:50:42.375547 28758 generic.go:334] "Generic (PLEG): container finished" podID="2de7b549-c3f5-4105-8d7b-b0de62f9784e" containerID="4f1c16bcc5617661be7a37f2eafd96e60f43859b2c7e4c4a5ff2bb5a0ce24f36" exitCode=0 Feb 23 14:50:42.377063 master-0 kubenswrapper[28758]: I0223 14:50:42.376921 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6bdcbb4f68-p649t" event={"ID":"2de7b549-c3f5-4105-8d7b-b0de62f9784e","Type":"ContainerDied","Data":"4f1c16bcc5617661be7a37f2eafd96e60f43859b2c7e4c4a5ff2bb5a0ce24f36"} Feb 23 14:50:42.495074 master-0 kubenswrapper[28758]: I0223 14:50:42.475306 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76227c43-1d77-4dd0-93fd-90a100ccb01e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-454z9\" (UID: \"76227c43-1d77-4dd0-93fd-90a100ccb01e\") " pod="openstack/nova-cell0-conductor-db-sync-454z9" Feb 23 14:50:42.495074 master-0 kubenswrapper[28758]: I0223 14:50:42.475358 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6rhr\" (UniqueName: \"kubernetes.io/projected/76227c43-1d77-4dd0-93fd-90a100ccb01e-kube-api-access-z6rhr\") pod \"nova-cell0-conductor-db-sync-454z9\" (UID: \"76227c43-1d77-4dd0-93fd-90a100ccb01e\") " pod="openstack/nova-cell0-conductor-db-sync-454z9" Feb 23 14:50:42.495074 master-0 kubenswrapper[28758]: I0223 14:50:42.475425 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76227c43-1d77-4dd0-93fd-90a100ccb01e-scripts\") pod \"nova-cell0-conductor-db-sync-454z9\" (UID: \"76227c43-1d77-4dd0-93fd-90a100ccb01e\") " pod="openstack/nova-cell0-conductor-db-sync-454z9" Feb 23 14:50:42.495074 master-0 kubenswrapper[28758]: I0223 14:50:42.475445 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76227c43-1d77-4dd0-93fd-90a100ccb01e-config-data\") pod \"nova-cell0-conductor-db-sync-454z9\" (UID: \"76227c43-1d77-4dd0-93fd-90a100ccb01e\") " pod="openstack/nova-cell0-conductor-db-sync-454z9" Feb 23 14:50:42.495074 master-0 kubenswrapper[28758]: I0223 14:50:42.483384 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76227c43-1d77-4dd0-93fd-90a100ccb01e-config-data\") pod \"nova-cell0-conductor-db-sync-454z9\" (UID: \"76227c43-1d77-4dd0-93fd-90a100ccb01e\") " pod="openstack/nova-cell0-conductor-db-sync-454z9" Feb 23 14:50:42.495074 master-0 kubenswrapper[28758]: I0223 14:50:42.483906 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76227c43-1d77-4dd0-93fd-90a100ccb01e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-454z9\" (UID: \"76227c43-1d77-4dd0-93fd-90a100ccb01e\") " pod="openstack/nova-cell0-conductor-db-sync-454z9" Feb 23 14:50:42.499456 master-0 kubenswrapper[28758]: I0223 14:50:42.499402 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76227c43-1d77-4dd0-93fd-90a100ccb01e-scripts\") pod \"nova-cell0-conductor-db-sync-454z9\" (UID: \"76227c43-1d77-4dd0-93fd-90a100ccb01e\") " pod="openstack/nova-cell0-conductor-db-sync-454z9" Feb 23 14:50:42.502179 master-0 kubenswrapper[28758]: I0223 14:50:42.502138 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6rhr\" (UniqueName: \"kubernetes.io/projected/76227c43-1d77-4dd0-93fd-90a100ccb01e-kube-api-access-z6rhr\") pod \"nova-cell0-conductor-db-sync-454z9\" (UID: \"76227c43-1d77-4dd0-93fd-90a100ccb01e\") " pod="openstack/nova-cell0-conductor-db-sync-454z9" Feb 23 14:50:42.587384 master-0 kubenswrapper[28758]: I0223 14:50:42.587111 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-454z9" Feb 23 14:50:42.819503 master-0 kubenswrapper[28758]: I0223 14:50:42.819100 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-63e78-default-external-api-0"] Feb 23 14:50:42.819742 master-0 kubenswrapper[28758]: I0223 14:50:42.819593 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-63e78-default-external-api-0" podUID="a09b843c-6e74-4532-8ce4-26147e97d8c5" containerName="glance-log" containerID="cri-o://cca996d036b0d938c26b7289bdc7d311eca327d7be81f6a3839239c667eb9ed6" gracePeriod=30 Feb 23 14:50:42.823506 master-0 kubenswrapper[28758]: I0223 14:50:42.820471 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-63e78-default-external-api-0" podUID="a09b843c-6e74-4532-8ce4-26147e97d8c5" containerName="glance-httpd" containerID="cri-o://0de1652540e3d677e7cc14a5d0ddea1d004b11fbddcf5d72dd3721158d988478" gracePeriod=30 Feb 23 14:50:43.017746 master-0 kubenswrapper[28758]: I0223 14:50:43.017681 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6bdcbb4f68-p649t" Feb 23 14:50:43.148046 master-0 kubenswrapper[28758]: I0223 14:50:43.147747 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2b6d\" (UniqueName: \"kubernetes.io/projected/2de7b549-c3f5-4105-8d7b-b0de62f9784e-kube-api-access-r2b6d\") pod \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\" (UID: \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\") " Feb 23 14:50:43.148046 master-0 kubenswrapper[28758]: I0223 14:50:43.147826 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de7b549-c3f5-4105-8d7b-b0de62f9784e-public-tls-certs\") pod \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\" (UID: \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\") " Feb 23 14:50:43.148046 master-0 kubenswrapper[28758]: I0223 14:50:43.147957 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2de7b549-c3f5-4105-8d7b-b0de62f9784e-config-data\") pod \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\" (UID: \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\") " Feb 23 14:50:43.148046 master-0 kubenswrapper[28758]: I0223 14:50:43.148001 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de7b549-c3f5-4105-8d7b-b0de62f9784e-internal-tls-certs\") pod \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\" (UID: \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\") " Feb 23 14:50:43.148046 master-0 kubenswrapper[28758]: I0223 14:50:43.148067 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2de7b549-c3f5-4105-8d7b-b0de62f9784e-scripts\") pod \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\" (UID: \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\") " Feb 23 14:50:43.148642 master-0 kubenswrapper[28758]: I0223 14:50:43.148103 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2de7b549-c3f5-4105-8d7b-b0de62f9784e-logs\") pod \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\" (UID: \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\") " Feb 23 14:50:43.148642 master-0 kubenswrapper[28758]: I0223 14:50:43.148126 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de7b549-c3f5-4105-8d7b-b0de62f9784e-combined-ca-bundle\") pod \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\" (UID: \"2de7b549-c3f5-4105-8d7b-b0de62f9784e\") " Feb 23 14:50:43.156050 master-0 kubenswrapper[28758]: I0223 14:50:43.155723 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2de7b549-c3f5-4105-8d7b-b0de62f9784e-kube-api-access-r2b6d" (OuterVolumeSpecName: "kube-api-access-r2b6d") pod "2de7b549-c3f5-4105-8d7b-b0de62f9784e" (UID: "2de7b549-c3f5-4105-8d7b-b0de62f9784e"). InnerVolumeSpecName "kube-api-access-r2b6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:50:43.167834 master-0 kubenswrapper[28758]: I0223 14:50:43.167091 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2de7b549-c3f5-4105-8d7b-b0de62f9784e-logs" (OuterVolumeSpecName: "logs") pod "2de7b549-c3f5-4105-8d7b-b0de62f9784e" (UID: "2de7b549-c3f5-4105-8d7b-b0de62f9784e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:50:43.173619 master-0 kubenswrapper[28758]: I0223 14:50:43.171317 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2de7b549-c3f5-4105-8d7b-b0de62f9784e-scripts" (OuterVolumeSpecName: "scripts") pod "2de7b549-c3f5-4105-8d7b-b0de62f9784e" (UID: "2de7b549-c3f5-4105-8d7b-b0de62f9784e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:43.246921 master-0 kubenswrapper[28758]: I0223 14:50:43.246735 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2de7b549-c3f5-4105-8d7b-b0de62f9784e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2de7b549-c3f5-4105-8d7b-b0de62f9784e" (UID: "2de7b549-c3f5-4105-8d7b-b0de62f9784e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:43.257670 master-0 kubenswrapper[28758]: I0223 14:50:43.257602 28758 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2de7b549-c3f5-4105-8d7b-b0de62f9784e-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:43.257670 master-0 kubenswrapper[28758]: I0223 14:50:43.257664 28758 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2de7b549-c3f5-4105-8d7b-b0de62f9784e-logs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:43.257670 master-0 kubenswrapper[28758]: I0223 14:50:43.257680 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2de7b549-c3f5-4105-8d7b-b0de62f9784e-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:43.258009 master-0 kubenswrapper[28758]: I0223 14:50:43.257697 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2b6d\" (UniqueName: \"kubernetes.io/projected/2de7b549-c3f5-4105-8d7b-b0de62f9784e-kube-api-access-r2b6d\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:43.269569 master-0 kubenswrapper[28758]: I0223 14:50:43.269266 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-454z9"] Feb 23 14:50:43.361505 master-0 kubenswrapper[28758]: I0223 14:50:43.361422 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2de7b549-c3f5-4105-8d7b-b0de62f9784e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2de7b549-c3f5-4105-8d7b-b0de62f9784e" (UID: "2de7b549-c3f5-4105-8d7b-b0de62f9784e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:43.362122 master-0 kubenswrapper[28758]: I0223 14:50:43.362089 28758 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de7b549-c3f5-4105-8d7b-b0de62f9784e-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:43.362241 master-0 kubenswrapper[28758]: I0223 14:50:43.362208 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2de7b549-c3f5-4105-8d7b-b0de62f9784e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2de7b549-c3f5-4105-8d7b-b0de62f9784e" (UID: "2de7b549-c3f5-4105-8d7b-b0de62f9784e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:43.371104 master-0 kubenswrapper[28758]: I0223 14:50:43.370949 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2de7b549-c3f5-4105-8d7b-b0de62f9784e-config-data" (OuterVolumeSpecName: "config-data") pod "2de7b549-c3f5-4105-8d7b-b0de62f9784e" (UID: "2de7b549-c3f5-4105-8d7b-b0de62f9784e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:43.395097 master-0 kubenswrapper[28758]: I0223 14:50:43.394914 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-454z9" event={"ID":"76227c43-1d77-4dd0-93fd-90a100ccb01e","Type":"ContainerStarted","Data":"9a0eda192dfdaac647bc2de178172f8bb0b69d6badce9c5b7c54562b7f5ab4d2"} Feb 23 14:50:43.400332 master-0 kubenswrapper[28758]: I0223 14:50:43.399693 28758 generic.go:334] "Generic (PLEG): container finished" podID="a09b843c-6e74-4532-8ce4-26147e97d8c5" containerID="cca996d036b0d938c26b7289bdc7d311eca327d7be81f6a3839239c667eb9ed6" exitCode=143 Feb 23 14:50:43.400332 master-0 kubenswrapper[28758]: I0223 14:50:43.399914 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-63e78-default-external-api-0" event={"ID":"a09b843c-6e74-4532-8ce4-26147e97d8c5","Type":"ContainerDied","Data":"cca996d036b0d938c26b7289bdc7d311eca327d7be81f6a3839239c667eb9ed6"} Feb 23 14:50:43.405253 master-0 kubenswrapper[28758]: I0223 14:50:43.405117 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6bdcbb4f68-p649t" event={"ID":"2de7b549-c3f5-4105-8d7b-b0de62f9784e","Type":"ContainerDied","Data":"ccf69e2cba2f92aa359cf82dc448702d3e77ae74d7a89b757206a96b1a70f7e3"} Feb 23 14:50:43.405369 master-0 kubenswrapper[28758]: I0223 14:50:43.405280 28758 scope.go:117] "RemoveContainer" containerID="4f1c16bcc5617661be7a37f2eafd96e60f43859b2c7e4c4a5ff2bb5a0ce24f36" Feb 23 14:50:43.405632 master-0 kubenswrapper[28758]: I0223 14:50:43.405610 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6bdcbb4f68-p649t" Feb 23 14:50:43.448339 master-0 kubenswrapper[28758]: I0223 14:50:43.448272 28758 scope.go:117] "RemoveContainer" containerID="8df68d5ca8c4ef5606a78ffd1f8761494e1613cefdb191fc895a8b358a8b3d9c" Feb 23 14:50:43.474986 master-0 kubenswrapper[28758]: I0223 14:50:43.474730 28758 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2de7b549-c3f5-4105-8d7b-b0de62f9784e-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:43.474986 master-0 kubenswrapper[28758]: I0223 14:50:43.474788 28758 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2de7b549-c3f5-4105-8d7b-b0de62f9784e-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:43.485596 master-0 kubenswrapper[28758]: I0223 14:50:43.485521 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6bdcbb4f68-p649t"] Feb 23 14:50:43.533066 master-0 kubenswrapper[28758]: I0223 14:50:43.532993 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6bdcbb4f68-p649t"] Feb 23 14:50:44.115905 master-0 kubenswrapper[28758]: I0223 14:50:44.115820 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2de7b549-c3f5-4105-8d7b-b0de62f9784e" path="/var/lib/kubelet/pods/2de7b549-c3f5-4105-8d7b-b0de62f9784e/volumes" Feb 23 14:50:44.116784 master-0 kubenswrapper[28758]: I0223 14:50:44.116755 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-0"] Feb 23 14:50:44.386852 master-0 kubenswrapper[28758]: I0223 14:50:44.385149 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-63e78-default-internal-api-0"] Feb 23 14:50:44.388208 master-0 kubenswrapper[28758]: I0223 14:50:44.387854 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-63e78-default-internal-api-0" podUID="76a97aa3-19a1-44d3-9019-da7b27957297" containerName="glance-log" containerID="cri-o://eabfb46572be76a89d392cc4f8273ef2d73faca6dfeac81f8567fbb5e1856924" gracePeriod=30 Feb 23 14:50:44.388208 master-0 kubenswrapper[28758]: I0223 14:50:44.388048 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-63e78-default-internal-api-0" podUID="76a97aa3-19a1-44d3-9019-da7b27957297" containerName="glance-httpd" containerID="cri-o://1b34d62d94b3cfc0f7ea720c20214c859b4f1d29c479e024171cc5193e14f80b" gracePeriod=30 Feb 23 14:50:45.467602 master-0 kubenswrapper[28758]: I0223 14:50:45.467518 28758 generic.go:334] "Generic (PLEG): container finished" podID="76a97aa3-19a1-44d3-9019-da7b27957297" containerID="eabfb46572be76a89d392cc4f8273ef2d73faca6dfeac81f8567fbb5e1856924" exitCode=143 Feb 23 14:50:45.467602 master-0 kubenswrapper[28758]: I0223 14:50:45.467589 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-63e78-default-internal-api-0" event={"ID":"76a97aa3-19a1-44d3-9019-da7b27957297","Type":"ContainerDied","Data":"eabfb46572be76a89d392cc4f8273ef2d73faca6dfeac81f8567fbb5e1856924"} Feb 23 14:50:45.969009 master-0 kubenswrapper[28758]: I0223 14:50:45.968886 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-2d990-scheduler-0" Feb 23 14:50:46.490748 master-0 kubenswrapper[28758]: I0223 14:50:46.489628 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-63e78-default-external-api-0" event={"ID":"a09b843c-6e74-4532-8ce4-26147e97d8c5","Type":"ContainerDied","Data":"0de1652540e3d677e7cc14a5d0ddea1d004b11fbddcf5d72dd3721158d988478"} Feb 23 14:50:46.490748 master-0 kubenswrapper[28758]: I0223 14:50:46.489373 28758 generic.go:334] "Generic (PLEG): container finished" podID="a09b843c-6e74-4532-8ce4-26147e97d8c5" containerID="0de1652540e3d677e7cc14a5d0ddea1d004b11fbddcf5d72dd3721158d988478" exitCode=0 Feb 23 14:50:46.881589 master-0 kubenswrapper[28758]: I0223 14:50:46.879645 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-c76dc76f7-sw64n" Feb 23 14:50:47.052870 master-0 kubenswrapper[28758]: I0223 14:50:47.052781 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-677b4847c-74h4p"] Feb 23 14:50:47.053274 master-0 kubenswrapper[28758]: I0223 14:50:47.053197 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-677b4847c-74h4p" podUID="07aaf639-2ccb-4ffb-be79-572b8990a03a" containerName="dnsmasq-dns" containerID="cri-o://fa3cfa1fada3c7da4b5e8ec9e6a80a74a5a1d0556a728604612ebc94d61bf72e" gracePeriod=10 Feb 23 14:50:47.511166 master-0 kubenswrapper[28758]: I0223 14:50:47.511104 28758 generic.go:334] "Generic (PLEG): container finished" podID="07aaf639-2ccb-4ffb-be79-572b8990a03a" containerID="fa3cfa1fada3c7da4b5e8ec9e6a80a74a5a1d0556a728604612ebc94d61bf72e" exitCode=0 Feb 23 14:50:47.511712 master-0 kubenswrapper[28758]: I0223 14:50:47.511173 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-677b4847c-74h4p" event={"ID":"07aaf639-2ccb-4ffb-be79-572b8990a03a","Type":"ContainerDied","Data":"fa3cfa1fada3c7da4b5e8ec9e6a80a74a5a1d0556a728604612ebc94d61bf72e"} Feb 23 14:50:48.525368 master-0 kubenswrapper[28758]: I0223 14:50:48.525222 28758 generic.go:334] "Generic (PLEG): container finished" podID="76a97aa3-19a1-44d3-9019-da7b27957297" containerID="1b34d62d94b3cfc0f7ea720c20214c859b4f1d29c479e024171cc5193e14f80b" exitCode=0 Feb 23 14:50:48.525368 master-0 kubenswrapper[28758]: I0223 14:50:48.525343 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-63e78-default-internal-api-0" event={"ID":"76a97aa3-19a1-44d3-9019-da7b27957297","Type":"ContainerDied","Data":"1b34d62d94b3cfc0f7ea720c20214c859b4f1d29c479e024171cc5193e14f80b"} Feb 23 14:50:51.513520 master-0 kubenswrapper[28758]: I0223 14:50:51.504268 28758 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-63e78-default-internal-api-0" podUID="76a97aa3-19a1-44d3-9019-da7b27957297" containerName="glance-log" probeResult="failure" output="Get \"https://10.128.0.214:9292/healthcheck\": dial tcp 10.128.0.214:9292: connect: connection refused" Feb 23 14:50:51.513520 master-0 kubenswrapper[28758]: I0223 14:50:51.504415 28758 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-63e78-default-internal-api-0" podUID="76a97aa3-19a1-44d3-9019-da7b27957297" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.128.0.214:9292/healthcheck\": dial tcp 10.128.0.214:9292: connect: connection refused" Feb 23 14:50:51.573512 master-0 kubenswrapper[28758]: I0223 14:50:51.562889 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-63e78-default-external-api-0" event={"ID":"a09b843c-6e74-4532-8ce4-26147e97d8c5","Type":"ContainerDied","Data":"56f7b6afc53ef30e889efc2e9c46ee89e3a7677a006d35de42f8733392b7a62a"} Feb 23 14:50:51.573512 master-0 kubenswrapper[28758]: I0223 14:50:51.562950 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56f7b6afc53ef30e889efc2e9c46ee89e3a7677a006d35de42f8733392b7a62a" Feb 23 14:50:51.624648 master-0 kubenswrapper[28758]: I0223 14:50:51.624593 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:50:51.673914 master-0 kubenswrapper[28758]: I0223 14:50:51.671387 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a09b843c-6e74-4532-8ce4-26147e97d8c5-config-data\") pod \"a09b843c-6e74-4532-8ce4-26147e97d8c5\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") " Feb 23 14:50:51.673914 master-0 kubenswrapper[28758]: I0223 14:50:51.671517 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a09b843c-6e74-4532-8ce4-26147e97d8c5-httpd-run\") pod \"a09b843c-6e74-4532-8ce4-26147e97d8c5\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") " Feb 23 14:50:51.673914 master-0 kubenswrapper[28758]: I0223 14:50:51.671577 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a09b843c-6e74-4532-8ce4-26147e97d8c5-logs\") pod \"a09b843c-6e74-4532-8ce4-26147e97d8c5\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") " Feb 23 14:50:51.673914 master-0 kubenswrapper[28758]: I0223 14:50:51.671650 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a09b843c-6e74-4532-8ce4-26147e97d8c5-public-tls-certs\") pod \"a09b843c-6e74-4532-8ce4-26147e97d8c5\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") " Feb 23 14:50:51.673914 master-0 kubenswrapper[28758]: I0223 14:50:51.671812 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a09b843c-6e74-4532-8ce4-26147e97d8c5-combined-ca-bundle\") pod \"a09b843c-6e74-4532-8ce4-26147e97d8c5\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") " Feb 23 14:50:51.673914 master-0 kubenswrapper[28758]: I0223 14:50:51.671929 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a09b843c-6e74-4532-8ce4-26147e97d8c5-scripts\") pod \"a09b843c-6e74-4532-8ce4-26147e97d8c5\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") " Feb 23 14:50:51.673914 master-0 kubenswrapper[28758]: I0223 14:50:51.671953 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jngbf\" (UniqueName: \"kubernetes.io/projected/a09b843c-6e74-4532-8ce4-26147e97d8c5-kube-api-access-jngbf\") pod \"a09b843c-6e74-4532-8ce4-26147e97d8c5\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") " Feb 23 14:50:51.673914 master-0 kubenswrapper[28758]: I0223 14:50:51.672044 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4676deba-719d-4f1a-a679-315252c62322\") pod \"a09b843c-6e74-4532-8ce4-26147e97d8c5\" (UID: \"a09b843c-6e74-4532-8ce4-26147e97d8c5\") " Feb 23 14:50:51.673914 master-0 kubenswrapper[28758]: I0223 14:50:51.672856 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a09b843c-6e74-4532-8ce4-26147e97d8c5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a09b843c-6e74-4532-8ce4-26147e97d8c5" (UID: "a09b843c-6e74-4532-8ce4-26147e97d8c5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:50:51.673914 master-0 kubenswrapper[28758]: I0223 14:50:51.673147 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a09b843c-6e74-4532-8ce4-26147e97d8c5-logs" (OuterVolumeSpecName: "logs") pod "a09b843c-6e74-4532-8ce4-26147e97d8c5" (UID: "a09b843c-6e74-4532-8ce4-26147e97d8c5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:50:51.687722 master-0 kubenswrapper[28758]: I0223 14:50:51.687638 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a09b843c-6e74-4532-8ce4-26147e97d8c5-scripts" (OuterVolumeSpecName: "scripts") pod "a09b843c-6e74-4532-8ce4-26147e97d8c5" (UID: "a09b843c-6e74-4532-8ce4-26147e97d8c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:51.692213 master-0 kubenswrapper[28758]: I0223 14:50:51.688641 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a09b843c-6e74-4532-8ce4-26147e97d8c5-kube-api-access-jngbf" (OuterVolumeSpecName: "kube-api-access-jngbf") pod "a09b843c-6e74-4532-8ce4-26147e97d8c5" (UID: "a09b843c-6e74-4532-8ce4-26147e97d8c5"). InnerVolumeSpecName "kube-api-access-jngbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:50:51.697919 master-0 kubenswrapper[28758]: I0223 14:50:51.697863 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^4676deba-719d-4f1a-a679-315252c62322" (OuterVolumeSpecName: "glance") pod "a09b843c-6e74-4532-8ce4-26147e97d8c5" (UID: "a09b843c-6e74-4532-8ce4-26147e97d8c5"). InnerVolumeSpecName "pvc-fee995dc-6f05-4147-9442-57dcc3df496b". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 23 14:50:51.714520 master-0 kubenswrapper[28758]: I0223 14:50:51.714440 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a09b843c-6e74-4532-8ce4-26147e97d8c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a09b843c-6e74-4532-8ce4-26147e97d8c5" (UID: "a09b843c-6e74-4532-8ce4-26147e97d8c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:51.785992 master-0 kubenswrapper[28758]: I0223 14:50:51.785939 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a09b843c-6e74-4532-8ce4-26147e97d8c5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a09b843c-6e74-4532-8ce4-26147e97d8c5" (UID: "a09b843c-6e74-4532-8ce4-26147e97d8c5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:51.787535 master-0 kubenswrapper[28758]: I0223 14:50:51.787454 28758 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a09b843c-6e74-4532-8ce4-26147e97d8c5-httpd-run\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:51.787535 master-0 kubenswrapper[28758]: I0223 14:50:51.787523 28758 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a09b843c-6e74-4532-8ce4-26147e97d8c5-logs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:51.787535 master-0 kubenswrapper[28758]: I0223 14:50:51.787534 28758 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a09b843c-6e74-4532-8ce4-26147e97d8c5-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:51.787693 master-0 kubenswrapper[28758]: I0223 14:50:51.787547 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a09b843c-6e74-4532-8ce4-26147e97d8c5-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:51.787693 master-0 kubenswrapper[28758]: I0223 14:50:51.787557 28758 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a09b843c-6e74-4532-8ce4-26147e97d8c5-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:51.787693 master-0 kubenswrapper[28758]: I0223 14:50:51.787568 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jngbf\" (UniqueName: \"kubernetes.io/projected/a09b843c-6e74-4532-8ce4-26147e97d8c5-kube-api-access-jngbf\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:51.787693 master-0 kubenswrapper[28758]: I0223 14:50:51.787596 28758 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-fee995dc-6f05-4147-9442-57dcc3df496b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4676deba-719d-4f1a-a679-315252c62322\") on node \"master-0\" " Feb 23 14:50:51.818795 master-0 kubenswrapper[28758]: I0223 14:50:51.818715 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a09b843c-6e74-4532-8ce4-26147e97d8c5-config-data" (OuterVolumeSpecName: "config-data") pod "a09b843c-6e74-4532-8ce4-26147e97d8c5" (UID: "a09b843c-6e74-4532-8ce4-26147e97d8c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:51.821074 master-0 kubenswrapper[28758]: I0223 14:50:51.821003 28758 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 23 14:50:51.821226 master-0 kubenswrapper[28758]: I0223 14:50:51.821196 28758 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-fee995dc-6f05-4147-9442-57dcc3df496b" (UniqueName: "kubernetes.io/csi/topolvm.io^4676deba-719d-4f1a-a679-315252c62322") on node "master-0" Feb 23 14:50:51.890316 master-0 kubenswrapper[28758]: I0223 14:50:51.890099 28758 reconciler_common.go:293] "Volume detached for volume \"pvc-fee995dc-6f05-4147-9442-57dcc3df496b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4676deba-719d-4f1a-a679-315252c62322\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:51.890316 master-0 kubenswrapper[28758]: I0223 14:50:51.890159 28758 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a09b843c-6e74-4532-8ce4-26147e97d8c5-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:52.202352 master-0 kubenswrapper[28758]: I0223 14:50:52.202005 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-677b4847c-74h4p" Feb 23 14:50:52.300664 master-0 kubenswrapper[28758]: I0223 14:50:52.300586 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07aaf639-2ccb-4ffb-be79-572b8990a03a-ovsdbserver-nb\") pod \"07aaf639-2ccb-4ffb-be79-572b8990a03a\" (UID: \"07aaf639-2ccb-4ffb-be79-572b8990a03a\") " Feb 23 14:50:52.301098 master-0 kubenswrapper[28758]: I0223 14:50:52.301062 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07aaf639-2ccb-4ffb-be79-572b8990a03a-config\") pod \"07aaf639-2ccb-4ffb-be79-572b8990a03a\" (UID: \"07aaf639-2ccb-4ffb-be79-572b8990a03a\") " Feb 23 14:50:52.301839 master-0 kubenswrapper[28758]: I0223 14:50:52.301707 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7zw8\" (UniqueName: \"kubernetes.io/projected/07aaf639-2ccb-4ffb-be79-572b8990a03a-kube-api-access-t7zw8\") pod \"07aaf639-2ccb-4ffb-be79-572b8990a03a\" (UID: \"07aaf639-2ccb-4ffb-be79-572b8990a03a\") " Feb 23 14:50:52.303010 master-0 kubenswrapper[28758]: I0223 14:50:52.302810 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07aaf639-2ccb-4ffb-be79-572b8990a03a-ovsdbserver-sb\") pod \"07aaf639-2ccb-4ffb-be79-572b8990a03a\" (UID: \"07aaf639-2ccb-4ffb-be79-572b8990a03a\") " Feb 23 14:50:52.303010 master-0 kubenswrapper[28758]: I0223 14:50:52.302920 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07aaf639-2ccb-4ffb-be79-572b8990a03a-dns-swift-storage-0\") pod \"07aaf639-2ccb-4ffb-be79-572b8990a03a\" (UID: \"07aaf639-2ccb-4ffb-be79-572b8990a03a\") " Feb 23 14:50:52.303010 master-0 kubenswrapper[28758]: I0223 14:50:52.302958 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07aaf639-2ccb-4ffb-be79-572b8990a03a-dns-svc\") pod \"07aaf639-2ccb-4ffb-be79-572b8990a03a\" (UID: \"07aaf639-2ccb-4ffb-be79-572b8990a03a\") " Feb 23 14:50:52.307983 master-0 kubenswrapper[28758]: I0223 14:50:52.307924 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07aaf639-2ccb-4ffb-be79-572b8990a03a-kube-api-access-t7zw8" (OuterVolumeSpecName: "kube-api-access-t7zw8") pod "07aaf639-2ccb-4ffb-be79-572b8990a03a" (UID: "07aaf639-2ccb-4ffb-be79-572b8990a03a"). InnerVolumeSpecName "kube-api-access-t7zw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:50:52.373966 master-0 kubenswrapper[28758]: I0223 14:50:52.373910 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07aaf639-2ccb-4ffb-be79-572b8990a03a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "07aaf639-2ccb-4ffb-be79-572b8990a03a" (UID: "07aaf639-2ccb-4ffb-be79-572b8990a03a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:50:52.381299 master-0 kubenswrapper[28758]: I0223 14:50:52.381229 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07aaf639-2ccb-4ffb-be79-572b8990a03a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "07aaf639-2ccb-4ffb-be79-572b8990a03a" (UID: "07aaf639-2ccb-4ffb-be79-572b8990a03a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:50:52.410017 master-0 kubenswrapper[28758]: I0223 14:50:52.407036 28758 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07aaf639-2ccb-4ffb-be79-572b8990a03a-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:52.410017 master-0 kubenswrapper[28758]: I0223 14:50:52.407082 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7zw8\" (UniqueName: \"kubernetes.io/projected/07aaf639-2ccb-4ffb-be79-572b8990a03a-kube-api-access-t7zw8\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:52.410017 master-0 kubenswrapper[28758]: I0223 14:50:52.407092 28758 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07aaf639-2ccb-4ffb-be79-572b8990a03a-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:52.414385 master-0 kubenswrapper[28758]: I0223 14:50:52.414331 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07aaf639-2ccb-4ffb-be79-572b8990a03a-config" (OuterVolumeSpecName: "config") pod "07aaf639-2ccb-4ffb-be79-572b8990a03a" (UID: "07aaf639-2ccb-4ffb-be79-572b8990a03a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:50:52.421931 master-0 kubenswrapper[28758]: I0223 14:50:52.421859 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07aaf639-2ccb-4ffb-be79-572b8990a03a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "07aaf639-2ccb-4ffb-be79-572b8990a03a" (UID: "07aaf639-2ccb-4ffb-be79-572b8990a03a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:50:52.470134 master-0 kubenswrapper[28758]: I0223 14:50:52.470039 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07aaf639-2ccb-4ffb-be79-572b8990a03a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "07aaf639-2ccb-4ffb-be79-572b8990a03a" (UID: "07aaf639-2ccb-4ffb-be79-572b8990a03a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:50:52.516461 master-0 kubenswrapper[28758]: I0223 14:50:52.516401 28758 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07aaf639-2ccb-4ffb-be79-572b8990a03a-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:52.516461 master-0 kubenswrapper[28758]: I0223 14:50:52.516455 28758 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07aaf639-2ccb-4ffb-be79-572b8990a03a-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:52.516461 master-0 kubenswrapper[28758]: I0223 14:50:52.516468 28758 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07aaf639-2ccb-4ffb-be79-572b8990a03a-dns-svc\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:52.583366 master-0 kubenswrapper[28758]: I0223 14:50:52.583308 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:50:52.584675 master-0 kubenswrapper[28758]: I0223 14:50:52.584585 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-677b4847c-74h4p" Feb 23 14:50:52.586019 master-0 kubenswrapper[28758]: I0223 14:50:52.585971 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-677b4847c-74h4p" event={"ID":"07aaf639-2ccb-4ffb-be79-572b8990a03a","Type":"ContainerDied","Data":"49faa17d5b50e1775025769332eb5f1f97e9dc8fdab50376834e4752b3dcb907"} Feb 23 14:50:52.586102 master-0 kubenswrapper[28758]: I0223 14:50:52.586068 28758 scope.go:117] "RemoveContainer" containerID="fa3cfa1fada3c7da4b5e8ec9e6a80a74a5a1d0556a728604612ebc94d61bf72e" Feb 23 14:50:52.711742 master-0 kubenswrapper[28758]: I0223 14:50:52.711650 28758 scope.go:117] "RemoveContainer" containerID="daf88e757071588e1c1e500c56b0244ca9ad17ba37196c8aeb3f21f3fb19885b" Feb 23 14:50:52.714904 master-0 kubenswrapper[28758]: I0223 14:50:52.714776 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-63e78-default-external-api-0"] Feb 23 14:50:52.741455 master-0 kubenswrapper[28758]: I0223 14:50:52.741389 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-63e78-default-external-api-0"] Feb 23 14:50:52.806777 master-0 kubenswrapper[28758]: I0223 14:50:52.806715 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-677b4847c-74h4p"] Feb 23 14:50:52.837847 master-0 kubenswrapper[28758]: I0223 14:50:52.837670 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-677b4847c-74h4p"] Feb 23 14:50:52.850244 master-0 kubenswrapper[28758]: I0223 14:50:52.850183 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-63e78-default-external-api-0"] Feb 23 14:50:52.850889 master-0 kubenswrapper[28758]: E0223 14:50:52.850848 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a09b843c-6e74-4532-8ce4-26147e97d8c5" containerName="glance-log" Feb 23 14:50:52.850889 master-0 kubenswrapper[28758]: I0223 14:50:52.850875 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="a09b843c-6e74-4532-8ce4-26147e97d8c5" containerName="glance-log" Feb 23 14:50:52.851012 master-0 kubenswrapper[28758]: E0223 14:50:52.850918 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07aaf639-2ccb-4ffb-be79-572b8990a03a" containerName="dnsmasq-dns" Feb 23 14:50:52.851012 master-0 kubenswrapper[28758]: I0223 14:50:52.850925 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="07aaf639-2ccb-4ffb-be79-572b8990a03a" containerName="dnsmasq-dns" Feb 23 14:50:52.851012 master-0 kubenswrapper[28758]: E0223 14:50:52.850949 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07aaf639-2ccb-4ffb-be79-572b8990a03a" containerName="init" Feb 23 14:50:52.851012 master-0 kubenswrapper[28758]: I0223 14:50:52.850955 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="07aaf639-2ccb-4ffb-be79-572b8990a03a" containerName="init" Feb 23 14:50:52.851012 master-0 kubenswrapper[28758]: E0223 14:50:52.850970 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a09b843c-6e74-4532-8ce4-26147e97d8c5" containerName="glance-httpd" Feb 23 14:50:52.851012 master-0 kubenswrapper[28758]: I0223 14:50:52.850977 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="a09b843c-6e74-4532-8ce4-26147e97d8c5" containerName="glance-httpd" Feb 23 14:50:52.851012 master-0 kubenswrapper[28758]: E0223 14:50:52.850992 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de7b549-c3f5-4105-8d7b-b0de62f9784e" containerName="placement-log" Feb 23 14:50:52.851012 master-0 kubenswrapper[28758]: I0223 14:50:52.851000 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de7b549-c3f5-4105-8d7b-b0de62f9784e" containerName="placement-log" Feb 23 14:50:52.851012 master-0 kubenswrapper[28758]: E0223 14:50:52.851013 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de7b549-c3f5-4105-8d7b-b0de62f9784e" containerName="placement-api" Feb 23 14:50:52.851012 master-0 kubenswrapper[28758]: I0223 14:50:52.851021 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de7b549-c3f5-4105-8d7b-b0de62f9784e" containerName="placement-api" Feb 23 14:50:52.851391 master-0 kubenswrapper[28758]: I0223 14:50:52.851257 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="2de7b549-c3f5-4105-8d7b-b0de62f9784e" containerName="placement-api" Feb 23 14:50:52.851391 master-0 kubenswrapper[28758]: I0223 14:50:52.851306 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="a09b843c-6e74-4532-8ce4-26147e97d8c5" containerName="glance-log" Feb 23 14:50:52.851391 master-0 kubenswrapper[28758]: I0223 14:50:52.851315 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="2de7b549-c3f5-4105-8d7b-b0de62f9784e" containerName="placement-log" Feb 23 14:50:52.851391 master-0 kubenswrapper[28758]: I0223 14:50:52.851327 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="a09b843c-6e74-4532-8ce4-26147e97d8c5" containerName="glance-httpd" Feb 23 14:50:52.851391 master-0 kubenswrapper[28758]: I0223 14:50:52.851342 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="07aaf639-2ccb-4ffb-be79-572b8990a03a" containerName="dnsmasq-dns" Feb 23 14:50:52.853020 master-0 kubenswrapper[28758]: I0223 14:50:52.852993 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:50:52.855442 master-0 kubenswrapper[28758]: I0223 14:50:52.855123 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 23 14:50:52.855442 master-0 kubenswrapper[28758]: I0223 14:50:52.855368 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-63e78-default-external-config-data" Feb 23 14:50:52.864546 master-0 kubenswrapper[28758]: I0223 14:50:52.863672 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-63e78-default-external-api-0"] Feb 23 14:50:52.935902 master-0 kubenswrapper[28758]: I0223 14:50:52.935834 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:50:52.940636 master-0 kubenswrapper[28758]: I0223 14:50:52.940567 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/509b3d39-1b4f-440e-831b-997244388255-httpd-run\") pod \"glance-63e78-default-external-api-0\" (UID: \"509b3d39-1b4f-440e-831b-997244388255\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:50:52.940921 master-0 kubenswrapper[28758]: I0223 14:50:52.940662 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/509b3d39-1b4f-440e-831b-997244388255-scripts\") pod \"glance-63e78-default-external-api-0\" (UID: \"509b3d39-1b4f-440e-831b-997244388255\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:50:52.940921 master-0 kubenswrapper[28758]: I0223 14:50:52.940706 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/509b3d39-1b4f-440e-831b-997244388255-public-tls-certs\") pod \"glance-63e78-default-external-api-0\" (UID: \"509b3d39-1b4f-440e-831b-997244388255\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:50:52.940921 master-0 kubenswrapper[28758]: I0223 14:50:52.940733 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/509b3d39-1b4f-440e-831b-997244388255-combined-ca-bundle\") pod \"glance-63e78-default-external-api-0\" (UID: \"509b3d39-1b4f-440e-831b-997244388255\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:50:52.940921 master-0 kubenswrapper[28758]: I0223 14:50:52.940783 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/509b3d39-1b4f-440e-831b-997244388255-config-data\") pod \"glance-63e78-default-external-api-0\" (UID: \"509b3d39-1b4f-440e-831b-997244388255\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:50:52.940921 master-0 kubenswrapper[28758]: I0223 14:50:52.940893 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/509b3d39-1b4f-440e-831b-997244388255-logs\") pod \"glance-63e78-default-external-api-0\" (UID: \"509b3d39-1b4f-440e-831b-997244388255\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:50:52.941104 master-0 kubenswrapper[28758]: I0223 14:50:52.940955 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fee995dc-6f05-4147-9442-57dcc3df496b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4676deba-719d-4f1a-a679-315252c62322\") pod \"glance-63e78-default-external-api-0\" (UID: \"509b3d39-1b4f-440e-831b-997244388255\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:50:52.941192 master-0 kubenswrapper[28758]: I0223 14:50:52.941165 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29qfr\" (UniqueName: \"kubernetes.io/projected/509b3d39-1b4f-440e-831b-997244388255-kube-api-access-29qfr\") pod \"glance-63e78-default-external-api-0\" (UID: \"509b3d39-1b4f-440e-831b-997244388255\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:50:53.049288 master-0 kubenswrapper[28758]: I0223 14:50:53.043972 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/76a97aa3-19a1-44d3-9019-da7b27957297-httpd-run\") pod \"76a97aa3-19a1-44d3-9019-da7b27957297\" (UID: \"76a97aa3-19a1-44d3-9019-da7b27957297\") " Feb 23 14:50:53.049288 master-0 kubenswrapper[28758]: I0223 14:50:53.044146 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/76a97aa3-19a1-44d3-9019-da7b27957297-internal-tls-certs\") pod \"76a97aa3-19a1-44d3-9019-da7b27957297\" (UID: \"76a97aa3-19a1-44d3-9019-da7b27957297\") " Feb 23 14:50:53.049288 master-0 kubenswrapper[28758]: I0223 14:50:53.044288 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76a97aa3-19a1-44d3-9019-da7b27957297-config-data\") pod \"76a97aa3-19a1-44d3-9019-da7b27957297\" (UID: \"76a97aa3-19a1-44d3-9019-da7b27957297\") " Feb 23 14:50:53.049288 master-0 kubenswrapper[28758]: I0223 14:50:53.044378 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76a97aa3-19a1-44d3-9019-da7b27957297-logs\") pod \"76a97aa3-19a1-44d3-9019-da7b27957297\" (UID: \"76a97aa3-19a1-44d3-9019-da7b27957297\") " Feb 23 14:50:53.049288 master-0 kubenswrapper[28758]: I0223 14:50:53.044451 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76a97aa3-19a1-44d3-9019-da7b27957297-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "76a97aa3-19a1-44d3-9019-da7b27957297" (UID: "76a97aa3-19a1-44d3-9019-da7b27957297"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:50:53.049288 master-0 kubenswrapper[28758]: I0223 14:50:53.044663 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^1a730645-16f8-4853-9800-95bc968aad28\") pod \"76a97aa3-19a1-44d3-9019-da7b27957297\" (UID: \"76a97aa3-19a1-44d3-9019-da7b27957297\") " Feb 23 14:50:53.049288 master-0 kubenswrapper[28758]: I0223 14:50:53.045040 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76a97aa3-19a1-44d3-9019-da7b27957297-combined-ca-bundle\") pod \"76a97aa3-19a1-44d3-9019-da7b27957297\" (UID: \"76a97aa3-19a1-44d3-9019-da7b27957297\") " Feb 23 14:50:53.049288 master-0 kubenswrapper[28758]: I0223 14:50:53.045071 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76a97aa3-19a1-44d3-9019-da7b27957297-scripts\") pod \"76a97aa3-19a1-44d3-9019-da7b27957297\" (UID: \"76a97aa3-19a1-44d3-9019-da7b27957297\") " Feb 23 14:50:53.049288 master-0 kubenswrapper[28758]: I0223 14:50:53.045116 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdx92\" (UniqueName: \"kubernetes.io/projected/76a97aa3-19a1-44d3-9019-da7b27957297-kube-api-access-vdx92\") pod \"76a97aa3-19a1-44d3-9019-da7b27957297\" (UID: \"76a97aa3-19a1-44d3-9019-da7b27957297\") " Feb 23 14:50:53.049288 master-0 kubenswrapper[28758]: I0223 14:50:53.045695 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/509b3d39-1b4f-440e-831b-997244388255-httpd-run\") pod \"glance-63e78-default-external-api-0\" (UID: \"509b3d39-1b4f-440e-831b-997244388255\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:50:53.049288 master-0 kubenswrapper[28758]: I0223 14:50:53.045734 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76a97aa3-19a1-44d3-9019-da7b27957297-logs" (OuterVolumeSpecName: "logs") pod "76a97aa3-19a1-44d3-9019-da7b27957297" (UID: "76a97aa3-19a1-44d3-9019-da7b27957297"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:50:53.049288 master-0 kubenswrapper[28758]: I0223 14:50:53.045782 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/509b3d39-1b4f-440e-831b-997244388255-scripts\") pod \"glance-63e78-default-external-api-0\" (UID: \"509b3d39-1b4f-440e-831b-997244388255\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:50:53.049288 master-0 kubenswrapper[28758]: I0223 14:50:53.045860 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/509b3d39-1b4f-440e-831b-997244388255-public-tls-certs\") pod \"glance-63e78-default-external-api-0\" (UID: \"509b3d39-1b4f-440e-831b-997244388255\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:50:53.049288 master-0 kubenswrapper[28758]: I0223 14:50:53.045888 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/509b3d39-1b4f-440e-831b-997244388255-combined-ca-bundle\") pod \"glance-63e78-default-external-api-0\" (UID: \"509b3d39-1b4f-440e-831b-997244388255\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:50:53.049288 master-0 kubenswrapper[28758]: I0223 14:50:53.045996 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/509b3d39-1b4f-440e-831b-997244388255-config-data\") pod \"glance-63e78-default-external-api-0\" (UID: \"509b3d39-1b4f-440e-831b-997244388255\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:50:53.049288 master-0 kubenswrapper[28758]: I0223 14:50:53.046048 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/509b3d39-1b4f-440e-831b-997244388255-logs\") pod \"glance-63e78-default-external-api-0\" (UID: \"509b3d39-1b4f-440e-831b-997244388255\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:50:53.049288 master-0 kubenswrapper[28758]: I0223 14:50:53.046084 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fee995dc-6f05-4147-9442-57dcc3df496b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4676deba-719d-4f1a-a679-315252c62322\") pod \"glance-63e78-default-external-api-0\" (UID: \"509b3d39-1b4f-440e-831b-997244388255\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:50:53.049288 master-0 kubenswrapper[28758]: I0223 14:50:53.046247 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29qfr\" (UniqueName: \"kubernetes.io/projected/509b3d39-1b4f-440e-831b-997244388255-kube-api-access-29qfr\") pod \"glance-63e78-default-external-api-0\" (UID: \"509b3d39-1b4f-440e-831b-997244388255\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:50:53.049288 master-0 kubenswrapper[28758]: I0223 14:50:53.046302 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/509b3d39-1b4f-440e-831b-997244388255-httpd-run\") pod \"glance-63e78-default-external-api-0\" (UID: \"509b3d39-1b4f-440e-831b-997244388255\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:50:53.049288 master-0 kubenswrapper[28758]: I0223 14:50:53.046616 28758 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76a97aa3-19a1-44d3-9019-da7b27957297-logs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:53.049288 master-0 kubenswrapper[28758]: I0223 14:50:53.046643 28758 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/76a97aa3-19a1-44d3-9019-da7b27957297-httpd-run\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:53.049288 master-0 kubenswrapper[28758]: I0223 14:50:53.048044 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/509b3d39-1b4f-440e-831b-997244388255-logs\") pod \"glance-63e78-default-external-api-0\" (UID: \"509b3d39-1b4f-440e-831b-997244388255\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:50:53.051638 master-0 kubenswrapper[28758]: I0223 14:50:53.051597 28758 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 14:50:53.051720 master-0 kubenswrapper[28758]: I0223 14:50:53.051662 28758 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fee995dc-6f05-4147-9442-57dcc3df496b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4676deba-719d-4f1a-a679-315252c62322\") pod \"glance-63e78-default-external-api-0\" (UID: \"509b3d39-1b4f-440e-831b-997244388255\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/e3fa539436bb43727cdf11c2c03cf22a4b969059756e5f07ca659a4a6862fdb6/globalmount\"" pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:50:53.053647 master-0 kubenswrapper[28758]: I0223 14:50:53.053598 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/509b3d39-1b4f-440e-831b-997244388255-scripts\") pod \"glance-63e78-default-external-api-0\" (UID: \"509b3d39-1b4f-440e-831b-997244388255\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:50:53.066189 master-0 kubenswrapper[28758]: I0223 14:50:53.066130 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/509b3d39-1b4f-440e-831b-997244388255-combined-ca-bundle\") pod \"glance-63e78-default-external-api-0\" (UID: \"509b3d39-1b4f-440e-831b-997244388255\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:50:53.067628 master-0 kubenswrapper[28758]: I0223 14:50:53.067591 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/509b3d39-1b4f-440e-831b-997244388255-config-data\") pod \"glance-63e78-default-external-api-0\" (UID: \"509b3d39-1b4f-440e-831b-997244388255\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:50:53.077703 master-0 kubenswrapper[28758]: I0223 14:50:53.077527 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76a97aa3-19a1-44d3-9019-da7b27957297-scripts" (OuterVolumeSpecName: "scripts") pod "76a97aa3-19a1-44d3-9019-da7b27957297" (UID: "76a97aa3-19a1-44d3-9019-da7b27957297"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:53.078025 master-0 kubenswrapper[28758]: I0223 14:50:53.077987 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/509b3d39-1b4f-440e-831b-997244388255-public-tls-certs\") pod \"glance-63e78-default-external-api-0\" (UID: \"509b3d39-1b4f-440e-831b-997244388255\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:50:53.080263 master-0 kubenswrapper[28758]: I0223 14:50:53.080216 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29qfr\" (UniqueName: \"kubernetes.io/projected/509b3d39-1b4f-440e-831b-997244388255-kube-api-access-29qfr\") pod \"glance-63e78-default-external-api-0\" (UID: \"509b3d39-1b4f-440e-831b-997244388255\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:50:53.080461 master-0 kubenswrapper[28758]: I0223 14:50:53.080248 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76a97aa3-19a1-44d3-9019-da7b27957297-kube-api-access-vdx92" (OuterVolumeSpecName: "kube-api-access-vdx92") pod "76a97aa3-19a1-44d3-9019-da7b27957297" (UID: "76a97aa3-19a1-44d3-9019-da7b27957297"). InnerVolumeSpecName "kube-api-access-vdx92". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:50:53.096921 master-0 kubenswrapper[28758]: I0223 14:50:53.096849 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^1a730645-16f8-4853-9800-95bc968aad28" (OuterVolumeSpecName: "glance") pod "76a97aa3-19a1-44d3-9019-da7b27957297" (UID: "76a97aa3-19a1-44d3-9019-da7b27957297"). InnerVolumeSpecName "pvc-94be7a4f-8e83-4b52-8f7d-b530d749c57c". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 23 14:50:53.151222 master-0 kubenswrapper[28758]: I0223 14:50:53.151081 28758 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76a97aa3-19a1-44d3-9019-da7b27957297-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:53.151222 master-0 kubenswrapper[28758]: I0223 14:50:53.151127 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdx92\" (UniqueName: \"kubernetes.io/projected/76a97aa3-19a1-44d3-9019-da7b27957297-kube-api-access-vdx92\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:53.151222 master-0 kubenswrapper[28758]: I0223 14:50:53.151150 28758 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-94be7a4f-8e83-4b52-8f7d-b530d749c57c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^1a730645-16f8-4853-9800-95bc968aad28\") on node \"master-0\" " Feb 23 14:50:53.205167 master-0 kubenswrapper[28758]: I0223 14:50:53.205118 28758 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 23 14:50:53.205384 master-0 kubenswrapper[28758]: I0223 14:50:53.205292 28758 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-94be7a4f-8e83-4b52-8f7d-b530d749c57c" (UniqueName: "kubernetes.io/csi/topolvm.io^1a730645-16f8-4853-9800-95bc968aad28") on node "master-0" Feb 23 14:50:53.237319 master-0 kubenswrapper[28758]: I0223 14:50:53.236643 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76a97aa3-19a1-44d3-9019-da7b27957297-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76a97aa3-19a1-44d3-9019-da7b27957297" (UID: "76a97aa3-19a1-44d3-9019-da7b27957297"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:53.255354 master-0 kubenswrapper[28758]: I0223 14:50:53.255168 28758 reconciler_common.go:293] "Volume detached for volume \"pvc-94be7a4f-8e83-4b52-8f7d-b530d749c57c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^1a730645-16f8-4853-9800-95bc968aad28\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:53.255354 master-0 kubenswrapper[28758]: I0223 14:50:53.255351 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76a97aa3-19a1-44d3-9019-da7b27957297-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:53.281451 master-0 kubenswrapper[28758]: I0223 14:50:53.281355 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76a97aa3-19a1-44d3-9019-da7b27957297-config-data" (OuterVolumeSpecName: "config-data") pod "76a97aa3-19a1-44d3-9019-da7b27957297" (UID: "76a97aa3-19a1-44d3-9019-da7b27957297"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:53.290100 master-0 kubenswrapper[28758]: I0223 14:50:53.290035 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76a97aa3-19a1-44d3-9019-da7b27957297-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "76a97aa3-19a1-44d3-9019-da7b27957297" (UID: "76a97aa3-19a1-44d3-9019-da7b27957297"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:53.361405 master-0 kubenswrapper[28758]: I0223 14:50:53.361181 28758 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/76a97aa3-19a1-44d3-9019-da7b27957297-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:53.361405 master-0 kubenswrapper[28758]: I0223 14:50:53.361245 28758 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76a97aa3-19a1-44d3-9019-da7b27957297-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:53.599049 master-0 kubenswrapper[28758]: I0223 14:50:53.598976 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-454z9" event={"ID":"76227c43-1d77-4dd0-93fd-90a100ccb01e","Type":"ContainerStarted","Data":"3656f1f4e8318bcc4bcd30498a1cd13ed4f65f408c0527816b585587aefa1cae"} Feb 23 14:50:53.601930 master-0 kubenswrapper[28758]: I0223 14:50:53.601880 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"894fbd22-c889-426b-954b-04a9a0e4d905","Type":"ContainerStarted","Data":"c29f86f08145946322f1837c39c2016a439af71f93f855c7d6aa6138194a1930"} Feb 23 14:50:53.604158 master-0 kubenswrapper[28758]: I0223 14:50:53.604118 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"f9f6fbc7-da78-4768-94ca-ed89ac38eec6","Type":"ContainerStarted","Data":"03809147027d4b42773031786f8d3d9643972c3b76388f87613a5ac4c09475a7"} Feb 23 14:50:53.604719 master-0 kubenswrapper[28758]: I0223 14:50:53.604316 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ironic-inspector-0" podUID="f9f6fbc7-da78-4768-94ca-ed89ac38eec6" containerName="inspector-pxe-init" containerID="cri-o://03809147027d4b42773031786f8d3d9643972c3b76388f87613a5ac4c09475a7" gracePeriod=60 Feb 23 14:50:53.610949 master-0 kubenswrapper[28758]: I0223 14:50:53.610884 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-63e78-default-internal-api-0" event={"ID":"76a97aa3-19a1-44d3-9019-da7b27957297","Type":"ContainerDied","Data":"8fc8c6cfd6b7d27127599caf0fa236d17dda5219c208ccbc690fc2e0a894b165"} Feb 23 14:50:53.611069 master-0 kubenswrapper[28758]: I0223 14:50:53.610957 28758 scope.go:117] "RemoveContainer" containerID="1b34d62d94b3cfc0f7ea720c20214c859b4f1d29c479e024171cc5193e14f80b" Feb 23 14:50:53.611119 master-0 kubenswrapper[28758]: I0223 14:50:53.611085 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:50:53.628541 master-0 kubenswrapper[28758]: I0223 14:50:53.625942 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-454z9" podStartSLOduration=2.727572314 podStartE2EDuration="11.625919665s" podCreationTimestamp="2026-02-23 14:50:42 +0000 UTC" firstStartedPulling="2026-02-23 14:50:43.322644615 +0000 UTC m=+975.448960547" lastFinishedPulling="2026-02-23 14:50:52.220991966 +0000 UTC m=+984.347307898" observedRunningTime="2026-02-23 14:50:53.618020075 +0000 UTC m=+985.744336017" watchObservedRunningTime="2026-02-23 14:50:53.625919665 +0000 UTC m=+985.752235597" Feb 23 14:50:53.665814 master-0 kubenswrapper[28758]: I0223 14:50:53.661628 28758 scope.go:117] "RemoveContainer" containerID="eabfb46572be76a89d392cc4f8273ef2d73faca6dfeac81f8567fbb5e1856924" Feb 23 14:50:53.760195 master-0 kubenswrapper[28758]: I0223 14:50:53.760117 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-63e78-default-internal-api-0"] Feb 23 14:50:53.808620 master-0 kubenswrapper[28758]: I0223 14:50:53.805271 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-63e78-default-internal-api-0"] Feb 23 14:50:53.820258 master-0 kubenswrapper[28758]: I0223 14:50:53.818866 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-63e78-default-internal-api-0"] Feb 23 14:50:53.820258 master-0 kubenswrapper[28758]: E0223 14:50:53.819831 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76a97aa3-19a1-44d3-9019-da7b27957297" containerName="glance-log" Feb 23 14:50:53.820258 master-0 kubenswrapper[28758]: I0223 14:50:53.819857 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="76a97aa3-19a1-44d3-9019-da7b27957297" containerName="glance-log" Feb 23 14:50:53.820258 master-0 kubenswrapper[28758]: E0223 14:50:53.819902 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76a97aa3-19a1-44d3-9019-da7b27957297" containerName="glance-httpd" Feb 23 14:50:53.820258 master-0 kubenswrapper[28758]: I0223 14:50:53.819909 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="76a97aa3-19a1-44d3-9019-da7b27957297" containerName="glance-httpd" Feb 23 14:50:53.820258 master-0 kubenswrapper[28758]: I0223 14:50:53.820208 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="76a97aa3-19a1-44d3-9019-da7b27957297" containerName="glance-httpd" Feb 23 14:50:53.820258 master-0 kubenswrapper[28758]: I0223 14:50:53.820233 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="76a97aa3-19a1-44d3-9019-da7b27957297" containerName="glance-log" Feb 23 14:50:53.821844 master-0 kubenswrapper[28758]: I0223 14:50:53.821807 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:50:53.828186 master-0 kubenswrapper[28758]: I0223 14:50:53.824869 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 23 14:50:53.828186 master-0 kubenswrapper[28758]: I0223 14:50:53.825259 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-63e78-default-internal-config-data" Feb 23 14:50:53.831390 master-0 kubenswrapper[28758]: I0223 14:50:53.831324 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-63e78-default-internal-api-0"] Feb 23 14:50:53.977373 master-0 kubenswrapper[28758]: I0223 14:50:53.977298 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be6a9b5-62a2-49f0-8871-aed1d7a7f588-combined-ca-bundle\") pod \"glance-63e78-default-internal-api-0\" (UID: \"3be6a9b5-62a2-49f0-8871-aed1d7a7f588\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:50:53.977373 master-0 kubenswrapper[28758]: I0223 14:50:53.977367 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3be6a9b5-62a2-49f0-8871-aed1d7a7f588-scripts\") pod \"glance-63e78-default-internal-api-0\" (UID: \"3be6a9b5-62a2-49f0-8871-aed1d7a7f588\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:50:53.977725 master-0 kubenswrapper[28758]: I0223 14:50:53.977426 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-94be7a4f-8e83-4b52-8f7d-b530d749c57c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^1a730645-16f8-4853-9800-95bc968aad28\") pod \"glance-63e78-default-internal-api-0\" (UID: \"3be6a9b5-62a2-49f0-8871-aed1d7a7f588\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:50:53.977725 master-0 kubenswrapper[28758]: I0223 14:50:53.977631 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-548tk\" (UniqueName: \"kubernetes.io/projected/3be6a9b5-62a2-49f0-8871-aed1d7a7f588-kube-api-access-548tk\") pod \"glance-63e78-default-internal-api-0\" (UID: \"3be6a9b5-62a2-49f0-8871-aed1d7a7f588\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:50:53.977725 master-0 kubenswrapper[28758]: I0223 14:50:53.977696 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3be6a9b5-62a2-49f0-8871-aed1d7a7f588-internal-tls-certs\") pod \"glance-63e78-default-internal-api-0\" (UID: \"3be6a9b5-62a2-49f0-8871-aed1d7a7f588\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:50:53.978026 master-0 kubenswrapper[28758]: I0223 14:50:53.977957 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3be6a9b5-62a2-49f0-8871-aed1d7a7f588-httpd-run\") pod \"glance-63e78-default-internal-api-0\" (UID: \"3be6a9b5-62a2-49f0-8871-aed1d7a7f588\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:50:53.978026 master-0 kubenswrapper[28758]: I0223 14:50:53.977996 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3be6a9b5-62a2-49f0-8871-aed1d7a7f588-logs\") pod \"glance-63e78-default-internal-api-0\" (UID: \"3be6a9b5-62a2-49f0-8871-aed1d7a7f588\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:50:53.979582 master-0 kubenswrapper[28758]: I0223 14:50:53.978630 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be6a9b5-62a2-49f0-8871-aed1d7a7f588-config-data\") pod \"glance-63e78-default-internal-api-0\" (UID: \"3be6a9b5-62a2-49f0-8871-aed1d7a7f588\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:50:53.983358 master-0 kubenswrapper[28758]: I0223 14:50:53.983310 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fee995dc-6f05-4147-9442-57dcc3df496b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4676deba-719d-4f1a-a679-315252c62322\") pod \"glance-63e78-default-external-api-0\" (UID: \"509b3d39-1b4f-440e-831b-997244388255\") " pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:50:54.085745 master-0 kubenswrapper[28758]: I0223 14:50:54.083628 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be6a9b5-62a2-49f0-8871-aed1d7a7f588-config-data\") pod \"glance-63e78-default-internal-api-0\" (UID: \"3be6a9b5-62a2-49f0-8871-aed1d7a7f588\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:50:54.085745 master-0 kubenswrapper[28758]: I0223 14:50:54.085692 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be6a9b5-62a2-49f0-8871-aed1d7a7f588-combined-ca-bundle\") pod \"glance-63e78-default-internal-api-0\" (UID: \"3be6a9b5-62a2-49f0-8871-aed1d7a7f588\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:50:54.085745 master-0 kubenswrapper[28758]: I0223 14:50:54.085727 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3be6a9b5-62a2-49f0-8871-aed1d7a7f588-scripts\") pod \"glance-63e78-default-internal-api-0\" (UID: \"3be6a9b5-62a2-49f0-8871-aed1d7a7f588\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:50:54.086492 master-0 kubenswrapper[28758]: I0223 14:50:54.085775 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-94be7a4f-8e83-4b52-8f7d-b530d749c57c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^1a730645-16f8-4853-9800-95bc968aad28\") pod \"glance-63e78-default-internal-api-0\" (UID: \"3be6a9b5-62a2-49f0-8871-aed1d7a7f588\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:50:54.086603 master-0 kubenswrapper[28758]: I0223 14:50:54.086575 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-548tk\" (UniqueName: \"kubernetes.io/projected/3be6a9b5-62a2-49f0-8871-aed1d7a7f588-kube-api-access-548tk\") pod \"glance-63e78-default-internal-api-0\" (UID: \"3be6a9b5-62a2-49f0-8871-aed1d7a7f588\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:50:54.086657 master-0 kubenswrapper[28758]: I0223 14:50:54.086609 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3be6a9b5-62a2-49f0-8871-aed1d7a7f588-internal-tls-certs\") pod \"glance-63e78-default-internal-api-0\" (UID: \"3be6a9b5-62a2-49f0-8871-aed1d7a7f588\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:50:54.086657 master-0 kubenswrapper[28758]: I0223 14:50:54.086651 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3be6a9b5-62a2-49f0-8871-aed1d7a7f588-httpd-run\") pod \"glance-63e78-default-internal-api-0\" (UID: \"3be6a9b5-62a2-49f0-8871-aed1d7a7f588\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:50:54.086746 master-0 kubenswrapper[28758]: I0223 14:50:54.086679 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3be6a9b5-62a2-49f0-8871-aed1d7a7f588-logs\") pod \"glance-63e78-default-internal-api-0\" (UID: \"3be6a9b5-62a2-49f0-8871-aed1d7a7f588\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:50:54.087270 master-0 kubenswrapper[28758]: I0223 14:50:54.087240 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3be6a9b5-62a2-49f0-8871-aed1d7a7f588-logs\") pod \"glance-63e78-default-internal-api-0\" (UID: \"3be6a9b5-62a2-49f0-8871-aed1d7a7f588\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:50:54.087621 master-0 kubenswrapper[28758]: I0223 14:50:54.087594 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3be6a9b5-62a2-49f0-8871-aed1d7a7f588-httpd-run\") pod \"glance-63e78-default-internal-api-0\" (UID: \"3be6a9b5-62a2-49f0-8871-aed1d7a7f588\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:50:54.089314 master-0 kubenswrapper[28758]: I0223 14:50:54.089254 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3be6a9b5-62a2-49f0-8871-aed1d7a7f588-scripts\") pod \"glance-63e78-default-internal-api-0\" (UID: \"3be6a9b5-62a2-49f0-8871-aed1d7a7f588\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:50:54.090883 master-0 kubenswrapper[28758]: I0223 14:50:54.090843 28758 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 14:50:54.090957 master-0 kubenswrapper[28758]: I0223 14:50:54.090897 28758 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-94be7a4f-8e83-4b52-8f7d-b530d749c57c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^1a730645-16f8-4853-9800-95bc968aad28\") pod \"glance-63e78-default-internal-api-0\" (UID: \"3be6a9b5-62a2-49f0-8871-aed1d7a7f588\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/152be6b8b3317aae1186e616b7fc755c4c4a3c8c7584c6b0c3c221e4abb1d1f4/globalmount\"" pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:50:54.092558 master-0 kubenswrapper[28758]: I0223 14:50:54.092511 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be6a9b5-62a2-49f0-8871-aed1d7a7f588-config-data\") pod \"glance-63e78-default-internal-api-0\" (UID: \"3be6a9b5-62a2-49f0-8871-aed1d7a7f588\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:50:54.093913 master-0 kubenswrapper[28758]: I0223 14:50:54.093876 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be6a9b5-62a2-49f0-8871-aed1d7a7f588-combined-ca-bundle\") pod \"glance-63e78-default-internal-api-0\" (UID: \"3be6a9b5-62a2-49f0-8871-aed1d7a7f588\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:50:54.102153 master-0 kubenswrapper[28758]: I0223 14:50:54.102070 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3be6a9b5-62a2-49f0-8871-aed1d7a7f588-internal-tls-certs\") pod \"glance-63e78-default-internal-api-0\" (UID: \"3be6a9b5-62a2-49f0-8871-aed1d7a7f588\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:50:54.111522 master-0 kubenswrapper[28758]: I0223 14:50:54.106056 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07aaf639-2ccb-4ffb-be79-572b8990a03a" path="/var/lib/kubelet/pods/07aaf639-2ccb-4ffb-be79-572b8990a03a/volumes" Feb 23 14:50:54.111522 master-0 kubenswrapper[28758]: I0223 14:50:54.106893 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76a97aa3-19a1-44d3-9019-da7b27957297" path="/var/lib/kubelet/pods/76a97aa3-19a1-44d3-9019-da7b27957297/volumes" Feb 23 14:50:54.111522 master-0 kubenswrapper[28758]: I0223 14:50:54.107631 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a09b843c-6e74-4532-8ce4-26147e97d8c5" path="/var/lib/kubelet/pods/a09b843c-6e74-4532-8ce4-26147e97d8c5/volumes" Feb 23 14:50:54.111522 master-0 kubenswrapper[28758]: I0223 14:50:54.108971 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-548tk\" (UniqueName: \"kubernetes.io/projected/3be6a9b5-62a2-49f0-8871-aed1d7a7f588-kube-api-access-548tk\") pod \"glance-63e78-default-internal-api-0\" (UID: \"3be6a9b5-62a2-49f0-8871-aed1d7a7f588\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:50:54.191779 master-0 kubenswrapper[28758]: I0223 14:50:54.191708 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:50:54.349241 master-0 kubenswrapper[28758]: I0223 14:50:54.349192 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Feb 23 14:50:54.501693 master-0 kubenswrapper[28758]: I0223 14:50:54.501635 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk9nx\" (UniqueName: \"kubernetes.io/projected/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-kube-api-access-zk9nx\") pod \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\" (UID: \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\") " Feb 23 14:50:54.502144 master-0 kubenswrapper[28758]: I0223 14:50:54.502125 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-scripts\") pod \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\" (UID: \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\") " Feb 23 14:50:54.502412 master-0 kubenswrapper[28758]: I0223 14:50:54.502397 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-etc-podinfo\") pod \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\" (UID: \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\") " Feb 23 14:50:54.502541 master-0 kubenswrapper[28758]: I0223 14:50:54.502526 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-combined-ca-bundle\") pod \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\" (UID: \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\") " Feb 23 14:50:54.502630 master-0 kubenswrapper[28758]: I0223 14:50:54.502616 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-var-lib-ironic\") pod \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\" (UID: \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\") " Feb 23 14:50:54.502750 master-0 kubenswrapper[28758]: I0223 14:50:54.502735 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\" (UID: \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\") " Feb 23 14:50:54.502877 master-0 kubenswrapper[28758]: I0223 14:50:54.502855 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-config\") pod \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\" (UID: \"f9f6fbc7-da78-4768-94ca-ed89ac38eec6\") " Feb 23 14:50:54.506441 master-0 kubenswrapper[28758]: I0223 14:50:54.506310 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-kube-api-access-zk9nx" (OuterVolumeSpecName: "kube-api-access-zk9nx") pod "f9f6fbc7-da78-4768-94ca-ed89ac38eec6" (UID: "f9f6fbc7-da78-4768-94ca-ed89ac38eec6"). InnerVolumeSpecName "kube-api-access-zk9nx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:50:54.506688 master-0 kubenswrapper[28758]: I0223 14:50:54.506627 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-scripts" (OuterVolumeSpecName: "scripts") pod "f9f6fbc7-da78-4768-94ca-ed89ac38eec6" (UID: "f9f6fbc7-da78-4768-94ca-ed89ac38eec6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:54.506804 master-0 kubenswrapper[28758]: I0223 14:50:54.506738 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "f9f6fbc7-da78-4768-94ca-ed89ac38eec6" (UID: "f9f6fbc7-da78-4768-94ca-ed89ac38eec6"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 23 14:50:54.508065 master-0 kubenswrapper[28758]: I0223 14:50:54.508001 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-var-lib-ironic" (OuterVolumeSpecName: "var-lib-ironic") pod "f9f6fbc7-da78-4768-94ca-ed89ac38eec6" (UID: "f9f6fbc7-da78-4768-94ca-ed89ac38eec6"). InnerVolumeSpecName "var-lib-ironic". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:50:54.508150 master-0 kubenswrapper[28758]: I0223 14:50:54.508060 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-config" (OuterVolumeSpecName: "config") pod "f9f6fbc7-da78-4768-94ca-ed89ac38eec6" (UID: "f9f6fbc7-da78-4768-94ca-ed89ac38eec6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:54.520706 master-0 kubenswrapper[28758]: I0223 14:50:54.519920 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-var-lib-ironic-inspector-dhcp-hostsdir" (OuterVolumeSpecName: "var-lib-ironic-inspector-dhcp-hostsdir") pod "f9f6fbc7-da78-4768-94ca-ed89ac38eec6" (UID: "f9f6fbc7-da78-4768-94ca-ed89ac38eec6"). InnerVolumeSpecName "var-lib-ironic-inspector-dhcp-hostsdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:50:54.564215 master-0 kubenswrapper[28758]: I0223 14:50:54.564150 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9f6fbc7-da78-4768-94ca-ed89ac38eec6" (UID: "f9f6fbc7-da78-4768-94ca-ed89ac38eec6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:50:54.606171 master-0 kubenswrapper[28758]: I0223 14:50:54.606048 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk9nx\" (UniqueName: \"kubernetes.io/projected/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-kube-api-access-zk9nx\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:54.606171 master-0 kubenswrapper[28758]: I0223 14:50:54.606095 28758 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:54.606171 master-0 kubenswrapper[28758]: I0223 14:50:54.606107 28758 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:54.606171 master-0 kubenswrapper[28758]: I0223 14:50:54.606115 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:54.606171 master-0 kubenswrapper[28758]: I0223 14:50:54.606124 28758 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-var-lib-ironic\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:54.606171 master-0 kubenswrapper[28758]: I0223 14:50:54.606136 28758 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-var-lib-ironic-inspector-dhcp-hostsdir\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:54.606171 master-0 kubenswrapper[28758]: I0223 14:50:54.606153 28758 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f9f6fbc7-da78-4768-94ca-ed89ac38eec6-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:50:54.625352 master-0 kubenswrapper[28758]: I0223 14:50:54.625285 28758 generic.go:334] "Generic (PLEG): container finished" podID="f9f6fbc7-da78-4768-94ca-ed89ac38eec6" containerID="03809147027d4b42773031786f8d3d9643972c3b76388f87613a5ac4c09475a7" exitCode=0 Feb 23 14:50:54.625689 master-0 kubenswrapper[28758]: I0223 14:50:54.625373 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"f9f6fbc7-da78-4768-94ca-ed89ac38eec6","Type":"ContainerDied","Data":"03809147027d4b42773031786f8d3d9643972c3b76388f87613a5ac4c09475a7"} Feb 23 14:50:54.625689 master-0 kubenswrapper[28758]: I0223 14:50:54.625408 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"f9f6fbc7-da78-4768-94ca-ed89ac38eec6","Type":"ContainerDied","Data":"5d1a2c37ac72b57d6a4fb42ada2153306ad3d42652a5eccd5abb8950ec62c175"} Feb 23 14:50:54.625689 master-0 kubenswrapper[28758]: I0223 14:50:54.625425 28758 scope.go:117] "RemoveContainer" containerID="03809147027d4b42773031786f8d3d9643972c3b76388f87613a5ac4c09475a7" Feb 23 14:50:54.625689 master-0 kubenswrapper[28758]: I0223 14:50:54.625571 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Feb 23 14:50:54.682097 master-0 kubenswrapper[28758]: I0223 14:50:54.682027 28758 scope.go:117] "RemoveContainer" containerID="ba6c66b9e82399edb9054fe6f305a0a99b532d92daa4459913775a35869089a1" Feb 23 14:50:54.714413 master-0 kubenswrapper[28758]: I0223 14:50:54.714367 28758 scope.go:117] "RemoveContainer" containerID="03809147027d4b42773031786f8d3d9643972c3b76388f87613a5ac4c09475a7" Feb 23 14:50:54.722965 master-0 kubenswrapper[28758]: E0223 14:50:54.720260 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03809147027d4b42773031786f8d3d9643972c3b76388f87613a5ac4c09475a7\": container with ID starting with 03809147027d4b42773031786f8d3d9643972c3b76388f87613a5ac4c09475a7 not found: ID does not exist" containerID="03809147027d4b42773031786f8d3d9643972c3b76388f87613a5ac4c09475a7" Feb 23 14:50:54.722965 master-0 kubenswrapper[28758]: I0223 14:50:54.720325 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03809147027d4b42773031786f8d3d9643972c3b76388f87613a5ac4c09475a7"} err="failed to get container status \"03809147027d4b42773031786f8d3d9643972c3b76388f87613a5ac4c09475a7\": rpc error: code = NotFound desc = could not find container \"03809147027d4b42773031786f8d3d9643972c3b76388f87613a5ac4c09475a7\": container with ID starting with 03809147027d4b42773031786f8d3d9643972c3b76388f87613a5ac4c09475a7 not found: ID does not exist" Feb 23 14:50:54.722965 master-0 kubenswrapper[28758]: I0223 14:50:54.720362 28758 scope.go:117] "RemoveContainer" containerID="ba6c66b9e82399edb9054fe6f305a0a99b532d92daa4459913775a35869089a1" Feb 23 14:50:54.722965 master-0 kubenswrapper[28758]: E0223 14:50:54.722280 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba6c66b9e82399edb9054fe6f305a0a99b532d92daa4459913775a35869089a1\": container with ID starting with ba6c66b9e82399edb9054fe6f305a0a99b532d92daa4459913775a35869089a1 not found: ID does not exist" containerID="ba6c66b9e82399edb9054fe6f305a0a99b532d92daa4459913775a35869089a1" Feb 23 14:50:54.722965 master-0 kubenswrapper[28758]: I0223 14:50:54.722356 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba6c66b9e82399edb9054fe6f305a0a99b532d92daa4459913775a35869089a1"} err="failed to get container status \"ba6c66b9e82399edb9054fe6f305a0a99b532d92daa4459913775a35869089a1\": rpc error: code = NotFound desc = could not find container \"ba6c66b9e82399edb9054fe6f305a0a99b532d92daa4459913775a35869089a1\": container with ID starting with ba6c66b9e82399edb9054fe6f305a0a99b532d92daa4459913775a35869089a1 not found: ID does not exist" Feb 23 14:50:54.752832 master-0 kubenswrapper[28758]: I0223 14:50:54.752754 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-0"] Feb 23 14:50:54.764385 master-0 kubenswrapper[28758]: I0223 14:50:54.764303 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-0"] Feb 23 14:50:54.823503 master-0 kubenswrapper[28758]: I0223 14:50:54.816023 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-0"] Feb 23 14:50:54.823503 master-0 kubenswrapper[28758]: E0223 14:50:54.816792 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9f6fbc7-da78-4768-94ca-ed89ac38eec6" containerName="inspector-pxe-init" Feb 23 14:50:54.823503 master-0 kubenswrapper[28758]: I0223 14:50:54.816821 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f6fbc7-da78-4768-94ca-ed89ac38eec6" containerName="inspector-pxe-init" Feb 23 14:50:54.823503 master-0 kubenswrapper[28758]: E0223 14:50:54.816864 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9f6fbc7-da78-4768-94ca-ed89ac38eec6" containerName="ironic-python-agent-init" Feb 23 14:50:54.823503 master-0 kubenswrapper[28758]: I0223 14:50:54.816872 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f6fbc7-da78-4768-94ca-ed89ac38eec6" containerName="ironic-python-agent-init" Feb 23 14:50:54.823503 master-0 kubenswrapper[28758]: I0223 14:50:54.817143 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9f6fbc7-da78-4768-94ca-ed89ac38eec6" containerName="inspector-pxe-init" Feb 23 14:50:54.833647 master-0 kubenswrapper[28758]: I0223 14:50:54.825698 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Feb 23 14:50:54.835143 master-0 kubenswrapper[28758]: I0223 14:50:54.835090 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-inspector-transport" Feb 23 14:50:54.835143 master-0 kubenswrapper[28758]: I0223 14:50:54.835090 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-inspector-public-svc" Feb 23 14:50:54.835272 master-0 kubenswrapper[28758]: I0223 14:50:54.835226 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-inspector-internal-svc" Feb 23 14:50:54.835412 master-0 kubenswrapper[28758]: I0223 14:50:54.835382 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Feb 23 14:50:54.837177 master-0 kubenswrapper[28758]: I0223 14:50:54.835898 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-63e78-default-external-api-0"] Feb 23 14:50:54.837177 master-0 kubenswrapper[28758]: I0223 14:50:54.836281 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Feb 23 14:50:54.855498 master-0 kubenswrapper[28758]: I0223 14:50:54.852427 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Feb 23 14:50:54.862220 master-0 kubenswrapper[28758]: W0223 14:50:54.862080 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod509b3d39_1b4f_440e_831b_997244388255.slice/crio-cf76c84fb68be3d68620f47ccb08fb7367afa631f6fe62ec8f9418dc694303df WatchSource:0}: Error finding container cf76c84fb68be3d68620f47ccb08fb7367afa631f6fe62ec8f9418dc694303df: Status 404 returned error can't find the container with id cf76c84fb68be3d68620f47ccb08fb7367afa631f6fe62ec8f9418dc694303df Feb 23 14:50:54.922642 master-0 kubenswrapper[28758]: I0223 14:50:54.921196 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5819b1bd-dd89-45b4-84ed-83a1355314de-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"5819b1bd-dd89-45b4-84ed-83a1355314de\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:54.922642 master-0 kubenswrapper[28758]: I0223 14:50:54.921338 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/5819b1bd-dd89-45b4-84ed-83a1355314de-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"5819b1bd-dd89-45b4-84ed-83a1355314de\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:54.922642 master-0 kubenswrapper[28758]: I0223 14:50:54.921462 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5819b1bd-dd89-45b4-84ed-83a1355314de-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"5819b1bd-dd89-45b4-84ed-83a1355314de\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:54.922642 master-0 kubenswrapper[28758]: I0223 14:50:54.921547 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5l9x\" (UniqueName: \"kubernetes.io/projected/5819b1bd-dd89-45b4-84ed-83a1355314de-kube-api-access-q5l9x\") pod \"ironic-inspector-0\" (UID: \"5819b1bd-dd89-45b4-84ed-83a1355314de\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:54.922642 master-0 kubenswrapper[28758]: I0223 14:50:54.921651 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5819b1bd-dd89-45b4-84ed-83a1355314de-config\") pod \"ironic-inspector-0\" (UID: \"5819b1bd-dd89-45b4-84ed-83a1355314de\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:54.922642 master-0 kubenswrapper[28758]: I0223 14:50:54.921683 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/5819b1bd-dd89-45b4-84ed-83a1355314de-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"5819b1bd-dd89-45b4-84ed-83a1355314de\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:54.922642 master-0 kubenswrapper[28758]: I0223 14:50:54.921827 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5819b1bd-dd89-45b4-84ed-83a1355314de-scripts\") pod \"ironic-inspector-0\" (UID: \"5819b1bd-dd89-45b4-84ed-83a1355314de\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:54.922642 master-0 kubenswrapper[28758]: I0223 14:50:54.921873 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5819b1bd-dd89-45b4-84ed-83a1355314de-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"5819b1bd-dd89-45b4-84ed-83a1355314de\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:54.922642 master-0 kubenswrapper[28758]: I0223 14:50:54.921896 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5819b1bd-dd89-45b4-84ed-83a1355314de-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"5819b1bd-dd89-45b4-84ed-83a1355314de\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:55.021732 master-0 kubenswrapper[28758]: I0223 14:50:55.021669 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-94be7a4f-8e83-4b52-8f7d-b530d749c57c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^1a730645-16f8-4853-9800-95bc968aad28\") pod \"glance-63e78-default-internal-api-0\" (UID: \"3be6a9b5-62a2-49f0-8871-aed1d7a7f588\") " pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:50:55.024942 master-0 kubenswrapper[28758]: I0223 14:50:55.024885 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5819b1bd-dd89-45b4-84ed-83a1355314de-config\") pod \"ironic-inspector-0\" (UID: \"5819b1bd-dd89-45b4-84ed-83a1355314de\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:55.024942 master-0 kubenswrapper[28758]: I0223 14:50:55.024931 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/5819b1bd-dd89-45b4-84ed-83a1355314de-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"5819b1bd-dd89-45b4-84ed-83a1355314de\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:55.025197 master-0 kubenswrapper[28758]: I0223 14:50:55.025030 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5819b1bd-dd89-45b4-84ed-83a1355314de-scripts\") pod \"ironic-inspector-0\" (UID: \"5819b1bd-dd89-45b4-84ed-83a1355314de\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:55.025197 master-0 kubenswrapper[28758]: I0223 14:50:55.025064 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5819b1bd-dd89-45b4-84ed-83a1355314de-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"5819b1bd-dd89-45b4-84ed-83a1355314de\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:55.025197 master-0 kubenswrapper[28758]: I0223 14:50:55.025081 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5819b1bd-dd89-45b4-84ed-83a1355314de-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"5819b1bd-dd89-45b4-84ed-83a1355314de\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:55.025197 master-0 kubenswrapper[28758]: I0223 14:50:55.025162 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5819b1bd-dd89-45b4-84ed-83a1355314de-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"5819b1bd-dd89-45b4-84ed-83a1355314de\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:55.025343 master-0 kubenswrapper[28758]: I0223 14:50:55.025206 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/5819b1bd-dd89-45b4-84ed-83a1355314de-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"5819b1bd-dd89-45b4-84ed-83a1355314de\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:55.025343 master-0 kubenswrapper[28758]: I0223 14:50:55.025235 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5819b1bd-dd89-45b4-84ed-83a1355314de-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"5819b1bd-dd89-45b4-84ed-83a1355314de\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:55.025343 master-0 kubenswrapper[28758]: I0223 14:50:55.025281 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5l9x\" (UniqueName: \"kubernetes.io/projected/5819b1bd-dd89-45b4-84ed-83a1355314de-kube-api-access-q5l9x\") pod \"ironic-inspector-0\" (UID: \"5819b1bd-dd89-45b4-84ed-83a1355314de\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:55.026763 master-0 kubenswrapper[28758]: I0223 14:50:55.026714 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/5819b1bd-dd89-45b4-84ed-83a1355314de-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"5819b1bd-dd89-45b4-84ed-83a1355314de\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:55.027081 master-0 kubenswrapper[28758]: I0223 14:50:55.027035 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/5819b1bd-dd89-45b4-84ed-83a1355314de-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"5819b1bd-dd89-45b4-84ed-83a1355314de\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:55.029352 master-0 kubenswrapper[28758]: I0223 14:50:55.029325 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5819b1bd-dd89-45b4-84ed-83a1355314de-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"5819b1bd-dd89-45b4-84ed-83a1355314de\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:55.029885 master-0 kubenswrapper[28758]: I0223 14:50:55.029848 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5819b1bd-dd89-45b4-84ed-83a1355314de-scripts\") pod \"ironic-inspector-0\" (UID: \"5819b1bd-dd89-45b4-84ed-83a1355314de\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:55.030181 master-0 kubenswrapper[28758]: I0223 14:50:55.030150 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5819b1bd-dd89-45b4-84ed-83a1355314de-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"5819b1bd-dd89-45b4-84ed-83a1355314de\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:55.030626 master-0 kubenswrapper[28758]: I0223 14:50:55.030591 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5819b1bd-dd89-45b4-84ed-83a1355314de-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"5819b1bd-dd89-45b4-84ed-83a1355314de\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:55.031315 master-0 kubenswrapper[28758]: I0223 14:50:55.031279 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5819b1bd-dd89-45b4-84ed-83a1355314de-config\") pod \"ironic-inspector-0\" (UID: \"5819b1bd-dd89-45b4-84ed-83a1355314de\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:55.032880 master-0 kubenswrapper[28758]: I0223 14:50:55.032803 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5819b1bd-dd89-45b4-84ed-83a1355314de-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"5819b1bd-dd89-45b4-84ed-83a1355314de\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:55.051364 master-0 kubenswrapper[28758]: I0223 14:50:55.051266 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5l9x\" (UniqueName: \"kubernetes.io/projected/5819b1bd-dd89-45b4-84ed-83a1355314de-kube-api-access-q5l9x\") pod \"ironic-inspector-0\" (UID: \"5819b1bd-dd89-45b4-84ed-83a1355314de\") " pod="openstack/ironic-inspector-0" Feb 23 14:50:55.071074 master-0 kubenswrapper[28758]: I0223 14:50:55.070974 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:50:55.187700 master-0 kubenswrapper[28758]: I0223 14:50:55.187624 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Feb 23 14:50:55.654423 master-0 kubenswrapper[28758]: I0223 14:50:55.650450 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-63e78-default-external-api-0" event={"ID":"509b3d39-1b4f-440e-831b-997244388255","Type":"ContainerStarted","Data":"357bd30c641eb721965ffd9ea08f6a71df55a32ae8a6c882432344dc54be4511"} Feb 23 14:50:55.654423 master-0 kubenswrapper[28758]: I0223 14:50:55.652817 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-63e78-default-external-api-0" event={"ID":"509b3d39-1b4f-440e-831b-997244388255","Type":"ContainerStarted","Data":"cf76c84fb68be3d68620f47ccb08fb7367afa631f6fe62ec8f9418dc694303df"} Feb 23 14:50:55.693102 master-0 kubenswrapper[28758]: I0223 14:50:55.692940 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-63e78-default-internal-api-0"] Feb 23 14:50:55.824575 master-0 kubenswrapper[28758]: W0223 14:50:55.824505 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5819b1bd_dd89_45b4_84ed_83a1355314de.slice/crio-8460f6779a35448eb9c96b1cf342eaf9e2fc9268b63dcabe806b79c8a1025f12 WatchSource:0}: Error finding container 8460f6779a35448eb9c96b1cf342eaf9e2fc9268b63dcabe806b79c8a1025f12: Status 404 returned error can't find the container with id 8460f6779a35448eb9c96b1cf342eaf9e2fc9268b63dcabe806b79c8a1025f12 Feb 23 14:50:55.834698 master-0 kubenswrapper[28758]: I0223 14:50:55.834638 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Feb 23 14:50:56.109074 master-0 kubenswrapper[28758]: I0223 14:50:56.109014 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9f6fbc7-da78-4768-94ca-ed89ac38eec6" path="/var/lib/kubelet/pods/f9f6fbc7-da78-4768-94ca-ed89ac38eec6/volumes" Feb 23 14:50:56.709090 master-0 kubenswrapper[28758]: I0223 14:50:56.709004 28758 generic.go:334] "Generic (PLEG): container finished" podID="5819b1bd-dd89-45b4-84ed-83a1355314de" containerID="71734c3ee3179eff0edfc70fe4a55df97e581ebfde56032527a6e77cbaee1e44" exitCode=0 Feb 23 14:50:56.709090 master-0 kubenswrapper[28758]: I0223 14:50:56.709069 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"5819b1bd-dd89-45b4-84ed-83a1355314de","Type":"ContainerDied","Data":"71734c3ee3179eff0edfc70fe4a55df97e581ebfde56032527a6e77cbaee1e44"} Feb 23 14:50:56.709669 master-0 kubenswrapper[28758]: I0223 14:50:56.709128 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"5819b1bd-dd89-45b4-84ed-83a1355314de","Type":"ContainerStarted","Data":"8460f6779a35448eb9c96b1cf342eaf9e2fc9268b63dcabe806b79c8a1025f12"} Feb 23 14:50:56.714501 master-0 kubenswrapper[28758]: I0223 14:50:56.714444 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-63e78-default-external-api-0" event={"ID":"509b3d39-1b4f-440e-831b-997244388255","Type":"ContainerStarted","Data":"d0b9eab4b7f47e6936851e28303a51410da4536454cce4d5a0ffdd565854ed08"} Feb 23 14:50:56.723370 master-0 kubenswrapper[28758]: I0223 14:50:56.723275 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-63e78-default-internal-api-0" event={"ID":"3be6a9b5-62a2-49f0-8871-aed1d7a7f588","Type":"ContainerStarted","Data":"5684a4efb417f31630fda1804f705133b44960c852f9a48de504e822eeb271c5"} Feb 23 14:50:56.723370 master-0 kubenswrapper[28758]: I0223 14:50:56.723320 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-63e78-default-internal-api-0" event={"ID":"3be6a9b5-62a2-49f0-8871-aed1d7a7f588","Type":"ContainerStarted","Data":"7fb809e088c100de1d9dd9124060a7971190970d754f8bbb5f907eff673af091"} Feb 23 14:50:56.794984 master-0 kubenswrapper[28758]: I0223 14:50:56.794903 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-63e78-default-external-api-0" podStartSLOduration=4.794883931 podStartE2EDuration="4.794883931s" podCreationTimestamp="2026-02-23 14:50:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:50:56.78505147 +0000 UTC m=+988.911367402" watchObservedRunningTime="2026-02-23 14:50:56.794883931 +0000 UTC m=+988.921199863" Feb 23 14:50:57.045849 master-0 kubenswrapper[28758]: I0223 14:50:57.045775 28758 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-677b4847c-74h4p" podUID="07aaf639-2ccb-4ffb-be79-572b8990a03a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.232:5353: i/o timeout" Feb 23 14:50:57.739997 master-0 kubenswrapper[28758]: I0223 14:50:57.739901 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-63e78-default-internal-api-0" event={"ID":"3be6a9b5-62a2-49f0-8871-aed1d7a7f588","Type":"ContainerStarted","Data":"e061751ee630cb4582d986998cb80882db260ae781e60ae3d730bf74abb020c9"} Feb 23 14:50:57.745116 master-0 kubenswrapper[28758]: I0223 14:50:57.745065 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"5819b1bd-dd89-45b4-84ed-83a1355314de","Type":"ContainerStarted","Data":"a16f3c0525fe6cdd8945a407e04e6d91989b66f3e9164fb393b60f40c3e4faea"} Feb 23 14:50:57.835818 master-0 kubenswrapper[28758]: I0223 14:50:57.835716 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-63e78-default-internal-api-0" podStartSLOduration=4.835695615 podStartE2EDuration="4.835695615s" podCreationTimestamp="2026-02-23 14:50:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:50:57.825798952 +0000 UTC m=+989.952114884" watchObservedRunningTime="2026-02-23 14:50:57.835695615 +0000 UTC m=+989.962011547" Feb 23 14:50:58.760047 master-0 kubenswrapper[28758]: I0223 14:50:58.758068 28758 generic.go:334] "Generic (PLEG): container finished" podID="5819b1bd-dd89-45b4-84ed-83a1355314de" containerID="a16f3c0525fe6cdd8945a407e04e6d91989b66f3e9164fb393b60f40c3e4faea" exitCode=0 Feb 23 14:50:58.760580 master-0 kubenswrapper[28758]: I0223 14:50:58.760239 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"5819b1bd-dd89-45b4-84ed-83a1355314de","Type":"ContainerDied","Data":"a16f3c0525fe6cdd8945a407e04e6d91989b66f3e9164fb393b60f40c3e4faea"} Feb 23 14:50:59.800774 master-0 kubenswrapper[28758]: I0223 14:50:59.800708 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"5819b1bd-dd89-45b4-84ed-83a1355314de","Type":"ContainerStarted","Data":"338729fbfb3b560d3acbd4e20960eccb9240c615d4f1ae8daec3a907c32a88e9"} Feb 23 14:51:00.827287 master-0 kubenswrapper[28758]: I0223 14:51:00.827123 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"5819b1bd-dd89-45b4-84ed-83a1355314de","Type":"ContainerStarted","Data":"51230004684e78e6bd8b2e480432f3f3f80878b8948390bd9e32bad9fe8f145b"} Feb 23 14:51:00.827287 master-0 kubenswrapper[28758]: I0223 14:51:00.827194 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"5819b1bd-dd89-45b4-84ed-83a1355314de","Type":"ContainerStarted","Data":"3ac9eda1146555095f49328d1efd4ff06736b6c1ecb20560d4093ecee73e0fc9"} Feb 23 14:51:01.844315 master-0 kubenswrapper[28758]: I0223 14:51:01.842940 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"5819b1bd-dd89-45b4-84ed-83a1355314de","Type":"ContainerStarted","Data":"8bc968499d8760e051109ae8a984a0655a39ff65a26dc6856f207f37d6c67652"} Feb 23 14:51:02.865611 master-0 kubenswrapper[28758]: I0223 14:51:02.865451 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"5819b1bd-dd89-45b4-84ed-83a1355314de","Type":"ContainerStarted","Data":"cbd186943873f72de24a992831d2322b1db20ce63135d713ff0725caa2fb0c78"} Feb 23 14:51:02.866343 master-0 kubenswrapper[28758]: I0223 14:51:02.866322 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Feb 23 14:51:02.992635 master-0 kubenswrapper[28758]: I0223 14:51:02.992491 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-0" podStartSLOduration=8.992452955 podStartE2EDuration="8.992452955s" podCreationTimestamp="2026-02-23 14:50:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:51:02.981040882 +0000 UTC m=+995.107356854" watchObservedRunningTime="2026-02-23 14:51:02.992452955 +0000 UTC m=+995.118768887" Feb 23 14:51:03.877417 master-0 kubenswrapper[28758]: I0223 14:51:03.877336 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Feb 23 14:51:04.192471 master-0 kubenswrapper[28758]: I0223 14:51:04.192351 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:51:04.192732 master-0 kubenswrapper[28758]: I0223 14:51:04.192553 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:51:04.230415 master-0 kubenswrapper[28758]: I0223 14:51:04.230349 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:51:04.241099 master-0 kubenswrapper[28758]: I0223 14:51:04.241038 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:51:04.889912 master-0 kubenswrapper[28758]: I0223 14:51:04.889826 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:51:04.889912 master-0 kubenswrapper[28758]: I0223 14:51:04.889893 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:51:05.071402 master-0 kubenswrapper[28758]: I0223 14:51:05.071341 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:51:05.071402 master-0 kubenswrapper[28758]: I0223 14:51:05.071401 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:51:05.116799 master-0 kubenswrapper[28758]: I0223 14:51:05.116720 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:51:05.129284 master-0 kubenswrapper[28758]: I0223 14:51:05.129220 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:51:05.188025 master-0 kubenswrapper[28758]: I0223 14:51:05.187961 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Feb 23 14:51:05.188025 master-0 kubenswrapper[28758]: I0223 14:51:05.188030 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Feb 23 14:51:05.188025 master-0 kubenswrapper[28758]: I0223 14:51:05.188047 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-inspector-0" Feb 23 14:51:05.188849 master-0 kubenswrapper[28758]: I0223 14:51:05.188064 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-inspector-0" Feb 23 14:51:05.227575 master-0 kubenswrapper[28758]: I0223 14:51:05.226249 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-inspector-0" Feb 23 14:51:05.229835 master-0 kubenswrapper[28758]: I0223 14:51:05.229783 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-inspector-0" Feb 23 14:51:05.901117 master-0 kubenswrapper[28758]: I0223 14:51:05.901055 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:51:05.901796 master-0 kubenswrapper[28758]: I0223 14:51:05.901759 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:51:05.902702 master-0 kubenswrapper[28758]: I0223 14:51:05.902652 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Feb 23 14:51:05.908502 master-0 kubenswrapper[28758]: I0223 14:51:05.907863 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Feb 23 14:51:05.910792 master-0 kubenswrapper[28758]: I0223 14:51:05.910749 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Feb 23 14:51:05.947890 master-0 kubenswrapper[28758]: I0223 14:51:05.947828 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Feb 23 14:51:07.104062 master-0 kubenswrapper[28758]: I0223 14:51:07.103989 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:51:07.104770 master-0 kubenswrapper[28758]: I0223 14:51:07.104130 28758 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 14:51:07.105528 master-0 kubenswrapper[28758]: I0223 14:51:07.105496 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-63e78-default-external-api-0" Feb 23 14:51:07.977558 master-0 kubenswrapper[28758]: I0223 14:51:07.977397 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:51:07.977558 master-0 kubenswrapper[28758]: I0223 14:51:07.977527 28758 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 14:51:07.978942 master-0 kubenswrapper[28758]: I0223 14:51:07.978857 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-63e78-default-internal-api-0" Feb 23 14:51:10.020618 master-0 kubenswrapper[28758]: I0223 14:51:10.020507 28758 generic.go:334] "Generic (PLEG): container finished" podID="76227c43-1d77-4dd0-93fd-90a100ccb01e" containerID="3656f1f4e8318bcc4bcd30498a1cd13ed4f65f408c0527816b585587aefa1cae" exitCode=0 Feb 23 14:51:10.021254 master-0 kubenswrapper[28758]: I0223 14:51:10.020596 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-454z9" event={"ID":"76227c43-1d77-4dd0-93fd-90a100ccb01e","Type":"ContainerDied","Data":"3656f1f4e8318bcc4bcd30498a1cd13ed4f65f408c0527816b585587aefa1cae"} Feb 23 14:51:11.487835 master-0 kubenswrapper[28758]: I0223 14:51:11.486627 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-454z9" Feb 23 14:51:11.644634 master-0 kubenswrapper[28758]: I0223 14:51:11.644504 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 14:51:11.645154 master-0 kubenswrapper[28758]: E0223 14:51:11.645112 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76227c43-1d77-4dd0-93fd-90a100ccb01e" containerName="nova-cell0-conductor-db-sync" Feb 23 14:51:11.645154 master-0 kubenswrapper[28758]: I0223 14:51:11.645132 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="76227c43-1d77-4dd0-93fd-90a100ccb01e" containerName="nova-cell0-conductor-db-sync" Feb 23 14:51:11.645403 master-0 kubenswrapper[28758]: I0223 14:51:11.645376 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="76227c43-1d77-4dd0-93fd-90a100ccb01e" containerName="nova-cell0-conductor-db-sync" Feb 23 14:51:11.646211 master-0 kubenswrapper[28758]: I0223 14:51:11.646182 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 23 14:51:11.656504 master-0 kubenswrapper[28758]: I0223 14:51:11.655942 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 14:51:11.656504 master-0 kubenswrapper[28758]: I0223 14:51:11.656112 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76227c43-1d77-4dd0-93fd-90a100ccb01e-scripts\") pod \"76227c43-1d77-4dd0-93fd-90a100ccb01e\" (UID: \"76227c43-1d77-4dd0-93fd-90a100ccb01e\") " Feb 23 14:51:11.656504 master-0 kubenswrapper[28758]: I0223 14:51:11.656229 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76227c43-1d77-4dd0-93fd-90a100ccb01e-combined-ca-bundle\") pod \"76227c43-1d77-4dd0-93fd-90a100ccb01e\" (UID: \"76227c43-1d77-4dd0-93fd-90a100ccb01e\") " Feb 23 14:51:11.656504 master-0 kubenswrapper[28758]: I0223 14:51:11.656326 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76227c43-1d77-4dd0-93fd-90a100ccb01e-config-data\") pod \"76227c43-1d77-4dd0-93fd-90a100ccb01e\" (UID: \"76227c43-1d77-4dd0-93fd-90a100ccb01e\") " Feb 23 14:51:11.656504 master-0 kubenswrapper[28758]: I0223 14:51:11.656394 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6rhr\" (UniqueName: \"kubernetes.io/projected/76227c43-1d77-4dd0-93fd-90a100ccb01e-kube-api-access-z6rhr\") pod \"76227c43-1d77-4dd0-93fd-90a100ccb01e\" (UID: \"76227c43-1d77-4dd0-93fd-90a100ccb01e\") " Feb 23 14:51:11.674539 master-0 kubenswrapper[28758]: I0223 14:51:11.674123 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76227c43-1d77-4dd0-93fd-90a100ccb01e-kube-api-access-z6rhr" (OuterVolumeSpecName: "kube-api-access-z6rhr") pod "76227c43-1d77-4dd0-93fd-90a100ccb01e" (UID: "76227c43-1d77-4dd0-93fd-90a100ccb01e"). InnerVolumeSpecName "kube-api-access-z6rhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:51:11.675417 master-0 kubenswrapper[28758]: I0223 14:51:11.675304 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76227c43-1d77-4dd0-93fd-90a100ccb01e-scripts" (OuterVolumeSpecName: "scripts") pod "76227c43-1d77-4dd0-93fd-90a100ccb01e" (UID: "76227c43-1d77-4dd0-93fd-90a100ccb01e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:51:11.694824 master-0 kubenswrapper[28758]: I0223 14:51:11.694618 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76227c43-1d77-4dd0-93fd-90a100ccb01e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76227c43-1d77-4dd0-93fd-90a100ccb01e" (UID: "76227c43-1d77-4dd0-93fd-90a100ccb01e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:51:11.699802 master-0 kubenswrapper[28758]: I0223 14:51:11.699695 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76227c43-1d77-4dd0-93fd-90a100ccb01e-config-data" (OuterVolumeSpecName: "config-data") pod "76227c43-1d77-4dd0-93fd-90a100ccb01e" (UID: "76227c43-1d77-4dd0-93fd-90a100ccb01e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:51:11.763206 master-0 kubenswrapper[28758]: I0223 14:51:11.763117 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aeba98f-5080-49b7-bcc7-2afffb29d0a2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"0aeba98f-5080-49b7-bcc7-2afffb29d0a2\") " pod="openstack/nova-cell0-conductor-0" Feb 23 14:51:11.764075 master-0 kubenswrapper[28758]: I0223 14:51:11.763528 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjkll\" (UniqueName: \"kubernetes.io/projected/0aeba98f-5080-49b7-bcc7-2afffb29d0a2-kube-api-access-gjkll\") pod \"nova-cell0-conductor-0\" (UID: \"0aeba98f-5080-49b7-bcc7-2afffb29d0a2\") " pod="openstack/nova-cell0-conductor-0" Feb 23 14:51:11.764075 master-0 kubenswrapper[28758]: I0223 14:51:11.763746 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aeba98f-5080-49b7-bcc7-2afffb29d0a2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"0aeba98f-5080-49b7-bcc7-2afffb29d0a2\") " pod="openstack/nova-cell0-conductor-0" Feb 23 14:51:11.764075 master-0 kubenswrapper[28758]: I0223 14:51:11.763984 28758 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76227c43-1d77-4dd0-93fd-90a100ccb01e-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:11.764075 master-0 kubenswrapper[28758]: I0223 14:51:11.764002 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76227c43-1d77-4dd0-93fd-90a100ccb01e-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:11.764075 master-0 kubenswrapper[28758]: I0223 14:51:11.764017 28758 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76227c43-1d77-4dd0-93fd-90a100ccb01e-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:11.764075 master-0 kubenswrapper[28758]: I0223 14:51:11.764030 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6rhr\" (UniqueName: \"kubernetes.io/projected/76227c43-1d77-4dd0-93fd-90a100ccb01e-kube-api-access-z6rhr\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:11.866215 master-0 kubenswrapper[28758]: I0223 14:51:11.866144 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aeba98f-5080-49b7-bcc7-2afffb29d0a2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"0aeba98f-5080-49b7-bcc7-2afffb29d0a2\") " pod="openstack/nova-cell0-conductor-0" Feb 23 14:51:11.866505 master-0 kubenswrapper[28758]: I0223 14:51:11.866325 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aeba98f-5080-49b7-bcc7-2afffb29d0a2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"0aeba98f-5080-49b7-bcc7-2afffb29d0a2\") " pod="openstack/nova-cell0-conductor-0" Feb 23 14:51:11.866505 master-0 kubenswrapper[28758]: I0223 14:51:11.866436 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjkll\" (UniqueName: \"kubernetes.io/projected/0aeba98f-5080-49b7-bcc7-2afffb29d0a2-kube-api-access-gjkll\") pod \"nova-cell0-conductor-0\" (UID: \"0aeba98f-5080-49b7-bcc7-2afffb29d0a2\") " pod="openstack/nova-cell0-conductor-0" Feb 23 14:51:11.870005 master-0 kubenswrapper[28758]: I0223 14:51:11.869972 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0aeba98f-5080-49b7-bcc7-2afffb29d0a2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"0aeba98f-5080-49b7-bcc7-2afffb29d0a2\") " pod="openstack/nova-cell0-conductor-0" Feb 23 14:51:11.876591 master-0 kubenswrapper[28758]: I0223 14:51:11.876466 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aeba98f-5080-49b7-bcc7-2afffb29d0a2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"0aeba98f-5080-49b7-bcc7-2afffb29d0a2\") " pod="openstack/nova-cell0-conductor-0" Feb 23 14:51:11.886592 master-0 kubenswrapper[28758]: I0223 14:51:11.886531 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjkll\" (UniqueName: \"kubernetes.io/projected/0aeba98f-5080-49b7-bcc7-2afffb29d0a2-kube-api-access-gjkll\") pod \"nova-cell0-conductor-0\" (UID: \"0aeba98f-5080-49b7-bcc7-2afffb29d0a2\") " pod="openstack/nova-cell0-conductor-0" Feb 23 14:51:12.048237 master-0 kubenswrapper[28758]: I0223 14:51:12.048172 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-454z9" event={"ID":"76227c43-1d77-4dd0-93fd-90a100ccb01e","Type":"ContainerDied","Data":"9a0eda192dfdaac647bc2de178172f8bb0b69d6badce9c5b7c54562b7f5ab4d2"} Feb 23 14:51:12.048237 master-0 kubenswrapper[28758]: I0223 14:51:12.048219 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-454z9" Feb 23 14:51:12.048935 master-0 kubenswrapper[28758]: I0223 14:51:12.048227 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a0eda192dfdaac647bc2de178172f8bb0b69d6badce9c5b7c54562b7f5ab4d2" Feb 23 14:51:12.099864 master-0 kubenswrapper[28758]: I0223 14:51:12.099785 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 23 14:51:12.559843 master-0 kubenswrapper[28758]: W0223 14:51:12.559778 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0aeba98f_5080_49b7_bcc7_2afffb29d0a2.slice/crio-e7d5f2aa7064fc56e04690e5d0b8aab012eabe49f761e61da5ddf0ec09013059 WatchSource:0}: Error finding container e7d5f2aa7064fc56e04690e5d0b8aab012eabe49f761e61da5ddf0ec09013059: Status 404 returned error can't find the container with id e7d5f2aa7064fc56e04690e5d0b8aab012eabe49f761e61da5ddf0ec09013059 Feb 23 14:51:12.567048 master-0 kubenswrapper[28758]: I0223 14:51:12.566970 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 23 14:51:13.067577 master-0 kubenswrapper[28758]: I0223 14:51:13.064929 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"0aeba98f-5080-49b7-bcc7-2afffb29d0a2","Type":"ContainerStarted","Data":"2f69664f755d0330c3851434f278ff4b7660485d5a8b927792a8474c91b01c42"} Feb 23 14:51:13.067577 master-0 kubenswrapper[28758]: I0223 14:51:13.065021 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"0aeba98f-5080-49b7-bcc7-2afffb29d0a2","Type":"ContainerStarted","Data":"e7d5f2aa7064fc56e04690e5d0b8aab012eabe49f761e61da5ddf0ec09013059"} Feb 23 14:51:13.067577 master-0 kubenswrapper[28758]: I0223 14:51:13.067367 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 23 14:51:13.096773 master-0 kubenswrapper[28758]: I0223 14:51:13.096649 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.096620269 podStartE2EDuration="2.096620269s" podCreationTimestamp="2026-02-23 14:51:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:51:13.089840937 +0000 UTC m=+1005.216156869" watchObservedRunningTime="2026-02-23 14:51:13.096620269 +0000 UTC m=+1005.222936201" Feb 23 14:51:17.146918 master-0 kubenswrapper[28758]: I0223 14:51:17.146828 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 23 14:51:17.971144 master-0 kubenswrapper[28758]: I0223 14:51:17.971051 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-cbtv5"] Feb 23 14:51:17.985683 master-0 kubenswrapper[28758]: I0223 14:51:17.983947 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cbtv5" Feb 23 14:51:17.986671 master-0 kubenswrapper[28758]: I0223 14:51:17.986571 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 23 14:51:17.986861 master-0 kubenswrapper[28758]: I0223 14:51:17.986812 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 23 14:51:18.000023 master-0 kubenswrapper[28758]: I0223 14:51:17.999924 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-cbtv5"] Feb 23 14:51:18.078532 master-0 kubenswrapper[28758]: I0223 14:51:18.077582 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-compute-ironic-compute-0"] Feb 23 14:51:18.099808 master-0 kubenswrapper[28758]: I0223 14:51:18.096815 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-compute-ironic-compute-0" Feb 23 14:51:18.110931 master-0 kubenswrapper[28758]: I0223 14:51:18.110840 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-ironic-compute-config-data" Feb 23 14:51:18.130218 master-0 kubenswrapper[28758]: I0223 14:51:18.128028 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-compute-ironic-compute-0"] Feb 23 14:51:18.154651 master-0 kubenswrapper[28758]: I0223 14:51:18.151920 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b0c2335-dbed-4733-997a-cb1ab862acb8-scripts\") pod \"nova-cell0-cell-mapping-cbtv5\" (UID: \"5b0c2335-dbed-4733-997a-cb1ab862acb8\") " pod="openstack/nova-cell0-cell-mapping-cbtv5" Feb 23 14:51:18.154651 master-0 kubenswrapper[28758]: I0223 14:51:18.152952 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b0c2335-dbed-4733-997a-cb1ab862acb8-config-data\") pod \"nova-cell0-cell-mapping-cbtv5\" (UID: \"5b0c2335-dbed-4733-997a-cb1ab862acb8\") " pod="openstack/nova-cell0-cell-mapping-cbtv5" Feb 23 14:51:18.154651 master-0 kubenswrapper[28758]: I0223 14:51:18.153163 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b0c2335-dbed-4733-997a-cb1ab862acb8-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-cbtv5\" (UID: \"5b0c2335-dbed-4733-997a-cb1ab862acb8\") " pod="openstack/nova-cell0-cell-mapping-cbtv5" Feb 23 14:51:18.154651 master-0 kubenswrapper[28758]: I0223 14:51:18.153224 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rnp6\" (UniqueName: \"kubernetes.io/projected/5b0c2335-dbed-4733-997a-cb1ab862acb8-kube-api-access-6rnp6\") pod \"nova-cell0-cell-mapping-cbtv5\" (UID: \"5b0c2335-dbed-4733-997a-cb1ab862acb8\") " pod="openstack/nova-cell0-cell-mapping-cbtv5" Feb 23 14:51:18.260044 master-0 kubenswrapper[28758]: I0223 14:51:18.259675 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 23 14:51:18.260739 master-0 kubenswrapper[28758]: I0223 14:51:18.260550 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s57vc\" (UniqueName: \"kubernetes.io/projected/75831807-40a3-4be4-ab18-b122079ee4ff-kube-api-access-s57vc\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"75831807-40a3-4be4-ab18-b122079ee4ff\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Feb 23 14:51:18.260739 master-0 kubenswrapper[28758]: I0223 14:51:18.260629 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b0c2335-dbed-4733-997a-cb1ab862acb8-scripts\") pod \"nova-cell0-cell-mapping-cbtv5\" (UID: \"5b0c2335-dbed-4733-997a-cb1ab862acb8\") " pod="openstack/nova-cell0-cell-mapping-cbtv5" Feb 23 14:51:18.260739 master-0 kubenswrapper[28758]: I0223 14:51:18.260687 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75831807-40a3-4be4-ab18-b122079ee4ff-config-data\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"75831807-40a3-4be4-ab18-b122079ee4ff\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Feb 23 14:51:18.260875 master-0 kubenswrapper[28758]: I0223 14:51:18.260770 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75831807-40a3-4be4-ab18-b122079ee4ff-combined-ca-bundle\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"75831807-40a3-4be4-ab18-b122079ee4ff\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Feb 23 14:51:18.260875 master-0 kubenswrapper[28758]: I0223 14:51:18.260844 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b0c2335-dbed-4733-997a-cb1ab862acb8-config-data\") pod \"nova-cell0-cell-mapping-cbtv5\" (UID: \"5b0c2335-dbed-4733-997a-cb1ab862acb8\") " pod="openstack/nova-cell0-cell-mapping-cbtv5" Feb 23 14:51:18.260938 master-0 kubenswrapper[28758]: I0223 14:51:18.260928 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b0c2335-dbed-4733-997a-cb1ab862acb8-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-cbtv5\" (UID: \"5b0c2335-dbed-4733-997a-cb1ab862acb8\") " pod="openstack/nova-cell0-cell-mapping-cbtv5" Feb 23 14:51:18.260974 master-0 kubenswrapper[28758]: I0223 14:51:18.260944 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rnp6\" (UniqueName: \"kubernetes.io/projected/5b0c2335-dbed-4733-997a-cb1ab862acb8-kube-api-access-6rnp6\") pod \"nova-cell0-cell-mapping-cbtv5\" (UID: \"5b0c2335-dbed-4733-997a-cb1ab862acb8\") " pod="openstack/nova-cell0-cell-mapping-cbtv5" Feb 23 14:51:18.268687 master-0 kubenswrapper[28758]: I0223 14:51:18.261729 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 14:51:18.269876 master-0 kubenswrapper[28758]: I0223 14:51:18.269839 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b0c2335-dbed-4733-997a-cb1ab862acb8-scripts\") pod \"nova-cell0-cell-mapping-cbtv5\" (UID: \"5b0c2335-dbed-4733-997a-cb1ab862acb8\") " pod="openstack/nova-cell0-cell-mapping-cbtv5" Feb 23 14:51:18.271617 master-0 kubenswrapper[28758]: I0223 14:51:18.270958 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b0c2335-dbed-4733-997a-cb1ab862acb8-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-cbtv5\" (UID: \"5b0c2335-dbed-4733-997a-cb1ab862acb8\") " pod="openstack/nova-cell0-cell-mapping-cbtv5" Feb 23 14:51:18.271617 master-0 kubenswrapper[28758]: I0223 14:51:18.271000 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b0c2335-dbed-4733-997a-cb1ab862acb8-config-data\") pod \"nova-cell0-cell-mapping-cbtv5\" (UID: \"5b0c2335-dbed-4733-997a-cb1ab862acb8\") " pod="openstack/nova-cell0-cell-mapping-cbtv5" Feb 23 14:51:18.280140 master-0 kubenswrapper[28758]: I0223 14:51:18.279848 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 23 14:51:18.297472 master-0 kubenswrapper[28758]: I0223 14:51:18.297420 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rnp6\" (UniqueName: \"kubernetes.io/projected/5b0c2335-dbed-4733-997a-cb1ab862acb8-kube-api-access-6rnp6\") pod \"nova-cell0-cell-mapping-cbtv5\" (UID: \"5b0c2335-dbed-4733-997a-cb1ab862acb8\") " pod="openstack/nova-cell0-cell-mapping-cbtv5" Feb 23 14:51:18.300745 master-0 kubenswrapper[28758]: I0223 14:51:18.300277 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 23 14:51:18.317993 master-0 kubenswrapper[28758]: I0223 14:51:18.316291 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 14:51:18.336545 master-0 kubenswrapper[28758]: I0223 14:51:18.325289 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 23 14:51:18.336545 master-0 kubenswrapper[28758]: I0223 14:51:18.325812 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cbtv5" Feb 23 14:51:18.336545 master-0 kubenswrapper[28758]: I0223 14:51:18.326462 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 14:51:18.401737 master-0 kubenswrapper[28758]: I0223 14:51:18.401562 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/640d93a2-e933-4da5-ba03-5905a4ffecb9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"640d93a2-e933-4da5-ba03-5905a4ffecb9\") " pod="openstack/nova-api-0" Feb 23 14:51:18.402126 master-0 kubenswrapper[28758]: I0223 14:51:18.401870 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bjmt\" (UniqueName: \"kubernetes.io/projected/640d93a2-e933-4da5-ba03-5905a4ffecb9-kube-api-access-9bjmt\") pod \"nova-api-0\" (UID: \"640d93a2-e933-4da5-ba03-5905a4ffecb9\") " pod="openstack/nova-api-0" Feb 23 14:51:18.402126 master-0 kubenswrapper[28758]: I0223 14:51:18.402059 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/640d93a2-e933-4da5-ba03-5905a4ffecb9-config-data\") pod \"nova-api-0\" (UID: \"640d93a2-e933-4da5-ba03-5905a4ffecb9\") " pod="openstack/nova-api-0" Feb 23 14:51:18.402412 master-0 kubenswrapper[28758]: I0223 14:51:18.402255 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/640d93a2-e933-4da5-ba03-5905a4ffecb9-logs\") pod \"nova-api-0\" (UID: \"640d93a2-e933-4da5-ba03-5905a4ffecb9\") " pod="openstack/nova-api-0" Feb 23 14:51:18.402412 master-0 kubenswrapper[28758]: I0223 14:51:18.402322 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a268096-f824-4f47-a742-119b7034f507-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8a268096-f824-4f47-a742-119b7034f507\") " pod="openstack/nova-metadata-0" Feb 23 14:51:18.402548 master-0 kubenswrapper[28758]: I0223 14:51:18.402437 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s57vc\" (UniqueName: \"kubernetes.io/projected/75831807-40a3-4be4-ab18-b122079ee4ff-kube-api-access-s57vc\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"75831807-40a3-4be4-ab18-b122079ee4ff\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Feb 23 14:51:18.410834 master-0 kubenswrapper[28758]: I0223 14:51:18.402585 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a268096-f824-4f47-a742-119b7034f507-logs\") pod \"nova-metadata-0\" (UID: \"8a268096-f824-4f47-a742-119b7034f507\") " pod="openstack/nova-metadata-0" Feb 23 14:51:18.410834 master-0 kubenswrapper[28758]: I0223 14:51:18.403203 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75831807-40a3-4be4-ab18-b122079ee4ff-config-data\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"75831807-40a3-4be4-ab18-b122079ee4ff\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Feb 23 14:51:18.410834 master-0 kubenswrapper[28758]: I0223 14:51:18.403361 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s8xc\" (UniqueName: \"kubernetes.io/projected/8a268096-f824-4f47-a742-119b7034f507-kube-api-access-6s8xc\") pod \"nova-metadata-0\" (UID: \"8a268096-f824-4f47-a742-119b7034f507\") " pod="openstack/nova-metadata-0" Feb 23 14:51:18.410834 master-0 kubenswrapper[28758]: I0223 14:51:18.403459 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a268096-f824-4f47-a742-119b7034f507-config-data\") pod \"nova-metadata-0\" (UID: \"8a268096-f824-4f47-a742-119b7034f507\") " pod="openstack/nova-metadata-0" Feb 23 14:51:18.410834 master-0 kubenswrapper[28758]: I0223 14:51:18.403528 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75831807-40a3-4be4-ab18-b122079ee4ff-combined-ca-bundle\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"75831807-40a3-4be4-ab18-b122079ee4ff\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Feb 23 14:51:18.410834 master-0 kubenswrapper[28758]: I0223 14:51:18.406994 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 14:51:18.480578 master-0 kubenswrapper[28758]: I0223 14:51:18.480402 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75831807-40a3-4be4-ab18-b122079ee4ff-combined-ca-bundle\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"75831807-40a3-4be4-ab18-b122079ee4ff\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Feb 23 14:51:18.516140 master-0 kubenswrapper[28758]: I0223 14:51:18.512463 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s57vc\" (UniqueName: \"kubernetes.io/projected/75831807-40a3-4be4-ab18-b122079ee4ff-kube-api-access-s57vc\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"75831807-40a3-4be4-ab18-b122079ee4ff\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Feb 23 14:51:18.516140 master-0 kubenswrapper[28758]: I0223 14:51:18.513074 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75831807-40a3-4be4-ab18-b122079ee4ff-config-data\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"75831807-40a3-4be4-ab18-b122079ee4ff\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Feb 23 14:51:18.531298 master-0 kubenswrapper[28758]: I0223 14:51:18.531193 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 14:51:18.536614 master-0 kubenswrapper[28758]: I0223 14:51:18.534424 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 14:51:18.554104 master-0 kubenswrapper[28758]: I0223 14:51:18.553293 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 23 14:51:18.563087 master-0 kubenswrapper[28758]: I0223 14:51:18.563021 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s8xc\" (UniqueName: \"kubernetes.io/projected/8a268096-f824-4f47-a742-119b7034f507-kube-api-access-6s8xc\") pod \"nova-metadata-0\" (UID: \"8a268096-f824-4f47-a742-119b7034f507\") " pod="openstack/nova-metadata-0" Feb 23 14:51:18.563410 master-0 kubenswrapper[28758]: I0223 14:51:18.563135 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a268096-f824-4f47-a742-119b7034f507-config-data\") pod \"nova-metadata-0\" (UID: \"8a268096-f824-4f47-a742-119b7034f507\") " pod="openstack/nova-metadata-0" Feb 23 14:51:18.563410 master-0 kubenswrapper[28758]: I0223 14:51:18.563296 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/640d93a2-e933-4da5-ba03-5905a4ffecb9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"640d93a2-e933-4da5-ba03-5905a4ffecb9\") " pod="openstack/nova-api-0" Feb 23 14:51:18.563410 master-0 kubenswrapper[28758]: I0223 14:51:18.563376 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bjmt\" (UniqueName: \"kubernetes.io/projected/640d93a2-e933-4da5-ba03-5905a4ffecb9-kube-api-access-9bjmt\") pod \"nova-api-0\" (UID: \"640d93a2-e933-4da5-ba03-5905a4ffecb9\") " pod="openstack/nova-api-0" Feb 23 14:51:18.563566 master-0 kubenswrapper[28758]: I0223 14:51:18.563447 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/640d93a2-e933-4da5-ba03-5905a4ffecb9-config-data\") pod \"nova-api-0\" (UID: \"640d93a2-e933-4da5-ba03-5905a4ffecb9\") " pod="openstack/nova-api-0" Feb 23 14:51:18.563613 master-0 kubenswrapper[28758]: I0223 14:51:18.563570 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/640d93a2-e933-4da5-ba03-5905a4ffecb9-logs\") pod \"nova-api-0\" (UID: \"640d93a2-e933-4da5-ba03-5905a4ffecb9\") " pod="openstack/nova-api-0" Feb 23 14:51:18.563613 master-0 kubenswrapper[28758]: I0223 14:51:18.563603 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a268096-f824-4f47-a742-119b7034f507-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8a268096-f824-4f47-a742-119b7034f507\") " pod="openstack/nova-metadata-0" Feb 23 14:51:18.564720 master-0 kubenswrapper[28758]: I0223 14:51:18.563681 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a268096-f824-4f47-a742-119b7034f507-logs\") pod \"nova-metadata-0\" (UID: \"8a268096-f824-4f47-a742-119b7034f507\") " pod="openstack/nova-metadata-0" Feb 23 14:51:18.564720 master-0 kubenswrapper[28758]: I0223 14:51:18.564527 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a268096-f824-4f47-a742-119b7034f507-logs\") pod \"nova-metadata-0\" (UID: \"8a268096-f824-4f47-a742-119b7034f507\") " pod="openstack/nova-metadata-0" Feb 23 14:51:18.568205 master-0 kubenswrapper[28758]: I0223 14:51:18.568150 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a268096-f824-4f47-a742-119b7034f507-config-data\") pod \"nova-metadata-0\" (UID: \"8a268096-f824-4f47-a742-119b7034f507\") " pod="openstack/nova-metadata-0" Feb 23 14:51:18.568620 master-0 kubenswrapper[28758]: I0223 14:51:18.568575 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/640d93a2-e933-4da5-ba03-5905a4ffecb9-logs\") pod \"nova-api-0\" (UID: \"640d93a2-e933-4da5-ba03-5905a4ffecb9\") " pod="openstack/nova-api-0" Feb 23 14:51:18.570078 master-0 kubenswrapper[28758]: I0223 14:51:18.569757 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/640d93a2-e933-4da5-ba03-5905a4ffecb9-config-data\") pod \"nova-api-0\" (UID: \"640d93a2-e933-4da5-ba03-5905a4ffecb9\") " pod="openstack/nova-api-0" Feb 23 14:51:18.572200 master-0 kubenswrapper[28758]: I0223 14:51:18.572036 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a268096-f824-4f47-a742-119b7034f507-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8a268096-f824-4f47-a742-119b7034f507\") " pod="openstack/nova-metadata-0" Feb 23 14:51:18.575349 master-0 kubenswrapper[28758]: I0223 14:51:18.575280 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/640d93a2-e933-4da5-ba03-5905a4ffecb9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"640d93a2-e933-4da5-ba03-5905a4ffecb9\") " pod="openstack/nova-api-0" Feb 23 14:51:18.579723 master-0 kubenswrapper[28758]: I0223 14:51:18.579569 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 14:51:18.582172 master-0 kubenswrapper[28758]: I0223 14:51:18.581662 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:51:18.593588 master-0 kubenswrapper[28758]: I0223 14:51:18.591163 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s8xc\" (UniqueName: \"kubernetes.io/projected/8a268096-f824-4f47-a742-119b7034f507-kube-api-access-6s8xc\") pod \"nova-metadata-0\" (UID: \"8a268096-f824-4f47-a742-119b7034f507\") " pod="openstack/nova-metadata-0" Feb 23 14:51:18.594906 master-0 kubenswrapper[28758]: I0223 14:51:18.594854 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 23 14:51:18.611602 master-0 kubenswrapper[28758]: I0223 14:51:18.611544 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bjmt\" (UniqueName: \"kubernetes.io/projected/640d93a2-e933-4da5-ba03-5905a4ffecb9-kube-api-access-9bjmt\") pod \"nova-api-0\" (UID: \"640d93a2-e933-4da5-ba03-5905a4ffecb9\") " pod="openstack/nova-api-0" Feb 23 14:51:18.637186 master-0 kubenswrapper[28758]: I0223 14:51:18.637110 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 14:51:18.639661 master-0 kubenswrapper[28758]: I0223 14:51:18.639461 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 14:51:18.653162 master-0 kubenswrapper[28758]: I0223 14:51:18.653048 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 14:51:18.672960 master-0 kubenswrapper[28758]: I0223 14:51:18.672148 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb4t2\" (UniqueName: \"kubernetes.io/projected/d1957082-18ba-4fec-bc8d-7daf7534c3fb-kube-api-access-nb4t2\") pod \"nova-scheduler-0\" (UID: \"d1957082-18ba-4fec-bc8d-7daf7534c3fb\") " pod="openstack/nova-scheduler-0" Feb 23 14:51:18.672960 master-0 kubenswrapper[28758]: I0223 14:51:18.672254 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qddps\" (UniqueName: \"kubernetes.io/projected/7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21-kube-api-access-qddps\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:51:18.672960 master-0 kubenswrapper[28758]: I0223 14:51:18.672364 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:51:18.672960 master-0 kubenswrapper[28758]: I0223 14:51:18.672554 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:51:18.672960 master-0 kubenswrapper[28758]: I0223 14:51:18.672874 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bdb596dbf-l8mhf"] Feb 23 14:51:18.675338 master-0 kubenswrapper[28758]: I0223 14:51:18.675029 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1957082-18ba-4fec-bc8d-7daf7534c3fb-config-data\") pod \"nova-scheduler-0\" (UID: \"d1957082-18ba-4fec-bc8d-7daf7534c3fb\") " pod="openstack/nova-scheduler-0" Feb 23 14:51:18.675338 master-0 kubenswrapper[28758]: I0223 14:51:18.675182 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1957082-18ba-4fec-bc8d-7daf7534c3fb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d1957082-18ba-4fec-bc8d-7daf7534c3fb\") " pod="openstack/nova-scheduler-0" Feb 23 14:51:18.676822 master-0 kubenswrapper[28758]: I0223 14:51:18.676774 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bdb596dbf-l8mhf" Feb 23 14:51:18.731341 master-0 kubenswrapper[28758]: I0223 14:51:18.727197 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bdb596dbf-l8mhf"] Feb 23 14:51:18.745306 master-0 kubenswrapper[28758]: I0223 14:51:18.744468 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-compute-ironic-compute-0" Feb 23 14:51:18.758071 master-0 kubenswrapper[28758]: I0223 14:51:18.754586 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 14:51:18.781073 master-0 kubenswrapper[28758]: I0223 14:51:18.780722 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79275321-c948-4636-bcdf-2c0ab4f02076-ovsdbserver-sb\") pod \"dnsmasq-dns-5bdb596dbf-l8mhf\" (UID: \"79275321-c948-4636-bcdf-2c0ab4f02076\") " pod="openstack/dnsmasq-dns-5bdb596dbf-l8mhf" Feb 23 14:51:18.781584 master-0 kubenswrapper[28758]: I0223 14:51:18.781552 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qddps\" (UniqueName: \"kubernetes.io/projected/7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21-kube-api-access-qddps\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:51:18.782631 master-0 kubenswrapper[28758]: I0223 14:51:18.782613 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:51:18.782839 master-0 kubenswrapper[28758]: I0223 14:51:18.782823 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltl7v\" (UniqueName: \"kubernetes.io/projected/79275321-c948-4636-bcdf-2c0ab4f02076-kube-api-access-ltl7v\") pod \"dnsmasq-dns-5bdb596dbf-l8mhf\" (UID: \"79275321-c948-4636-bcdf-2c0ab4f02076\") " pod="openstack/dnsmasq-dns-5bdb596dbf-l8mhf" Feb 23 14:51:18.783041 master-0 kubenswrapper[28758]: I0223 14:51:18.783020 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79275321-c948-4636-bcdf-2c0ab4f02076-dns-svc\") pod \"dnsmasq-dns-5bdb596dbf-l8mhf\" (UID: \"79275321-c948-4636-bcdf-2c0ab4f02076\") " pod="openstack/dnsmasq-dns-5bdb596dbf-l8mhf" Feb 23 14:51:18.783343 master-0 kubenswrapper[28758]: I0223 14:51:18.783324 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/79275321-c948-4636-bcdf-2c0ab4f02076-dns-swift-storage-0\") pod \"dnsmasq-dns-5bdb596dbf-l8mhf\" (UID: \"79275321-c948-4636-bcdf-2c0ab4f02076\") " pod="openstack/dnsmasq-dns-5bdb596dbf-l8mhf" Feb 23 14:51:18.783559 master-0 kubenswrapper[28758]: I0223 14:51:18.783544 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:51:18.783778 master-0 kubenswrapper[28758]: I0223 14:51:18.783752 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1957082-18ba-4fec-bc8d-7daf7534c3fb-config-data\") pod \"nova-scheduler-0\" (UID: \"d1957082-18ba-4fec-bc8d-7daf7534c3fb\") " pod="openstack/nova-scheduler-0" Feb 23 14:51:18.783963 master-0 kubenswrapper[28758]: I0223 14:51:18.783946 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1957082-18ba-4fec-bc8d-7daf7534c3fb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d1957082-18ba-4fec-bc8d-7daf7534c3fb\") " pod="openstack/nova-scheduler-0" Feb 23 14:51:18.784134 master-0 kubenswrapper[28758]: I0223 14:51:18.784118 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79275321-c948-4636-bcdf-2c0ab4f02076-config\") pod \"dnsmasq-dns-5bdb596dbf-l8mhf\" (UID: \"79275321-c948-4636-bcdf-2c0ab4f02076\") " pod="openstack/dnsmasq-dns-5bdb596dbf-l8mhf" Feb 23 14:51:18.786371 master-0 kubenswrapper[28758]: I0223 14:51:18.786267 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79275321-c948-4636-bcdf-2c0ab4f02076-ovsdbserver-nb\") pod \"dnsmasq-dns-5bdb596dbf-l8mhf\" (UID: \"79275321-c948-4636-bcdf-2c0ab4f02076\") " pod="openstack/dnsmasq-dns-5bdb596dbf-l8mhf" Feb 23 14:51:18.787419 master-0 kubenswrapper[28758]: I0223 14:51:18.786679 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:51:18.802074 master-0 kubenswrapper[28758]: I0223 14:51:18.800771 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:51:18.803202 master-0 kubenswrapper[28758]: I0223 14:51:18.803150 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1957082-18ba-4fec-bc8d-7daf7534c3fb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d1957082-18ba-4fec-bc8d-7daf7534c3fb\") " pod="openstack/nova-scheduler-0" Feb 23 14:51:18.803394 master-0 kubenswrapper[28758]: I0223 14:51:18.799763 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb4t2\" (UniqueName: \"kubernetes.io/projected/d1957082-18ba-4fec-bc8d-7daf7534c3fb-kube-api-access-nb4t2\") pod \"nova-scheduler-0\" (UID: \"d1957082-18ba-4fec-bc8d-7daf7534c3fb\") " pod="openstack/nova-scheduler-0" Feb 23 14:51:18.804022 master-0 kubenswrapper[28758]: I0223 14:51:18.803996 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1957082-18ba-4fec-bc8d-7daf7534c3fb-config-data\") pod \"nova-scheduler-0\" (UID: \"d1957082-18ba-4fec-bc8d-7daf7534c3fb\") " pod="openstack/nova-scheduler-0" Feb 23 14:51:18.824535 master-0 kubenswrapper[28758]: I0223 14:51:18.818373 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb4t2\" (UniqueName: \"kubernetes.io/projected/d1957082-18ba-4fec-bc8d-7daf7534c3fb-kube-api-access-nb4t2\") pod \"nova-scheduler-0\" (UID: \"d1957082-18ba-4fec-bc8d-7daf7534c3fb\") " pod="openstack/nova-scheduler-0" Feb 23 14:51:18.824535 master-0 kubenswrapper[28758]: I0223 14:51:18.820171 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qddps\" (UniqueName: \"kubernetes.io/projected/7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21-kube-api-access-qddps\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:51:18.907407 master-0 kubenswrapper[28758]: I0223 14:51:18.907354 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/79275321-c948-4636-bcdf-2c0ab4f02076-dns-swift-storage-0\") pod \"dnsmasq-dns-5bdb596dbf-l8mhf\" (UID: \"79275321-c948-4636-bcdf-2c0ab4f02076\") " pod="openstack/dnsmasq-dns-5bdb596dbf-l8mhf" Feb 23 14:51:18.907719 master-0 kubenswrapper[28758]: I0223 14:51:18.907689 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79275321-c948-4636-bcdf-2c0ab4f02076-config\") pod \"dnsmasq-dns-5bdb596dbf-l8mhf\" (UID: \"79275321-c948-4636-bcdf-2c0ab4f02076\") " pod="openstack/dnsmasq-dns-5bdb596dbf-l8mhf" Feb 23 14:51:18.907886 master-0 kubenswrapper[28758]: I0223 14:51:18.907817 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79275321-c948-4636-bcdf-2c0ab4f02076-ovsdbserver-nb\") pod \"dnsmasq-dns-5bdb596dbf-l8mhf\" (UID: \"79275321-c948-4636-bcdf-2c0ab4f02076\") " pod="openstack/dnsmasq-dns-5bdb596dbf-l8mhf" Feb 23 14:51:18.908019 master-0 kubenswrapper[28758]: I0223 14:51:18.907990 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79275321-c948-4636-bcdf-2c0ab4f02076-ovsdbserver-sb\") pod \"dnsmasq-dns-5bdb596dbf-l8mhf\" (UID: \"79275321-c948-4636-bcdf-2c0ab4f02076\") " pod="openstack/dnsmasq-dns-5bdb596dbf-l8mhf" Feb 23 14:51:18.908264 master-0 kubenswrapper[28758]: I0223 14:51:18.908229 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltl7v\" (UniqueName: \"kubernetes.io/projected/79275321-c948-4636-bcdf-2c0ab4f02076-kube-api-access-ltl7v\") pod \"dnsmasq-dns-5bdb596dbf-l8mhf\" (UID: \"79275321-c948-4636-bcdf-2c0ab4f02076\") " pod="openstack/dnsmasq-dns-5bdb596dbf-l8mhf" Feb 23 14:51:18.908359 master-0 kubenswrapper[28758]: I0223 14:51:18.908336 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79275321-c948-4636-bcdf-2c0ab4f02076-dns-svc\") pod \"dnsmasq-dns-5bdb596dbf-l8mhf\" (UID: \"79275321-c948-4636-bcdf-2c0ab4f02076\") " pod="openstack/dnsmasq-dns-5bdb596dbf-l8mhf" Feb 23 14:51:18.908635 master-0 kubenswrapper[28758]: I0223 14:51:18.908597 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/79275321-c948-4636-bcdf-2c0ab4f02076-dns-swift-storage-0\") pod \"dnsmasq-dns-5bdb596dbf-l8mhf\" (UID: \"79275321-c948-4636-bcdf-2c0ab4f02076\") " pod="openstack/dnsmasq-dns-5bdb596dbf-l8mhf" Feb 23 14:51:18.908948 master-0 kubenswrapper[28758]: I0223 14:51:18.908903 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79275321-c948-4636-bcdf-2c0ab4f02076-ovsdbserver-nb\") pod \"dnsmasq-dns-5bdb596dbf-l8mhf\" (UID: \"79275321-c948-4636-bcdf-2c0ab4f02076\") " pod="openstack/dnsmasq-dns-5bdb596dbf-l8mhf" Feb 23 14:51:18.910044 master-0 kubenswrapper[28758]: I0223 14:51:18.910015 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79275321-c948-4636-bcdf-2c0ab4f02076-ovsdbserver-sb\") pod \"dnsmasq-dns-5bdb596dbf-l8mhf\" (UID: \"79275321-c948-4636-bcdf-2c0ab4f02076\") " pod="openstack/dnsmasq-dns-5bdb596dbf-l8mhf" Feb 23 14:51:18.910159 master-0 kubenswrapper[28758]: I0223 14:51:18.910119 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79275321-c948-4636-bcdf-2c0ab4f02076-config\") pod \"dnsmasq-dns-5bdb596dbf-l8mhf\" (UID: \"79275321-c948-4636-bcdf-2c0ab4f02076\") " pod="openstack/dnsmasq-dns-5bdb596dbf-l8mhf" Feb 23 14:51:18.910493 master-0 kubenswrapper[28758]: I0223 14:51:18.910362 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79275321-c948-4636-bcdf-2c0ab4f02076-dns-svc\") pod \"dnsmasq-dns-5bdb596dbf-l8mhf\" (UID: \"79275321-c948-4636-bcdf-2c0ab4f02076\") " pod="openstack/dnsmasq-dns-5bdb596dbf-l8mhf" Feb 23 14:51:18.943182 master-0 kubenswrapper[28758]: I0223 14:51:18.943111 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltl7v\" (UniqueName: \"kubernetes.io/projected/79275321-c948-4636-bcdf-2c0ab4f02076-kube-api-access-ltl7v\") pod \"dnsmasq-dns-5bdb596dbf-l8mhf\" (UID: \"79275321-c948-4636-bcdf-2c0ab4f02076\") " pod="openstack/dnsmasq-dns-5bdb596dbf-l8mhf" Feb 23 14:51:18.961293 master-0 kubenswrapper[28758]: I0223 14:51:18.961115 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 14:51:18.988317 master-0 kubenswrapper[28758]: I0223 14:51:18.987784 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:51:19.053732 master-0 kubenswrapper[28758]: I0223 14:51:19.053668 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bdb596dbf-l8mhf" Feb 23 14:51:19.177091 master-0 kubenswrapper[28758]: W0223 14:51:19.176964 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b0c2335_dbed_4733_997a_cb1ab862acb8.slice/crio-a2188aa48f998198f8d7dad7bea4be79a7e51c23eef98a4421f43af2c598cbe8 WatchSource:0}: Error finding container a2188aa48f998198f8d7dad7bea4be79a7e51c23eef98a4421f43af2c598cbe8: Status 404 returned error can't find the container with id a2188aa48f998198f8d7dad7bea4be79a7e51c23eef98a4421f43af2c598cbe8 Feb 23 14:51:19.182868 master-0 kubenswrapper[28758]: I0223 14:51:19.180266 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-cbtv5"] Feb 23 14:51:19.240063 master-0 kubenswrapper[28758]: I0223 14:51:19.239774 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mj69t"] Feb 23 14:51:19.243183 master-0 kubenswrapper[28758]: I0223 14:51:19.242747 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mj69t" Feb 23 14:51:19.250967 master-0 kubenswrapper[28758]: I0223 14:51:19.247347 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 23 14:51:19.250967 master-0 kubenswrapper[28758]: I0223 14:51:19.247722 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 23 14:51:19.323409 master-0 kubenswrapper[28758]: I0223 14:51:19.323320 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb4acaac-3f2b-41c9-a0fc-76232abf5168-scripts\") pod \"nova-cell1-conductor-db-sync-mj69t\" (UID: \"bb4acaac-3f2b-41c9-a0fc-76232abf5168\") " pod="openstack/nova-cell1-conductor-db-sync-mj69t" Feb 23 14:51:19.323886 master-0 kubenswrapper[28758]: I0223 14:51:19.323845 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqmcg\" (UniqueName: \"kubernetes.io/projected/bb4acaac-3f2b-41c9-a0fc-76232abf5168-kube-api-access-qqmcg\") pod \"nova-cell1-conductor-db-sync-mj69t\" (UID: \"bb4acaac-3f2b-41c9-a0fc-76232abf5168\") " pod="openstack/nova-cell1-conductor-db-sync-mj69t" Feb 23 14:51:19.323966 master-0 kubenswrapper[28758]: I0223 14:51:19.323885 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4acaac-3f2b-41c9-a0fc-76232abf5168-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mj69t\" (UID: \"bb4acaac-3f2b-41c9-a0fc-76232abf5168\") " pod="openstack/nova-cell1-conductor-db-sync-mj69t" Feb 23 14:51:19.324017 master-0 kubenswrapper[28758]: I0223 14:51:19.323977 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4acaac-3f2b-41c9-a0fc-76232abf5168-config-data\") pod \"nova-cell1-conductor-db-sync-mj69t\" (UID: \"bb4acaac-3f2b-41c9-a0fc-76232abf5168\") " pod="openstack/nova-cell1-conductor-db-sync-mj69t" Feb 23 14:51:19.324292 master-0 kubenswrapper[28758]: I0223 14:51:19.324102 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mj69t"] Feb 23 14:51:19.345344 master-0 kubenswrapper[28758]: I0223 14:51:19.343395 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 14:51:19.427376 master-0 kubenswrapper[28758]: W0223 14:51:19.425722 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75831807_40a3_4be4_ab18_b122079ee4ff.slice/crio-4587717311da6b1c314014bd56a8977c264173e6c2dfa0c9c78e688e0dae46f6 WatchSource:0}: Error finding container 4587717311da6b1c314014bd56a8977c264173e6c2dfa0c9c78e688e0dae46f6: Status 404 returned error can't find the container with id 4587717311da6b1c314014bd56a8977c264173e6c2dfa0c9c78e688e0dae46f6 Feb 23 14:51:19.430551 master-0 kubenswrapper[28758]: I0223 14:51:19.430356 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-compute-ironic-compute-0"] Feb 23 14:51:19.433604 master-0 kubenswrapper[28758]: I0223 14:51:19.432420 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqmcg\" (UniqueName: \"kubernetes.io/projected/bb4acaac-3f2b-41c9-a0fc-76232abf5168-kube-api-access-qqmcg\") pod \"nova-cell1-conductor-db-sync-mj69t\" (UID: \"bb4acaac-3f2b-41c9-a0fc-76232abf5168\") " pod="openstack/nova-cell1-conductor-db-sync-mj69t" Feb 23 14:51:19.433604 master-0 kubenswrapper[28758]: I0223 14:51:19.432511 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4acaac-3f2b-41c9-a0fc-76232abf5168-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mj69t\" (UID: \"bb4acaac-3f2b-41c9-a0fc-76232abf5168\") " pod="openstack/nova-cell1-conductor-db-sync-mj69t" Feb 23 14:51:19.433604 master-0 kubenswrapper[28758]: I0223 14:51:19.433123 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4acaac-3f2b-41c9-a0fc-76232abf5168-config-data\") pod \"nova-cell1-conductor-db-sync-mj69t\" (UID: \"bb4acaac-3f2b-41c9-a0fc-76232abf5168\") " pod="openstack/nova-cell1-conductor-db-sync-mj69t" Feb 23 14:51:19.433604 master-0 kubenswrapper[28758]: I0223 14:51:19.433422 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb4acaac-3f2b-41c9-a0fc-76232abf5168-scripts\") pod \"nova-cell1-conductor-db-sync-mj69t\" (UID: \"bb4acaac-3f2b-41c9-a0fc-76232abf5168\") " pod="openstack/nova-cell1-conductor-db-sync-mj69t" Feb 23 14:51:19.438842 master-0 kubenswrapper[28758]: I0223 14:51:19.438761 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb4acaac-3f2b-41c9-a0fc-76232abf5168-scripts\") pod \"nova-cell1-conductor-db-sync-mj69t\" (UID: \"bb4acaac-3f2b-41c9-a0fc-76232abf5168\") " pod="openstack/nova-cell1-conductor-db-sync-mj69t" Feb 23 14:51:19.439188 master-0 kubenswrapper[28758]: I0223 14:51:19.439098 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4acaac-3f2b-41c9-a0fc-76232abf5168-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mj69t\" (UID: \"bb4acaac-3f2b-41c9-a0fc-76232abf5168\") " pod="openstack/nova-cell1-conductor-db-sync-mj69t" Feb 23 14:51:19.440585 master-0 kubenswrapper[28758]: I0223 14:51:19.440557 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4acaac-3f2b-41c9-a0fc-76232abf5168-config-data\") pod \"nova-cell1-conductor-db-sync-mj69t\" (UID: \"bb4acaac-3f2b-41c9-a0fc-76232abf5168\") " pod="openstack/nova-cell1-conductor-db-sync-mj69t" Feb 23 14:51:19.496360 master-0 kubenswrapper[28758]: I0223 14:51:19.492013 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqmcg\" (UniqueName: \"kubernetes.io/projected/bb4acaac-3f2b-41c9-a0fc-76232abf5168-kube-api-access-qqmcg\") pod \"nova-cell1-conductor-db-sync-mj69t\" (UID: \"bb4acaac-3f2b-41c9-a0fc-76232abf5168\") " pod="openstack/nova-cell1-conductor-db-sync-mj69t" Feb 23 14:51:19.589689 master-0 kubenswrapper[28758]: W0223 14:51:19.589520 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod640d93a2_e933_4da5_ba03_5905a4ffecb9.slice/crio-a01951abc66220b1818e86a5a6272abb024af614378c542768afc7fd6a988170 WatchSource:0}: Error finding container a01951abc66220b1818e86a5a6272abb024af614378c542768afc7fd6a988170: Status 404 returned error can't find the container with id a01951abc66220b1818e86a5a6272abb024af614378c542768afc7fd6a988170 Feb 23 14:51:19.592857 master-0 kubenswrapper[28758]: I0223 14:51:19.592791 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 14:51:19.776534 master-0 kubenswrapper[28758]: I0223 14:51:19.776338 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mj69t" Feb 23 14:51:19.939930 master-0 kubenswrapper[28758]: W0223 14:51:19.939863 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c86c1e1_e667_49c5_ae5d_8ef8fa0f4b21.slice/crio-f1c8a2e599b5146090f11d5b5cf49421f75ca9c7a6543d507abc1c487876cc52 WatchSource:0}: Error finding container f1c8a2e599b5146090f11d5b5cf49421f75ca9c7a6543d507abc1c487876cc52: Status 404 returned error can't find the container with id f1c8a2e599b5146090f11d5b5cf49421f75ca9c7a6543d507abc1c487876cc52 Feb 23 14:51:20.001054 master-0 kubenswrapper[28758]: I0223 14:51:20.000806 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 14:51:20.070061 master-0 kubenswrapper[28758]: I0223 14:51:20.069907 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 14:51:20.110492 master-0 kubenswrapper[28758]: W0223 14:51:20.110351 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79275321_c948_4636_bcdf_2c0ab4f02076.slice/crio-a11c29c95bb00a031fbb9143b7d7392fc1b1c8c04248974cbe27815fdfe13ff4 WatchSource:0}: Error finding container a11c29c95bb00a031fbb9143b7d7392fc1b1c8c04248974cbe27815fdfe13ff4: Status 404 returned error can't find the container with id a11c29c95bb00a031fbb9143b7d7392fc1b1c8c04248974cbe27815fdfe13ff4 Feb 23 14:51:20.119436 master-0 kubenswrapper[28758]: I0223 14:51:20.119366 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bdb596dbf-l8mhf"] Feb 23 14:51:20.227769 master-0 kubenswrapper[28758]: I0223 14:51:20.227654 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bdb596dbf-l8mhf" event={"ID":"79275321-c948-4636-bcdf-2c0ab4f02076","Type":"ContainerStarted","Data":"a11c29c95bb00a031fbb9143b7d7392fc1b1c8c04248974cbe27815fdfe13ff4"} Feb 23 14:51:20.230904 master-0 kubenswrapper[28758]: I0223 14:51:20.230595 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-compute-ironic-compute-0" event={"ID":"75831807-40a3-4be4-ab18-b122079ee4ff","Type":"ContainerStarted","Data":"4587717311da6b1c314014bd56a8977c264173e6c2dfa0c9c78e688e0dae46f6"} Feb 23 14:51:20.236611 master-0 kubenswrapper[28758]: I0223 14:51:20.236403 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8a268096-f824-4f47-a742-119b7034f507","Type":"ContainerStarted","Data":"5216b999aec1b765e5e1751e56fe934a2fb6fa272f1e995c91f70be1382d8b1b"} Feb 23 14:51:20.242623 master-0 kubenswrapper[28758]: I0223 14:51:20.241464 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21","Type":"ContainerStarted","Data":"f1c8a2e599b5146090f11d5b5cf49421f75ca9c7a6543d507abc1c487876cc52"} Feb 23 14:51:20.245184 master-0 kubenswrapper[28758]: I0223 14:51:20.245129 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d1957082-18ba-4fec-bc8d-7daf7534c3fb","Type":"ContainerStarted","Data":"cfad930f6f41fe76e125995eb767681d5421082c63b957332c472ae639b68bbb"} Feb 23 14:51:20.251147 master-0 kubenswrapper[28758]: I0223 14:51:20.250909 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"640d93a2-e933-4da5-ba03-5905a4ffecb9","Type":"ContainerStarted","Data":"a01951abc66220b1818e86a5a6272abb024af614378c542768afc7fd6a988170"} Feb 23 14:51:20.253795 master-0 kubenswrapper[28758]: I0223 14:51:20.253731 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cbtv5" event={"ID":"5b0c2335-dbed-4733-997a-cb1ab862acb8","Type":"ContainerStarted","Data":"8a63eb55160a29c3331d218879d84fd9fc1563ed7864fbf547dd3a2599195d2b"} Feb 23 14:51:20.253795 master-0 kubenswrapper[28758]: I0223 14:51:20.253763 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cbtv5" event={"ID":"5b0c2335-dbed-4733-997a-cb1ab862acb8","Type":"ContainerStarted","Data":"a2188aa48f998198f8d7dad7bea4be79a7e51c23eef98a4421f43af2c598cbe8"} Feb 23 14:51:20.468525 master-0 kubenswrapper[28758]: I0223 14:51:20.459959 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-cbtv5" podStartSLOduration=3.459940415 podStartE2EDuration="3.459940415s" podCreationTimestamp="2026-02-23 14:51:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:51:20.288907182 +0000 UTC m=+1012.415223114" watchObservedRunningTime="2026-02-23 14:51:20.459940415 +0000 UTC m=+1012.586256347" Feb 23 14:51:20.478776 master-0 kubenswrapper[28758]: I0223 14:51:20.471839 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mj69t"] Feb 23 14:51:21.284842 master-0 kubenswrapper[28758]: I0223 14:51:21.284690 28758 generic.go:334] "Generic (PLEG): container finished" podID="79275321-c948-4636-bcdf-2c0ab4f02076" containerID="40de82de7656f911efb90d8bed9335961f3d740b5bdd5f06264ab043010de138" exitCode=0 Feb 23 14:51:21.284842 master-0 kubenswrapper[28758]: I0223 14:51:21.284752 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bdb596dbf-l8mhf" event={"ID":"79275321-c948-4636-bcdf-2c0ab4f02076","Type":"ContainerDied","Data":"40de82de7656f911efb90d8bed9335961f3d740b5bdd5f06264ab043010de138"} Feb 23 14:51:21.293461 master-0 kubenswrapper[28758]: I0223 14:51:21.293372 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mj69t" event={"ID":"bb4acaac-3f2b-41c9-a0fc-76232abf5168","Type":"ContainerStarted","Data":"1c658e32c2067f75610049b7ce969bb955f138c4bfd9451049314dae8f697706"} Feb 23 14:51:21.293461 master-0 kubenswrapper[28758]: I0223 14:51:21.293432 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mj69t" event={"ID":"bb4acaac-3f2b-41c9-a0fc-76232abf5168","Type":"ContainerStarted","Data":"752ba255d9f22ebb897e447805a0d17c1973c9ec8ec0d3159b46b9554836bb5b"} Feb 23 14:51:21.721264 master-0 kubenswrapper[28758]: I0223 14:51:21.721177 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-mj69t" podStartSLOduration=2.721155302 podStartE2EDuration="2.721155302s" podCreationTimestamp="2026-02-23 14:51:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:51:21.709895203 +0000 UTC m=+1013.836211135" watchObservedRunningTime="2026-02-23 14:51:21.721155302 +0000 UTC m=+1013.847471244" Feb 23 14:51:23.205506 master-0 kubenswrapper[28758]: I0223 14:51:23.202039 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 14:51:23.221603 master-0 kubenswrapper[28758]: I0223 14:51:23.216065 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 14:51:24.359564 master-0 kubenswrapper[28758]: I0223 14:51:24.357651 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8a268096-f824-4f47-a742-119b7034f507","Type":"ContainerStarted","Data":"e221eef4b928ccc85892e679a3d096a62e410bbfccbe2581eef574d3006ad47f"} Feb 23 14:51:24.359564 master-0 kubenswrapper[28758]: I0223 14:51:24.357778 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8a268096-f824-4f47-a742-119b7034f507","Type":"ContainerStarted","Data":"64dd679fd753a3c8945bbbb458e62fd0b1d98f38303d9f6d930a83add20cee1f"} Feb 23 14:51:24.359564 master-0 kubenswrapper[28758]: I0223 14:51:24.357810 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8a268096-f824-4f47-a742-119b7034f507" containerName="nova-metadata-log" containerID="cri-o://64dd679fd753a3c8945bbbb458e62fd0b1d98f38303d9f6d930a83add20cee1f" gracePeriod=30 Feb 23 14:51:24.359564 master-0 kubenswrapper[28758]: I0223 14:51:24.357843 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8a268096-f824-4f47-a742-119b7034f507" containerName="nova-metadata-metadata" containerID="cri-o://e221eef4b928ccc85892e679a3d096a62e410bbfccbe2581eef574d3006ad47f" gracePeriod=30 Feb 23 14:51:24.365389 master-0 kubenswrapper[28758]: I0223 14:51:24.365017 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21","Type":"ContainerStarted","Data":"e577b45efe50dcbbda9b7f7ce6fc657b56b311a72a17b4f94d5eac9e89d8fcda"} Feb 23 14:51:24.365389 master-0 kubenswrapper[28758]: I0223 14:51:24.365128 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://e577b45efe50dcbbda9b7f7ce6fc657b56b311a72a17b4f94d5eac9e89d8fcda" gracePeriod=30 Feb 23 14:51:24.370762 master-0 kubenswrapper[28758]: I0223 14:51:24.370317 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d1957082-18ba-4fec-bc8d-7daf7534c3fb","Type":"ContainerStarted","Data":"4d5a574333d02e976b745d647a22f2a3881098aaaee4a2e4136a369fcdbe14e1"} Feb 23 14:51:24.374246 master-0 kubenswrapper[28758]: I0223 14:51:24.373170 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"640d93a2-e933-4da5-ba03-5905a4ffecb9","Type":"ContainerStarted","Data":"1e53d8ef27aac9eded58759ed6131f332f53b8102de278b9ab9b1a7074c198ee"} Feb 23 14:51:24.374246 master-0 kubenswrapper[28758]: I0223 14:51:24.373219 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"640d93a2-e933-4da5-ba03-5905a4ffecb9","Type":"ContainerStarted","Data":"2bb1021226f9c42c3894ff0f9c7ff82d81ccb8b5659360d06cb7c123cae8ae92"} Feb 23 14:51:24.376209 master-0 kubenswrapper[28758]: I0223 14:51:24.376165 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bdb596dbf-l8mhf" event={"ID":"79275321-c948-4636-bcdf-2c0ab4f02076","Type":"ContainerStarted","Data":"95833dc474850c32d6b3fae9ed6bd7114059237225d21845650659f7fdc6d96f"} Feb 23 14:51:24.376512 master-0 kubenswrapper[28758]: I0223 14:51:24.376412 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bdb596dbf-l8mhf" Feb 23 14:51:24.422214 master-0 kubenswrapper[28758]: I0223 14:51:24.422087 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.543728587 podStartE2EDuration="6.422062041s" podCreationTimestamp="2026-02-23 14:51:18 +0000 UTC" firstStartedPulling="2026-02-23 14:51:19.374935398 +0000 UTC m=+1011.501251330" lastFinishedPulling="2026-02-23 14:51:23.253268852 +0000 UTC m=+1015.379584784" observedRunningTime="2026-02-23 14:51:24.413978102 +0000 UTC m=+1016.540294044" watchObservedRunningTime="2026-02-23 14:51:24.422062041 +0000 UTC m=+1016.548377973" Feb 23 14:51:24.446007 master-0 kubenswrapper[28758]: I0223 14:51:24.444608 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.141884207 podStartE2EDuration="6.444579319s" podCreationTimestamp="2026-02-23 14:51:18 +0000 UTC" firstStartedPulling="2026-02-23 14:51:19.950613071 +0000 UTC m=+1012.076928993" lastFinishedPulling="2026-02-23 14:51:23.253308173 +0000 UTC m=+1015.379624105" observedRunningTime="2026-02-23 14:51:24.438097575 +0000 UTC m=+1016.564413507" watchObservedRunningTime="2026-02-23 14:51:24.444579319 +0000 UTC m=+1016.570895251" Feb 23 14:51:24.474510 master-0 kubenswrapper[28758]: I0223 14:51:24.472117 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bdb596dbf-l8mhf" podStartSLOduration=6.472092688 podStartE2EDuration="6.472092688s" podCreationTimestamp="2026-02-23 14:51:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:51:24.462082824 +0000 UTC m=+1016.588398756" watchObservedRunningTime="2026-02-23 14:51:24.472092688 +0000 UTC m=+1016.598408630" Feb 23 14:51:24.493889 master-0 kubenswrapper[28758]: I0223 14:51:24.492162 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.829768898 podStartE2EDuration="6.492134445s" podCreationTimestamp="2026-02-23 14:51:18 +0000 UTC" firstStartedPulling="2026-02-23 14:51:19.59533421 +0000 UTC m=+1011.721650142" lastFinishedPulling="2026-02-23 14:51:23.257699757 +0000 UTC m=+1015.384015689" observedRunningTime="2026-02-23 14:51:24.484573631 +0000 UTC m=+1016.610889593" watchObservedRunningTime="2026-02-23 14:51:24.492134445 +0000 UTC m=+1016.618450377" Feb 23 14:51:25.249366 master-0 kubenswrapper[28758]: I0223 14:51:25.249316 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 14:51:25.285511 master-0 kubenswrapper[28758]: I0223 14:51:25.280466 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.9592071840000003 podStartE2EDuration="7.28042735s" podCreationTimestamp="2026-02-23 14:51:18 +0000 UTC" firstStartedPulling="2026-02-23 14:51:19.930234104 +0000 UTC m=+1012.056550036" lastFinishedPulling="2026-02-23 14:51:23.25145427 +0000 UTC m=+1015.377770202" observedRunningTime="2026-02-23 14:51:24.520639833 +0000 UTC m=+1016.646955765" watchObservedRunningTime="2026-02-23 14:51:25.28042735 +0000 UTC m=+1017.406743282" Feb 23 14:51:25.302667 master-0 kubenswrapper[28758]: I0223 14:51:25.296803 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a268096-f824-4f47-a742-119b7034f507-config-data\") pod \"8a268096-f824-4f47-a742-119b7034f507\" (UID: \"8a268096-f824-4f47-a742-119b7034f507\") " Feb 23 14:51:25.302667 master-0 kubenswrapper[28758]: I0223 14:51:25.297636 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s8xc\" (UniqueName: \"kubernetes.io/projected/8a268096-f824-4f47-a742-119b7034f507-kube-api-access-6s8xc\") pod \"8a268096-f824-4f47-a742-119b7034f507\" (UID: \"8a268096-f824-4f47-a742-119b7034f507\") " Feb 23 14:51:25.302667 master-0 kubenswrapper[28758]: I0223 14:51:25.299804 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a268096-f824-4f47-a742-119b7034f507-logs\") pod \"8a268096-f824-4f47-a742-119b7034f507\" (UID: \"8a268096-f824-4f47-a742-119b7034f507\") " Feb 23 14:51:25.302667 master-0 kubenswrapper[28758]: I0223 14:51:25.299874 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a268096-f824-4f47-a742-119b7034f507-combined-ca-bundle\") pod \"8a268096-f824-4f47-a742-119b7034f507\" (UID: \"8a268096-f824-4f47-a742-119b7034f507\") " Feb 23 14:51:25.302667 master-0 kubenswrapper[28758]: I0223 14:51:25.300744 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a268096-f824-4f47-a742-119b7034f507-logs" (OuterVolumeSpecName: "logs") pod "8a268096-f824-4f47-a742-119b7034f507" (UID: "8a268096-f824-4f47-a742-119b7034f507"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:51:25.302667 master-0 kubenswrapper[28758]: I0223 14:51:25.301766 28758 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a268096-f824-4f47-a742-119b7034f507-logs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:25.305607 master-0 kubenswrapper[28758]: I0223 14:51:25.305544 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a268096-f824-4f47-a742-119b7034f507-kube-api-access-6s8xc" (OuterVolumeSpecName: "kube-api-access-6s8xc") pod "8a268096-f824-4f47-a742-119b7034f507" (UID: "8a268096-f824-4f47-a742-119b7034f507"). InnerVolumeSpecName "kube-api-access-6s8xc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:51:25.346841 master-0 kubenswrapper[28758]: I0223 14:51:25.346737 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a268096-f824-4f47-a742-119b7034f507-config-data" (OuterVolumeSpecName: "config-data") pod "8a268096-f824-4f47-a742-119b7034f507" (UID: "8a268096-f824-4f47-a742-119b7034f507"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:51:25.383517 master-0 kubenswrapper[28758]: I0223 14:51:25.383316 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a268096-f824-4f47-a742-119b7034f507-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a268096-f824-4f47-a742-119b7034f507" (UID: "8a268096-f824-4f47-a742-119b7034f507"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:51:25.400244 master-0 kubenswrapper[28758]: I0223 14:51:25.399492 28758 generic.go:334] "Generic (PLEG): container finished" podID="8a268096-f824-4f47-a742-119b7034f507" containerID="e221eef4b928ccc85892e679a3d096a62e410bbfccbe2581eef574d3006ad47f" exitCode=0 Feb 23 14:51:25.400244 master-0 kubenswrapper[28758]: I0223 14:51:25.399547 28758 generic.go:334] "Generic (PLEG): container finished" podID="8a268096-f824-4f47-a742-119b7034f507" containerID="64dd679fd753a3c8945bbbb458e62fd0b1d98f38303d9f6d930a83add20cee1f" exitCode=143 Feb 23 14:51:25.406894 master-0 kubenswrapper[28758]: I0223 14:51:25.405663 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8a268096-f824-4f47-a742-119b7034f507","Type":"ContainerDied","Data":"e221eef4b928ccc85892e679a3d096a62e410bbfccbe2581eef574d3006ad47f"} Feb 23 14:51:25.406894 master-0 kubenswrapper[28758]: I0223 14:51:25.405760 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8a268096-f824-4f47-a742-119b7034f507","Type":"ContainerDied","Data":"64dd679fd753a3c8945bbbb458e62fd0b1d98f38303d9f6d930a83add20cee1f"} Feb 23 14:51:25.406894 master-0 kubenswrapper[28758]: I0223 14:51:25.405781 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8a268096-f824-4f47-a742-119b7034f507","Type":"ContainerDied","Data":"5216b999aec1b765e5e1751e56fe934a2fb6fa272f1e995c91f70be1382d8b1b"} Feb 23 14:51:25.406894 master-0 kubenswrapper[28758]: I0223 14:51:25.405830 28758 scope.go:117] "RemoveContainer" containerID="e221eef4b928ccc85892e679a3d096a62e410bbfccbe2581eef574d3006ad47f" Feb 23 14:51:25.409329 master-0 kubenswrapper[28758]: I0223 14:51:25.408941 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 14:51:25.420847 master-0 kubenswrapper[28758]: I0223 14:51:25.420741 28758 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a268096-f824-4f47-a742-119b7034f507-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:25.420847 master-0 kubenswrapper[28758]: I0223 14:51:25.420805 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s8xc\" (UniqueName: \"kubernetes.io/projected/8a268096-f824-4f47-a742-119b7034f507-kube-api-access-6s8xc\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:25.420847 master-0 kubenswrapper[28758]: I0223 14:51:25.420821 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a268096-f824-4f47-a742-119b7034f507-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:25.520810 master-0 kubenswrapper[28758]: I0223 14:51:25.520753 28758 scope.go:117] "RemoveContainer" containerID="64dd679fd753a3c8945bbbb458e62fd0b1d98f38303d9f6d930a83add20cee1f" Feb 23 14:51:25.526105 master-0 kubenswrapper[28758]: I0223 14:51:25.526058 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 14:51:25.556123 master-0 kubenswrapper[28758]: I0223 14:51:25.556014 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 14:51:25.577647 master-0 kubenswrapper[28758]: I0223 14:51:25.577581 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 23 14:51:25.579301 master-0 kubenswrapper[28758]: E0223 14:51:25.579273 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a268096-f824-4f47-a742-119b7034f507" containerName="nova-metadata-log" Feb 23 14:51:25.579427 master-0 kubenswrapper[28758]: I0223 14:51:25.579410 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a268096-f824-4f47-a742-119b7034f507" containerName="nova-metadata-log" Feb 23 14:51:25.579602 master-0 kubenswrapper[28758]: E0223 14:51:25.579584 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a268096-f824-4f47-a742-119b7034f507" containerName="nova-metadata-metadata" Feb 23 14:51:25.579714 master-0 kubenswrapper[28758]: I0223 14:51:25.579692 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a268096-f824-4f47-a742-119b7034f507" containerName="nova-metadata-metadata" Feb 23 14:51:25.580290 master-0 kubenswrapper[28758]: I0223 14:51:25.580270 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a268096-f824-4f47-a742-119b7034f507" containerName="nova-metadata-log" Feb 23 14:51:25.580398 master-0 kubenswrapper[28758]: I0223 14:51:25.580383 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a268096-f824-4f47-a742-119b7034f507" containerName="nova-metadata-metadata" Feb 23 14:51:25.582378 master-0 kubenswrapper[28758]: I0223 14:51:25.582352 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 14:51:25.585418 master-0 kubenswrapper[28758]: I0223 14:51:25.585366 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 23 14:51:25.585839 master-0 kubenswrapper[28758]: I0223 14:51:25.585728 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 23 14:51:25.611940 master-0 kubenswrapper[28758]: I0223 14:51:25.611854 28758 scope.go:117] "RemoveContainer" containerID="e221eef4b928ccc85892e679a3d096a62e410bbfccbe2581eef574d3006ad47f" Feb 23 14:51:25.612568 master-0 kubenswrapper[28758]: E0223 14:51:25.612524 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e221eef4b928ccc85892e679a3d096a62e410bbfccbe2581eef574d3006ad47f\": container with ID starting with e221eef4b928ccc85892e679a3d096a62e410bbfccbe2581eef574d3006ad47f not found: ID does not exist" containerID="e221eef4b928ccc85892e679a3d096a62e410bbfccbe2581eef574d3006ad47f" Feb 23 14:51:25.612730 master-0 kubenswrapper[28758]: I0223 14:51:25.612691 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e221eef4b928ccc85892e679a3d096a62e410bbfccbe2581eef574d3006ad47f"} err="failed to get container status \"e221eef4b928ccc85892e679a3d096a62e410bbfccbe2581eef574d3006ad47f\": rpc error: code = NotFound desc = could not find container \"e221eef4b928ccc85892e679a3d096a62e410bbfccbe2581eef574d3006ad47f\": container with ID starting with e221eef4b928ccc85892e679a3d096a62e410bbfccbe2581eef574d3006ad47f not found: ID does not exist" Feb 23 14:51:25.612844 master-0 kubenswrapper[28758]: I0223 14:51:25.612828 28758 scope.go:117] "RemoveContainer" containerID="64dd679fd753a3c8945bbbb458e62fd0b1d98f38303d9f6d930a83add20cee1f" Feb 23 14:51:25.613338 master-0 kubenswrapper[28758]: E0223 14:51:25.613314 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64dd679fd753a3c8945bbbb458e62fd0b1d98f38303d9f6d930a83add20cee1f\": container with ID starting with 64dd679fd753a3c8945bbbb458e62fd0b1d98f38303d9f6d930a83add20cee1f not found: ID does not exist" containerID="64dd679fd753a3c8945bbbb458e62fd0b1d98f38303d9f6d930a83add20cee1f" Feb 23 14:51:25.613457 master-0 kubenswrapper[28758]: I0223 14:51:25.613434 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64dd679fd753a3c8945bbbb458e62fd0b1d98f38303d9f6d930a83add20cee1f"} err="failed to get container status \"64dd679fd753a3c8945bbbb458e62fd0b1d98f38303d9f6d930a83add20cee1f\": rpc error: code = NotFound desc = could not find container \"64dd679fd753a3c8945bbbb458e62fd0b1d98f38303d9f6d930a83add20cee1f\": container with ID starting with 64dd679fd753a3c8945bbbb458e62fd0b1d98f38303d9f6d930a83add20cee1f not found: ID does not exist" Feb 23 14:51:25.613569 master-0 kubenswrapper[28758]: I0223 14:51:25.613552 28758 scope.go:117] "RemoveContainer" containerID="e221eef4b928ccc85892e679a3d096a62e410bbfccbe2581eef574d3006ad47f" Feb 23 14:51:25.613906 master-0 kubenswrapper[28758]: I0223 14:51:25.613882 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e221eef4b928ccc85892e679a3d096a62e410bbfccbe2581eef574d3006ad47f"} err="failed to get container status \"e221eef4b928ccc85892e679a3d096a62e410bbfccbe2581eef574d3006ad47f\": rpc error: code = NotFound desc = could not find container \"e221eef4b928ccc85892e679a3d096a62e410bbfccbe2581eef574d3006ad47f\": container with ID starting with e221eef4b928ccc85892e679a3d096a62e410bbfccbe2581eef574d3006ad47f not found: ID does not exist" Feb 23 14:51:25.614004 master-0 kubenswrapper[28758]: I0223 14:51:25.613990 28758 scope.go:117] "RemoveContainer" containerID="64dd679fd753a3c8945bbbb458e62fd0b1d98f38303d9f6d930a83add20cee1f" Feb 23 14:51:25.614419 master-0 kubenswrapper[28758]: I0223 14:51:25.614397 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64dd679fd753a3c8945bbbb458e62fd0b1d98f38303d9f6d930a83add20cee1f"} err="failed to get container status \"64dd679fd753a3c8945bbbb458e62fd0b1d98f38303d9f6d930a83add20cee1f\": rpc error: code = NotFound desc = could not find container \"64dd679fd753a3c8945bbbb458e62fd0b1d98f38303d9f6d930a83add20cee1f\": container with ID starting with 64dd679fd753a3c8945bbbb458e62fd0b1d98f38303d9f6d930a83add20cee1f not found: ID does not exist" Feb 23 14:51:25.728649 master-0 kubenswrapper[28758]: I0223 14:51:25.728435 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71561be7-edc6-4a2c-8145-5e9df5a3b0e2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"71561be7-edc6-4a2c-8145-5e9df5a3b0e2\") " pod="openstack/nova-metadata-0" Feb 23 14:51:25.728649 master-0 kubenswrapper[28758]: I0223 14:51:25.728567 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71561be7-edc6-4a2c-8145-5e9df5a3b0e2-logs\") pod \"nova-metadata-0\" (UID: \"71561be7-edc6-4a2c-8145-5e9df5a3b0e2\") " pod="openstack/nova-metadata-0" Feb 23 14:51:25.729134 master-0 kubenswrapper[28758]: I0223 14:51:25.729068 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71561be7-edc6-4a2c-8145-5e9df5a3b0e2-config-data\") pod \"nova-metadata-0\" (UID: \"71561be7-edc6-4a2c-8145-5e9df5a3b0e2\") " pod="openstack/nova-metadata-0" Feb 23 14:51:25.729244 master-0 kubenswrapper[28758]: I0223 14:51:25.729215 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/71561be7-edc6-4a2c-8145-5e9df5a3b0e2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"71561be7-edc6-4a2c-8145-5e9df5a3b0e2\") " pod="openstack/nova-metadata-0" Feb 23 14:51:25.729322 master-0 kubenswrapper[28758]: I0223 14:51:25.729271 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krsb5\" (UniqueName: \"kubernetes.io/projected/71561be7-edc6-4a2c-8145-5e9df5a3b0e2-kube-api-access-krsb5\") pod \"nova-metadata-0\" (UID: \"71561be7-edc6-4a2c-8145-5e9df5a3b0e2\") " pod="openstack/nova-metadata-0" Feb 23 14:51:25.833047 master-0 kubenswrapper[28758]: I0223 14:51:25.832889 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/71561be7-edc6-4a2c-8145-5e9df5a3b0e2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"71561be7-edc6-4a2c-8145-5e9df5a3b0e2\") " pod="openstack/nova-metadata-0" Feb 23 14:51:25.833298 master-0 kubenswrapper[28758]: I0223 14:51:25.833075 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krsb5\" (UniqueName: \"kubernetes.io/projected/71561be7-edc6-4a2c-8145-5e9df5a3b0e2-kube-api-access-krsb5\") pod \"nova-metadata-0\" (UID: \"71561be7-edc6-4a2c-8145-5e9df5a3b0e2\") " pod="openstack/nova-metadata-0" Feb 23 14:51:25.833298 master-0 kubenswrapper[28758]: I0223 14:51:25.833192 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71561be7-edc6-4a2c-8145-5e9df5a3b0e2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"71561be7-edc6-4a2c-8145-5e9df5a3b0e2\") " pod="openstack/nova-metadata-0" Feb 23 14:51:25.833298 master-0 kubenswrapper[28758]: I0223 14:51:25.833254 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71561be7-edc6-4a2c-8145-5e9df5a3b0e2-logs\") pod \"nova-metadata-0\" (UID: \"71561be7-edc6-4a2c-8145-5e9df5a3b0e2\") " pod="openstack/nova-metadata-0" Feb 23 14:51:25.833464 master-0 kubenswrapper[28758]: I0223 14:51:25.833340 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71561be7-edc6-4a2c-8145-5e9df5a3b0e2-config-data\") pod \"nova-metadata-0\" (UID: \"71561be7-edc6-4a2c-8145-5e9df5a3b0e2\") " pod="openstack/nova-metadata-0" Feb 23 14:51:25.834148 master-0 kubenswrapper[28758]: I0223 14:51:25.833890 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71561be7-edc6-4a2c-8145-5e9df5a3b0e2-logs\") pod \"nova-metadata-0\" (UID: \"71561be7-edc6-4a2c-8145-5e9df5a3b0e2\") " pod="openstack/nova-metadata-0" Feb 23 14:51:25.838667 master-0 kubenswrapper[28758]: I0223 14:51:25.838626 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71561be7-edc6-4a2c-8145-5e9df5a3b0e2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"71561be7-edc6-4a2c-8145-5e9df5a3b0e2\") " pod="openstack/nova-metadata-0" Feb 23 14:51:25.842532 master-0 kubenswrapper[28758]: I0223 14:51:25.842452 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71561be7-edc6-4a2c-8145-5e9df5a3b0e2-config-data\") pod \"nova-metadata-0\" (UID: \"71561be7-edc6-4a2c-8145-5e9df5a3b0e2\") " pod="openstack/nova-metadata-0" Feb 23 14:51:25.845437 master-0 kubenswrapper[28758]: I0223 14:51:25.845302 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/71561be7-edc6-4a2c-8145-5e9df5a3b0e2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"71561be7-edc6-4a2c-8145-5e9df5a3b0e2\") " pod="openstack/nova-metadata-0" Feb 23 14:51:25.870591 master-0 kubenswrapper[28758]: I0223 14:51:25.864607 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 14:51:25.911146 master-0 kubenswrapper[28758]: I0223 14:51:25.911078 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krsb5\" (UniqueName: \"kubernetes.io/projected/71561be7-edc6-4a2c-8145-5e9df5a3b0e2-kube-api-access-krsb5\") pod \"nova-metadata-0\" (UID: \"71561be7-edc6-4a2c-8145-5e9df5a3b0e2\") " pod="openstack/nova-metadata-0" Feb 23 14:51:25.914596 master-0 kubenswrapper[28758]: I0223 14:51:25.913159 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 14:51:26.113292 master-0 kubenswrapper[28758]: I0223 14:51:26.113189 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a268096-f824-4f47-a742-119b7034f507" path="/var/lib/kubelet/pods/8a268096-f824-4f47-a742-119b7034f507/volumes" Feb 23 14:51:26.432867 master-0 kubenswrapper[28758]: I0223 14:51:26.432553 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 14:51:26.433459 master-0 kubenswrapper[28758]: W0223 14:51:26.432872 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71561be7_edc6_4a2c_8145_5e9df5a3b0e2.slice/crio-d4bb47b9377bad43d1f525b1460327e91d66616d3d1c4185ca1a52c031aa0603 WatchSource:0}: Error finding container d4bb47b9377bad43d1f525b1460327e91d66616d3d1c4185ca1a52c031aa0603: Status 404 returned error can't find the container with id d4bb47b9377bad43d1f525b1460327e91d66616d3d1c4185ca1a52c031aa0603 Feb 23 14:51:27.459600 master-0 kubenswrapper[28758]: I0223 14:51:27.459522 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"71561be7-edc6-4a2c-8145-5e9df5a3b0e2","Type":"ContainerStarted","Data":"3c4c00d8d239c2a43cd5de2f41ed0e943f773594d425bf7cdf034c48e2710c89"} Feb 23 14:51:27.459600 master-0 kubenswrapper[28758]: I0223 14:51:27.459598 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"71561be7-edc6-4a2c-8145-5e9df5a3b0e2","Type":"ContainerStarted","Data":"dd80d9274614dc84f57777dd0cd2ef9c282326c3edf9402fc2c572f5bc63ae0e"} Feb 23 14:51:27.460276 master-0 kubenswrapper[28758]: I0223 14:51:27.459616 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"71561be7-edc6-4a2c-8145-5e9df5a3b0e2","Type":"ContainerStarted","Data":"d4bb47b9377bad43d1f525b1460327e91d66616d3d1c4185ca1a52c031aa0603"} Feb 23 14:51:27.500591 master-0 kubenswrapper[28758]: I0223 14:51:27.494975 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.494951855 podStartE2EDuration="2.494951855s" podCreationTimestamp="2026-02-23 14:51:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:51:27.482342138 +0000 UTC m=+1019.608658070" watchObservedRunningTime="2026-02-23 14:51:27.494951855 +0000 UTC m=+1019.621267787" Feb 23 14:51:28.757218 master-0 kubenswrapper[28758]: I0223 14:51:28.756288 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 14:51:28.757218 master-0 kubenswrapper[28758]: I0223 14:51:28.756361 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 14:51:28.961646 master-0 kubenswrapper[28758]: I0223 14:51:28.961530 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 23 14:51:28.961646 master-0 kubenswrapper[28758]: I0223 14:51:28.961605 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 23 14:51:28.988421 master-0 kubenswrapper[28758]: I0223 14:51:28.988357 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:51:29.054762 master-0 kubenswrapper[28758]: I0223 14:51:29.054649 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bdb596dbf-l8mhf" Feb 23 14:51:29.165429 master-0 kubenswrapper[28758]: I0223 14:51:29.165364 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 23 14:51:29.513881 master-0 kubenswrapper[28758]: I0223 14:51:29.513821 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 23 14:51:29.840922 master-0 kubenswrapper[28758]: I0223 14:51:29.840748 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="640d93a2-e933-4da5-ba03-5905a4ffecb9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.128.1.2:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 14:51:29.840922 master-0 kubenswrapper[28758]: I0223 14:51:29.840756 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="640d93a2-e933-4da5-ba03-5905a4ffecb9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.128.1.2:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 14:51:30.088546 master-0 kubenswrapper[28758]: I0223 14:51:30.087019 28758 trace.go:236] Trace[2065145033]: "Calculate volume metrics of glance for pod openstack/glance-63e78-default-external-api-0" (23-Feb-2026 14:51:28.077) (total time: 2009ms): Feb 23 14:51:30.088546 master-0 kubenswrapper[28758]: Trace[2065145033]: [2.009634093s] [2.009634093s] END Feb 23 14:51:30.220495 master-0 kubenswrapper[28758]: I0223 14:51:30.215422 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c76dc76f7-sw64n"] Feb 23 14:51:30.220495 master-0 kubenswrapper[28758]: I0223 14:51:30.215751 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-c76dc76f7-sw64n" podUID="76dec5c7-dcd8-4b1b-9022-8d87a29cbbce" containerName="dnsmasq-dns" containerID="cri-o://c8c2dea1161a129d36bce1c9b0d068cc637e4dfe1f874f2c144717c424b1a8bc" gracePeriod=10 Feb 23 14:51:30.519614 master-0 kubenswrapper[28758]: I0223 14:51:30.519537 28758 generic.go:334] "Generic (PLEG): container finished" podID="5b0c2335-dbed-4733-997a-cb1ab862acb8" containerID="8a63eb55160a29c3331d218879d84fd9fc1563ed7864fbf547dd3a2599195d2b" exitCode=0 Feb 23 14:51:30.519877 master-0 kubenswrapper[28758]: I0223 14:51:30.519658 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cbtv5" event={"ID":"5b0c2335-dbed-4733-997a-cb1ab862acb8","Type":"ContainerDied","Data":"8a63eb55160a29c3331d218879d84fd9fc1563ed7864fbf547dd3a2599195d2b"} Feb 23 14:51:30.530852 master-0 kubenswrapper[28758]: I0223 14:51:30.530755 28758 generic.go:334] "Generic (PLEG): container finished" podID="76dec5c7-dcd8-4b1b-9022-8d87a29cbbce" containerID="c8c2dea1161a129d36bce1c9b0d068cc637e4dfe1f874f2c144717c424b1a8bc" exitCode=0 Feb 23 14:51:30.531900 master-0 kubenswrapper[28758]: I0223 14:51:30.531856 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c76dc76f7-sw64n" event={"ID":"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce","Type":"ContainerDied","Data":"c8c2dea1161a129d36bce1c9b0d068cc637e4dfe1f874f2c144717c424b1a8bc"} Feb 23 14:51:30.914332 master-0 kubenswrapper[28758]: I0223 14:51:30.914193 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 14:51:30.915205 master-0 kubenswrapper[28758]: I0223 14:51:30.915179 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 14:51:31.884579 master-0 kubenswrapper[28758]: I0223 14:51:31.881058 28758 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-c76dc76f7-sw64n" podUID="76dec5c7-dcd8-4b1b-9022-8d87a29cbbce" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.249:5353: connect: connection refused" Feb 23 14:51:34.875328 master-0 kubenswrapper[28758]: I0223 14:51:34.875260 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cbtv5" Feb 23 14:51:35.046814 master-0 kubenswrapper[28758]: I0223 14:51:35.046756 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b0c2335-dbed-4733-997a-cb1ab862acb8-scripts\") pod \"5b0c2335-dbed-4733-997a-cb1ab862acb8\" (UID: \"5b0c2335-dbed-4733-997a-cb1ab862acb8\") " Feb 23 14:51:35.047051 master-0 kubenswrapper[28758]: I0223 14:51:35.046969 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b0c2335-dbed-4733-997a-cb1ab862acb8-config-data\") pod \"5b0c2335-dbed-4733-997a-cb1ab862acb8\" (UID: \"5b0c2335-dbed-4733-997a-cb1ab862acb8\") " Feb 23 14:51:35.047091 master-0 kubenswrapper[28758]: I0223 14:51:35.047055 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b0c2335-dbed-4733-997a-cb1ab862acb8-combined-ca-bundle\") pod \"5b0c2335-dbed-4733-997a-cb1ab862acb8\" (UID: \"5b0c2335-dbed-4733-997a-cb1ab862acb8\") " Feb 23 14:51:35.047219 master-0 kubenswrapper[28758]: I0223 14:51:35.047194 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rnp6\" (UniqueName: \"kubernetes.io/projected/5b0c2335-dbed-4733-997a-cb1ab862acb8-kube-api-access-6rnp6\") pod \"5b0c2335-dbed-4733-997a-cb1ab862acb8\" (UID: \"5b0c2335-dbed-4733-997a-cb1ab862acb8\") " Feb 23 14:51:35.050328 master-0 kubenswrapper[28758]: I0223 14:51:35.050249 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b0c2335-dbed-4733-997a-cb1ab862acb8-kube-api-access-6rnp6" (OuterVolumeSpecName: "kube-api-access-6rnp6") pod "5b0c2335-dbed-4733-997a-cb1ab862acb8" (UID: "5b0c2335-dbed-4733-997a-cb1ab862acb8"). InnerVolumeSpecName "kube-api-access-6rnp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:51:35.051666 master-0 kubenswrapper[28758]: I0223 14:51:35.051612 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b0c2335-dbed-4733-997a-cb1ab862acb8-scripts" (OuterVolumeSpecName: "scripts") pod "5b0c2335-dbed-4733-997a-cb1ab862acb8" (UID: "5b0c2335-dbed-4733-997a-cb1ab862acb8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:51:35.081787 master-0 kubenswrapper[28758]: I0223 14:51:35.081670 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b0c2335-dbed-4733-997a-cb1ab862acb8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b0c2335-dbed-4733-997a-cb1ab862acb8" (UID: "5b0c2335-dbed-4733-997a-cb1ab862acb8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:51:35.095958 master-0 kubenswrapper[28758]: I0223 14:51:35.095897 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b0c2335-dbed-4733-997a-cb1ab862acb8-config-data" (OuterVolumeSpecName: "config-data") pod "5b0c2335-dbed-4733-997a-cb1ab862acb8" (UID: "5b0c2335-dbed-4733-997a-cb1ab862acb8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:51:35.127349 master-0 kubenswrapper[28758]: I0223 14:51:35.127301 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c76dc76f7-sw64n" Feb 23 14:51:35.152869 master-0 kubenswrapper[28758]: I0223 14:51:35.152802 28758 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b0c2335-dbed-4733-997a-cb1ab862acb8-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:35.152869 master-0 kubenswrapper[28758]: I0223 14:51:35.152870 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b0c2335-dbed-4733-997a-cb1ab862acb8-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:35.152869 master-0 kubenswrapper[28758]: I0223 14:51:35.152887 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rnp6\" (UniqueName: \"kubernetes.io/projected/5b0c2335-dbed-4733-997a-cb1ab862acb8-kube-api-access-6rnp6\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:35.152869 master-0 kubenswrapper[28758]: I0223 14:51:35.152902 28758 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b0c2335-dbed-4733-997a-cb1ab862acb8-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:35.255439 master-0 kubenswrapper[28758]: I0223 14:51:35.255249 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cw79x\" (UniqueName: \"kubernetes.io/projected/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-kube-api-access-cw79x\") pod \"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce\" (UID: \"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce\") " Feb 23 14:51:35.255648 master-0 kubenswrapper[28758]: I0223 14:51:35.255610 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-config\") pod \"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce\" (UID: \"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce\") " Feb 23 14:51:35.255782 master-0 kubenswrapper[28758]: I0223 14:51:35.255753 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-dns-svc\") pod \"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce\" (UID: \"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce\") " Feb 23 14:51:35.255831 master-0 kubenswrapper[28758]: I0223 14:51:35.255787 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-ovsdbserver-nb\") pod \"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce\" (UID: \"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce\") " Feb 23 14:51:35.255874 master-0 kubenswrapper[28758]: I0223 14:51:35.255833 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-dns-swift-storage-0\") pod \"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce\" (UID: \"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce\") " Feb 23 14:51:35.255925 master-0 kubenswrapper[28758]: I0223 14:51:35.255886 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-ovsdbserver-sb\") pod \"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce\" (UID: \"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce\") " Feb 23 14:51:35.260401 master-0 kubenswrapper[28758]: I0223 14:51:35.260337 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-kube-api-access-cw79x" (OuterVolumeSpecName: "kube-api-access-cw79x") pod "76dec5c7-dcd8-4b1b-9022-8d87a29cbbce" (UID: "76dec5c7-dcd8-4b1b-9022-8d87a29cbbce"). InnerVolumeSpecName "kube-api-access-cw79x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:51:35.317015 master-0 kubenswrapper[28758]: I0223 14:51:35.315518 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "76dec5c7-dcd8-4b1b-9022-8d87a29cbbce" (UID: "76dec5c7-dcd8-4b1b-9022-8d87a29cbbce"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:51:35.321646 master-0 kubenswrapper[28758]: I0223 14:51:35.321582 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "76dec5c7-dcd8-4b1b-9022-8d87a29cbbce" (UID: "76dec5c7-dcd8-4b1b-9022-8d87a29cbbce"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:51:35.325210 master-0 kubenswrapper[28758]: I0223 14:51:35.325137 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-config" (OuterVolumeSpecName: "config") pod "76dec5c7-dcd8-4b1b-9022-8d87a29cbbce" (UID: "76dec5c7-dcd8-4b1b-9022-8d87a29cbbce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:51:35.325210 master-0 kubenswrapper[28758]: I0223 14:51:35.325167 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "76dec5c7-dcd8-4b1b-9022-8d87a29cbbce" (UID: "76dec5c7-dcd8-4b1b-9022-8d87a29cbbce"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:51:35.343716 master-0 kubenswrapper[28758]: I0223 14:51:35.343651 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "76dec5c7-dcd8-4b1b-9022-8d87a29cbbce" (UID: "76dec5c7-dcd8-4b1b-9022-8d87a29cbbce"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:51:35.358927 master-0 kubenswrapper[28758]: I0223 14:51:35.358852 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cw79x\" (UniqueName: \"kubernetes.io/projected/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-kube-api-access-cw79x\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:35.358927 master-0 kubenswrapper[28758]: I0223 14:51:35.358911 28758 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:35.358927 master-0 kubenswrapper[28758]: I0223 14:51:35.358928 28758 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-dns-svc\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:35.358927 master-0 kubenswrapper[28758]: I0223 14:51:35.358940 28758 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:35.358927 master-0 kubenswrapper[28758]: I0223 14:51:35.358953 28758 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:35.359625 master-0 kubenswrapper[28758]: I0223 14:51:35.358964 28758 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:35.603226 master-0 kubenswrapper[28758]: I0223 14:51:35.603049 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c76dc76f7-sw64n" Feb 23 14:51:35.603726 master-0 kubenswrapper[28758]: I0223 14:51:35.603668 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c76dc76f7-sw64n" event={"ID":"76dec5c7-dcd8-4b1b-9022-8d87a29cbbce","Type":"ContainerDied","Data":"7fbd8935766a6a7baf4d15ec0f154b1d43bbb7d6395ff1b99eb58c2e47c43324"} Feb 23 14:51:35.603877 master-0 kubenswrapper[28758]: I0223 14:51:35.603855 28758 scope.go:117] "RemoveContainer" containerID="c8c2dea1161a129d36bce1c9b0d068cc637e4dfe1f874f2c144717c424b1a8bc" Feb 23 14:51:35.608132 master-0 kubenswrapper[28758]: I0223 14:51:35.608102 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-compute-ironic-compute-0" event={"ID":"75831807-40a3-4be4-ab18-b122079ee4ff","Type":"ContainerStarted","Data":"d70a0c514cc81f1eae088b95bb0bf4155a07535a69979cb94d5fbb492624ecb2"} Feb 23 14:51:35.608585 master-0 kubenswrapper[28758]: I0223 14:51:35.608453 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-compute-ironic-compute-0" Feb 23 14:51:35.611099 master-0 kubenswrapper[28758]: I0223 14:51:35.611058 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-cbtv5" event={"ID":"5b0c2335-dbed-4733-997a-cb1ab862acb8","Type":"ContainerDied","Data":"a2188aa48f998198f8d7dad7bea4be79a7e51c23eef98a4421f43af2c598cbe8"} Feb 23 14:51:35.611263 master-0 kubenswrapper[28758]: I0223 14:51:35.611244 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2188aa48f998198f8d7dad7bea4be79a7e51c23eef98a4421f43af2c598cbe8" Feb 23 14:51:35.611380 master-0 kubenswrapper[28758]: I0223 14:51:35.611172 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-cbtv5" Feb 23 14:51:35.668061 master-0 kubenswrapper[28758]: I0223 14:51:35.667897 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-compute-ironic-compute-0" podStartSLOduration=2.213038474 podStartE2EDuration="17.66787201s" podCreationTimestamp="2026-02-23 14:51:18 +0000 UTC" firstStartedPulling="2026-02-23 14:51:19.425676955 +0000 UTC m=+1011.551992897" lastFinishedPulling="2026-02-23 14:51:34.880510501 +0000 UTC m=+1027.006826433" observedRunningTime="2026-02-23 14:51:35.643398216 +0000 UTC m=+1027.769714168" watchObservedRunningTime="2026-02-23 14:51:35.66787201 +0000 UTC m=+1027.794187952" Feb 23 14:51:35.669048 master-0 kubenswrapper[28758]: I0223 14:51:35.669010 28758 scope.go:117] "RemoveContainer" containerID="c1d09aa7450de0a128c7415e92e3b14172427ec6e9a4711c385022c917cdd8f1" Feb 23 14:51:35.673268 master-0 kubenswrapper[28758]: I0223 14:51:35.673207 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-compute-ironic-compute-0" Feb 23 14:51:35.685690 master-0 kubenswrapper[28758]: I0223 14:51:35.685586 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c76dc76f7-sw64n"] Feb 23 14:51:35.701527 master-0 kubenswrapper[28758]: I0223 14:51:35.701235 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c76dc76f7-sw64n"] Feb 23 14:51:35.913600 master-0 kubenswrapper[28758]: I0223 14:51:35.913414 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 23 14:51:35.917041 master-0 kubenswrapper[28758]: I0223 14:51:35.913792 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 23 14:51:36.116508 master-0 kubenswrapper[28758]: I0223 14:51:36.112884 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76dec5c7-dcd8-4b1b-9022-8d87a29cbbce" path="/var/lib/kubelet/pods/76dec5c7-dcd8-4b1b-9022-8d87a29cbbce/volumes" Feb 23 14:51:36.121512 master-0 kubenswrapper[28758]: I0223 14:51:36.117997 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 14:51:36.121512 master-0 kubenswrapper[28758]: I0223 14:51:36.118298 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="640d93a2-e933-4da5-ba03-5905a4ffecb9" containerName="nova-api-log" containerID="cri-o://2bb1021226f9c42c3894ff0f9c7ff82d81ccb8b5659360d06cb7c123cae8ae92" gracePeriod=30 Feb 23 14:51:36.121512 master-0 kubenswrapper[28758]: I0223 14:51:36.118463 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="640d93a2-e933-4da5-ba03-5905a4ffecb9" containerName="nova-api-api" containerID="cri-o://1e53d8ef27aac9eded58759ed6131f332f53b8102de278b9ab9b1a7074c198ee" gracePeriod=30 Feb 23 14:51:36.161239 master-0 kubenswrapper[28758]: I0223 14:51:36.156096 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 14:51:36.161239 master-0 kubenswrapper[28758]: I0223 14:51:36.156325 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d1957082-18ba-4fec-bc8d-7daf7534c3fb" containerName="nova-scheduler-scheduler" containerID="cri-o://4d5a574333d02e976b745d647a22f2a3881098aaaee4a2e4136a369fcdbe14e1" gracePeriod=30 Feb 23 14:51:36.231511 master-0 kubenswrapper[28758]: I0223 14:51:36.228002 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 14:51:36.639415 master-0 kubenswrapper[28758]: I0223 14:51:36.639347 28758 generic.go:334] "Generic (PLEG): container finished" podID="bb4acaac-3f2b-41c9-a0fc-76232abf5168" containerID="1c658e32c2067f75610049b7ce969bb955f138c4bfd9451049314dae8f697706" exitCode=0 Feb 23 14:51:36.639737 master-0 kubenswrapper[28758]: I0223 14:51:36.639451 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mj69t" event={"ID":"bb4acaac-3f2b-41c9-a0fc-76232abf5168","Type":"ContainerDied","Data":"1c658e32c2067f75610049b7ce969bb955f138c4bfd9451049314dae8f697706"} Feb 23 14:51:36.645613 master-0 kubenswrapper[28758]: I0223 14:51:36.644767 28758 generic.go:334] "Generic (PLEG): container finished" podID="640d93a2-e933-4da5-ba03-5905a4ffecb9" containerID="2bb1021226f9c42c3894ff0f9c7ff82d81ccb8b5659360d06cb7c123cae8ae92" exitCode=143 Feb 23 14:51:36.645613 master-0 kubenswrapper[28758]: I0223 14:51:36.644858 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"640d93a2-e933-4da5-ba03-5905a4ffecb9","Type":"ContainerDied","Data":"2bb1021226f9c42c3894ff0f9c7ff82d81ccb8b5659360d06cb7c123cae8ae92"} Feb 23 14:51:36.936501 master-0 kubenswrapper[28758]: I0223 14:51:36.934559 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="71561be7-edc6-4a2c-8145-5e9df5a3b0e2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.8:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:51:36.936501 master-0 kubenswrapper[28758]: I0223 14:51:36.934680 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="71561be7-edc6-4a2c-8145-5e9df5a3b0e2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.8:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:51:37.751205 master-0 kubenswrapper[28758]: I0223 14:51:37.749004 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="71561be7-edc6-4a2c-8145-5e9df5a3b0e2" containerName="nova-metadata-log" containerID="cri-o://dd80d9274614dc84f57777dd0cd2ef9c282326c3edf9402fc2c572f5bc63ae0e" gracePeriod=30 Feb 23 14:51:37.751205 master-0 kubenswrapper[28758]: I0223 14:51:37.749591 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="71561be7-edc6-4a2c-8145-5e9df5a3b0e2" containerName="nova-metadata-metadata" containerID="cri-o://3c4c00d8d239c2a43cd5de2f41ed0e943f773594d425bf7cdf034c48e2710c89" gracePeriod=30 Feb 23 14:51:38.591733 master-0 kubenswrapper[28758]: I0223 14:51:38.591684 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mj69t" Feb 23 14:51:38.771145 master-0 kubenswrapper[28758]: I0223 14:51:38.770338 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4acaac-3f2b-41c9-a0fc-76232abf5168-config-data\") pod \"bb4acaac-3f2b-41c9-a0fc-76232abf5168\" (UID: \"bb4acaac-3f2b-41c9-a0fc-76232abf5168\") " Feb 23 14:51:38.771145 master-0 kubenswrapper[28758]: I0223 14:51:38.770413 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4acaac-3f2b-41c9-a0fc-76232abf5168-combined-ca-bundle\") pod \"bb4acaac-3f2b-41c9-a0fc-76232abf5168\" (UID: \"bb4acaac-3f2b-41c9-a0fc-76232abf5168\") " Feb 23 14:51:38.771145 master-0 kubenswrapper[28758]: I0223 14:51:38.770560 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqmcg\" (UniqueName: \"kubernetes.io/projected/bb4acaac-3f2b-41c9-a0fc-76232abf5168-kube-api-access-qqmcg\") pod \"bb4acaac-3f2b-41c9-a0fc-76232abf5168\" (UID: \"bb4acaac-3f2b-41c9-a0fc-76232abf5168\") " Feb 23 14:51:38.771145 master-0 kubenswrapper[28758]: I0223 14:51:38.770657 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb4acaac-3f2b-41c9-a0fc-76232abf5168-scripts\") pod \"bb4acaac-3f2b-41c9-a0fc-76232abf5168\" (UID: \"bb4acaac-3f2b-41c9-a0fc-76232abf5168\") " Feb 23 14:51:38.774452 master-0 kubenswrapper[28758]: I0223 14:51:38.774396 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb4acaac-3f2b-41c9-a0fc-76232abf5168-scripts" (OuterVolumeSpecName: "scripts") pod "bb4acaac-3f2b-41c9-a0fc-76232abf5168" (UID: "bb4acaac-3f2b-41c9-a0fc-76232abf5168"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:51:38.777389 master-0 kubenswrapper[28758]: I0223 14:51:38.777303 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb4acaac-3f2b-41c9-a0fc-76232abf5168-kube-api-access-qqmcg" (OuterVolumeSpecName: "kube-api-access-qqmcg") pod "bb4acaac-3f2b-41c9-a0fc-76232abf5168" (UID: "bb4acaac-3f2b-41c9-a0fc-76232abf5168"). InnerVolumeSpecName "kube-api-access-qqmcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:51:38.777963 master-0 kubenswrapper[28758]: I0223 14:51:38.777914 28758 generic.go:334] "Generic (PLEG): container finished" podID="71561be7-edc6-4a2c-8145-5e9df5a3b0e2" containerID="dd80d9274614dc84f57777dd0cd2ef9c282326c3edf9402fc2c572f5bc63ae0e" exitCode=143 Feb 23 14:51:38.778075 master-0 kubenswrapper[28758]: I0223 14:51:38.778001 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"71561be7-edc6-4a2c-8145-5e9df5a3b0e2","Type":"ContainerDied","Data":"dd80d9274614dc84f57777dd0cd2ef9c282326c3edf9402fc2c572f5bc63ae0e"} Feb 23 14:51:38.781307 master-0 kubenswrapper[28758]: I0223 14:51:38.781240 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mj69t" event={"ID":"bb4acaac-3f2b-41c9-a0fc-76232abf5168","Type":"ContainerDied","Data":"752ba255d9f22ebb897e447805a0d17c1973c9ec8ec0d3159b46b9554836bb5b"} Feb 23 14:51:38.781423 master-0 kubenswrapper[28758]: I0223 14:51:38.781317 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="752ba255d9f22ebb897e447805a0d17c1973c9ec8ec0d3159b46b9554836bb5b" Feb 23 14:51:38.781562 master-0 kubenswrapper[28758]: I0223 14:51:38.781411 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mj69t" Feb 23 14:51:38.828591 master-0 kubenswrapper[28758]: I0223 14:51:38.828458 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb4acaac-3f2b-41c9-a0fc-76232abf5168-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb4acaac-3f2b-41c9-a0fc-76232abf5168" (UID: "bb4acaac-3f2b-41c9-a0fc-76232abf5168"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:51:38.838553 master-0 kubenswrapper[28758]: I0223 14:51:38.838459 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb4acaac-3f2b-41c9-a0fc-76232abf5168-config-data" (OuterVolumeSpecName: "config-data") pod "bb4acaac-3f2b-41c9-a0fc-76232abf5168" (UID: "bb4acaac-3f2b-41c9-a0fc-76232abf5168"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:51:38.873950 master-0 kubenswrapper[28758]: I0223 14:51:38.873894 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqmcg\" (UniqueName: \"kubernetes.io/projected/bb4acaac-3f2b-41c9-a0fc-76232abf5168-kube-api-access-qqmcg\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:38.874313 master-0 kubenswrapper[28758]: I0223 14:51:38.874253 28758 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb4acaac-3f2b-41c9-a0fc-76232abf5168-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:38.874313 master-0 kubenswrapper[28758]: I0223 14:51:38.874274 28758 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4acaac-3f2b-41c9-a0fc-76232abf5168-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:38.874313 master-0 kubenswrapper[28758]: I0223 14:51:38.874283 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4acaac-3f2b-41c9-a0fc-76232abf5168-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:38.888445 master-0 kubenswrapper[28758]: I0223 14:51:38.888346 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 14:51:38.889022 master-0 kubenswrapper[28758]: E0223 14:51:38.888983 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb4acaac-3f2b-41c9-a0fc-76232abf5168" containerName="nova-cell1-conductor-db-sync" Feb 23 14:51:38.889022 master-0 kubenswrapper[28758]: I0223 14:51:38.889009 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb4acaac-3f2b-41c9-a0fc-76232abf5168" containerName="nova-cell1-conductor-db-sync" Feb 23 14:51:38.889145 master-0 kubenswrapper[28758]: E0223 14:51:38.889045 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b0c2335-dbed-4733-997a-cb1ab862acb8" containerName="nova-manage" Feb 23 14:51:38.889145 master-0 kubenswrapper[28758]: I0223 14:51:38.889056 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b0c2335-dbed-4733-997a-cb1ab862acb8" containerName="nova-manage" Feb 23 14:51:38.889145 master-0 kubenswrapper[28758]: E0223 14:51:38.889074 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76dec5c7-dcd8-4b1b-9022-8d87a29cbbce" containerName="init" Feb 23 14:51:38.889145 master-0 kubenswrapper[28758]: I0223 14:51:38.889082 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="76dec5c7-dcd8-4b1b-9022-8d87a29cbbce" containerName="init" Feb 23 14:51:38.889145 master-0 kubenswrapper[28758]: E0223 14:51:38.889097 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76dec5c7-dcd8-4b1b-9022-8d87a29cbbce" containerName="dnsmasq-dns" Feb 23 14:51:38.889145 master-0 kubenswrapper[28758]: I0223 14:51:38.889104 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="76dec5c7-dcd8-4b1b-9022-8d87a29cbbce" containerName="dnsmasq-dns" Feb 23 14:51:38.889455 master-0 kubenswrapper[28758]: I0223 14:51:38.889432 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb4acaac-3f2b-41c9-a0fc-76232abf5168" containerName="nova-cell1-conductor-db-sync" Feb 23 14:51:38.889455 master-0 kubenswrapper[28758]: I0223 14:51:38.889452 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b0c2335-dbed-4733-997a-cb1ab862acb8" containerName="nova-manage" Feb 23 14:51:38.889616 master-0 kubenswrapper[28758]: I0223 14:51:38.889473 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="76dec5c7-dcd8-4b1b-9022-8d87a29cbbce" containerName="dnsmasq-dns" Feb 23 14:51:38.893559 master-0 kubenswrapper[28758]: I0223 14:51:38.891551 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 23 14:51:38.909780 master-0 kubenswrapper[28758]: I0223 14:51:38.909702 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 14:51:38.963718 master-0 kubenswrapper[28758]: E0223 14:51:38.963648 28758 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4d5a574333d02e976b745d647a22f2a3881098aaaee4a2e4136a369fcdbe14e1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 14:51:38.965488 master-0 kubenswrapper[28758]: E0223 14:51:38.965432 28758 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4d5a574333d02e976b745d647a22f2a3881098aaaee4a2e4136a369fcdbe14e1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 14:51:38.967074 master-0 kubenswrapper[28758]: E0223 14:51:38.966939 28758 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4d5a574333d02e976b745d647a22f2a3881098aaaee4a2e4136a369fcdbe14e1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 23 14:51:38.967074 master-0 kubenswrapper[28758]: E0223 14:51:38.967047 28758 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="d1957082-18ba-4fec-bc8d-7daf7534c3fb" containerName="nova-scheduler-scheduler" Feb 23 14:51:39.078627 master-0 kubenswrapper[28758]: I0223 14:51:39.078493 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89qq8\" (UniqueName: \"kubernetes.io/projected/6664573d-f4f2-4a39-9c7b-03c62a50a511-kube-api-access-89qq8\") pod \"nova-cell1-conductor-0\" (UID: \"6664573d-f4f2-4a39-9c7b-03c62a50a511\") " pod="openstack/nova-cell1-conductor-0" Feb 23 14:51:39.078627 master-0 kubenswrapper[28758]: I0223 14:51:39.078589 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6664573d-f4f2-4a39-9c7b-03c62a50a511-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"6664573d-f4f2-4a39-9c7b-03c62a50a511\") " pod="openstack/nova-cell1-conductor-0" Feb 23 14:51:39.078877 master-0 kubenswrapper[28758]: I0223 14:51:39.078755 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6664573d-f4f2-4a39-9c7b-03c62a50a511-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"6664573d-f4f2-4a39-9c7b-03c62a50a511\") " pod="openstack/nova-cell1-conductor-0" Feb 23 14:51:39.184314 master-0 kubenswrapper[28758]: I0223 14:51:39.184220 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89qq8\" (UniqueName: \"kubernetes.io/projected/6664573d-f4f2-4a39-9c7b-03c62a50a511-kube-api-access-89qq8\") pod \"nova-cell1-conductor-0\" (UID: \"6664573d-f4f2-4a39-9c7b-03c62a50a511\") " pod="openstack/nova-cell1-conductor-0" Feb 23 14:51:39.184762 master-0 kubenswrapper[28758]: I0223 14:51:39.184559 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6664573d-f4f2-4a39-9c7b-03c62a50a511-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"6664573d-f4f2-4a39-9c7b-03c62a50a511\") " pod="openstack/nova-cell1-conductor-0" Feb 23 14:51:39.186700 master-0 kubenswrapper[28758]: I0223 14:51:39.185419 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6664573d-f4f2-4a39-9c7b-03c62a50a511-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"6664573d-f4f2-4a39-9c7b-03c62a50a511\") " pod="openstack/nova-cell1-conductor-0" Feb 23 14:51:39.190489 master-0 kubenswrapper[28758]: I0223 14:51:39.190383 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6664573d-f4f2-4a39-9c7b-03c62a50a511-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"6664573d-f4f2-4a39-9c7b-03c62a50a511\") " pod="openstack/nova-cell1-conductor-0" Feb 23 14:51:39.190616 master-0 kubenswrapper[28758]: I0223 14:51:39.190399 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6664573d-f4f2-4a39-9c7b-03c62a50a511-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"6664573d-f4f2-4a39-9c7b-03c62a50a511\") " pod="openstack/nova-cell1-conductor-0" Feb 23 14:51:39.225558 master-0 kubenswrapper[28758]: I0223 14:51:39.225449 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89qq8\" (UniqueName: \"kubernetes.io/projected/6664573d-f4f2-4a39-9c7b-03c62a50a511-kube-api-access-89qq8\") pod \"nova-cell1-conductor-0\" (UID: \"6664573d-f4f2-4a39-9c7b-03c62a50a511\") " pod="openstack/nova-cell1-conductor-0" Feb 23 14:51:39.257287 master-0 kubenswrapper[28758]: I0223 14:51:39.252942 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 23 14:51:39.721691 master-0 kubenswrapper[28758]: I0223 14:51:39.721582 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 23 14:51:39.796391 master-0 kubenswrapper[28758]: I0223 14:51:39.796321 28758 generic.go:334] "Generic (PLEG): container finished" podID="640d93a2-e933-4da5-ba03-5905a4ffecb9" containerID="1e53d8ef27aac9eded58759ed6131f332f53b8102de278b9ab9b1a7074c198ee" exitCode=0 Feb 23 14:51:39.796391 master-0 kubenswrapper[28758]: I0223 14:51:39.796396 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"640d93a2-e933-4da5-ba03-5905a4ffecb9","Type":"ContainerDied","Data":"1e53d8ef27aac9eded58759ed6131f332f53b8102de278b9ab9b1a7074c198ee"} Feb 23 14:51:40.265633 master-0 kubenswrapper[28758]: I0223 14:51:40.265560 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 14:51:40.324507 master-0 kubenswrapper[28758]: I0223 14:51:40.317640 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/640d93a2-e933-4da5-ba03-5905a4ffecb9-combined-ca-bundle\") pod \"640d93a2-e933-4da5-ba03-5905a4ffecb9\" (UID: \"640d93a2-e933-4da5-ba03-5905a4ffecb9\") " Feb 23 14:51:40.324507 master-0 kubenswrapper[28758]: I0223 14:51:40.317732 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/640d93a2-e933-4da5-ba03-5905a4ffecb9-logs\") pod \"640d93a2-e933-4da5-ba03-5905a4ffecb9\" (UID: \"640d93a2-e933-4da5-ba03-5905a4ffecb9\") " Feb 23 14:51:40.324507 master-0 kubenswrapper[28758]: I0223 14:51:40.317925 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bjmt\" (UniqueName: \"kubernetes.io/projected/640d93a2-e933-4da5-ba03-5905a4ffecb9-kube-api-access-9bjmt\") pod \"640d93a2-e933-4da5-ba03-5905a4ffecb9\" (UID: \"640d93a2-e933-4da5-ba03-5905a4ffecb9\") " Feb 23 14:51:40.324507 master-0 kubenswrapper[28758]: I0223 14:51:40.318095 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/640d93a2-e933-4da5-ba03-5905a4ffecb9-config-data\") pod \"640d93a2-e933-4da5-ba03-5905a4ffecb9\" (UID: \"640d93a2-e933-4da5-ba03-5905a4ffecb9\") " Feb 23 14:51:40.324507 master-0 kubenswrapper[28758]: I0223 14:51:40.320678 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/640d93a2-e933-4da5-ba03-5905a4ffecb9-logs" (OuterVolumeSpecName: "logs") pod "640d93a2-e933-4da5-ba03-5905a4ffecb9" (UID: "640d93a2-e933-4da5-ba03-5905a4ffecb9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:51:40.344566 master-0 kubenswrapper[28758]: I0223 14:51:40.344198 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/640d93a2-e933-4da5-ba03-5905a4ffecb9-kube-api-access-9bjmt" (OuterVolumeSpecName: "kube-api-access-9bjmt") pod "640d93a2-e933-4da5-ba03-5905a4ffecb9" (UID: "640d93a2-e933-4da5-ba03-5905a4ffecb9"). InnerVolumeSpecName "kube-api-access-9bjmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:51:40.357723 master-0 kubenswrapper[28758]: I0223 14:51:40.357624 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/640d93a2-e933-4da5-ba03-5905a4ffecb9-config-data" (OuterVolumeSpecName: "config-data") pod "640d93a2-e933-4da5-ba03-5905a4ffecb9" (UID: "640d93a2-e933-4da5-ba03-5905a4ffecb9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:51:40.377030 master-0 kubenswrapper[28758]: I0223 14:51:40.376945 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/640d93a2-e933-4da5-ba03-5905a4ffecb9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "640d93a2-e933-4da5-ba03-5905a4ffecb9" (UID: "640d93a2-e933-4da5-ba03-5905a4ffecb9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:51:40.421511 master-0 kubenswrapper[28758]: I0223 14:51:40.420255 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bjmt\" (UniqueName: \"kubernetes.io/projected/640d93a2-e933-4da5-ba03-5905a4ffecb9-kube-api-access-9bjmt\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:40.421511 master-0 kubenswrapper[28758]: I0223 14:51:40.420318 28758 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/640d93a2-e933-4da5-ba03-5905a4ffecb9-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:40.421511 master-0 kubenswrapper[28758]: I0223 14:51:40.420331 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/640d93a2-e933-4da5-ba03-5905a4ffecb9-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:40.421511 master-0 kubenswrapper[28758]: I0223 14:51:40.420342 28758 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/640d93a2-e933-4da5-ba03-5905a4ffecb9-logs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:40.812597 master-0 kubenswrapper[28758]: I0223 14:51:40.811998 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"6664573d-f4f2-4a39-9c7b-03c62a50a511","Type":"ContainerStarted","Data":"6d6c83dec4ba6a40c0344476241213f7bc7d00d8df534b417e279cb8976ded17"} Feb 23 14:51:40.812597 master-0 kubenswrapper[28758]: I0223 14:51:40.812075 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"6664573d-f4f2-4a39-9c7b-03c62a50a511","Type":"ContainerStarted","Data":"2d838adaf315b522bcf38a341bfa8c3b76b279af4a51125eb170131ac2f7d0af"} Feb 23 14:51:40.812597 master-0 kubenswrapper[28758]: I0223 14:51:40.812099 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 23 14:51:40.814595 master-0 kubenswrapper[28758]: I0223 14:51:40.814552 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"640d93a2-e933-4da5-ba03-5905a4ffecb9","Type":"ContainerDied","Data":"a01951abc66220b1818e86a5a6272abb024af614378c542768afc7fd6a988170"} Feb 23 14:51:40.814761 master-0 kubenswrapper[28758]: I0223 14:51:40.814743 28758 scope.go:117] "RemoveContainer" containerID="1e53d8ef27aac9eded58759ed6131f332f53b8102de278b9ab9b1a7074c198ee" Feb 23 14:51:40.814982 master-0 kubenswrapper[28758]: I0223 14:51:40.814962 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 14:51:40.847924 master-0 kubenswrapper[28758]: I0223 14:51:40.847872 28758 scope.go:117] "RemoveContainer" containerID="2bb1021226f9c42c3894ff0f9c7ff82d81ccb8b5659360d06cb7c123cae8ae92" Feb 23 14:51:40.974088 master-0 kubenswrapper[28758]: I0223 14:51:40.973900 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.973878415 podStartE2EDuration="2.973878415s" podCreationTimestamp="2026-02-23 14:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:51:40.958264782 +0000 UTC m=+1033.084580724" watchObservedRunningTime="2026-02-23 14:51:40.973878415 +0000 UTC m=+1033.100194347" Feb 23 14:51:40.993524 master-0 kubenswrapper[28758]: I0223 14:51:40.993447 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 14:51:41.010642 master-0 kubenswrapper[28758]: I0223 14:51:41.010581 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 23 14:51:41.056077 master-0 kubenswrapper[28758]: I0223 14:51:41.056004 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 23 14:51:41.056803 master-0 kubenswrapper[28758]: E0223 14:51:41.056764 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="640d93a2-e933-4da5-ba03-5905a4ffecb9" containerName="nova-api-log" Feb 23 14:51:41.056803 master-0 kubenswrapper[28758]: I0223 14:51:41.056795 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="640d93a2-e933-4da5-ba03-5905a4ffecb9" containerName="nova-api-log" Feb 23 14:51:41.056940 master-0 kubenswrapper[28758]: E0223 14:51:41.056834 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="640d93a2-e933-4da5-ba03-5905a4ffecb9" containerName="nova-api-api" Feb 23 14:51:41.056940 master-0 kubenswrapper[28758]: I0223 14:51:41.056843 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="640d93a2-e933-4da5-ba03-5905a4ffecb9" containerName="nova-api-api" Feb 23 14:51:41.057196 master-0 kubenswrapper[28758]: I0223 14:51:41.057170 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="640d93a2-e933-4da5-ba03-5905a4ffecb9" containerName="nova-api-log" Feb 23 14:51:41.057196 master-0 kubenswrapper[28758]: I0223 14:51:41.057194 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="640d93a2-e933-4da5-ba03-5905a4ffecb9" containerName="nova-api-api" Feb 23 14:51:41.058807 master-0 kubenswrapper[28758]: I0223 14:51:41.058774 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 14:51:41.060829 master-0 kubenswrapper[28758]: I0223 14:51:41.060789 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 23 14:51:41.087384 master-0 kubenswrapper[28758]: I0223 14:51:41.078004 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 14:51:41.241111 master-0 kubenswrapper[28758]: I0223 14:51:41.240603 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8p4d\" (UniqueName: \"kubernetes.io/projected/0004ca55-1f7a-4c38-8934-3c73e1380545-kube-api-access-r8p4d\") pod \"nova-api-0\" (UID: \"0004ca55-1f7a-4c38-8934-3c73e1380545\") " pod="openstack/nova-api-0" Feb 23 14:51:41.241111 master-0 kubenswrapper[28758]: I0223 14:51:41.240808 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0004ca55-1f7a-4c38-8934-3c73e1380545-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0004ca55-1f7a-4c38-8934-3c73e1380545\") " pod="openstack/nova-api-0" Feb 23 14:51:41.241111 master-0 kubenswrapper[28758]: I0223 14:51:41.240847 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0004ca55-1f7a-4c38-8934-3c73e1380545-logs\") pod \"nova-api-0\" (UID: \"0004ca55-1f7a-4c38-8934-3c73e1380545\") " pod="openstack/nova-api-0" Feb 23 14:51:41.241111 master-0 kubenswrapper[28758]: I0223 14:51:41.240936 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0004ca55-1f7a-4c38-8934-3c73e1380545-config-data\") pod \"nova-api-0\" (UID: \"0004ca55-1f7a-4c38-8934-3c73e1380545\") " pod="openstack/nova-api-0" Feb 23 14:51:41.342015 master-0 kubenswrapper[28758]: I0223 14:51:41.341689 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8p4d\" (UniqueName: \"kubernetes.io/projected/0004ca55-1f7a-4c38-8934-3c73e1380545-kube-api-access-r8p4d\") pod \"nova-api-0\" (UID: \"0004ca55-1f7a-4c38-8934-3c73e1380545\") " pod="openstack/nova-api-0" Feb 23 14:51:41.342015 master-0 kubenswrapper[28758]: I0223 14:51:41.341815 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0004ca55-1f7a-4c38-8934-3c73e1380545-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0004ca55-1f7a-4c38-8934-3c73e1380545\") " pod="openstack/nova-api-0" Feb 23 14:51:41.342015 master-0 kubenswrapper[28758]: I0223 14:51:41.341846 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0004ca55-1f7a-4c38-8934-3c73e1380545-logs\") pod \"nova-api-0\" (UID: \"0004ca55-1f7a-4c38-8934-3c73e1380545\") " pod="openstack/nova-api-0" Feb 23 14:51:41.342015 master-0 kubenswrapper[28758]: I0223 14:51:41.341897 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0004ca55-1f7a-4c38-8934-3c73e1380545-config-data\") pod \"nova-api-0\" (UID: \"0004ca55-1f7a-4c38-8934-3c73e1380545\") " pod="openstack/nova-api-0" Feb 23 14:51:41.343466 master-0 kubenswrapper[28758]: I0223 14:51:41.342375 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0004ca55-1f7a-4c38-8934-3c73e1380545-logs\") pod \"nova-api-0\" (UID: \"0004ca55-1f7a-4c38-8934-3c73e1380545\") " pod="openstack/nova-api-0" Feb 23 14:51:41.356500 master-0 kubenswrapper[28758]: I0223 14:51:41.351355 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0004ca55-1f7a-4c38-8934-3c73e1380545-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0004ca55-1f7a-4c38-8934-3c73e1380545\") " pod="openstack/nova-api-0" Feb 23 14:51:41.356500 master-0 kubenswrapper[28758]: I0223 14:51:41.351431 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0004ca55-1f7a-4c38-8934-3c73e1380545-config-data\") pod \"nova-api-0\" (UID: \"0004ca55-1f7a-4c38-8934-3c73e1380545\") " pod="openstack/nova-api-0" Feb 23 14:51:41.365559 master-0 kubenswrapper[28758]: I0223 14:51:41.362783 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8p4d\" (UniqueName: \"kubernetes.io/projected/0004ca55-1f7a-4c38-8934-3c73e1380545-kube-api-access-r8p4d\") pod \"nova-api-0\" (UID: \"0004ca55-1f7a-4c38-8934-3c73e1380545\") " pod="openstack/nova-api-0" Feb 23 14:51:41.428626 master-0 kubenswrapper[28758]: I0223 14:51:41.423933 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 14:51:41.835154 master-0 kubenswrapper[28758]: I0223 14:51:41.833215 28758 generic.go:334] "Generic (PLEG): container finished" podID="d1957082-18ba-4fec-bc8d-7daf7534c3fb" containerID="4d5a574333d02e976b745d647a22f2a3881098aaaee4a2e4136a369fcdbe14e1" exitCode=0 Feb 23 14:51:41.835154 master-0 kubenswrapper[28758]: I0223 14:51:41.833296 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d1957082-18ba-4fec-bc8d-7daf7534c3fb","Type":"ContainerDied","Data":"4d5a574333d02e976b745d647a22f2a3881098aaaee4a2e4136a369fcdbe14e1"} Feb 23 14:51:42.089838 master-0 kubenswrapper[28758]: I0223 14:51:42.089785 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 14:51:42.106597 master-0 kubenswrapper[28758]: I0223 14:51:42.106062 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="640d93a2-e933-4da5-ba03-5905a4ffecb9" path="/var/lib/kubelet/pods/640d93a2-e933-4da5-ba03-5905a4ffecb9/volumes" Feb 23 14:51:42.166311 master-0 kubenswrapper[28758]: I0223 14:51:42.166248 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 14:51:42.185712 master-0 kubenswrapper[28758]: W0223 14:51:42.185099 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0004ca55_1f7a_4c38_8934_3c73e1380545.slice/crio-3332ef26649bb7c29de4e7fc162ae117bd9a126c035b995416fdc98fb04d0b69 WatchSource:0}: Error finding container 3332ef26649bb7c29de4e7fc162ae117bd9a126c035b995416fdc98fb04d0b69: Status 404 returned error can't find the container with id 3332ef26649bb7c29de4e7fc162ae117bd9a126c035b995416fdc98fb04d0b69 Feb 23 14:51:42.266582 master-0 kubenswrapper[28758]: I0223 14:51:42.266514 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1957082-18ba-4fec-bc8d-7daf7534c3fb-config-data\") pod \"d1957082-18ba-4fec-bc8d-7daf7534c3fb\" (UID: \"d1957082-18ba-4fec-bc8d-7daf7534c3fb\") " Feb 23 14:51:42.267022 master-0 kubenswrapper[28758]: I0223 14:51:42.266990 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1957082-18ba-4fec-bc8d-7daf7534c3fb-combined-ca-bundle\") pod \"d1957082-18ba-4fec-bc8d-7daf7534c3fb\" (UID: \"d1957082-18ba-4fec-bc8d-7daf7534c3fb\") " Feb 23 14:51:42.267441 master-0 kubenswrapper[28758]: I0223 14:51:42.267129 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nb4t2\" (UniqueName: \"kubernetes.io/projected/d1957082-18ba-4fec-bc8d-7daf7534c3fb-kube-api-access-nb4t2\") pod \"d1957082-18ba-4fec-bc8d-7daf7534c3fb\" (UID: \"d1957082-18ba-4fec-bc8d-7daf7534c3fb\") " Feb 23 14:51:42.270389 master-0 kubenswrapper[28758]: I0223 14:51:42.270338 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1957082-18ba-4fec-bc8d-7daf7534c3fb-kube-api-access-nb4t2" (OuterVolumeSpecName: "kube-api-access-nb4t2") pod "d1957082-18ba-4fec-bc8d-7daf7534c3fb" (UID: "d1957082-18ba-4fec-bc8d-7daf7534c3fb"). InnerVolumeSpecName "kube-api-access-nb4t2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:51:42.311911 master-0 kubenswrapper[28758]: I0223 14:51:42.311856 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1957082-18ba-4fec-bc8d-7daf7534c3fb-config-data" (OuterVolumeSpecName: "config-data") pod "d1957082-18ba-4fec-bc8d-7daf7534c3fb" (UID: "d1957082-18ba-4fec-bc8d-7daf7534c3fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:51:42.320714 master-0 kubenswrapper[28758]: I0223 14:51:42.320650 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1957082-18ba-4fec-bc8d-7daf7534c3fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1957082-18ba-4fec-bc8d-7daf7534c3fb" (UID: "d1957082-18ba-4fec-bc8d-7daf7534c3fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:51:42.370738 master-0 kubenswrapper[28758]: I0223 14:51:42.370683 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nb4t2\" (UniqueName: \"kubernetes.io/projected/d1957082-18ba-4fec-bc8d-7daf7534c3fb-kube-api-access-nb4t2\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:42.370738 master-0 kubenswrapper[28758]: I0223 14:51:42.370733 28758 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1957082-18ba-4fec-bc8d-7daf7534c3fb-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:42.370738 master-0 kubenswrapper[28758]: I0223 14:51:42.370748 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1957082-18ba-4fec-bc8d-7daf7534c3fb-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:42.677445 master-0 kubenswrapper[28758]: I0223 14:51:42.677373 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 14:51:42.787850 master-0 kubenswrapper[28758]: I0223 14:51:42.787608 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71561be7-edc6-4a2c-8145-5e9df5a3b0e2-config-data\") pod \"71561be7-edc6-4a2c-8145-5e9df5a3b0e2\" (UID: \"71561be7-edc6-4a2c-8145-5e9df5a3b0e2\") " Feb 23 14:51:42.787850 master-0 kubenswrapper[28758]: I0223 14:51:42.787766 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71561be7-edc6-4a2c-8145-5e9df5a3b0e2-logs\") pod \"71561be7-edc6-4a2c-8145-5e9df5a3b0e2\" (UID: \"71561be7-edc6-4a2c-8145-5e9df5a3b0e2\") " Feb 23 14:51:42.788113 master-0 kubenswrapper[28758]: I0223 14:51:42.787901 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krsb5\" (UniqueName: \"kubernetes.io/projected/71561be7-edc6-4a2c-8145-5e9df5a3b0e2-kube-api-access-krsb5\") pod \"71561be7-edc6-4a2c-8145-5e9df5a3b0e2\" (UID: \"71561be7-edc6-4a2c-8145-5e9df5a3b0e2\") " Feb 23 14:51:42.788113 master-0 kubenswrapper[28758]: I0223 14:51:42.787928 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71561be7-edc6-4a2c-8145-5e9df5a3b0e2-combined-ca-bundle\") pod \"71561be7-edc6-4a2c-8145-5e9df5a3b0e2\" (UID: \"71561be7-edc6-4a2c-8145-5e9df5a3b0e2\") " Feb 23 14:51:42.788113 master-0 kubenswrapper[28758]: I0223 14:51:42.787982 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/71561be7-edc6-4a2c-8145-5e9df5a3b0e2-nova-metadata-tls-certs\") pod \"71561be7-edc6-4a2c-8145-5e9df5a3b0e2\" (UID: \"71561be7-edc6-4a2c-8145-5e9df5a3b0e2\") " Feb 23 14:51:42.791241 master-0 kubenswrapper[28758]: I0223 14:51:42.791199 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71561be7-edc6-4a2c-8145-5e9df5a3b0e2-logs" (OuterVolumeSpecName: "logs") pod "71561be7-edc6-4a2c-8145-5e9df5a3b0e2" (UID: "71561be7-edc6-4a2c-8145-5e9df5a3b0e2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:51:42.803895 master-0 kubenswrapper[28758]: I0223 14:51:42.803837 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71561be7-edc6-4a2c-8145-5e9df5a3b0e2-kube-api-access-krsb5" (OuterVolumeSpecName: "kube-api-access-krsb5") pod "71561be7-edc6-4a2c-8145-5e9df5a3b0e2" (UID: "71561be7-edc6-4a2c-8145-5e9df5a3b0e2"). InnerVolumeSpecName "kube-api-access-krsb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:51:42.819647 master-0 kubenswrapper[28758]: I0223 14:51:42.819588 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71561be7-edc6-4a2c-8145-5e9df5a3b0e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71561be7-edc6-4a2c-8145-5e9df5a3b0e2" (UID: "71561be7-edc6-4a2c-8145-5e9df5a3b0e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:51:42.822700 master-0 kubenswrapper[28758]: I0223 14:51:42.822656 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71561be7-edc6-4a2c-8145-5e9df5a3b0e2-config-data" (OuterVolumeSpecName: "config-data") pod "71561be7-edc6-4a2c-8145-5e9df5a3b0e2" (UID: "71561be7-edc6-4a2c-8145-5e9df5a3b0e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:51:42.852043 master-0 kubenswrapper[28758]: I0223 14:51:42.851873 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71561be7-edc6-4a2c-8145-5e9df5a3b0e2-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "71561be7-edc6-4a2c-8145-5e9df5a3b0e2" (UID: "71561be7-edc6-4a2c-8145-5e9df5a3b0e2"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:51:42.885823 master-0 kubenswrapper[28758]: I0223 14:51:42.885758 28758 generic.go:334] "Generic (PLEG): container finished" podID="71561be7-edc6-4a2c-8145-5e9df5a3b0e2" containerID="3c4c00d8d239c2a43cd5de2f41ed0e943f773594d425bf7cdf034c48e2710c89" exitCode=0 Feb 23 14:51:42.886050 master-0 kubenswrapper[28758]: I0223 14:51:42.885845 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"71561be7-edc6-4a2c-8145-5e9df5a3b0e2","Type":"ContainerDied","Data":"3c4c00d8d239c2a43cd5de2f41ed0e943f773594d425bf7cdf034c48e2710c89"} Feb 23 14:51:42.886050 master-0 kubenswrapper[28758]: I0223 14:51:42.885882 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"71561be7-edc6-4a2c-8145-5e9df5a3b0e2","Type":"ContainerDied","Data":"d4bb47b9377bad43d1f525b1460327e91d66616d3d1c4185ca1a52c031aa0603"} Feb 23 14:51:42.886050 master-0 kubenswrapper[28758]: I0223 14:51:42.885901 28758 scope.go:117] "RemoveContainer" containerID="3c4c00d8d239c2a43cd5de2f41ed0e943f773594d425bf7cdf034c48e2710c89" Feb 23 14:51:42.886050 master-0 kubenswrapper[28758]: I0223 14:51:42.886026 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 14:51:42.891405 master-0 kubenswrapper[28758]: I0223 14:51:42.891237 28758 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71561be7-edc6-4a2c-8145-5e9df5a3b0e2-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:42.891405 master-0 kubenswrapper[28758]: I0223 14:51:42.891279 28758 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71561be7-edc6-4a2c-8145-5e9df5a3b0e2-logs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:42.891405 master-0 kubenswrapper[28758]: I0223 14:51:42.891294 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krsb5\" (UniqueName: \"kubernetes.io/projected/71561be7-edc6-4a2c-8145-5e9df5a3b0e2-kube-api-access-krsb5\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:42.891405 master-0 kubenswrapper[28758]: I0223 14:51:42.891308 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71561be7-edc6-4a2c-8145-5e9df5a3b0e2-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:42.891405 master-0 kubenswrapper[28758]: I0223 14:51:42.891320 28758 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/71561be7-edc6-4a2c-8145-5e9df5a3b0e2-nova-metadata-tls-certs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:42.899432 master-0 kubenswrapper[28758]: I0223 14:51:42.899357 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0004ca55-1f7a-4c38-8934-3c73e1380545","Type":"ContainerStarted","Data":"1a9ae0f88cc41b6554a0718035340164712c6ae259734900671d9be2477509d8"} Feb 23 14:51:42.899551 master-0 kubenswrapper[28758]: I0223 14:51:42.899433 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0004ca55-1f7a-4c38-8934-3c73e1380545","Type":"ContainerStarted","Data":"22919aae0c50a61f21e9874f7f2e7cc92add14c56d6aba57b68eb9bf2b456542"} Feb 23 14:51:42.899551 master-0 kubenswrapper[28758]: I0223 14:51:42.899451 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0004ca55-1f7a-4c38-8934-3c73e1380545","Type":"ContainerStarted","Data":"3332ef26649bb7c29de4e7fc162ae117bd9a126c035b995416fdc98fb04d0b69"} Feb 23 14:51:42.917918 master-0 kubenswrapper[28758]: I0223 14:51:42.917859 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d1957082-18ba-4fec-bc8d-7daf7534c3fb","Type":"ContainerDied","Data":"cfad930f6f41fe76e125995eb767681d5421082c63b957332c472ae639b68bbb"} Feb 23 14:51:42.918178 master-0 kubenswrapper[28758]: I0223 14:51:42.917983 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 14:51:42.948067 master-0 kubenswrapper[28758]: I0223 14:51:42.947133 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.947110666 podStartE2EDuration="1.947110666s" podCreationTimestamp="2026-02-23 14:51:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:51:42.932709928 +0000 UTC m=+1035.059025860" watchObservedRunningTime="2026-02-23 14:51:42.947110666 +0000 UTC m=+1035.073426598" Feb 23 14:51:42.962252 master-0 kubenswrapper[28758]: I0223 14:51:42.960542 28758 scope.go:117] "RemoveContainer" containerID="dd80d9274614dc84f57777dd0cd2ef9c282326c3edf9402fc2c572f5bc63ae0e" Feb 23 14:51:42.989446 master-0 kubenswrapper[28758]: I0223 14:51:42.989291 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 14:51:43.021945 master-0 kubenswrapper[28758]: I0223 14:51:43.019646 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 14:51:43.057496 master-0 kubenswrapper[28758]: I0223 14:51:43.042389 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 23 14:51:43.057496 master-0 kubenswrapper[28758]: E0223 14:51:43.043087 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71561be7-edc6-4a2c-8145-5e9df5a3b0e2" containerName="nova-metadata-log" Feb 23 14:51:43.057496 master-0 kubenswrapper[28758]: I0223 14:51:43.043106 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="71561be7-edc6-4a2c-8145-5e9df5a3b0e2" containerName="nova-metadata-log" Feb 23 14:51:43.057496 master-0 kubenswrapper[28758]: E0223 14:51:43.043130 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71561be7-edc6-4a2c-8145-5e9df5a3b0e2" containerName="nova-metadata-metadata" Feb 23 14:51:43.057496 master-0 kubenswrapper[28758]: I0223 14:51:43.043138 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="71561be7-edc6-4a2c-8145-5e9df5a3b0e2" containerName="nova-metadata-metadata" Feb 23 14:51:43.057496 master-0 kubenswrapper[28758]: E0223 14:51:43.043164 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1957082-18ba-4fec-bc8d-7daf7534c3fb" containerName="nova-scheduler-scheduler" Feb 23 14:51:43.057496 master-0 kubenswrapper[28758]: I0223 14:51:43.043172 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1957082-18ba-4fec-bc8d-7daf7534c3fb" containerName="nova-scheduler-scheduler" Feb 23 14:51:43.057496 master-0 kubenswrapper[28758]: I0223 14:51:43.043531 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="71561be7-edc6-4a2c-8145-5e9df5a3b0e2" containerName="nova-metadata-log" Feb 23 14:51:43.057496 master-0 kubenswrapper[28758]: I0223 14:51:43.043583 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1957082-18ba-4fec-bc8d-7daf7534c3fb" containerName="nova-scheduler-scheduler" Feb 23 14:51:43.057496 master-0 kubenswrapper[28758]: I0223 14:51:43.043609 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="71561be7-edc6-4a2c-8145-5e9df5a3b0e2" containerName="nova-metadata-metadata" Feb 23 14:51:43.057496 master-0 kubenswrapper[28758]: I0223 14:51:43.050742 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 14:51:43.057496 master-0 kubenswrapper[28758]: I0223 14:51:43.057106 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 23 14:51:43.057496 master-0 kubenswrapper[28758]: I0223 14:51:43.057380 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 23 14:51:43.119395 master-0 kubenswrapper[28758]: I0223 14:51:43.119325 28758 scope.go:117] "RemoveContainer" containerID="3c4c00d8d239c2a43cd5de2f41ed0e943f773594d425bf7cdf034c48e2710c89" Feb 23 14:51:43.123292 master-0 kubenswrapper[28758]: E0223 14:51:43.123144 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c4c00d8d239c2a43cd5de2f41ed0e943f773594d425bf7cdf034c48e2710c89\": container with ID starting with 3c4c00d8d239c2a43cd5de2f41ed0e943f773594d425bf7cdf034c48e2710c89 not found: ID does not exist" containerID="3c4c00d8d239c2a43cd5de2f41ed0e943f773594d425bf7cdf034c48e2710c89" Feb 23 14:51:43.132332 master-0 kubenswrapper[28758]: I0223 14:51:43.123218 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c4c00d8d239c2a43cd5de2f41ed0e943f773594d425bf7cdf034c48e2710c89"} err="failed to get container status \"3c4c00d8d239c2a43cd5de2f41ed0e943f773594d425bf7cdf034c48e2710c89\": rpc error: code = NotFound desc = could not find container \"3c4c00d8d239c2a43cd5de2f41ed0e943f773594d425bf7cdf034c48e2710c89\": container with ID starting with 3c4c00d8d239c2a43cd5de2f41ed0e943f773594d425bf7cdf034c48e2710c89 not found: ID does not exist" Feb 23 14:51:43.132332 master-0 kubenswrapper[28758]: I0223 14:51:43.129709 28758 scope.go:117] "RemoveContainer" containerID="dd80d9274614dc84f57777dd0cd2ef9c282326c3edf9402fc2c572f5bc63ae0e" Feb 23 14:51:43.132332 master-0 kubenswrapper[28758]: I0223 14:51:43.130237 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 14:51:43.132332 master-0 kubenswrapper[28758]: E0223 14:51:43.130451 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd80d9274614dc84f57777dd0cd2ef9c282326c3edf9402fc2c572f5bc63ae0e\": container with ID starting with dd80d9274614dc84f57777dd0cd2ef9c282326c3edf9402fc2c572f5bc63ae0e not found: ID does not exist" containerID="dd80d9274614dc84f57777dd0cd2ef9c282326c3edf9402fc2c572f5bc63ae0e" Feb 23 14:51:43.132332 master-0 kubenswrapper[28758]: I0223 14:51:43.130501 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd80d9274614dc84f57777dd0cd2ef9c282326c3edf9402fc2c572f5bc63ae0e"} err="failed to get container status \"dd80d9274614dc84f57777dd0cd2ef9c282326c3edf9402fc2c572f5bc63ae0e\": rpc error: code = NotFound desc = could not find container \"dd80d9274614dc84f57777dd0cd2ef9c282326c3edf9402fc2c572f5bc63ae0e\": container with ID starting with dd80d9274614dc84f57777dd0cd2ef9c282326c3edf9402fc2c572f5bc63ae0e not found: ID does not exist" Feb 23 14:51:43.132332 master-0 kubenswrapper[28758]: I0223 14:51:43.130535 28758 scope.go:117] "RemoveContainer" containerID="4d5a574333d02e976b745d647a22f2a3881098aaaee4a2e4136a369fcdbe14e1" Feb 23 14:51:43.156619 master-0 kubenswrapper[28758]: I0223 14:51:43.156562 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 14:51:43.189932 master-0 kubenswrapper[28758]: I0223 14:51:43.189837 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 14:51:43.228042 master-0 kubenswrapper[28758]: I0223 14:51:43.227987 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 14:51:43.230266 master-0 kubenswrapper[28758]: I0223 14:51:43.230233 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 14:51:43.230587 master-0 kubenswrapper[28758]: I0223 14:51:43.230534 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvqrn\" (UniqueName: \"kubernetes.io/projected/07be6be7-f047-46d7-a28b-cc077ce2261c-kube-api-access-hvqrn\") pod \"nova-metadata-0\" (UID: \"07be6be7-f047-46d7-a28b-cc077ce2261c\") " pod="openstack/nova-metadata-0" Feb 23 14:51:43.231059 master-0 kubenswrapper[28758]: I0223 14:51:43.231014 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/07be6be7-f047-46d7-a28b-cc077ce2261c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"07be6be7-f047-46d7-a28b-cc077ce2261c\") " pod="openstack/nova-metadata-0" Feb 23 14:51:43.231666 master-0 kubenswrapper[28758]: I0223 14:51:43.231632 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07be6be7-f047-46d7-a28b-cc077ce2261c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"07be6be7-f047-46d7-a28b-cc077ce2261c\") " pod="openstack/nova-metadata-0" Feb 23 14:51:43.233085 master-0 kubenswrapper[28758]: I0223 14:51:43.231701 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07be6be7-f047-46d7-a28b-cc077ce2261c-logs\") pod \"nova-metadata-0\" (UID: \"07be6be7-f047-46d7-a28b-cc077ce2261c\") " pod="openstack/nova-metadata-0" Feb 23 14:51:43.233085 master-0 kubenswrapper[28758]: I0223 14:51:43.232590 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 23 14:51:43.233962 master-0 kubenswrapper[28758]: I0223 14:51:43.233921 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07be6be7-f047-46d7-a28b-cc077ce2261c-config-data\") pod \"nova-metadata-0\" (UID: \"07be6be7-f047-46d7-a28b-cc077ce2261c\") " pod="openstack/nova-metadata-0" Feb 23 14:51:43.249055 master-0 kubenswrapper[28758]: I0223 14:51:43.248843 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 14:51:43.337499 master-0 kubenswrapper[28758]: I0223 14:51:43.337279 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07be6be7-f047-46d7-a28b-cc077ce2261c-config-data\") pod \"nova-metadata-0\" (UID: \"07be6be7-f047-46d7-a28b-cc077ce2261c\") " pod="openstack/nova-metadata-0" Feb 23 14:51:43.337691 master-0 kubenswrapper[28758]: I0223 14:51:43.337534 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvqrn\" (UniqueName: \"kubernetes.io/projected/07be6be7-f047-46d7-a28b-cc077ce2261c-kube-api-access-hvqrn\") pod \"nova-metadata-0\" (UID: \"07be6be7-f047-46d7-a28b-cc077ce2261c\") " pod="openstack/nova-metadata-0" Feb 23 14:51:43.337691 master-0 kubenswrapper[28758]: I0223 14:51:43.337597 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwdf5\" (UniqueName: \"kubernetes.io/projected/8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec-kube-api-access-jwdf5\") pod \"nova-scheduler-0\" (UID: \"8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec\") " pod="openstack/nova-scheduler-0" Feb 23 14:51:43.337691 master-0 kubenswrapper[28758]: I0223 14:51:43.337672 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/07be6be7-f047-46d7-a28b-cc077ce2261c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"07be6be7-f047-46d7-a28b-cc077ce2261c\") " pod="openstack/nova-metadata-0" Feb 23 14:51:43.337828 master-0 kubenswrapper[28758]: I0223 14:51:43.337693 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec-config-data\") pod \"nova-scheduler-0\" (UID: \"8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec\") " pod="openstack/nova-scheduler-0" Feb 23 14:51:43.337828 master-0 kubenswrapper[28758]: I0223 14:51:43.337729 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07be6be7-f047-46d7-a28b-cc077ce2261c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"07be6be7-f047-46d7-a28b-cc077ce2261c\") " pod="openstack/nova-metadata-0" Feb 23 14:51:43.337828 master-0 kubenswrapper[28758]: I0223 14:51:43.337793 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07be6be7-f047-46d7-a28b-cc077ce2261c-logs\") pod \"nova-metadata-0\" (UID: \"07be6be7-f047-46d7-a28b-cc077ce2261c\") " pod="openstack/nova-metadata-0" Feb 23 14:51:43.337961 master-0 kubenswrapper[28758]: I0223 14:51:43.337907 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec\") " pod="openstack/nova-scheduler-0" Feb 23 14:51:43.346599 master-0 kubenswrapper[28758]: I0223 14:51:43.338386 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07be6be7-f047-46d7-a28b-cc077ce2261c-logs\") pod \"nova-metadata-0\" (UID: \"07be6be7-f047-46d7-a28b-cc077ce2261c\") " pod="openstack/nova-metadata-0" Feb 23 14:51:43.346599 master-0 kubenswrapper[28758]: I0223 14:51:43.341856 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/07be6be7-f047-46d7-a28b-cc077ce2261c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"07be6be7-f047-46d7-a28b-cc077ce2261c\") " pod="openstack/nova-metadata-0" Feb 23 14:51:43.346599 master-0 kubenswrapper[28758]: I0223 14:51:43.344153 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07be6be7-f047-46d7-a28b-cc077ce2261c-config-data\") pod \"nova-metadata-0\" (UID: \"07be6be7-f047-46d7-a28b-cc077ce2261c\") " pod="openstack/nova-metadata-0" Feb 23 14:51:43.362501 master-0 kubenswrapper[28758]: I0223 14:51:43.357615 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvqrn\" (UniqueName: \"kubernetes.io/projected/07be6be7-f047-46d7-a28b-cc077ce2261c-kube-api-access-hvqrn\") pod \"nova-metadata-0\" (UID: \"07be6be7-f047-46d7-a28b-cc077ce2261c\") " pod="openstack/nova-metadata-0" Feb 23 14:51:43.372924 master-0 kubenswrapper[28758]: I0223 14:51:43.371396 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07be6be7-f047-46d7-a28b-cc077ce2261c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"07be6be7-f047-46d7-a28b-cc077ce2261c\") " pod="openstack/nova-metadata-0" Feb 23 14:51:43.443468 master-0 kubenswrapper[28758]: I0223 14:51:43.443397 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec\") " pod="openstack/nova-scheduler-0" Feb 23 14:51:43.443693 master-0 kubenswrapper[28758]: I0223 14:51:43.443620 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwdf5\" (UniqueName: \"kubernetes.io/projected/8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec-kube-api-access-jwdf5\") pod \"nova-scheduler-0\" (UID: \"8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec\") " pod="openstack/nova-scheduler-0" Feb 23 14:51:43.443780 master-0 kubenswrapper[28758]: I0223 14:51:43.443713 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec-config-data\") pod \"nova-scheduler-0\" (UID: \"8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec\") " pod="openstack/nova-scheduler-0" Feb 23 14:51:43.447540 master-0 kubenswrapper[28758]: I0223 14:51:43.447496 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec-config-data\") pod \"nova-scheduler-0\" (UID: \"8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec\") " pod="openstack/nova-scheduler-0" Feb 23 14:51:43.447925 master-0 kubenswrapper[28758]: I0223 14:51:43.447826 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 14:51:43.448198 master-0 kubenswrapper[28758]: I0223 14:51:43.448083 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec\") " pod="openstack/nova-scheduler-0" Feb 23 14:51:43.480579 master-0 kubenswrapper[28758]: I0223 14:51:43.480502 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwdf5\" (UniqueName: \"kubernetes.io/projected/8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec-kube-api-access-jwdf5\") pod \"nova-scheduler-0\" (UID: \"8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec\") " pod="openstack/nova-scheduler-0" Feb 23 14:51:43.557385 master-0 kubenswrapper[28758]: I0223 14:51:43.557327 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 14:51:43.942614 master-0 kubenswrapper[28758]: I0223 14:51:43.942553 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 14:51:44.113614 master-0 kubenswrapper[28758]: I0223 14:51:44.113501 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71561be7-edc6-4a2c-8145-5e9df5a3b0e2" path="/var/lib/kubelet/pods/71561be7-edc6-4a2c-8145-5e9df5a3b0e2/volumes" Feb 23 14:51:44.114323 master-0 kubenswrapper[28758]: I0223 14:51:44.114288 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1957082-18ba-4fec-bc8d-7daf7534c3fb" path="/var/lib/kubelet/pods/d1957082-18ba-4fec-bc8d-7daf7534c3fb/volumes" Feb 23 14:51:44.546198 master-0 kubenswrapper[28758]: W0223 14:51:44.546138 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b8e2511_b5e6_4a0a_9baf_d5be2c0a58ec.slice/crio-13e3c12dedbe97c1796ce4352b82f4344a1290bea7f1b8c67275fd3a6999b593 WatchSource:0}: Error finding container 13e3c12dedbe97c1796ce4352b82f4344a1290bea7f1b8c67275fd3a6999b593: Status 404 returned error can't find the container with id 13e3c12dedbe97c1796ce4352b82f4344a1290bea7f1b8c67275fd3a6999b593 Feb 23 14:51:44.551089 master-0 kubenswrapper[28758]: I0223 14:51:44.551030 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 14:51:44.960513 master-0 kubenswrapper[28758]: I0223 14:51:44.959214 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec","Type":"ContainerStarted","Data":"13e3c12dedbe97c1796ce4352b82f4344a1290bea7f1b8c67275fd3a6999b593"} Feb 23 14:51:44.964307 master-0 kubenswrapper[28758]: I0223 14:51:44.961702 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"07be6be7-f047-46d7-a28b-cc077ce2261c","Type":"ContainerStarted","Data":"7f2516722ecfb46cfaeeb87d56fe69c1f08ff21e506710b29ec6c3ce233778f2"} Feb 23 14:51:44.964307 master-0 kubenswrapper[28758]: I0223 14:51:44.961763 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"07be6be7-f047-46d7-a28b-cc077ce2261c","Type":"ContainerStarted","Data":"62fca3409b9e55c698632023db5afaf837ed61c8a521fe1d96747b4a2141f132"} Feb 23 14:51:44.964307 master-0 kubenswrapper[28758]: I0223 14:51:44.961774 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"07be6be7-f047-46d7-a28b-cc077ce2261c","Type":"ContainerStarted","Data":"089cf05f9de48ff9c8cbbb99013397bdf64f81b9aad91fff57fccff9dbe06bca"} Feb 23 14:51:45.975041 master-0 kubenswrapper[28758]: I0223 14:51:45.974952 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec","Type":"ContainerStarted","Data":"bf8ea5b460d72729ee24b47e584409946a22df206c7b08d7bcca7e525d06c63c"} Feb 23 14:51:46.043072 master-0 kubenswrapper[28758]: I0223 14:51:46.042561 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.042534946 podStartE2EDuration="4.042534946s" podCreationTimestamp="2026-02-23 14:51:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:51:46.027241983 +0000 UTC m=+1038.153557915" watchObservedRunningTime="2026-02-23 14:51:46.042534946 +0000 UTC m=+1038.168850888" Feb 23 14:51:46.069939 master-0 kubenswrapper[28758]: I0223 14:51:46.069788 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.069752757 podStartE2EDuration="3.069752757s" podCreationTimestamp="2026-02-23 14:51:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:51:46.058328823 +0000 UTC m=+1038.184644765" watchObservedRunningTime="2026-02-23 14:51:46.069752757 +0000 UTC m=+1038.196068689" Feb 23 14:51:46.994325 master-0 kubenswrapper[28758]: I0223 14:51:46.994261 28758 generic.go:334] "Generic (PLEG): container finished" podID="894fbd22-c889-426b-954b-04a9a0e4d905" containerID="c29f86f08145946322f1837c39c2016a439af71f93f855c7d6aa6138194a1930" exitCode=0 Feb 23 14:51:46.994909 master-0 kubenswrapper[28758]: I0223 14:51:46.994514 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"894fbd22-c889-426b-954b-04a9a0e4d905","Type":"ContainerDied","Data":"c29f86f08145946322f1837c39c2016a439af71f93f855c7d6aa6138194a1930"} Feb 23 14:51:48.023572 master-0 kubenswrapper[28758]: I0223 14:51:48.021738 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"894fbd22-c889-426b-954b-04a9a0e4d905","Type":"ContainerStarted","Data":"4e37a25c236e88d73dd1b15bbcaf4021f8e794489269b10d14e5f1b1142b0dfe"} Feb 23 14:51:48.023572 master-0 kubenswrapper[28758]: I0223 14:51:48.021800 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"894fbd22-c889-426b-954b-04a9a0e4d905","Type":"ContainerStarted","Data":"ee8df06edae846387c0fe18a2328902e937b4d75686de8c03f8cc1431c65ba6d"} Feb 23 14:51:48.462608 master-0 kubenswrapper[28758]: I0223 14:51:48.460077 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 14:51:48.462608 master-0 kubenswrapper[28758]: I0223 14:51:48.460162 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 14:51:48.558574 master-0 kubenswrapper[28758]: I0223 14:51:48.558325 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 23 14:51:49.091870 master-0 kubenswrapper[28758]: I0223 14:51:49.051046 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"894fbd22-c889-426b-954b-04a9a0e4d905","Type":"ContainerStarted","Data":"5f6a86be0338ea1718f90524224f4827043f8414e9c3244db319064ea39a01ef"} Feb 23 14:51:49.091870 master-0 kubenswrapper[28758]: I0223 14:51:49.051290 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-conductor-0" Feb 23 14:51:49.182541 master-0 kubenswrapper[28758]: I0223 14:51:49.180812 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-conductor-0" podStartSLOduration=68.44478279 podStartE2EDuration="1m49.18076856s" podCreationTimestamp="2026-02-23 14:50:00 +0000 UTC" firstStartedPulling="2026-02-23 14:50:11.445853756 +0000 UTC m=+943.572169688" lastFinishedPulling="2026-02-23 14:50:52.181839526 +0000 UTC m=+984.308155458" observedRunningTime="2026-02-23 14:51:49.164511049 +0000 UTC m=+1041.290827001" watchObservedRunningTime="2026-02-23 14:51:49.18076856 +0000 UTC m=+1041.307084492" Feb 23 14:51:49.297579 master-0 kubenswrapper[28758]: I0223 14:51:49.296306 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 23 14:51:49.637347 master-0 kubenswrapper[28758]: I0223 14:51:49.637031 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-conductor-0" Feb 23 14:51:50.063874 master-0 kubenswrapper[28758]: I0223 14:51:50.063811 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-conductor-0" Feb 23 14:51:51.086083 master-0 kubenswrapper[28758]: I0223 14:51:51.086013 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-conductor-0" Feb 23 14:51:51.111296 master-0 kubenswrapper[28758]: I0223 14:51:51.111226 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-conductor-0" Feb 23 14:51:51.425389 master-0 kubenswrapper[28758]: I0223 14:51:51.425232 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 14:51:51.425389 master-0 kubenswrapper[28758]: I0223 14:51:51.425308 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 14:51:52.114053 master-0 kubenswrapper[28758]: I0223 14:51:52.113992 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-conductor-0" Feb 23 14:51:52.509782 master-0 kubenswrapper[28758]: I0223 14:51:52.509674 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0004ca55-1f7a-4c38-8934-3c73e1380545" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.128.1.10:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 14:51:52.509782 master-0 kubenswrapper[28758]: I0223 14:51:52.509725 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0004ca55-1f7a-4c38-8934-3c73e1380545" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.128.1.10:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 14:51:53.449171 master-0 kubenswrapper[28758]: I0223 14:51:53.449075 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 23 14:51:53.449171 master-0 kubenswrapper[28758]: I0223 14:51:53.449156 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 23 14:51:53.559354 master-0 kubenswrapper[28758]: I0223 14:51:53.559271 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 23 14:51:53.596330 master-0 kubenswrapper[28758]: I0223 14:51:53.596233 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 23 14:51:54.161247 master-0 kubenswrapper[28758]: I0223 14:51:54.161170 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 23 14:51:54.462025 master-0 kubenswrapper[28758]: I0223 14:51:54.461907 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="07be6be7-f047-46d7-a28b-cc077ce2261c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.11:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:51:54.462782 master-0 kubenswrapper[28758]: I0223 14:51:54.461907 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="07be6be7-f047-46d7-a28b-cc077ce2261c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.11:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:51:54.956015 master-0 kubenswrapper[28758]: I0223 14:51:54.955949 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:51:54.982468 master-0 kubenswrapper[28758]: I0223 14:51:54.982394 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21-combined-ca-bundle\") pod \"7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21\" (UID: \"7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21\") " Feb 23 14:51:54.982468 master-0 kubenswrapper[28758]: I0223 14:51:54.982460 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21-config-data\") pod \"7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21\" (UID: \"7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21\") " Feb 23 14:51:54.982840 master-0 kubenswrapper[28758]: I0223 14:51:54.982614 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qddps\" (UniqueName: \"kubernetes.io/projected/7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21-kube-api-access-qddps\") pod \"7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21\" (UID: \"7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21\") " Feb 23 14:51:54.986676 master-0 kubenswrapper[28758]: I0223 14:51:54.986605 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21-kube-api-access-qddps" (OuterVolumeSpecName: "kube-api-access-qddps") pod "7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21" (UID: "7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21"). InnerVolumeSpecName "kube-api-access-qddps". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:51:55.020404 master-0 kubenswrapper[28758]: I0223 14:51:55.020290 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21-config-data" (OuterVolumeSpecName: "config-data") pod "7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21" (UID: "7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:51:55.030324 master-0 kubenswrapper[28758]: I0223 14:51:55.029927 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21" (UID: "7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:51:55.085782 master-0 kubenswrapper[28758]: I0223 14:51:55.085675 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qddps\" (UniqueName: \"kubernetes.io/projected/7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21-kube-api-access-qddps\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:55.085782 master-0 kubenswrapper[28758]: I0223 14:51:55.085759 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:55.085782 master-0 kubenswrapper[28758]: I0223 14:51:55.085771 28758 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:51:55.137622 master-0 kubenswrapper[28758]: I0223 14:51:55.137542 28758 generic.go:334] "Generic (PLEG): container finished" podID="7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21" containerID="e577b45efe50dcbbda9b7f7ce6fc657b56b311a72a17b4f94d5eac9e89d8fcda" exitCode=137 Feb 23 14:51:55.137622 master-0 kubenswrapper[28758]: I0223 14:51:55.137560 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:51:55.138056 master-0 kubenswrapper[28758]: I0223 14:51:55.137581 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21","Type":"ContainerDied","Data":"e577b45efe50dcbbda9b7f7ce6fc657b56b311a72a17b4f94d5eac9e89d8fcda"} Feb 23 14:51:55.138056 master-0 kubenswrapper[28758]: I0223 14:51:55.137741 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21","Type":"ContainerDied","Data":"f1c8a2e599b5146090f11d5b5cf49421f75ca9c7a6543d507abc1c487876cc52"} Feb 23 14:51:55.138056 master-0 kubenswrapper[28758]: I0223 14:51:55.137763 28758 scope.go:117] "RemoveContainer" containerID="e577b45efe50dcbbda9b7f7ce6fc657b56b311a72a17b4f94d5eac9e89d8fcda" Feb 23 14:51:55.197109 master-0 kubenswrapper[28758]: I0223 14:51:55.192996 28758 scope.go:117] "RemoveContainer" containerID="e577b45efe50dcbbda9b7f7ce6fc657b56b311a72a17b4f94d5eac9e89d8fcda" Feb 23 14:51:55.197109 master-0 kubenswrapper[28758]: E0223 14:51:55.193429 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e577b45efe50dcbbda9b7f7ce6fc657b56b311a72a17b4f94d5eac9e89d8fcda\": container with ID starting with e577b45efe50dcbbda9b7f7ce6fc657b56b311a72a17b4f94d5eac9e89d8fcda not found: ID does not exist" containerID="e577b45efe50dcbbda9b7f7ce6fc657b56b311a72a17b4f94d5eac9e89d8fcda" Feb 23 14:51:55.197109 master-0 kubenswrapper[28758]: I0223 14:51:55.193504 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e577b45efe50dcbbda9b7f7ce6fc657b56b311a72a17b4f94d5eac9e89d8fcda"} err="failed to get container status \"e577b45efe50dcbbda9b7f7ce6fc657b56b311a72a17b4f94d5eac9e89d8fcda\": rpc error: code = NotFound desc = could not find container \"e577b45efe50dcbbda9b7f7ce6fc657b56b311a72a17b4f94d5eac9e89d8fcda\": container with ID starting with e577b45efe50dcbbda9b7f7ce6fc657b56b311a72a17b4f94d5eac9e89d8fcda not found: ID does not exist" Feb 23 14:51:55.210112 master-0 kubenswrapper[28758]: I0223 14:51:55.209742 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 14:51:55.232690 master-0 kubenswrapper[28758]: I0223 14:51:55.232585 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 14:51:55.263603 master-0 kubenswrapper[28758]: I0223 14:51:55.262638 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 14:51:55.263839 master-0 kubenswrapper[28758]: E0223 14:51:55.263760 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21" containerName="nova-cell1-novncproxy-novncproxy" Feb 23 14:51:55.263839 master-0 kubenswrapper[28758]: I0223 14:51:55.263781 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21" containerName="nova-cell1-novncproxy-novncproxy" Feb 23 14:51:55.264105 master-0 kubenswrapper[28758]: I0223 14:51:55.264074 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21" containerName="nova-cell1-novncproxy-novncproxy" Feb 23 14:51:55.267617 master-0 kubenswrapper[28758]: I0223 14:51:55.265815 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:51:55.280818 master-0 kubenswrapper[28758]: I0223 14:51:55.280683 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 23 14:51:55.281110 master-0 kubenswrapper[28758]: I0223 14:51:55.281074 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 23 14:51:55.281585 master-0 kubenswrapper[28758]: I0223 14:51:55.281489 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 23 14:51:55.282081 master-0 kubenswrapper[28758]: I0223 14:51:55.282038 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 14:51:55.302167 master-0 kubenswrapper[28758]: I0223 14:51:55.302085 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b8a1637-d3b3-4945-98f6-128eed86cb10-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b8a1637-d3b3-4945-98f6-128eed86cb10\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:51:55.302447 master-0 kubenswrapper[28758]: I0223 14:51:55.302401 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-486fw\" (UniqueName: \"kubernetes.io/projected/1b8a1637-d3b3-4945-98f6-128eed86cb10-kube-api-access-486fw\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b8a1637-d3b3-4945-98f6-128eed86cb10\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:51:55.302715 master-0 kubenswrapper[28758]: I0223 14:51:55.302686 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b8a1637-d3b3-4945-98f6-128eed86cb10-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b8a1637-d3b3-4945-98f6-128eed86cb10\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:51:55.302778 master-0 kubenswrapper[28758]: I0223 14:51:55.302738 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b8a1637-d3b3-4945-98f6-128eed86cb10-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b8a1637-d3b3-4945-98f6-128eed86cb10\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:51:55.303182 master-0 kubenswrapper[28758]: I0223 14:51:55.303151 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b8a1637-d3b3-4945-98f6-128eed86cb10-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b8a1637-d3b3-4945-98f6-128eed86cb10\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:51:55.406735 master-0 kubenswrapper[28758]: I0223 14:51:55.406461 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b8a1637-d3b3-4945-98f6-128eed86cb10-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b8a1637-d3b3-4945-98f6-128eed86cb10\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:51:55.406735 master-0 kubenswrapper[28758]: I0223 14:51:55.406590 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-486fw\" (UniqueName: \"kubernetes.io/projected/1b8a1637-d3b3-4945-98f6-128eed86cb10-kube-api-access-486fw\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b8a1637-d3b3-4945-98f6-128eed86cb10\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:51:55.406735 master-0 kubenswrapper[28758]: I0223 14:51:55.406666 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b8a1637-d3b3-4945-98f6-128eed86cb10-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b8a1637-d3b3-4945-98f6-128eed86cb10\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:51:55.406735 master-0 kubenswrapper[28758]: I0223 14:51:55.406725 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b8a1637-d3b3-4945-98f6-128eed86cb10-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b8a1637-d3b3-4945-98f6-128eed86cb10\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:51:55.407241 master-0 kubenswrapper[28758]: I0223 14:51:55.406825 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b8a1637-d3b3-4945-98f6-128eed86cb10-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b8a1637-d3b3-4945-98f6-128eed86cb10\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:51:55.417642 master-0 kubenswrapper[28758]: I0223 14:51:55.411219 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b8a1637-d3b3-4945-98f6-128eed86cb10-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b8a1637-d3b3-4945-98f6-128eed86cb10\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:51:55.417642 master-0 kubenswrapper[28758]: I0223 14:51:55.412615 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b8a1637-d3b3-4945-98f6-128eed86cb10-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b8a1637-d3b3-4945-98f6-128eed86cb10\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:51:55.419735 master-0 kubenswrapper[28758]: I0223 14:51:55.418847 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b8a1637-d3b3-4945-98f6-128eed86cb10-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b8a1637-d3b3-4945-98f6-128eed86cb10\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:51:55.421151 master-0 kubenswrapper[28758]: I0223 14:51:55.421118 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b8a1637-d3b3-4945-98f6-128eed86cb10-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b8a1637-d3b3-4945-98f6-128eed86cb10\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:51:55.430098 master-0 kubenswrapper[28758]: I0223 14:51:55.430050 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-486fw\" (UniqueName: \"kubernetes.io/projected/1b8a1637-d3b3-4945-98f6-128eed86cb10-kube-api-access-486fw\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b8a1637-d3b3-4945-98f6-128eed86cb10\") " pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:51:55.605632 master-0 kubenswrapper[28758]: I0223 14:51:55.605513 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:51:56.112460 master-0 kubenswrapper[28758]: I0223 14:51:56.112360 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21" path="/var/lib/kubelet/pods/7c86c1e1-e667-49c5-ae5d-8ef8fa0f4b21/volumes" Feb 23 14:51:56.124885 master-0 kubenswrapper[28758]: I0223 14:51:56.124779 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 23 14:51:56.155400 master-0 kubenswrapper[28758]: I0223 14:51:56.155337 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1b8a1637-d3b3-4945-98f6-128eed86cb10","Type":"ContainerStarted","Data":"3c03dc34bd19364ee6d12c7fdf26c94433f7492a1d101e961d37622976305294"} Feb 23 14:51:57.173373 master-0 kubenswrapper[28758]: I0223 14:51:57.173283 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1b8a1637-d3b3-4945-98f6-128eed86cb10","Type":"ContainerStarted","Data":"22e302bc02746320638e12a3d6addf898a39d8c19bb6e766c37934c27e3fd193"} Feb 23 14:51:57.246832 master-0 kubenswrapper[28758]: I0223 14:51:57.235455 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.235436926 podStartE2EDuration="2.235436926s" podCreationTimestamp="2026-02-23 14:51:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:51:57.23417946 +0000 UTC m=+1049.360495412" watchObservedRunningTime="2026-02-23 14:51:57.235436926 +0000 UTC m=+1049.361752878" Feb 23 14:52:00.606326 master-0 kubenswrapper[28758]: I0223 14:52:00.606129 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:52:01.428600 master-0 kubenswrapper[28758]: I0223 14:52:01.428539 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 23 14:52:01.428912 master-0 kubenswrapper[28758]: I0223 14:52:01.428865 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 23 14:52:01.429069 master-0 kubenswrapper[28758]: I0223 14:52:01.429032 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 23 14:52:01.431802 master-0 kubenswrapper[28758]: I0223 14:52:01.431768 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 23 14:52:02.229324 master-0 kubenswrapper[28758]: I0223 14:52:02.229205 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 23 14:52:02.234400 master-0 kubenswrapper[28758]: I0223 14:52:02.233944 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 23 14:52:02.518106 master-0 kubenswrapper[28758]: I0223 14:52:02.518034 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6746dffb99-qhhmg"] Feb 23 14:52:02.521889 master-0 kubenswrapper[28758]: I0223 14:52:02.521238 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6746dffb99-qhhmg" Feb 23 14:52:02.564961 master-0 kubenswrapper[28758]: I0223 14:52:02.559905 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6746dffb99-qhhmg"] Feb 23 14:52:02.642503 master-0 kubenswrapper[28758]: I0223 14:52:02.642092 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5-ovsdbserver-sb\") pod \"dnsmasq-dns-6746dffb99-qhhmg\" (UID: \"0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5\") " pod="openstack/dnsmasq-dns-6746dffb99-qhhmg" Feb 23 14:52:02.642503 master-0 kubenswrapper[28758]: I0223 14:52:02.642410 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5-ovsdbserver-nb\") pod \"dnsmasq-dns-6746dffb99-qhhmg\" (UID: \"0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5\") " pod="openstack/dnsmasq-dns-6746dffb99-qhhmg" Feb 23 14:52:02.642757 master-0 kubenswrapper[28758]: I0223 14:52:02.642525 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzjxc\" (UniqueName: \"kubernetes.io/projected/0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5-kube-api-access-zzjxc\") pod \"dnsmasq-dns-6746dffb99-qhhmg\" (UID: \"0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5\") " pod="openstack/dnsmasq-dns-6746dffb99-qhhmg" Feb 23 14:52:02.642757 master-0 kubenswrapper[28758]: I0223 14:52:02.642591 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5-dns-swift-storage-0\") pod \"dnsmasq-dns-6746dffb99-qhhmg\" (UID: \"0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5\") " pod="openstack/dnsmasq-dns-6746dffb99-qhhmg" Feb 23 14:52:02.642816 master-0 kubenswrapper[28758]: I0223 14:52:02.642776 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5-dns-svc\") pod \"dnsmasq-dns-6746dffb99-qhhmg\" (UID: \"0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5\") " pod="openstack/dnsmasq-dns-6746dffb99-qhhmg" Feb 23 14:52:02.644232 master-0 kubenswrapper[28758]: I0223 14:52:02.642868 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5-config\") pod \"dnsmasq-dns-6746dffb99-qhhmg\" (UID: \"0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5\") " pod="openstack/dnsmasq-dns-6746dffb99-qhhmg" Feb 23 14:52:02.745895 master-0 kubenswrapper[28758]: I0223 14:52:02.745637 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5-ovsdbserver-sb\") pod \"dnsmasq-dns-6746dffb99-qhhmg\" (UID: \"0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5\") " pod="openstack/dnsmasq-dns-6746dffb99-qhhmg" Feb 23 14:52:02.745895 master-0 kubenswrapper[28758]: I0223 14:52:02.745886 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5-ovsdbserver-nb\") pod \"dnsmasq-dns-6746dffb99-qhhmg\" (UID: \"0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5\") " pod="openstack/dnsmasq-dns-6746dffb99-qhhmg" Feb 23 14:52:02.746475 master-0 kubenswrapper[28758]: I0223 14:52:02.745952 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzjxc\" (UniqueName: \"kubernetes.io/projected/0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5-kube-api-access-zzjxc\") pod \"dnsmasq-dns-6746dffb99-qhhmg\" (UID: \"0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5\") " pod="openstack/dnsmasq-dns-6746dffb99-qhhmg" Feb 23 14:52:02.746475 master-0 kubenswrapper[28758]: I0223 14:52:02.745983 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5-dns-swift-storage-0\") pod \"dnsmasq-dns-6746dffb99-qhhmg\" (UID: \"0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5\") " pod="openstack/dnsmasq-dns-6746dffb99-qhhmg" Feb 23 14:52:02.746475 master-0 kubenswrapper[28758]: I0223 14:52:02.746041 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5-dns-svc\") pod \"dnsmasq-dns-6746dffb99-qhhmg\" (UID: \"0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5\") " pod="openstack/dnsmasq-dns-6746dffb99-qhhmg" Feb 23 14:52:02.746475 master-0 kubenswrapper[28758]: I0223 14:52:02.746085 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5-config\") pod \"dnsmasq-dns-6746dffb99-qhhmg\" (UID: \"0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5\") " pod="openstack/dnsmasq-dns-6746dffb99-qhhmg" Feb 23 14:52:02.747339 master-0 kubenswrapper[28758]: I0223 14:52:02.747296 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5-ovsdbserver-nb\") pod \"dnsmasq-dns-6746dffb99-qhhmg\" (UID: \"0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5\") " pod="openstack/dnsmasq-dns-6746dffb99-qhhmg" Feb 23 14:52:02.747339 master-0 kubenswrapper[28758]: I0223 14:52:02.747327 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5-ovsdbserver-sb\") pod \"dnsmasq-dns-6746dffb99-qhhmg\" (UID: \"0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5\") " pod="openstack/dnsmasq-dns-6746dffb99-qhhmg" Feb 23 14:52:02.747607 master-0 kubenswrapper[28758]: I0223 14:52:02.747576 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5-config\") pod \"dnsmasq-dns-6746dffb99-qhhmg\" (UID: \"0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5\") " pod="openstack/dnsmasq-dns-6746dffb99-qhhmg" Feb 23 14:52:02.747980 master-0 kubenswrapper[28758]: I0223 14:52:02.747941 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5-dns-svc\") pod \"dnsmasq-dns-6746dffb99-qhhmg\" (UID: \"0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5\") " pod="openstack/dnsmasq-dns-6746dffb99-qhhmg" Feb 23 14:52:02.748642 master-0 kubenswrapper[28758]: I0223 14:52:02.748593 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5-dns-swift-storage-0\") pod \"dnsmasq-dns-6746dffb99-qhhmg\" (UID: \"0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5\") " pod="openstack/dnsmasq-dns-6746dffb99-qhhmg" Feb 23 14:52:02.767838 master-0 kubenswrapper[28758]: I0223 14:52:02.767679 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzjxc\" (UniqueName: \"kubernetes.io/projected/0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5-kube-api-access-zzjxc\") pod \"dnsmasq-dns-6746dffb99-qhhmg\" (UID: \"0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5\") " pod="openstack/dnsmasq-dns-6746dffb99-qhhmg" Feb 23 14:52:02.895787 master-0 kubenswrapper[28758]: I0223 14:52:02.895692 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6746dffb99-qhhmg" Feb 23 14:52:03.392147 master-0 kubenswrapper[28758]: I0223 14:52:03.391885 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6746dffb99-qhhmg"] Feb 23 14:52:03.463695 master-0 kubenswrapper[28758]: I0223 14:52:03.463590 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 23 14:52:03.464345 master-0 kubenswrapper[28758]: I0223 14:52:03.464251 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 23 14:52:03.477525 master-0 kubenswrapper[28758]: I0223 14:52:03.476804 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 23 14:52:04.258577 master-0 kubenswrapper[28758]: I0223 14:52:04.258278 28758 generic.go:334] "Generic (PLEG): container finished" podID="0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5" containerID="364eb25f29a7782fd11c1bcf2fd4b6055c16c024df97543616f5afb7f439ba7c" exitCode=0 Feb 23 14:52:04.258836 master-0 kubenswrapper[28758]: I0223 14:52:04.258563 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6746dffb99-qhhmg" event={"ID":"0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5","Type":"ContainerDied","Data":"364eb25f29a7782fd11c1bcf2fd4b6055c16c024df97543616f5afb7f439ba7c"} Feb 23 14:52:04.258836 master-0 kubenswrapper[28758]: I0223 14:52:04.258629 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6746dffb99-qhhmg" event={"ID":"0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5","Type":"ContainerStarted","Data":"7bfd937813a0ee4409be4c1afdb527432448117dd9144cde7d2ce4795fba2768"} Feb 23 14:52:04.265547 master-0 kubenswrapper[28758]: I0223 14:52:04.265291 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 23 14:52:05.271539 master-0 kubenswrapper[28758]: I0223 14:52:05.270333 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6746dffb99-qhhmg" event={"ID":"0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5","Type":"ContainerStarted","Data":"d2398769afdbfa941a64466b5319e3d7de3058b173ec7eb7dd688f39b94dd195"} Feb 23 14:52:05.298060 master-0 kubenswrapper[28758]: I0223 14:52:05.297977 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 14:52:05.298280 master-0 kubenswrapper[28758]: I0223 14:52:05.298230 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0004ca55-1f7a-4c38-8934-3c73e1380545" containerName="nova-api-log" containerID="cri-o://22919aae0c50a61f21e9874f7f2e7cc92add14c56d6aba57b68eb9bf2b456542" gracePeriod=30 Feb 23 14:52:05.298412 master-0 kubenswrapper[28758]: I0223 14:52:05.298320 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0004ca55-1f7a-4c38-8934-3c73e1380545" containerName="nova-api-api" containerID="cri-o://1a9ae0f88cc41b6554a0718035340164712c6ae259734900671d9be2477509d8" gracePeriod=30 Feb 23 14:52:05.299551 master-0 kubenswrapper[28758]: I0223 14:52:05.299480 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6746dffb99-qhhmg" podStartSLOduration=3.299458327 podStartE2EDuration="3.299458327s" podCreationTimestamp="2026-02-23 14:52:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:52:05.294880957 +0000 UTC m=+1057.421196889" watchObservedRunningTime="2026-02-23 14:52:05.299458327 +0000 UTC m=+1057.425774259" Feb 23 14:52:05.606893 master-0 kubenswrapper[28758]: I0223 14:52:05.606726 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:52:05.630858 master-0 kubenswrapper[28758]: I0223 14:52:05.630777 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:52:06.285948 master-0 kubenswrapper[28758]: I0223 14:52:06.285873 28758 generic.go:334] "Generic (PLEG): container finished" podID="0004ca55-1f7a-4c38-8934-3c73e1380545" containerID="22919aae0c50a61f21e9874f7f2e7cc92add14c56d6aba57b68eb9bf2b456542" exitCode=143 Feb 23 14:52:06.286454 master-0 kubenswrapper[28758]: I0223 14:52:06.285944 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0004ca55-1f7a-4c38-8934-3c73e1380545","Type":"ContainerDied","Data":"22919aae0c50a61f21e9874f7f2e7cc92add14c56d6aba57b68eb9bf2b456542"} Feb 23 14:52:06.286454 master-0 kubenswrapper[28758]: I0223 14:52:06.286186 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6746dffb99-qhhmg" Feb 23 14:52:06.301840 master-0 kubenswrapper[28758]: I0223 14:52:06.301761 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 23 14:52:06.624144 master-0 kubenswrapper[28758]: I0223 14:52:06.623739 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-btgpd"] Feb 23 14:52:06.625878 master-0 kubenswrapper[28758]: I0223 14:52:06.625836 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-btgpd" Feb 23 14:52:06.635467 master-0 kubenswrapper[28758]: I0223 14:52:06.635428 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 23 14:52:06.636003 master-0 kubenswrapper[28758]: I0223 14:52:06.635603 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 23 14:52:06.640805 master-0 kubenswrapper[28758]: I0223 14:52:06.637255 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-btgpd"] Feb 23 14:52:06.651792 master-0 kubenswrapper[28758]: I0223 14:52:06.651760 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-host-discover-9m76c"] Feb 23 14:52:06.656232 master-0 kubenswrapper[28758]: I0223 14:52:06.656140 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-9m76c" Feb 23 14:52:06.668633 master-0 kubenswrapper[28758]: I0223 14:52:06.664715 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-host-discover-9m76c"] Feb 23 14:52:06.703029 master-0 kubenswrapper[28758]: I0223 14:52:06.702948 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64cc097b-5a40-40dc-961d-2951e576f39e-combined-ca-bundle\") pod \"nova-cell1-host-discover-9m76c\" (UID: \"64cc097b-5a40-40dc-961d-2951e576f39e\") " pod="openstack/nova-cell1-host-discover-9m76c" Feb 23 14:52:06.703314 master-0 kubenswrapper[28758]: I0223 14:52:06.703157 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2ec10e8-80ea-4fc3-93ce-3deec7202918-scripts\") pod \"nova-cell1-cell-mapping-btgpd\" (UID: \"c2ec10e8-80ea-4fc3-93ce-3deec7202918\") " pod="openstack/nova-cell1-cell-mapping-btgpd" Feb 23 14:52:06.703314 master-0 kubenswrapper[28758]: I0223 14:52:06.703234 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2ec10e8-80ea-4fc3-93ce-3deec7202918-config-data\") pod \"nova-cell1-cell-mapping-btgpd\" (UID: \"c2ec10e8-80ea-4fc3-93ce-3deec7202918\") " pod="openstack/nova-cell1-cell-mapping-btgpd" Feb 23 14:52:06.703946 master-0 kubenswrapper[28758]: I0223 14:52:06.703781 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2d67\" (UniqueName: \"kubernetes.io/projected/64cc097b-5a40-40dc-961d-2951e576f39e-kube-api-access-k2d67\") pod \"nova-cell1-host-discover-9m76c\" (UID: \"64cc097b-5a40-40dc-961d-2951e576f39e\") " pod="openstack/nova-cell1-host-discover-9m76c" Feb 23 14:52:06.704227 master-0 kubenswrapper[28758]: I0223 14:52:06.704188 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64cc097b-5a40-40dc-961d-2951e576f39e-scripts\") pod \"nova-cell1-host-discover-9m76c\" (UID: \"64cc097b-5a40-40dc-961d-2951e576f39e\") " pod="openstack/nova-cell1-host-discover-9m76c" Feb 23 14:52:06.704428 master-0 kubenswrapper[28758]: I0223 14:52:06.704321 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64cc097b-5a40-40dc-961d-2951e576f39e-config-data\") pod \"nova-cell1-host-discover-9m76c\" (UID: \"64cc097b-5a40-40dc-961d-2951e576f39e\") " pod="openstack/nova-cell1-host-discover-9m76c" Feb 23 14:52:06.704524 master-0 kubenswrapper[28758]: I0223 14:52:06.704469 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2ec10e8-80ea-4fc3-93ce-3deec7202918-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-btgpd\" (UID: \"c2ec10e8-80ea-4fc3-93ce-3deec7202918\") " pod="openstack/nova-cell1-cell-mapping-btgpd" Feb 23 14:52:06.704688 master-0 kubenswrapper[28758]: I0223 14:52:06.704651 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npdmn\" (UniqueName: \"kubernetes.io/projected/c2ec10e8-80ea-4fc3-93ce-3deec7202918-kube-api-access-npdmn\") pod \"nova-cell1-cell-mapping-btgpd\" (UID: \"c2ec10e8-80ea-4fc3-93ce-3deec7202918\") " pod="openstack/nova-cell1-cell-mapping-btgpd" Feb 23 14:52:06.806853 master-0 kubenswrapper[28758]: I0223 14:52:06.806785 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2d67\" (UniqueName: \"kubernetes.io/projected/64cc097b-5a40-40dc-961d-2951e576f39e-kube-api-access-k2d67\") pod \"nova-cell1-host-discover-9m76c\" (UID: \"64cc097b-5a40-40dc-961d-2951e576f39e\") " pod="openstack/nova-cell1-host-discover-9m76c" Feb 23 14:52:06.807400 master-0 kubenswrapper[28758]: I0223 14:52:06.807315 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64cc097b-5a40-40dc-961d-2951e576f39e-scripts\") pod \"nova-cell1-host-discover-9m76c\" (UID: \"64cc097b-5a40-40dc-961d-2951e576f39e\") " pod="openstack/nova-cell1-host-discover-9m76c" Feb 23 14:52:06.807508 master-0 kubenswrapper[28758]: I0223 14:52:06.807399 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64cc097b-5a40-40dc-961d-2951e576f39e-config-data\") pod \"nova-cell1-host-discover-9m76c\" (UID: \"64cc097b-5a40-40dc-961d-2951e576f39e\") " pod="openstack/nova-cell1-host-discover-9m76c" Feb 23 14:52:06.807508 master-0 kubenswrapper[28758]: I0223 14:52:06.807447 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2ec10e8-80ea-4fc3-93ce-3deec7202918-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-btgpd\" (UID: \"c2ec10e8-80ea-4fc3-93ce-3deec7202918\") " pod="openstack/nova-cell1-cell-mapping-btgpd" Feb 23 14:52:06.807620 master-0 kubenswrapper[28758]: I0223 14:52:06.807547 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npdmn\" (UniqueName: \"kubernetes.io/projected/c2ec10e8-80ea-4fc3-93ce-3deec7202918-kube-api-access-npdmn\") pod \"nova-cell1-cell-mapping-btgpd\" (UID: \"c2ec10e8-80ea-4fc3-93ce-3deec7202918\") " pod="openstack/nova-cell1-cell-mapping-btgpd" Feb 23 14:52:06.807620 master-0 kubenswrapper[28758]: I0223 14:52:06.807593 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64cc097b-5a40-40dc-961d-2951e576f39e-combined-ca-bundle\") pod \"nova-cell1-host-discover-9m76c\" (UID: \"64cc097b-5a40-40dc-961d-2951e576f39e\") " pod="openstack/nova-cell1-host-discover-9m76c" Feb 23 14:52:06.807782 master-0 kubenswrapper[28758]: I0223 14:52:06.807741 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2ec10e8-80ea-4fc3-93ce-3deec7202918-scripts\") pod \"nova-cell1-cell-mapping-btgpd\" (UID: \"c2ec10e8-80ea-4fc3-93ce-3deec7202918\") " pod="openstack/nova-cell1-cell-mapping-btgpd" Feb 23 14:52:06.807859 master-0 kubenswrapper[28758]: I0223 14:52:06.807839 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2ec10e8-80ea-4fc3-93ce-3deec7202918-config-data\") pod \"nova-cell1-cell-mapping-btgpd\" (UID: \"c2ec10e8-80ea-4fc3-93ce-3deec7202918\") " pod="openstack/nova-cell1-cell-mapping-btgpd" Feb 23 14:52:06.812209 master-0 kubenswrapper[28758]: I0223 14:52:06.812166 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2ec10e8-80ea-4fc3-93ce-3deec7202918-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-btgpd\" (UID: \"c2ec10e8-80ea-4fc3-93ce-3deec7202918\") " pod="openstack/nova-cell1-cell-mapping-btgpd" Feb 23 14:52:06.813145 master-0 kubenswrapper[28758]: I0223 14:52:06.813111 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64cc097b-5a40-40dc-961d-2951e576f39e-combined-ca-bundle\") pod \"nova-cell1-host-discover-9m76c\" (UID: \"64cc097b-5a40-40dc-961d-2951e576f39e\") " pod="openstack/nova-cell1-host-discover-9m76c" Feb 23 14:52:06.813916 master-0 kubenswrapper[28758]: I0223 14:52:06.813857 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64cc097b-5a40-40dc-961d-2951e576f39e-config-data\") pod \"nova-cell1-host-discover-9m76c\" (UID: \"64cc097b-5a40-40dc-961d-2951e576f39e\") " pod="openstack/nova-cell1-host-discover-9m76c" Feb 23 14:52:06.816064 master-0 kubenswrapper[28758]: I0223 14:52:06.815388 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2ec10e8-80ea-4fc3-93ce-3deec7202918-config-data\") pod \"nova-cell1-cell-mapping-btgpd\" (UID: \"c2ec10e8-80ea-4fc3-93ce-3deec7202918\") " pod="openstack/nova-cell1-cell-mapping-btgpd" Feb 23 14:52:06.816064 master-0 kubenswrapper[28758]: I0223 14:52:06.815868 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2ec10e8-80ea-4fc3-93ce-3deec7202918-scripts\") pod \"nova-cell1-cell-mapping-btgpd\" (UID: \"c2ec10e8-80ea-4fc3-93ce-3deec7202918\") " pod="openstack/nova-cell1-cell-mapping-btgpd" Feb 23 14:52:06.821389 master-0 kubenswrapper[28758]: I0223 14:52:06.821329 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64cc097b-5a40-40dc-961d-2951e576f39e-scripts\") pod \"nova-cell1-host-discover-9m76c\" (UID: \"64cc097b-5a40-40dc-961d-2951e576f39e\") " pod="openstack/nova-cell1-host-discover-9m76c" Feb 23 14:52:06.838117 master-0 kubenswrapper[28758]: I0223 14:52:06.838057 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2d67\" (UniqueName: \"kubernetes.io/projected/64cc097b-5a40-40dc-961d-2951e576f39e-kube-api-access-k2d67\") pod \"nova-cell1-host-discover-9m76c\" (UID: \"64cc097b-5a40-40dc-961d-2951e576f39e\") " pod="openstack/nova-cell1-host-discover-9m76c" Feb 23 14:52:06.843507 master-0 kubenswrapper[28758]: I0223 14:52:06.843415 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npdmn\" (UniqueName: \"kubernetes.io/projected/c2ec10e8-80ea-4fc3-93ce-3deec7202918-kube-api-access-npdmn\") pod \"nova-cell1-cell-mapping-btgpd\" (UID: \"c2ec10e8-80ea-4fc3-93ce-3deec7202918\") " pod="openstack/nova-cell1-cell-mapping-btgpd" Feb 23 14:52:06.963222 master-0 kubenswrapper[28758]: I0223 14:52:06.962500 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-btgpd" Feb 23 14:52:06.997265 master-0 kubenswrapper[28758]: I0223 14:52:06.996905 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-9m76c" Feb 23 14:52:07.503424 master-0 kubenswrapper[28758]: W0223 14:52:07.503339 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64cc097b_5a40_40dc_961d_2951e576f39e.slice/crio-d5abe8aaa95d2377df948c6372e3d87e2c9ef7a93f1feadd4b1fc0fd6d855b3b WatchSource:0}: Error finding container d5abe8aaa95d2377df948c6372e3d87e2c9ef7a93f1feadd4b1fc0fd6d855b3b: Status 404 returned error can't find the container with id d5abe8aaa95d2377df948c6372e3d87e2c9ef7a93f1feadd4b1fc0fd6d855b3b Feb 23 14:52:07.511167 master-0 kubenswrapper[28758]: I0223 14:52:07.511106 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-host-discover-9m76c"] Feb 23 14:52:07.618510 master-0 kubenswrapper[28758]: I0223 14:52:07.616272 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-btgpd"] Feb 23 14:52:07.618510 master-0 kubenswrapper[28758]: W0223 14:52:07.618052 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2ec10e8_80ea_4fc3_93ce_3deec7202918.slice/crio-e35756cc6760d85693d50393600d4cf2c72d8d9f20a9ba240bfc1dcec14526f8 WatchSource:0}: Error finding container e35756cc6760d85693d50393600d4cf2c72d8d9f20a9ba240bfc1dcec14526f8: Status 404 returned error can't find the container with id e35756cc6760d85693d50393600d4cf2c72d8d9f20a9ba240bfc1dcec14526f8 Feb 23 14:52:08.326010 master-0 kubenswrapper[28758]: I0223 14:52:08.325921 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-9m76c" event={"ID":"64cc097b-5a40-40dc-961d-2951e576f39e","Type":"ContainerStarted","Data":"7721aae09aafaa7e916c995bc3522bc0e1eb74299fe988fc4bfc2950d24c1c5a"} Feb 23 14:52:08.326361 master-0 kubenswrapper[28758]: I0223 14:52:08.326018 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-9m76c" event={"ID":"64cc097b-5a40-40dc-961d-2951e576f39e","Type":"ContainerStarted","Data":"d5abe8aaa95d2377df948c6372e3d87e2c9ef7a93f1feadd4b1fc0fd6d855b3b"} Feb 23 14:52:08.328300 master-0 kubenswrapper[28758]: I0223 14:52:08.328252 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-btgpd" event={"ID":"c2ec10e8-80ea-4fc3-93ce-3deec7202918","Type":"ContainerStarted","Data":"53bfde90eeec423d49fc398e4df02d1b4e0852a76e87fe0b10c460c7b653dfec"} Feb 23 14:52:08.328300 master-0 kubenswrapper[28758]: I0223 14:52:08.328298 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-btgpd" event={"ID":"c2ec10e8-80ea-4fc3-93ce-3deec7202918","Type":"ContainerStarted","Data":"e35756cc6760d85693d50393600d4cf2c72d8d9f20a9ba240bfc1dcec14526f8"} Feb 23 14:52:08.348109 master-0 kubenswrapper[28758]: I0223 14:52:08.348007 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-host-discover-9m76c" podStartSLOduration=2.34798528 podStartE2EDuration="2.34798528s" podCreationTimestamp="2026-02-23 14:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:52:08.347689972 +0000 UTC m=+1060.474005914" watchObservedRunningTime="2026-02-23 14:52:08.34798528 +0000 UTC m=+1060.474301222" Feb 23 14:52:08.378309 master-0 kubenswrapper[28758]: I0223 14:52:08.378211 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-btgpd" podStartSLOduration=2.378188616 podStartE2EDuration="2.378188616s" podCreationTimestamp="2026-02-23 14:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:52:08.367717399 +0000 UTC m=+1060.494033331" watchObservedRunningTime="2026-02-23 14:52:08.378188616 +0000 UTC m=+1060.504504548" Feb 23 14:52:09.013595 master-0 kubenswrapper[28758]: I0223 14:52:09.013415 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 14:52:09.080823 master-0 kubenswrapper[28758]: I0223 14:52:09.069209 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0004ca55-1f7a-4c38-8934-3c73e1380545-combined-ca-bundle\") pod \"0004ca55-1f7a-4c38-8934-3c73e1380545\" (UID: \"0004ca55-1f7a-4c38-8934-3c73e1380545\") " Feb 23 14:52:09.080823 master-0 kubenswrapper[28758]: I0223 14:52:09.069658 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0004ca55-1f7a-4c38-8934-3c73e1380545-config-data\") pod \"0004ca55-1f7a-4c38-8934-3c73e1380545\" (UID: \"0004ca55-1f7a-4c38-8934-3c73e1380545\") " Feb 23 14:52:09.080823 master-0 kubenswrapper[28758]: I0223 14:52:09.069745 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8p4d\" (UniqueName: \"kubernetes.io/projected/0004ca55-1f7a-4c38-8934-3c73e1380545-kube-api-access-r8p4d\") pod \"0004ca55-1f7a-4c38-8934-3c73e1380545\" (UID: \"0004ca55-1f7a-4c38-8934-3c73e1380545\") " Feb 23 14:52:09.080823 master-0 kubenswrapper[28758]: I0223 14:52:09.069854 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0004ca55-1f7a-4c38-8934-3c73e1380545-logs\") pod \"0004ca55-1f7a-4c38-8934-3c73e1380545\" (UID: \"0004ca55-1f7a-4c38-8934-3c73e1380545\") " Feb 23 14:52:09.080823 master-0 kubenswrapper[28758]: I0223 14:52:09.071333 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0004ca55-1f7a-4c38-8934-3c73e1380545-logs" (OuterVolumeSpecName: "logs") pod "0004ca55-1f7a-4c38-8934-3c73e1380545" (UID: "0004ca55-1f7a-4c38-8934-3c73e1380545"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:52:09.083807 master-0 kubenswrapper[28758]: I0223 14:52:09.083726 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0004ca55-1f7a-4c38-8934-3c73e1380545-kube-api-access-r8p4d" (OuterVolumeSpecName: "kube-api-access-r8p4d") pod "0004ca55-1f7a-4c38-8934-3c73e1380545" (UID: "0004ca55-1f7a-4c38-8934-3c73e1380545"). InnerVolumeSpecName "kube-api-access-r8p4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:52:09.109009 master-0 kubenswrapper[28758]: I0223 14:52:09.108930 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0004ca55-1f7a-4c38-8934-3c73e1380545-config-data" (OuterVolumeSpecName: "config-data") pod "0004ca55-1f7a-4c38-8934-3c73e1380545" (UID: "0004ca55-1f7a-4c38-8934-3c73e1380545"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:52:09.122325 master-0 kubenswrapper[28758]: I0223 14:52:09.122183 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0004ca55-1f7a-4c38-8934-3c73e1380545-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0004ca55-1f7a-4c38-8934-3c73e1380545" (UID: "0004ca55-1f7a-4c38-8934-3c73e1380545"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:52:09.172792 master-0 kubenswrapper[28758]: I0223 14:52:09.172253 28758 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0004ca55-1f7a-4c38-8934-3c73e1380545-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:52:09.172792 master-0 kubenswrapper[28758]: I0223 14:52:09.172313 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8p4d\" (UniqueName: \"kubernetes.io/projected/0004ca55-1f7a-4c38-8934-3c73e1380545-kube-api-access-r8p4d\") on node \"master-0\" DevicePath \"\"" Feb 23 14:52:09.172792 master-0 kubenswrapper[28758]: I0223 14:52:09.172330 28758 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0004ca55-1f7a-4c38-8934-3c73e1380545-logs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:52:09.172792 master-0 kubenswrapper[28758]: I0223 14:52:09.172344 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0004ca55-1f7a-4c38-8934-3c73e1380545-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:52:09.343955 master-0 kubenswrapper[28758]: I0223 14:52:09.343873 28758 generic.go:334] "Generic (PLEG): container finished" podID="0004ca55-1f7a-4c38-8934-3c73e1380545" containerID="1a9ae0f88cc41b6554a0718035340164712c6ae259734900671d9be2477509d8" exitCode=0 Feb 23 14:52:09.345252 master-0 kubenswrapper[28758]: I0223 14:52:09.345201 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 14:52:09.354188 master-0 kubenswrapper[28758]: I0223 14:52:09.354124 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0004ca55-1f7a-4c38-8934-3c73e1380545","Type":"ContainerDied","Data":"1a9ae0f88cc41b6554a0718035340164712c6ae259734900671d9be2477509d8"} Feb 23 14:52:09.354188 master-0 kubenswrapper[28758]: I0223 14:52:09.354183 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0004ca55-1f7a-4c38-8934-3c73e1380545","Type":"ContainerDied","Data":"3332ef26649bb7c29de4e7fc162ae117bd9a126c035b995416fdc98fb04d0b69"} Feb 23 14:52:09.354382 master-0 kubenswrapper[28758]: I0223 14:52:09.354205 28758 scope.go:117] "RemoveContainer" containerID="1a9ae0f88cc41b6554a0718035340164712c6ae259734900671d9be2477509d8" Feb 23 14:52:09.419624 master-0 kubenswrapper[28758]: I0223 14:52:09.409279 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 14:52:09.421868 master-0 kubenswrapper[28758]: I0223 14:52:09.420545 28758 scope.go:117] "RemoveContainer" containerID="22919aae0c50a61f21e9874f7f2e7cc92add14c56d6aba57b68eb9bf2b456542" Feb 23 14:52:09.425759 master-0 kubenswrapper[28758]: I0223 14:52:09.425374 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 23 14:52:09.441611 master-0 kubenswrapper[28758]: I0223 14:52:09.441545 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 23 14:52:09.445630 master-0 kubenswrapper[28758]: E0223 14:52:09.442196 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0004ca55-1f7a-4c38-8934-3c73e1380545" containerName="nova-api-log" Feb 23 14:52:09.445630 master-0 kubenswrapper[28758]: I0223 14:52:09.442224 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="0004ca55-1f7a-4c38-8934-3c73e1380545" containerName="nova-api-log" Feb 23 14:52:09.445630 master-0 kubenswrapper[28758]: E0223 14:52:09.442301 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0004ca55-1f7a-4c38-8934-3c73e1380545" containerName="nova-api-api" Feb 23 14:52:09.445630 master-0 kubenswrapper[28758]: I0223 14:52:09.442315 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="0004ca55-1f7a-4c38-8934-3c73e1380545" containerName="nova-api-api" Feb 23 14:52:09.445630 master-0 kubenswrapper[28758]: I0223 14:52:09.442729 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="0004ca55-1f7a-4c38-8934-3c73e1380545" containerName="nova-api-log" Feb 23 14:52:09.445630 master-0 kubenswrapper[28758]: I0223 14:52:09.442755 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="0004ca55-1f7a-4c38-8934-3c73e1380545" containerName="nova-api-api" Feb 23 14:52:09.445630 master-0 kubenswrapper[28758]: I0223 14:52:09.444755 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 14:52:09.447829 master-0 kubenswrapper[28758]: I0223 14:52:09.447790 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 23 14:52:09.447901 master-0 kubenswrapper[28758]: I0223 14:52:09.447790 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 23 14:52:09.447938 master-0 kubenswrapper[28758]: I0223 14:52:09.447792 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 23 14:52:09.457716 master-0 kubenswrapper[28758]: I0223 14:52:09.457608 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 14:52:09.595511 master-0 kubenswrapper[28758]: I0223 14:52:09.594866 28758 scope.go:117] "RemoveContainer" containerID="1a9ae0f88cc41b6554a0718035340164712c6ae259734900671d9be2477509d8" Feb 23 14:52:09.601512 master-0 kubenswrapper[28758]: I0223 14:52:09.598762 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde\") " pod="openstack/nova-api-0" Feb 23 14:52:09.601512 master-0 kubenswrapper[28758]: I0223 14:52:09.598881 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde\") " pod="openstack/nova-api-0" Feb 23 14:52:09.601512 master-0 kubenswrapper[28758]: I0223 14:52:09.598959 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktnq9\" (UniqueName: \"kubernetes.io/projected/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-kube-api-access-ktnq9\") pod \"nova-api-0\" (UID: \"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde\") " pod="openstack/nova-api-0" Feb 23 14:52:09.601512 master-0 kubenswrapper[28758]: I0223 14:52:09.599011 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-logs\") pod \"nova-api-0\" (UID: \"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde\") " pod="openstack/nova-api-0" Feb 23 14:52:09.601512 master-0 kubenswrapper[28758]: I0223 14:52:09.599081 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-config-data\") pod \"nova-api-0\" (UID: \"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde\") " pod="openstack/nova-api-0" Feb 23 14:52:09.601512 master-0 kubenswrapper[28758]: I0223 14:52:09.599108 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-public-tls-certs\") pod \"nova-api-0\" (UID: \"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde\") " pod="openstack/nova-api-0" Feb 23 14:52:09.614511 master-0 kubenswrapper[28758]: E0223 14:52:09.606132 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a9ae0f88cc41b6554a0718035340164712c6ae259734900671d9be2477509d8\": container with ID starting with 1a9ae0f88cc41b6554a0718035340164712c6ae259734900671d9be2477509d8 not found: ID does not exist" containerID="1a9ae0f88cc41b6554a0718035340164712c6ae259734900671d9be2477509d8" Feb 23 14:52:09.614511 master-0 kubenswrapper[28758]: I0223 14:52:09.606234 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a9ae0f88cc41b6554a0718035340164712c6ae259734900671d9be2477509d8"} err="failed to get container status \"1a9ae0f88cc41b6554a0718035340164712c6ae259734900671d9be2477509d8\": rpc error: code = NotFound desc = could not find container \"1a9ae0f88cc41b6554a0718035340164712c6ae259734900671d9be2477509d8\": container with ID starting with 1a9ae0f88cc41b6554a0718035340164712c6ae259734900671d9be2477509d8 not found: ID does not exist" Feb 23 14:52:09.614511 master-0 kubenswrapper[28758]: I0223 14:52:09.606266 28758 scope.go:117] "RemoveContainer" containerID="22919aae0c50a61f21e9874f7f2e7cc92add14c56d6aba57b68eb9bf2b456542" Feb 23 14:52:09.614511 master-0 kubenswrapper[28758]: E0223 14:52:09.606675 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22919aae0c50a61f21e9874f7f2e7cc92add14c56d6aba57b68eb9bf2b456542\": container with ID starting with 22919aae0c50a61f21e9874f7f2e7cc92add14c56d6aba57b68eb9bf2b456542 not found: ID does not exist" containerID="22919aae0c50a61f21e9874f7f2e7cc92add14c56d6aba57b68eb9bf2b456542" Feb 23 14:52:09.614511 master-0 kubenswrapper[28758]: I0223 14:52:09.606770 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22919aae0c50a61f21e9874f7f2e7cc92add14c56d6aba57b68eb9bf2b456542"} err="failed to get container status \"22919aae0c50a61f21e9874f7f2e7cc92add14c56d6aba57b68eb9bf2b456542\": rpc error: code = NotFound desc = could not find container \"22919aae0c50a61f21e9874f7f2e7cc92add14c56d6aba57b68eb9bf2b456542\": container with ID starting with 22919aae0c50a61f21e9874f7f2e7cc92add14c56d6aba57b68eb9bf2b456542 not found: ID does not exist" Feb 23 14:52:09.709509 master-0 kubenswrapper[28758]: I0223 14:52:09.706355 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-public-tls-certs\") pod \"nova-api-0\" (UID: \"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde\") " pod="openstack/nova-api-0" Feb 23 14:52:09.709509 master-0 kubenswrapper[28758]: I0223 14:52:09.706635 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde\") " pod="openstack/nova-api-0" Feb 23 14:52:09.709509 master-0 kubenswrapper[28758]: I0223 14:52:09.706668 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde\") " pod="openstack/nova-api-0" Feb 23 14:52:09.709509 master-0 kubenswrapper[28758]: I0223 14:52:09.706729 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktnq9\" (UniqueName: \"kubernetes.io/projected/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-kube-api-access-ktnq9\") pod \"nova-api-0\" (UID: \"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde\") " pod="openstack/nova-api-0" Feb 23 14:52:09.709509 master-0 kubenswrapper[28758]: I0223 14:52:09.706760 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-logs\") pod \"nova-api-0\" (UID: \"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde\") " pod="openstack/nova-api-0" Feb 23 14:52:09.709509 master-0 kubenswrapper[28758]: I0223 14:52:09.706805 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-config-data\") pod \"nova-api-0\" (UID: \"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde\") " pod="openstack/nova-api-0" Feb 23 14:52:09.720509 master-0 kubenswrapper[28758]: I0223 14:52:09.715691 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-public-tls-certs\") pod \"nova-api-0\" (UID: \"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde\") " pod="openstack/nova-api-0" Feb 23 14:52:09.720509 master-0 kubenswrapper[28758]: I0223 14:52:09.716214 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-logs\") pod \"nova-api-0\" (UID: \"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde\") " pod="openstack/nova-api-0" Feb 23 14:52:09.726508 master-0 kubenswrapper[28758]: I0223 14:52:09.721033 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde\") " pod="openstack/nova-api-0" Feb 23 14:52:09.739927 master-0 kubenswrapper[28758]: I0223 14:52:09.739863 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde\") " pod="openstack/nova-api-0" Feb 23 14:52:09.740560 master-0 kubenswrapper[28758]: I0223 14:52:09.740510 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-config-data\") pod \"nova-api-0\" (UID: \"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde\") " pod="openstack/nova-api-0" Feb 23 14:52:09.746243 master-0 kubenswrapper[28758]: I0223 14:52:09.744626 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktnq9\" (UniqueName: \"kubernetes.io/projected/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-kube-api-access-ktnq9\") pod \"nova-api-0\" (UID: \"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde\") " pod="openstack/nova-api-0" Feb 23 14:52:09.818367 master-0 kubenswrapper[28758]: I0223 14:52:09.818205 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 14:52:10.108650 master-0 kubenswrapper[28758]: I0223 14:52:10.108432 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0004ca55-1f7a-4c38-8934-3c73e1380545" path="/var/lib/kubelet/pods/0004ca55-1f7a-4c38-8934-3c73e1380545/volumes" Feb 23 14:52:10.309148 master-0 kubenswrapper[28758]: I0223 14:52:10.308522 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 14:52:10.363466 master-0 kubenswrapper[28758]: I0223 14:52:10.363387 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde","Type":"ContainerStarted","Data":"4e416c7e63802f310e573ca4985aac973593ad3e5b46bf813971a8c11f49bbef"} Feb 23 14:52:11.376720 master-0 kubenswrapper[28758]: I0223 14:52:11.376573 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde","Type":"ContainerStarted","Data":"03576a26302c4ea098c47a29b1b158409b6008394f67c697da3d7d646c745e07"} Feb 23 14:52:11.376720 master-0 kubenswrapper[28758]: I0223 14:52:11.376652 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde","Type":"ContainerStarted","Data":"7e377e0183295d37400bf1498dbee653d35b264381906e2bfc0aea6ebc8f23d2"} Feb 23 14:52:11.381943 master-0 kubenswrapper[28758]: I0223 14:52:11.381898 28758 generic.go:334] "Generic (PLEG): container finished" podID="64cc097b-5a40-40dc-961d-2951e576f39e" containerID="7721aae09aafaa7e916c995bc3522bc0e1eb74299fe988fc4bfc2950d24c1c5a" exitCode=0 Feb 23 14:52:11.381943 master-0 kubenswrapper[28758]: I0223 14:52:11.381938 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-9m76c" event={"ID":"64cc097b-5a40-40dc-961d-2951e576f39e","Type":"ContainerDied","Data":"7721aae09aafaa7e916c995bc3522bc0e1eb74299fe988fc4bfc2950d24c1c5a"} Feb 23 14:52:11.429630 master-0 kubenswrapper[28758]: I0223 14:52:11.425896 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.425874045 podStartE2EDuration="2.425874045s" podCreationTimestamp="2026-02-23 14:52:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:52:11.399846878 +0000 UTC m=+1063.526162810" watchObservedRunningTime="2026-02-23 14:52:11.425874045 +0000 UTC m=+1063.552189977" Feb 23 14:52:12.852272 master-0 kubenswrapper[28758]: I0223 14:52:12.852184 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-9m76c" Feb 23 14:52:12.900731 master-0 kubenswrapper[28758]: I0223 14:52:12.896968 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6746dffb99-qhhmg" Feb 23 14:52:13.028206 master-0 kubenswrapper[28758]: I0223 14:52:13.028096 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64cc097b-5a40-40dc-961d-2951e576f39e-config-data\") pod \"64cc097b-5a40-40dc-961d-2951e576f39e\" (UID: \"64cc097b-5a40-40dc-961d-2951e576f39e\") " Feb 23 14:52:13.028516 master-0 kubenswrapper[28758]: I0223 14:52:13.028321 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64cc097b-5a40-40dc-961d-2951e576f39e-combined-ca-bundle\") pod \"64cc097b-5a40-40dc-961d-2951e576f39e\" (UID: \"64cc097b-5a40-40dc-961d-2951e576f39e\") " Feb 23 14:52:13.028516 master-0 kubenswrapper[28758]: I0223 14:52:13.028437 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2d67\" (UniqueName: \"kubernetes.io/projected/64cc097b-5a40-40dc-961d-2951e576f39e-kube-api-access-k2d67\") pod \"64cc097b-5a40-40dc-961d-2951e576f39e\" (UID: \"64cc097b-5a40-40dc-961d-2951e576f39e\") " Feb 23 14:52:13.032519 master-0 kubenswrapper[28758]: I0223 14:52:13.028662 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64cc097b-5a40-40dc-961d-2951e576f39e-scripts\") pod \"64cc097b-5a40-40dc-961d-2951e576f39e\" (UID: \"64cc097b-5a40-40dc-961d-2951e576f39e\") " Feb 23 14:52:13.052880 master-0 kubenswrapper[28758]: I0223 14:52:13.052793 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64cc097b-5a40-40dc-961d-2951e576f39e-scripts" (OuterVolumeSpecName: "scripts") pod "64cc097b-5a40-40dc-961d-2951e576f39e" (UID: "64cc097b-5a40-40dc-961d-2951e576f39e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:52:13.060684 master-0 kubenswrapper[28758]: I0223 14:52:13.060610 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64cc097b-5a40-40dc-961d-2951e576f39e-kube-api-access-k2d67" (OuterVolumeSpecName: "kube-api-access-k2d67") pod "64cc097b-5a40-40dc-961d-2951e576f39e" (UID: "64cc097b-5a40-40dc-961d-2951e576f39e"). InnerVolumeSpecName "kube-api-access-k2d67". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:52:13.078932 master-0 kubenswrapper[28758]: I0223 14:52:13.078863 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bdb596dbf-l8mhf"] Feb 23 14:52:13.079180 master-0 kubenswrapper[28758]: I0223 14:52:13.079126 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bdb596dbf-l8mhf" podUID="79275321-c948-4636-bcdf-2c0ab4f02076" containerName="dnsmasq-dns" containerID="cri-o://95833dc474850c32d6b3fae9ed6bd7114059237225d21845650659f7fdc6d96f" gracePeriod=10 Feb 23 14:52:13.093806 master-0 kubenswrapper[28758]: I0223 14:52:13.093655 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64cc097b-5a40-40dc-961d-2951e576f39e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64cc097b-5a40-40dc-961d-2951e576f39e" (UID: "64cc097b-5a40-40dc-961d-2951e576f39e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:52:13.097556 master-0 kubenswrapper[28758]: I0223 14:52:13.096668 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64cc097b-5a40-40dc-961d-2951e576f39e-config-data" (OuterVolumeSpecName: "config-data") pod "64cc097b-5a40-40dc-961d-2951e576f39e" (UID: "64cc097b-5a40-40dc-961d-2951e576f39e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:52:13.136335 master-0 kubenswrapper[28758]: I0223 14:52:13.136271 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64cc097b-5a40-40dc-961d-2951e576f39e-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:52:13.136335 master-0 kubenswrapper[28758]: I0223 14:52:13.136328 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2d67\" (UniqueName: \"kubernetes.io/projected/64cc097b-5a40-40dc-961d-2951e576f39e-kube-api-access-k2d67\") on node \"master-0\" DevicePath \"\"" Feb 23 14:52:13.136487 master-0 kubenswrapper[28758]: I0223 14:52:13.136342 28758 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64cc097b-5a40-40dc-961d-2951e576f39e-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:52:13.136487 master-0 kubenswrapper[28758]: I0223 14:52:13.136352 28758 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64cc097b-5a40-40dc-961d-2951e576f39e-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:52:13.415620 master-0 kubenswrapper[28758]: I0223 14:52:13.411577 28758 generic.go:334] "Generic (PLEG): container finished" podID="c2ec10e8-80ea-4fc3-93ce-3deec7202918" containerID="53bfde90eeec423d49fc398e4df02d1b4e0852a76e87fe0b10c460c7b653dfec" exitCode=0 Feb 23 14:52:13.415620 master-0 kubenswrapper[28758]: I0223 14:52:13.411656 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-btgpd" event={"ID":"c2ec10e8-80ea-4fc3-93ce-3deec7202918","Type":"ContainerDied","Data":"53bfde90eeec423d49fc398e4df02d1b4e0852a76e87fe0b10c460c7b653dfec"} Feb 23 14:52:13.417093 master-0 kubenswrapper[28758]: I0223 14:52:13.416643 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-9m76c" event={"ID":"64cc097b-5a40-40dc-961d-2951e576f39e","Type":"ContainerDied","Data":"d5abe8aaa95d2377df948c6372e3d87e2c9ef7a93f1feadd4b1fc0fd6d855b3b"} Feb 23 14:52:13.417093 master-0 kubenswrapper[28758]: I0223 14:52:13.416719 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5abe8aaa95d2377df948c6372e3d87e2c9ef7a93f1feadd4b1fc0fd6d855b3b" Feb 23 14:52:13.417093 master-0 kubenswrapper[28758]: I0223 14:52:13.416716 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-9m76c" Feb 23 14:52:13.421110 master-0 kubenswrapper[28758]: I0223 14:52:13.421014 28758 generic.go:334] "Generic (PLEG): container finished" podID="79275321-c948-4636-bcdf-2c0ab4f02076" containerID="95833dc474850c32d6b3fae9ed6bd7114059237225d21845650659f7fdc6d96f" exitCode=0 Feb 23 14:52:13.421254 master-0 kubenswrapper[28758]: I0223 14:52:13.421132 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bdb596dbf-l8mhf" event={"ID":"79275321-c948-4636-bcdf-2c0ab4f02076","Type":"ContainerDied","Data":"95833dc474850c32d6b3fae9ed6bd7114059237225d21845650659f7fdc6d96f"} Feb 23 14:52:13.538460 master-0 kubenswrapper[28758]: I0223 14:52:13.536764 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bdb596dbf-l8mhf" Feb 23 14:52:13.651622 master-0 kubenswrapper[28758]: I0223 14:52:13.651538 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltl7v\" (UniqueName: \"kubernetes.io/projected/79275321-c948-4636-bcdf-2c0ab4f02076-kube-api-access-ltl7v\") pod \"79275321-c948-4636-bcdf-2c0ab4f02076\" (UID: \"79275321-c948-4636-bcdf-2c0ab4f02076\") " Feb 23 14:52:13.651622 master-0 kubenswrapper[28758]: I0223 14:52:13.651615 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/79275321-c948-4636-bcdf-2c0ab4f02076-dns-swift-storage-0\") pod \"79275321-c948-4636-bcdf-2c0ab4f02076\" (UID: \"79275321-c948-4636-bcdf-2c0ab4f02076\") " Feb 23 14:52:13.651878 master-0 kubenswrapper[28758]: I0223 14:52:13.651682 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79275321-c948-4636-bcdf-2c0ab4f02076-dns-svc\") pod \"79275321-c948-4636-bcdf-2c0ab4f02076\" (UID: \"79275321-c948-4636-bcdf-2c0ab4f02076\") " Feb 23 14:52:13.651878 master-0 kubenswrapper[28758]: I0223 14:52:13.651823 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79275321-c948-4636-bcdf-2c0ab4f02076-config\") pod \"79275321-c948-4636-bcdf-2c0ab4f02076\" (UID: \"79275321-c948-4636-bcdf-2c0ab4f02076\") " Feb 23 14:52:13.651878 master-0 kubenswrapper[28758]: I0223 14:52:13.651854 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79275321-c948-4636-bcdf-2c0ab4f02076-ovsdbserver-nb\") pod \"79275321-c948-4636-bcdf-2c0ab4f02076\" (UID: \"79275321-c948-4636-bcdf-2c0ab4f02076\") " Feb 23 14:52:13.651992 master-0 kubenswrapper[28758]: I0223 14:52:13.651916 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79275321-c948-4636-bcdf-2c0ab4f02076-ovsdbserver-sb\") pod \"79275321-c948-4636-bcdf-2c0ab4f02076\" (UID: \"79275321-c948-4636-bcdf-2c0ab4f02076\") " Feb 23 14:52:13.657023 master-0 kubenswrapper[28758]: I0223 14:52:13.656972 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79275321-c948-4636-bcdf-2c0ab4f02076-kube-api-access-ltl7v" (OuterVolumeSpecName: "kube-api-access-ltl7v") pod "79275321-c948-4636-bcdf-2c0ab4f02076" (UID: "79275321-c948-4636-bcdf-2c0ab4f02076"). InnerVolumeSpecName "kube-api-access-ltl7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:52:13.722917 master-0 kubenswrapper[28758]: I0223 14:52:13.722848 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79275321-c948-4636-bcdf-2c0ab4f02076-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "79275321-c948-4636-bcdf-2c0ab4f02076" (UID: "79275321-c948-4636-bcdf-2c0ab4f02076"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:52:13.725706 master-0 kubenswrapper[28758]: I0223 14:52:13.725588 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79275321-c948-4636-bcdf-2c0ab4f02076-config" (OuterVolumeSpecName: "config") pod "79275321-c948-4636-bcdf-2c0ab4f02076" (UID: "79275321-c948-4636-bcdf-2c0ab4f02076"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:52:13.730318 master-0 kubenswrapper[28758]: I0223 14:52:13.730275 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79275321-c948-4636-bcdf-2c0ab4f02076-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "79275321-c948-4636-bcdf-2c0ab4f02076" (UID: "79275321-c948-4636-bcdf-2c0ab4f02076"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:52:13.734637 master-0 kubenswrapper[28758]: I0223 14:52:13.734557 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79275321-c948-4636-bcdf-2c0ab4f02076-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "79275321-c948-4636-bcdf-2c0ab4f02076" (UID: "79275321-c948-4636-bcdf-2c0ab4f02076"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:52:13.750986 master-0 kubenswrapper[28758]: I0223 14:52:13.749773 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79275321-c948-4636-bcdf-2c0ab4f02076-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "79275321-c948-4636-bcdf-2c0ab4f02076" (UID: "79275321-c948-4636-bcdf-2c0ab4f02076"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:52:13.759131 master-0 kubenswrapper[28758]: I0223 14:52:13.758864 28758 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79275321-c948-4636-bcdf-2c0ab4f02076-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:52:13.759131 master-0 kubenswrapper[28758]: I0223 14:52:13.758925 28758 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79275321-c948-4636-bcdf-2c0ab4f02076-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Feb 23 14:52:13.759131 master-0 kubenswrapper[28758]: I0223 14:52:13.758941 28758 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79275321-c948-4636-bcdf-2c0ab4f02076-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Feb 23 14:52:13.759131 master-0 kubenswrapper[28758]: I0223 14:52:13.758958 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltl7v\" (UniqueName: \"kubernetes.io/projected/79275321-c948-4636-bcdf-2c0ab4f02076-kube-api-access-ltl7v\") on node \"master-0\" DevicePath \"\"" Feb 23 14:52:13.759131 master-0 kubenswrapper[28758]: I0223 14:52:13.758968 28758 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/79275321-c948-4636-bcdf-2c0ab4f02076-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Feb 23 14:52:13.759131 master-0 kubenswrapper[28758]: I0223 14:52:13.758982 28758 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79275321-c948-4636-bcdf-2c0ab4f02076-dns-svc\") on node \"master-0\" DevicePath \"\"" Feb 23 14:52:14.435161 master-0 kubenswrapper[28758]: I0223 14:52:14.435094 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bdb596dbf-l8mhf" event={"ID":"79275321-c948-4636-bcdf-2c0ab4f02076","Type":"ContainerDied","Data":"a11c29c95bb00a031fbb9143b7d7392fc1b1c8c04248974cbe27815fdfe13ff4"} Feb 23 14:52:14.435161 master-0 kubenswrapper[28758]: I0223 14:52:14.435127 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bdb596dbf-l8mhf" Feb 23 14:52:14.435161 master-0 kubenswrapper[28758]: I0223 14:52:14.435165 28758 scope.go:117] "RemoveContainer" containerID="95833dc474850c32d6b3fae9ed6bd7114059237225d21845650659f7fdc6d96f" Feb 23 14:52:14.468567 master-0 kubenswrapper[28758]: I0223 14:52:14.468525 28758 scope.go:117] "RemoveContainer" containerID="40de82de7656f911efb90d8bed9335961f3d740b5bdd5f06264ab043010de138" Feb 23 14:52:14.472510 master-0 kubenswrapper[28758]: I0223 14:52:14.470447 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bdb596dbf-l8mhf"] Feb 23 14:52:14.481749 master-0 kubenswrapper[28758]: I0223 14:52:14.481682 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bdb596dbf-l8mhf"] Feb 23 14:52:14.902857 master-0 kubenswrapper[28758]: I0223 14:52:14.902769 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-btgpd" Feb 23 14:52:14.987860 master-0 kubenswrapper[28758]: I0223 14:52:14.987769 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2ec10e8-80ea-4fc3-93ce-3deec7202918-scripts\") pod \"c2ec10e8-80ea-4fc3-93ce-3deec7202918\" (UID: \"c2ec10e8-80ea-4fc3-93ce-3deec7202918\") " Feb 23 14:52:14.988077 master-0 kubenswrapper[28758]: I0223 14:52:14.987891 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2ec10e8-80ea-4fc3-93ce-3deec7202918-combined-ca-bundle\") pod \"c2ec10e8-80ea-4fc3-93ce-3deec7202918\" (UID: \"c2ec10e8-80ea-4fc3-93ce-3deec7202918\") " Feb 23 14:52:14.988077 master-0 kubenswrapper[28758]: I0223 14:52:14.987974 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2ec10e8-80ea-4fc3-93ce-3deec7202918-config-data\") pod \"c2ec10e8-80ea-4fc3-93ce-3deec7202918\" (UID: \"c2ec10e8-80ea-4fc3-93ce-3deec7202918\") " Feb 23 14:52:14.988077 master-0 kubenswrapper[28758]: I0223 14:52:14.987998 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npdmn\" (UniqueName: \"kubernetes.io/projected/c2ec10e8-80ea-4fc3-93ce-3deec7202918-kube-api-access-npdmn\") pod \"c2ec10e8-80ea-4fc3-93ce-3deec7202918\" (UID: \"c2ec10e8-80ea-4fc3-93ce-3deec7202918\") " Feb 23 14:52:14.990917 master-0 kubenswrapper[28758]: I0223 14:52:14.990837 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2ec10e8-80ea-4fc3-93ce-3deec7202918-scripts" (OuterVolumeSpecName: "scripts") pod "c2ec10e8-80ea-4fc3-93ce-3deec7202918" (UID: "c2ec10e8-80ea-4fc3-93ce-3deec7202918"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:52:14.991888 master-0 kubenswrapper[28758]: I0223 14:52:14.991794 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2ec10e8-80ea-4fc3-93ce-3deec7202918-kube-api-access-npdmn" (OuterVolumeSpecName: "kube-api-access-npdmn") pod "c2ec10e8-80ea-4fc3-93ce-3deec7202918" (UID: "c2ec10e8-80ea-4fc3-93ce-3deec7202918"). InnerVolumeSpecName "kube-api-access-npdmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:52:15.023707 master-0 kubenswrapper[28758]: I0223 14:52:15.023627 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2ec10e8-80ea-4fc3-93ce-3deec7202918-config-data" (OuterVolumeSpecName: "config-data") pod "c2ec10e8-80ea-4fc3-93ce-3deec7202918" (UID: "c2ec10e8-80ea-4fc3-93ce-3deec7202918"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:52:15.023707 master-0 kubenswrapper[28758]: I0223 14:52:15.023666 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2ec10e8-80ea-4fc3-93ce-3deec7202918-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2ec10e8-80ea-4fc3-93ce-3deec7202918" (UID: "c2ec10e8-80ea-4fc3-93ce-3deec7202918"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:52:15.089865 master-0 kubenswrapper[28758]: I0223 14:52:15.089789 28758 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2ec10e8-80ea-4fc3-93ce-3deec7202918-scripts\") on node \"master-0\" DevicePath \"\"" Feb 23 14:52:15.089865 master-0 kubenswrapper[28758]: I0223 14:52:15.089832 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2ec10e8-80ea-4fc3-93ce-3deec7202918-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:52:15.089865 master-0 kubenswrapper[28758]: I0223 14:52:15.089848 28758 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2ec10e8-80ea-4fc3-93ce-3deec7202918-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:52:15.089865 master-0 kubenswrapper[28758]: I0223 14:52:15.089857 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npdmn\" (UniqueName: \"kubernetes.io/projected/c2ec10e8-80ea-4fc3-93ce-3deec7202918-kube-api-access-npdmn\") on node \"master-0\" DevicePath \"\"" Feb 23 14:52:15.449824 master-0 kubenswrapper[28758]: I0223 14:52:15.449420 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-btgpd" event={"ID":"c2ec10e8-80ea-4fc3-93ce-3deec7202918","Type":"ContainerDied","Data":"e35756cc6760d85693d50393600d4cf2c72d8d9f20a9ba240bfc1dcec14526f8"} Feb 23 14:52:15.449824 master-0 kubenswrapper[28758]: I0223 14:52:15.449504 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-btgpd" Feb 23 14:52:15.449824 master-0 kubenswrapper[28758]: I0223 14:52:15.449527 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e35756cc6760d85693d50393600d4cf2c72d8d9f20a9ba240bfc1dcec14526f8" Feb 23 14:52:15.650342 master-0 kubenswrapper[28758]: I0223 14:52:15.650263 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 14:52:15.650769 master-0 kubenswrapper[28758]: I0223 14:52:15.650556 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0ecff41d-8f4e-4d83-95fe-0c9ac487ebde" containerName="nova-api-log" containerID="cri-o://7e377e0183295d37400bf1498dbee653d35b264381906e2bfc0aea6ebc8f23d2" gracePeriod=30 Feb 23 14:52:15.650769 master-0 kubenswrapper[28758]: I0223 14:52:15.650632 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0ecff41d-8f4e-4d83-95fe-0c9ac487ebde" containerName="nova-api-api" containerID="cri-o://03576a26302c4ea098c47a29b1b158409b6008394f67c697da3d7d646c745e07" gracePeriod=30 Feb 23 14:52:15.666425 master-0 kubenswrapper[28758]: I0223 14:52:15.666359 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 14:52:15.666670 master-0 kubenswrapper[28758]: I0223 14:52:15.666622 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec" containerName="nova-scheduler-scheduler" containerID="cri-o://bf8ea5b460d72729ee24b47e584409946a22df206c7b08d7bcca7e525d06c63c" gracePeriod=30 Feb 23 14:52:15.744683 master-0 kubenswrapper[28758]: I0223 14:52:15.744538 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 14:52:15.744890 master-0 kubenswrapper[28758]: I0223 14:52:15.744788 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="07be6be7-f047-46d7-a28b-cc077ce2261c" containerName="nova-metadata-log" containerID="cri-o://62fca3409b9e55c698632023db5afaf837ed61c8a521fe1d96747b4a2141f132" gracePeriod=30 Feb 23 14:52:15.745172 master-0 kubenswrapper[28758]: I0223 14:52:15.745120 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="07be6be7-f047-46d7-a28b-cc077ce2261c" containerName="nova-metadata-metadata" containerID="cri-o://7f2516722ecfb46cfaeeb87d56fe69c1f08ff21e506710b29ec6c3ce233778f2" gracePeriod=30 Feb 23 14:52:16.113407 master-0 kubenswrapper[28758]: I0223 14:52:16.112416 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79275321-c948-4636-bcdf-2c0ab4f02076" path="/var/lib/kubelet/pods/79275321-c948-4636-bcdf-2c0ab4f02076/volumes" Feb 23 14:52:16.365523 master-0 kubenswrapper[28758]: I0223 14:52:16.365369 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 14:52:16.462001 master-0 kubenswrapper[28758]: I0223 14:52:16.461919 28758 generic.go:334] "Generic (PLEG): container finished" podID="8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec" containerID="bf8ea5b460d72729ee24b47e584409946a22df206c7b08d7bcca7e525d06c63c" exitCode=0 Feb 23 14:52:16.462001 master-0 kubenswrapper[28758]: I0223 14:52:16.462003 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec","Type":"ContainerDied","Data":"bf8ea5b460d72729ee24b47e584409946a22df206c7b08d7bcca7e525d06c63c"} Feb 23 14:52:16.464964 master-0 kubenswrapper[28758]: I0223 14:52:16.464919 28758 generic.go:334] "Generic (PLEG): container finished" podID="0ecff41d-8f4e-4d83-95fe-0c9ac487ebde" containerID="03576a26302c4ea098c47a29b1b158409b6008394f67c697da3d7d646c745e07" exitCode=0 Feb 23 14:52:16.464964 master-0 kubenswrapper[28758]: I0223 14:52:16.464952 28758 generic.go:334] "Generic (PLEG): container finished" podID="0ecff41d-8f4e-4d83-95fe-0c9ac487ebde" containerID="7e377e0183295d37400bf1498dbee653d35b264381906e2bfc0aea6ebc8f23d2" exitCode=143 Feb 23 14:52:16.465084 master-0 kubenswrapper[28758]: I0223 14:52:16.464992 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde","Type":"ContainerDied","Data":"03576a26302c4ea098c47a29b1b158409b6008394f67c697da3d7d646c745e07"} Feb 23 14:52:16.465084 master-0 kubenswrapper[28758]: I0223 14:52:16.465017 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde","Type":"ContainerDied","Data":"7e377e0183295d37400bf1498dbee653d35b264381906e2bfc0aea6ebc8f23d2"} Feb 23 14:52:16.465084 master-0 kubenswrapper[28758]: I0223 14:52:16.465031 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde","Type":"ContainerDied","Data":"4e416c7e63802f310e573ca4985aac973593ad3e5b46bf813971a8c11f49bbef"} Feb 23 14:52:16.465084 master-0 kubenswrapper[28758]: I0223 14:52:16.465051 28758 scope.go:117] "RemoveContainer" containerID="03576a26302c4ea098c47a29b1b158409b6008394f67c697da3d7d646c745e07" Feb 23 14:52:16.465249 master-0 kubenswrapper[28758]: I0223 14:52:16.465219 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 14:52:16.468788 master-0 kubenswrapper[28758]: I0223 14:52:16.468733 28758 generic.go:334] "Generic (PLEG): container finished" podID="07be6be7-f047-46d7-a28b-cc077ce2261c" containerID="62fca3409b9e55c698632023db5afaf837ed61c8a521fe1d96747b4a2141f132" exitCode=143 Feb 23 14:52:16.468968 master-0 kubenswrapper[28758]: I0223 14:52:16.468799 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"07be6be7-f047-46d7-a28b-cc077ce2261c","Type":"ContainerDied","Data":"62fca3409b9e55c698632023db5afaf837ed61c8a521fe1d96747b4a2141f132"} Feb 23 14:52:16.501425 master-0 kubenswrapper[28758]: I0223 14:52:16.501332 28758 scope.go:117] "RemoveContainer" containerID="7e377e0183295d37400bf1498dbee653d35b264381906e2bfc0aea6ebc8f23d2" Feb 23 14:52:16.525717 master-0 kubenswrapper[28758]: I0223 14:52:16.525653 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-combined-ca-bundle\") pod \"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde\" (UID: \"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde\") " Feb 23 14:52:16.525910 master-0 kubenswrapper[28758]: I0223 14:52:16.525800 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-internal-tls-certs\") pod \"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde\" (UID: \"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde\") " Feb 23 14:52:16.525910 master-0 kubenswrapper[28758]: I0223 14:52:16.525856 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-config-data\") pod \"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde\" (UID: \"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde\") " Feb 23 14:52:16.525910 master-0 kubenswrapper[28758]: I0223 14:52:16.525897 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktnq9\" (UniqueName: \"kubernetes.io/projected/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-kube-api-access-ktnq9\") pod \"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde\" (UID: \"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde\") " Feb 23 14:52:16.526019 master-0 kubenswrapper[28758]: I0223 14:52:16.525946 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-logs\") pod \"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde\" (UID: \"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde\") " Feb 23 14:52:16.526184 master-0 kubenswrapper[28758]: I0223 14:52:16.526130 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-public-tls-certs\") pod \"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde\" (UID: \"0ecff41d-8f4e-4d83-95fe-0c9ac487ebde\") " Feb 23 14:52:16.526550 master-0 kubenswrapper[28758]: I0223 14:52:16.526450 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-logs" (OuterVolumeSpecName: "logs") pod "0ecff41d-8f4e-4d83-95fe-0c9ac487ebde" (UID: "0ecff41d-8f4e-4d83-95fe-0c9ac487ebde"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:52:16.529247 master-0 kubenswrapper[28758]: I0223 14:52:16.527275 28758 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-logs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:52:16.529645 master-0 kubenswrapper[28758]: I0223 14:52:16.529573 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-kube-api-access-ktnq9" (OuterVolumeSpecName: "kube-api-access-ktnq9") pod "0ecff41d-8f4e-4d83-95fe-0c9ac487ebde" (UID: "0ecff41d-8f4e-4d83-95fe-0c9ac487ebde"). InnerVolumeSpecName "kube-api-access-ktnq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:52:16.557111 master-0 kubenswrapper[28758]: I0223 14:52:16.556998 28758 scope.go:117] "RemoveContainer" containerID="03576a26302c4ea098c47a29b1b158409b6008394f67c697da3d7d646c745e07" Feb 23 14:52:16.557777 master-0 kubenswrapper[28758]: E0223 14:52:16.557620 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03576a26302c4ea098c47a29b1b158409b6008394f67c697da3d7d646c745e07\": container with ID starting with 03576a26302c4ea098c47a29b1b158409b6008394f67c697da3d7d646c745e07 not found: ID does not exist" containerID="03576a26302c4ea098c47a29b1b158409b6008394f67c697da3d7d646c745e07" Feb 23 14:52:16.557777 master-0 kubenswrapper[28758]: I0223 14:52:16.557670 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03576a26302c4ea098c47a29b1b158409b6008394f67c697da3d7d646c745e07"} err="failed to get container status \"03576a26302c4ea098c47a29b1b158409b6008394f67c697da3d7d646c745e07\": rpc error: code = NotFound desc = could not find container \"03576a26302c4ea098c47a29b1b158409b6008394f67c697da3d7d646c745e07\": container with ID starting with 03576a26302c4ea098c47a29b1b158409b6008394f67c697da3d7d646c745e07 not found: ID does not exist" Feb 23 14:52:16.557777 master-0 kubenswrapper[28758]: I0223 14:52:16.557697 28758 scope.go:117] "RemoveContainer" containerID="7e377e0183295d37400bf1498dbee653d35b264381906e2bfc0aea6ebc8f23d2" Feb 23 14:52:16.558528 master-0 kubenswrapper[28758]: E0223 14:52:16.558109 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e377e0183295d37400bf1498dbee653d35b264381906e2bfc0aea6ebc8f23d2\": container with ID starting with 7e377e0183295d37400bf1498dbee653d35b264381906e2bfc0aea6ebc8f23d2 not found: ID does not exist" containerID="7e377e0183295d37400bf1498dbee653d35b264381906e2bfc0aea6ebc8f23d2" Feb 23 14:52:16.558528 master-0 kubenswrapper[28758]: I0223 14:52:16.558164 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e377e0183295d37400bf1498dbee653d35b264381906e2bfc0aea6ebc8f23d2"} err="failed to get container status \"7e377e0183295d37400bf1498dbee653d35b264381906e2bfc0aea6ebc8f23d2\": rpc error: code = NotFound desc = could not find container \"7e377e0183295d37400bf1498dbee653d35b264381906e2bfc0aea6ebc8f23d2\": container with ID starting with 7e377e0183295d37400bf1498dbee653d35b264381906e2bfc0aea6ebc8f23d2 not found: ID does not exist" Feb 23 14:52:16.558528 master-0 kubenswrapper[28758]: I0223 14:52:16.558201 28758 scope.go:117] "RemoveContainer" containerID="03576a26302c4ea098c47a29b1b158409b6008394f67c697da3d7d646c745e07" Feb 23 14:52:16.558696 master-0 kubenswrapper[28758]: I0223 14:52:16.558659 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03576a26302c4ea098c47a29b1b158409b6008394f67c697da3d7d646c745e07"} err="failed to get container status \"03576a26302c4ea098c47a29b1b158409b6008394f67c697da3d7d646c745e07\": rpc error: code = NotFound desc = could not find container \"03576a26302c4ea098c47a29b1b158409b6008394f67c697da3d7d646c745e07\": container with ID starting with 03576a26302c4ea098c47a29b1b158409b6008394f67c697da3d7d646c745e07 not found: ID does not exist" Feb 23 14:52:16.558696 master-0 kubenswrapper[28758]: I0223 14:52:16.558685 28758 scope.go:117] "RemoveContainer" containerID="7e377e0183295d37400bf1498dbee653d35b264381906e2bfc0aea6ebc8f23d2" Feb 23 14:52:16.559093 master-0 kubenswrapper[28758]: I0223 14:52:16.559049 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e377e0183295d37400bf1498dbee653d35b264381906e2bfc0aea6ebc8f23d2"} err="failed to get container status \"7e377e0183295d37400bf1498dbee653d35b264381906e2bfc0aea6ebc8f23d2\": rpc error: code = NotFound desc = could not find container \"7e377e0183295d37400bf1498dbee653d35b264381906e2bfc0aea6ebc8f23d2\": container with ID starting with 7e377e0183295d37400bf1498dbee653d35b264381906e2bfc0aea6ebc8f23d2 not found: ID does not exist" Feb 23 14:52:16.571073 master-0 kubenswrapper[28758]: I0223 14:52:16.571016 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ecff41d-8f4e-4d83-95fe-0c9ac487ebde" (UID: "0ecff41d-8f4e-4d83-95fe-0c9ac487ebde"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:52:16.572030 master-0 kubenswrapper[28758]: I0223 14:52:16.571978 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-config-data" (OuterVolumeSpecName: "config-data") pod "0ecff41d-8f4e-4d83-95fe-0c9ac487ebde" (UID: "0ecff41d-8f4e-4d83-95fe-0c9ac487ebde"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:52:16.595336 master-0 kubenswrapper[28758]: I0223 14:52:16.595268 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0ecff41d-8f4e-4d83-95fe-0c9ac487ebde" (UID: "0ecff41d-8f4e-4d83-95fe-0c9ac487ebde"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:52:16.608179 master-0 kubenswrapper[28758]: I0223 14:52:16.607565 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0ecff41d-8f4e-4d83-95fe-0c9ac487ebde" (UID: "0ecff41d-8f4e-4d83-95fe-0c9ac487ebde"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:52:16.616890 master-0 kubenswrapper[28758]: I0223 14:52:16.616322 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 14:52:16.630725 master-0 kubenswrapper[28758]: I0223 14:52:16.630626 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktnq9\" (UniqueName: \"kubernetes.io/projected/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-kube-api-access-ktnq9\") on node \"master-0\" DevicePath \"\"" Feb 23 14:52:16.630725 master-0 kubenswrapper[28758]: I0223 14:52:16.630710 28758 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:52:16.630725 master-0 kubenswrapper[28758]: I0223 14:52:16.630724 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:52:16.631097 master-0 kubenswrapper[28758]: I0223 14:52:16.630736 28758 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:52:16.631097 master-0 kubenswrapper[28758]: I0223 14:52:16.630774 28758 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:52:16.732761 master-0 kubenswrapper[28758]: I0223 14:52:16.732690 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec-config-data\") pod \"8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec\" (UID: \"8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec\") " Feb 23 14:52:16.732761 master-0 kubenswrapper[28758]: I0223 14:52:16.732773 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwdf5\" (UniqueName: \"kubernetes.io/projected/8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec-kube-api-access-jwdf5\") pod \"8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec\" (UID: \"8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec\") " Feb 23 14:52:16.733045 master-0 kubenswrapper[28758]: I0223 14:52:16.733013 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec-combined-ca-bundle\") pod \"8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec\" (UID: \"8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec\") " Feb 23 14:52:16.736615 master-0 kubenswrapper[28758]: I0223 14:52:16.736559 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec-kube-api-access-jwdf5" (OuterVolumeSpecName: "kube-api-access-jwdf5") pod "8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec" (UID: "8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec"). InnerVolumeSpecName "kube-api-access-jwdf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:52:16.770651 master-0 kubenswrapper[28758]: I0223 14:52:16.770496 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec-config-data" (OuterVolumeSpecName: "config-data") pod "8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec" (UID: "8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:52:16.780608 master-0 kubenswrapper[28758]: I0223 14:52:16.780525 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec" (UID: "8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:52:16.823719 master-0 kubenswrapper[28758]: I0223 14:52:16.823622 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 23 14:52:16.836161 master-0 kubenswrapper[28758]: I0223 14:52:16.836092 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:52:16.836161 master-0 kubenswrapper[28758]: I0223 14:52:16.836146 28758 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:52:16.836161 master-0 kubenswrapper[28758]: I0223 14:52:16.836157 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwdf5\" (UniqueName: \"kubernetes.io/projected/8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec-kube-api-access-jwdf5\") on node \"master-0\" DevicePath \"\"" Feb 23 14:52:16.842183 master-0 kubenswrapper[28758]: I0223 14:52:16.842133 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 23 14:52:16.878469 master-0 kubenswrapper[28758]: I0223 14:52:16.858919 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 23 14:52:16.878469 master-0 kubenswrapper[28758]: E0223 14:52:16.859534 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec" containerName="nova-scheduler-scheduler" Feb 23 14:52:16.878469 master-0 kubenswrapper[28758]: I0223 14:52:16.859554 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec" containerName="nova-scheduler-scheduler" Feb 23 14:52:16.878469 master-0 kubenswrapper[28758]: E0223 14:52:16.859589 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64cc097b-5a40-40dc-961d-2951e576f39e" containerName="nova-manage" Feb 23 14:52:16.878469 master-0 kubenswrapper[28758]: I0223 14:52:16.859598 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="64cc097b-5a40-40dc-961d-2951e576f39e" containerName="nova-manage" Feb 23 14:52:16.878469 master-0 kubenswrapper[28758]: E0223 14:52:16.859631 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2ec10e8-80ea-4fc3-93ce-3deec7202918" containerName="nova-manage" Feb 23 14:52:16.878469 master-0 kubenswrapper[28758]: I0223 14:52:16.859640 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2ec10e8-80ea-4fc3-93ce-3deec7202918" containerName="nova-manage" Feb 23 14:52:16.878469 master-0 kubenswrapper[28758]: E0223 14:52:16.859659 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79275321-c948-4636-bcdf-2c0ab4f02076" containerName="dnsmasq-dns" Feb 23 14:52:16.878469 master-0 kubenswrapper[28758]: I0223 14:52:16.859667 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="79275321-c948-4636-bcdf-2c0ab4f02076" containerName="dnsmasq-dns" Feb 23 14:52:16.878469 master-0 kubenswrapper[28758]: E0223 14:52:16.859690 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ecff41d-8f4e-4d83-95fe-0c9ac487ebde" containerName="nova-api-api" Feb 23 14:52:16.878469 master-0 kubenswrapper[28758]: I0223 14:52:16.859699 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ecff41d-8f4e-4d83-95fe-0c9ac487ebde" containerName="nova-api-api" Feb 23 14:52:16.878469 master-0 kubenswrapper[28758]: E0223 14:52:16.859720 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ecff41d-8f4e-4d83-95fe-0c9ac487ebde" containerName="nova-api-log" Feb 23 14:52:16.878469 master-0 kubenswrapper[28758]: I0223 14:52:16.859727 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ecff41d-8f4e-4d83-95fe-0c9ac487ebde" containerName="nova-api-log" Feb 23 14:52:16.878469 master-0 kubenswrapper[28758]: E0223 14:52:16.859757 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79275321-c948-4636-bcdf-2c0ab4f02076" containerName="init" Feb 23 14:52:16.878469 master-0 kubenswrapper[28758]: I0223 14:52:16.859766 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="79275321-c948-4636-bcdf-2c0ab4f02076" containerName="init" Feb 23 14:52:16.878469 master-0 kubenswrapper[28758]: I0223 14:52:16.860038 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec" containerName="nova-scheduler-scheduler" Feb 23 14:52:16.878469 master-0 kubenswrapper[28758]: I0223 14:52:16.860071 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="64cc097b-5a40-40dc-961d-2951e576f39e" containerName="nova-manage" Feb 23 14:52:16.878469 master-0 kubenswrapper[28758]: I0223 14:52:16.860094 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="79275321-c948-4636-bcdf-2c0ab4f02076" containerName="dnsmasq-dns" Feb 23 14:52:16.878469 master-0 kubenswrapper[28758]: I0223 14:52:16.860121 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ecff41d-8f4e-4d83-95fe-0c9ac487ebde" containerName="nova-api-api" Feb 23 14:52:16.878469 master-0 kubenswrapper[28758]: I0223 14:52:16.860138 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ecff41d-8f4e-4d83-95fe-0c9ac487ebde" containerName="nova-api-log" Feb 23 14:52:16.878469 master-0 kubenswrapper[28758]: I0223 14:52:16.860161 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2ec10e8-80ea-4fc3-93ce-3deec7202918" containerName="nova-manage" Feb 23 14:52:16.878469 master-0 kubenswrapper[28758]: I0223 14:52:16.877767 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 14:52:16.882587 master-0 kubenswrapper[28758]: I0223 14:52:16.881311 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 23 14:52:16.886006 master-0 kubenswrapper[28758]: I0223 14:52:16.882782 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 23 14:52:16.886006 master-0 kubenswrapper[28758]: I0223 14:52:16.882849 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 23 14:52:16.891831 master-0 kubenswrapper[28758]: I0223 14:52:16.891764 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 14:52:16.965134 master-0 kubenswrapper[28758]: I0223 14:52:16.965033 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d31ce50-c005-43ee-b185-fc4991705cf2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3d31ce50-c005-43ee-b185-fc4991705cf2\") " pod="openstack/nova-api-0" Feb 23 14:52:16.965428 master-0 kubenswrapper[28758]: I0223 14:52:16.965185 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d31ce50-c005-43ee-b185-fc4991705cf2-public-tls-certs\") pod \"nova-api-0\" (UID: \"3d31ce50-c005-43ee-b185-fc4991705cf2\") " pod="openstack/nova-api-0" Feb 23 14:52:16.965428 master-0 kubenswrapper[28758]: I0223 14:52:16.965258 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d31ce50-c005-43ee-b185-fc4991705cf2-logs\") pod \"nova-api-0\" (UID: \"3d31ce50-c005-43ee-b185-fc4991705cf2\") " pod="openstack/nova-api-0" Feb 23 14:52:16.965428 master-0 kubenswrapper[28758]: I0223 14:52:16.965363 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d31ce50-c005-43ee-b185-fc4991705cf2-config-data\") pod \"nova-api-0\" (UID: \"3d31ce50-c005-43ee-b185-fc4991705cf2\") " pod="openstack/nova-api-0" Feb 23 14:52:16.965773 master-0 kubenswrapper[28758]: I0223 14:52:16.965728 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz5mg\" (UniqueName: \"kubernetes.io/projected/3d31ce50-c005-43ee-b185-fc4991705cf2-kube-api-access-cz5mg\") pod \"nova-api-0\" (UID: \"3d31ce50-c005-43ee-b185-fc4991705cf2\") " pod="openstack/nova-api-0" Feb 23 14:52:16.965832 master-0 kubenswrapper[28758]: I0223 14:52:16.965801 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d31ce50-c005-43ee-b185-fc4991705cf2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3d31ce50-c005-43ee-b185-fc4991705cf2\") " pod="openstack/nova-api-0" Feb 23 14:52:17.067496 master-0 kubenswrapper[28758]: I0223 14:52:17.067404 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d31ce50-c005-43ee-b185-fc4991705cf2-public-tls-certs\") pod \"nova-api-0\" (UID: \"3d31ce50-c005-43ee-b185-fc4991705cf2\") " pod="openstack/nova-api-0" Feb 23 14:52:17.067781 master-0 kubenswrapper[28758]: I0223 14:52:17.067656 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d31ce50-c005-43ee-b185-fc4991705cf2-logs\") pod \"nova-api-0\" (UID: \"3d31ce50-c005-43ee-b185-fc4991705cf2\") " pod="openstack/nova-api-0" Feb 23 14:52:17.067781 master-0 kubenswrapper[28758]: I0223 14:52:17.067713 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d31ce50-c005-43ee-b185-fc4991705cf2-config-data\") pod \"nova-api-0\" (UID: \"3d31ce50-c005-43ee-b185-fc4991705cf2\") " pod="openstack/nova-api-0" Feb 23 14:52:17.067893 master-0 kubenswrapper[28758]: I0223 14:52:17.067789 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz5mg\" (UniqueName: \"kubernetes.io/projected/3d31ce50-c005-43ee-b185-fc4991705cf2-kube-api-access-cz5mg\") pod \"nova-api-0\" (UID: \"3d31ce50-c005-43ee-b185-fc4991705cf2\") " pod="openstack/nova-api-0" Feb 23 14:52:17.067893 master-0 kubenswrapper[28758]: I0223 14:52:17.067822 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d31ce50-c005-43ee-b185-fc4991705cf2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3d31ce50-c005-43ee-b185-fc4991705cf2\") " pod="openstack/nova-api-0" Feb 23 14:52:17.067893 master-0 kubenswrapper[28758]: I0223 14:52:17.067881 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d31ce50-c005-43ee-b185-fc4991705cf2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3d31ce50-c005-43ee-b185-fc4991705cf2\") " pod="openstack/nova-api-0" Feb 23 14:52:17.068110 master-0 kubenswrapper[28758]: I0223 14:52:17.068056 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d31ce50-c005-43ee-b185-fc4991705cf2-logs\") pod \"nova-api-0\" (UID: \"3d31ce50-c005-43ee-b185-fc4991705cf2\") " pod="openstack/nova-api-0" Feb 23 14:52:17.071445 master-0 kubenswrapper[28758]: I0223 14:52:17.071397 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d31ce50-c005-43ee-b185-fc4991705cf2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3d31ce50-c005-43ee-b185-fc4991705cf2\") " pod="openstack/nova-api-0" Feb 23 14:52:17.071445 master-0 kubenswrapper[28758]: I0223 14:52:17.071421 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d31ce50-c005-43ee-b185-fc4991705cf2-config-data\") pod \"nova-api-0\" (UID: \"3d31ce50-c005-43ee-b185-fc4991705cf2\") " pod="openstack/nova-api-0" Feb 23 14:52:17.072582 master-0 kubenswrapper[28758]: I0223 14:52:17.072551 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d31ce50-c005-43ee-b185-fc4991705cf2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3d31ce50-c005-43ee-b185-fc4991705cf2\") " pod="openstack/nova-api-0" Feb 23 14:52:17.072962 master-0 kubenswrapper[28758]: I0223 14:52:17.072923 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d31ce50-c005-43ee-b185-fc4991705cf2-public-tls-certs\") pod \"nova-api-0\" (UID: \"3d31ce50-c005-43ee-b185-fc4991705cf2\") " pod="openstack/nova-api-0" Feb 23 14:52:17.087555 master-0 kubenswrapper[28758]: I0223 14:52:17.084903 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz5mg\" (UniqueName: \"kubernetes.io/projected/3d31ce50-c005-43ee-b185-fc4991705cf2-kube-api-access-cz5mg\") pod \"nova-api-0\" (UID: \"3d31ce50-c005-43ee-b185-fc4991705cf2\") " pod="openstack/nova-api-0" Feb 23 14:52:17.283046 master-0 kubenswrapper[28758]: I0223 14:52:17.283001 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 23 14:52:17.484039 master-0 kubenswrapper[28758]: I0223 14:52:17.483607 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec","Type":"ContainerDied","Data":"13e3c12dedbe97c1796ce4352b82f4344a1290bea7f1b8c67275fd3a6999b593"} Feb 23 14:52:17.484039 master-0 kubenswrapper[28758]: I0223 14:52:17.483616 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 14:52:17.484039 master-0 kubenswrapper[28758]: I0223 14:52:17.483692 28758 scope.go:117] "RemoveContainer" containerID="bf8ea5b460d72729ee24b47e584409946a22df206c7b08d7bcca7e525d06c63c" Feb 23 14:52:17.530180 master-0 kubenswrapper[28758]: I0223 14:52:17.530100 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 14:52:17.543195 master-0 kubenswrapper[28758]: I0223 14:52:17.542887 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 14:52:17.575603 master-0 kubenswrapper[28758]: I0223 14:52:17.571869 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 14:52:17.575603 master-0 kubenswrapper[28758]: I0223 14:52:17.573530 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 14:52:17.576594 master-0 kubenswrapper[28758]: I0223 14:52:17.576555 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f5c91f3-d3a3-445d-bd0b-e81ed6db60b6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6f5c91f3-d3a3-445d-bd0b-e81ed6db60b6\") " pod="openstack/nova-scheduler-0" Feb 23 14:52:17.576695 master-0 kubenswrapper[28758]: I0223 14:52:17.576642 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q5xr\" (UniqueName: \"kubernetes.io/projected/6f5c91f3-d3a3-445d-bd0b-e81ed6db60b6-kube-api-access-4q5xr\") pod \"nova-scheduler-0\" (UID: \"6f5c91f3-d3a3-445d-bd0b-e81ed6db60b6\") " pod="openstack/nova-scheduler-0" Feb 23 14:52:17.576950 master-0 kubenswrapper[28758]: I0223 14:52:17.576922 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f5c91f3-d3a3-445d-bd0b-e81ed6db60b6-config-data\") pod \"nova-scheduler-0\" (UID: \"6f5c91f3-d3a3-445d-bd0b-e81ed6db60b6\") " pod="openstack/nova-scheduler-0" Feb 23 14:52:17.582564 master-0 kubenswrapper[28758]: I0223 14:52:17.582510 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 23 14:52:17.587432 master-0 kubenswrapper[28758]: I0223 14:52:17.586319 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 14:52:17.679578 master-0 kubenswrapper[28758]: I0223 14:52:17.679490 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f5c91f3-d3a3-445d-bd0b-e81ed6db60b6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6f5c91f3-d3a3-445d-bd0b-e81ed6db60b6\") " pod="openstack/nova-scheduler-0" Feb 23 14:52:17.679869 master-0 kubenswrapper[28758]: I0223 14:52:17.679595 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q5xr\" (UniqueName: \"kubernetes.io/projected/6f5c91f3-d3a3-445d-bd0b-e81ed6db60b6-kube-api-access-4q5xr\") pod \"nova-scheduler-0\" (UID: \"6f5c91f3-d3a3-445d-bd0b-e81ed6db60b6\") " pod="openstack/nova-scheduler-0" Feb 23 14:52:17.679869 master-0 kubenswrapper[28758]: I0223 14:52:17.679651 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f5c91f3-d3a3-445d-bd0b-e81ed6db60b6-config-data\") pod \"nova-scheduler-0\" (UID: \"6f5c91f3-d3a3-445d-bd0b-e81ed6db60b6\") " pod="openstack/nova-scheduler-0" Feb 23 14:52:17.685047 master-0 kubenswrapper[28758]: I0223 14:52:17.684949 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f5c91f3-d3a3-445d-bd0b-e81ed6db60b6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6f5c91f3-d3a3-445d-bd0b-e81ed6db60b6\") " pod="openstack/nova-scheduler-0" Feb 23 14:52:17.685269 master-0 kubenswrapper[28758]: I0223 14:52:17.685198 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f5c91f3-d3a3-445d-bd0b-e81ed6db60b6-config-data\") pod \"nova-scheduler-0\" (UID: \"6f5c91f3-d3a3-445d-bd0b-e81ed6db60b6\") " pod="openstack/nova-scheduler-0" Feb 23 14:52:17.698705 master-0 kubenswrapper[28758]: I0223 14:52:17.698624 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q5xr\" (UniqueName: \"kubernetes.io/projected/6f5c91f3-d3a3-445d-bd0b-e81ed6db60b6-kube-api-access-4q5xr\") pod \"nova-scheduler-0\" (UID: \"6f5c91f3-d3a3-445d-bd0b-e81ed6db60b6\") " pod="openstack/nova-scheduler-0" Feb 23 14:52:17.737843 master-0 kubenswrapper[28758]: I0223 14:52:17.736712 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 23 14:52:17.923830 master-0 kubenswrapper[28758]: I0223 14:52:17.923678 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 23 14:52:18.116825 master-0 kubenswrapper[28758]: I0223 14:52:18.105000 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ecff41d-8f4e-4d83-95fe-0c9ac487ebde" path="/var/lib/kubelet/pods/0ecff41d-8f4e-4d83-95fe-0c9ac487ebde/volumes" Feb 23 14:52:18.116825 master-0 kubenswrapper[28758]: I0223 14:52:18.105823 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec" path="/var/lib/kubelet/pods/8b8e2511-b5e6-4a0a-9baf-d5be2c0a58ec/volumes" Feb 23 14:52:18.395684 master-0 kubenswrapper[28758]: I0223 14:52:18.395565 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 23 14:52:18.541354 master-0 kubenswrapper[28758]: I0223 14:52:18.541269 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6f5c91f3-d3a3-445d-bd0b-e81ed6db60b6","Type":"ContainerStarted","Data":"db84ce87c48f7aa592fcc5df85042a32f64ab6538f10266bddfb4173dbc6f4ab"} Feb 23 14:52:18.544758 master-0 kubenswrapper[28758]: I0223 14:52:18.544699 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3d31ce50-c005-43ee-b185-fc4991705cf2","Type":"ContainerStarted","Data":"0979031abbbde022bda8d46d60019695922456d09cf1839a73872d476b889d42"} Feb 23 14:52:18.544902 master-0 kubenswrapper[28758]: I0223 14:52:18.544884 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3d31ce50-c005-43ee-b185-fc4991705cf2","Type":"ContainerStarted","Data":"c63eec6d3cc4b4f605687b973e1a8f7b7998ef1b76216de5ce19c3f4fc873efc"} Feb 23 14:52:18.545028 master-0 kubenswrapper[28758]: I0223 14:52:18.545011 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3d31ce50-c005-43ee-b185-fc4991705cf2","Type":"ContainerStarted","Data":"79263473dbc07548f763978703dc8f8b9104ba34f998527b2e49c4e59f434bad"} Feb 23 14:52:18.583020 master-0 kubenswrapper[28758]: I0223 14:52:18.582912 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.58288565 podStartE2EDuration="2.58288565s" podCreationTimestamp="2026-02-23 14:52:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:52:18.566789974 +0000 UTC m=+1070.693105906" watchObservedRunningTime="2026-02-23 14:52:18.58288565 +0000 UTC m=+1070.709201582" Feb 23 14:52:18.875958 master-0 kubenswrapper[28758]: I0223 14:52:18.875846 28758 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="07be6be7-f047-46d7-a28b-cc077ce2261c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.11:8775/\": read tcp 10.128.0.2:59262->10.128.1.11:8775: read: connection reset by peer" Feb 23 14:52:18.875958 master-0 kubenswrapper[28758]: I0223 14:52:18.875912 28758 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="07be6be7-f047-46d7-a28b-cc077ce2261c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.11:8775/\": read tcp 10.128.0.2:59264->10.128.1.11:8775: read: connection reset by peer" Feb 23 14:52:19.382470 master-0 kubenswrapper[28758]: I0223 14:52:19.382414 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 14:52:19.433527 master-0 kubenswrapper[28758]: I0223 14:52:19.430932 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvqrn\" (UniqueName: \"kubernetes.io/projected/07be6be7-f047-46d7-a28b-cc077ce2261c-kube-api-access-hvqrn\") pod \"07be6be7-f047-46d7-a28b-cc077ce2261c\" (UID: \"07be6be7-f047-46d7-a28b-cc077ce2261c\") " Feb 23 14:52:19.433527 master-0 kubenswrapper[28758]: I0223 14:52:19.431003 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07be6be7-f047-46d7-a28b-cc077ce2261c-config-data\") pod \"07be6be7-f047-46d7-a28b-cc077ce2261c\" (UID: \"07be6be7-f047-46d7-a28b-cc077ce2261c\") " Feb 23 14:52:19.433527 master-0 kubenswrapper[28758]: I0223 14:52:19.431078 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/07be6be7-f047-46d7-a28b-cc077ce2261c-nova-metadata-tls-certs\") pod \"07be6be7-f047-46d7-a28b-cc077ce2261c\" (UID: \"07be6be7-f047-46d7-a28b-cc077ce2261c\") " Feb 23 14:52:19.433527 master-0 kubenswrapper[28758]: I0223 14:52:19.431227 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07be6be7-f047-46d7-a28b-cc077ce2261c-combined-ca-bundle\") pod \"07be6be7-f047-46d7-a28b-cc077ce2261c\" (UID: \"07be6be7-f047-46d7-a28b-cc077ce2261c\") " Feb 23 14:52:19.433527 master-0 kubenswrapper[28758]: I0223 14:52:19.431337 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07be6be7-f047-46d7-a28b-cc077ce2261c-logs\") pod \"07be6be7-f047-46d7-a28b-cc077ce2261c\" (UID: \"07be6be7-f047-46d7-a28b-cc077ce2261c\") " Feb 23 14:52:19.433527 master-0 kubenswrapper[28758]: I0223 14:52:19.432922 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07be6be7-f047-46d7-a28b-cc077ce2261c-logs" (OuterVolumeSpecName: "logs") pod "07be6be7-f047-46d7-a28b-cc077ce2261c" (UID: "07be6be7-f047-46d7-a28b-cc077ce2261c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 14:52:19.439003 master-0 kubenswrapper[28758]: I0223 14:52:19.438786 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07be6be7-f047-46d7-a28b-cc077ce2261c-kube-api-access-hvqrn" (OuterVolumeSpecName: "kube-api-access-hvqrn") pod "07be6be7-f047-46d7-a28b-cc077ce2261c" (UID: "07be6be7-f047-46d7-a28b-cc077ce2261c"). InnerVolumeSpecName "kube-api-access-hvqrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:52:19.467514 master-0 kubenswrapper[28758]: I0223 14:52:19.467357 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07be6be7-f047-46d7-a28b-cc077ce2261c-config-data" (OuterVolumeSpecName: "config-data") pod "07be6be7-f047-46d7-a28b-cc077ce2261c" (UID: "07be6be7-f047-46d7-a28b-cc077ce2261c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:52:19.473710 master-0 kubenswrapper[28758]: I0223 14:52:19.473247 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07be6be7-f047-46d7-a28b-cc077ce2261c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07be6be7-f047-46d7-a28b-cc077ce2261c" (UID: "07be6be7-f047-46d7-a28b-cc077ce2261c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:52:19.524808 master-0 kubenswrapper[28758]: I0223 14:52:19.524739 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07be6be7-f047-46d7-a28b-cc077ce2261c-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "07be6be7-f047-46d7-a28b-cc077ce2261c" (UID: "07be6be7-f047-46d7-a28b-cc077ce2261c"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:52:19.534584 master-0 kubenswrapper[28758]: I0223 14:52:19.534442 28758 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07be6be7-f047-46d7-a28b-cc077ce2261c-logs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:52:19.534584 master-0 kubenswrapper[28758]: I0223 14:52:19.534548 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvqrn\" (UniqueName: \"kubernetes.io/projected/07be6be7-f047-46d7-a28b-cc077ce2261c-kube-api-access-hvqrn\") on node \"master-0\" DevicePath \"\"" Feb 23 14:52:19.534584 master-0 kubenswrapper[28758]: I0223 14:52:19.534563 28758 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07be6be7-f047-46d7-a28b-cc077ce2261c-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 14:52:19.534584 master-0 kubenswrapper[28758]: I0223 14:52:19.534576 28758 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/07be6be7-f047-46d7-a28b-cc077ce2261c-nova-metadata-tls-certs\") on node \"master-0\" DevicePath \"\"" Feb 23 14:52:19.534584 master-0 kubenswrapper[28758]: I0223 14:52:19.534588 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07be6be7-f047-46d7-a28b-cc077ce2261c-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 14:52:19.569037 master-0 kubenswrapper[28758]: I0223 14:52:19.568269 28758 generic.go:334] "Generic (PLEG): container finished" podID="07be6be7-f047-46d7-a28b-cc077ce2261c" containerID="7f2516722ecfb46cfaeeb87d56fe69c1f08ff21e506710b29ec6c3ce233778f2" exitCode=0 Feb 23 14:52:19.569037 master-0 kubenswrapper[28758]: I0223 14:52:19.568356 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"07be6be7-f047-46d7-a28b-cc077ce2261c","Type":"ContainerDied","Data":"7f2516722ecfb46cfaeeb87d56fe69c1f08ff21e506710b29ec6c3ce233778f2"} Feb 23 14:52:19.569037 master-0 kubenswrapper[28758]: I0223 14:52:19.568390 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"07be6be7-f047-46d7-a28b-cc077ce2261c","Type":"ContainerDied","Data":"089cf05f9de48ff9c8cbbb99013397bdf64f81b9aad91fff57fccff9dbe06bca"} Feb 23 14:52:19.569037 master-0 kubenswrapper[28758]: I0223 14:52:19.568413 28758 scope.go:117] "RemoveContainer" containerID="7f2516722ecfb46cfaeeb87d56fe69c1f08ff21e506710b29ec6c3ce233778f2" Feb 23 14:52:19.569037 master-0 kubenswrapper[28758]: I0223 14:52:19.568571 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 14:52:19.577434 master-0 kubenswrapper[28758]: I0223 14:52:19.577364 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6f5c91f3-d3a3-445d-bd0b-e81ed6db60b6","Type":"ContainerStarted","Data":"a50a982c1b0b3f04628d288664d2394e91b6cf417241795f5188aabeb3c6f78e"} Feb 23 14:52:19.602521 master-0 kubenswrapper[28758]: I0223 14:52:19.602438 28758 scope.go:117] "RemoveContainer" containerID="62fca3409b9e55c698632023db5afaf837ed61c8a521fe1d96747b4a2141f132" Feb 23 14:52:19.632857 master-0 kubenswrapper[28758]: I0223 14:52:19.632643 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.6326203870000002 podStartE2EDuration="2.632620387s" podCreationTimestamp="2026-02-23 14:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:52:19.608087473 +0000 UTC m=+1071.734403405" watchObservedRunningTime="2026-02-23 14:52:19.632620387 +0000 UTC m=+1071.758936339" Feb 23 14:52:19.660769 master-0 kubenswrapper[28758]: I0223 14:52:19.658446 28758 scope.go:117] "RemoveContainer" containerID="7f2516722ecfb46cfaeeb87d56fe69c1f08ff21e506710b29ec6c3ce233778f2" Feb 23 14:52:19.664635 master-0 kubenswrapper[28758]: I0223 14:52:19.664543 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 14:52:19.666602 master-0 kubenswrapper[28758]: E0223 14:52:19.666213 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f2516722ecfb46cfaeeb87d56fe69c1f08ff21e506710b29ec6c3ce233778f2\": container with ID starting with 7f2516722ecfb46cfaeeb87d56fe69c1f08ff21e506710b29ec6c3ce233778f2 not found: ID does not exist" containerID="7f2516722ecfb46cfaeeb87d56fe69c1f08ff21e506710b29ec6c3ce233778f2" Feb 23 14:52:19.666602 master-0 kubenswrapper[28758]: I0223 14:52:19.666316 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f2516722ecfb46cfaeeb87d56fe69c1f08ff21e506710b29ec6c3ce233778f2"} err="failed to get container status \"7f2516722ecfb46cfaeeb87d56fe69c1f08ff21e506710b29ec6c3ce233778f2\": rpc error: code = NotFound desc = could not find container \"7f2516722ecfb46cfaeeb87d56fe69c1f08ff21e506710b29ec6c3ce233778f2\": container with ID starting with 7f2516722ecfb46cfaeeb87d56fe69c1f08ff21e506710b29ec6c3ce233778f2 not found: ID does not exist" Feb 23 14:52:19.666602 master-0 kubenswrapper[28758]: I0223 14:52:19.666354 28758 scope.go:117] "RemoveContainer" containerID="62fca3409b9e55c698632023db5afaf837ed61c8a521fe1d96747b4a2141f132" Feb 23 14:52:19.668703 master-0 kubenswrapper[28758]: E0223 14:52:19.668637 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62fca3409b9e55c698632023db5afaf837ed61c8a521fe1d96747b4a2141f132\": container with ID starting with 62fca3409b9e55c698632023db5afaf837ed61c8a521fe1d96747b4a2141f132 not found: ID does not exist" containerID="62fca3409b9e55c698632023db5afaf837ed61c8a521fe1d96747b4a2141f132" Feb 23 14:52:19.668789 master-0 kubenswrapper[28758]: I0223 14:52:19.668701 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62fca3409b9e55c698632023db5afaf837ed61c8a521fe1d96747b4a2141f132"} err="failed to get container status \"62fca3409b9e55c698632023db5afaf837ed61c8a521fe1d96747b4a2141f132\": rpc error: code = NotFound desc = could not find container \"62fca3409b9e55c698632023db5afaf837ed61c8a521fe1d96747b4a2141f132\": container with ID starting with 62fca3409b9e55c698632023db5afaf837ed61c8a521fe1d96747b4a2141f132 not found: ID does not exist" Feb 23 14:52:19.675924 master-0 kubenswrapper[28758]: I0223 14:52:19.673541 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 14:52:19.686504 master-0 kubenswrapper[28758]: I0223 14:52:19.686405 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 23 14:52:19.687270 master-0 kubenswrapper[28758]: E0223 14:52:19.687229 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07be6be7-f047-46d7-a28b-cc077ce2261c" containerName="nova-metadata-metadata" Feb 23 14:52:19.687270 master-0 kubenswrapper[28758]: I0223 14:52:19.687261 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="07be6be7-f047-46d7-a28b-cc077ce2261c" containerName="nova-metadata-metadata" Feb 23 14:52:19.687371 master-0 kubenswrapper[28758]: E0223 14:52:19.687301 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07be6be7-f047-46d7-a28b-cc077ce2261c" containerName="nova-metadata-log" Feb 23 14:52:19.687371 master-0 kubenswrapper[28758]: I0223 14:52:19.687313 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="07be6be7-f047-46d7-a28b-cc077ce2261c" containerName="nova-metadata-log" Feb 23 14:52:19.687654 master-0 kubenswrapper[28758]: I0223 14:52:19.687628 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="07be6be7-f047-46d7-a28b-cc077ce2261c" containerName="nova-metadata-metadata" Feb 23 14:52:19.687721 master-0 kubenswrapper[28758]: I0223 14:52:19.687704 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="07be6be7-f047-46d7-a28b-cc077ce2261c" containerName="nova-metadata-log" Feb 23 14:52:19.689434 master-0 kubenswrapper[28758]: I0223 14:52:19.689275 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 14:52:19.692624 master-0 kubenswrapper[28758]: I0223 14:52:19.692574 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 23 14:52:19.692829 master-0 kubenswrapper[28758]: I0223 14:52:19.692594 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 23 14:52:19.723470 master-0 kubenswrapper[28758]: I0223 14:52:19.722785 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 14:52:19.844222 master-0 kubenswrapper[28758]: I0223 14:52:19.843504 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f3c354f-1b43-43ba-9c77-f19db5f74a2f-config-data\") pod \"nova-metadata-0\" (UID: \"9f3c354f-1b43-43ba-9c77-f19db5f74a2f\") " pod="openstack/nova-metadata-0" Feb 23 14:52:19.844222 master-0 kubenswrapper[28758]: I0223 14:52:19.843601 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f3c354f-1b43-43ba-9c77-f19db5f74a2f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9f3c354f-1b43-43ba-9c77-f19db5f74a2f\") " pod="openstack/nova-metadata-0" Feb 23 14:52:19.844222 master-0 kubenswrapper[28758]: I0223 14:52:19.843688 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f3c354f-1b43-43ba-9c77-f19db5f74a2f-logs\") pod \"nova-metadata-0\" (UID: \"9f3c354f-1b43-43ba-9c77-f19db5f74a2f\") " pod="openstack/nova-metadata-0" Feb 23 14:52:19.844222 master-0 kubenswrapper[28758]: I0223 14:52:19.843818 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq7nc\" (UniqueName: \"kubernetes.io/projected/9f3c354f-1b43-43ba-9c77-f19db5f74a2f-kube-api-access-gq7nc\") pod \"nova-metadata-0\" (UID: \"9f3c354f-1b43-43ba-9c77-f19db5f74a2f\") " pod="openstack/nova-metadata-0" Feb 23 14:52:19.844222 master-0 kubenswrapper[28758]: I0223 14:52:19.843865 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f3c354f-1b43-43ba-9c77-f19db5f74a2f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9f3c354f-1b43-43ba-9c77-f19db5f74a2f\") " pod="openstack/nova-metadata-0" Feb 23 14:52:19.947504 master-0 kubenswrapper[28758]: I0223 14:52:19.947407 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq7nc\" (UniqueName: \"kubernetes.io/projected/9f3c354f-1b43-43ba-9c77-f19db5f74a2f-kube-api-access-gq7nc\") pod \"nova-metadata-0\" (UID: \"9f3c354f-1b43-43ba-9c77-f19db5f74a2f\") " pod="openstack/nova-metadata-0" Feb 23 14:52:19.947750 master-0 kubenswrapper[28758]: I0223 14:52:19.947550 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f3c354f-1b43-43ba-9c77-f19db5f74a2f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9f3c354f-1b43-43ba-9c77-f19db5f74a2f\") " pod="openstack/nova-metadata-0" Feb 23 14:52:19.947804 master-0 kubenswrapper[28758]: I0223 14:52:19.947764 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f3c354f-1b43-43ba-9c77-f19db5f74a2f-config-data\") pod \"nova-metadata-0\" (UID: \"9f3c354f-1b43-43ba-9c77-f19db5f74a2f\") " pod="openstack/nova-metadata-0" Feb 23 14:52:19.947847 master-0 kubenswrapper[28758]: I0223 14:52:19.947831 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f3c354f-1b43-43ba-9c77-f19db5f74a2f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9f3c354f-1b43-43ba-9c77-f19db5f74a2f\") " pod="openstack/nova-metadata-0" Feb 23 14:52:19.947965 master-0 kubenswrapper[28758]: I0223 14:52:19.947931 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f3c354f-1b43-43ba-9c77-f19db5f74a2f-logs\") pod \"nova-metadata-0\" (UID: \"9f3c354f-1b43-43ba-9c77-f19db5f74a2f\") " pod="openstack/nova-metadata-0" Feb 23 14:52:19.948705 master-0 kubenswrapper[28758]: I0223 14:52:19.948664 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f3c354f-1b43-43ba-9c77-f19db5f74a2f-logs\") pod \"nova-metadata-0\" (UID: \"9f3c354f-1b43-43ba-9c77-f19db5f74a2f\") " pod="openstack/nova-metadata-0" Feb 23 14:52:19.956504 master-0 kubenswrapper[28758]: I0223 14:52:19.954728 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f3c354f-1b43-43ba-9c77-f19db5f74a2f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9f3c354f-1b43-43ba-9c77-f19db5f74a2f\") " pod="openstack/nova-metadata-0" Feb 23 14:52:19.956504 master-0 kubenswrapper[28758]: I0223 14:52:19.955037 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f3c354f-1b43-43ba-9c77-f19db5f74a2f-config-data\") pod \"nova-metadata-0\" (UID: \"9f3c354f-1b43-43ba-9c77-f19db5f74a2f\") " pod="openstack/nova-metadata-0" Feb 23 14:52:19.956504 master-0 kubenswrapper[28758]: I0223 14:52:19.955497 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f3c354f-1b43-43ba-9c77-f19db5f74a2f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9f3c354f-1b43-43ba-9c77-f19db5f74a2f\") " pod="openstack/nova-metadata-0" Feb 23 14:52:19.968621 master-0 kubenswrapper[28758]: I0223 14:52:19.968215 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq7nc\" (UniqueName: \"kubernetes.io/projected/9f3c354f-1b43-43ba-9c77-f19db5f74a2f-kube-api-access-gq7nc\") pod \"nova-metadata-0\" (UID: \"9f3c354f-1b43-43ba-9c77-f19db5f74a2f\") " pod="openstack/nova-metadata-0" Feb 23 14:52:20.014781 master-0 kubenswrapper[28758]: I0223 14:52:20.014705 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 23 14:52:20.109180 master-0 kubenswrapper[28758]: I0223 14:52:20.108908 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07be6be7-f047-46d7-a28b-cc077ce2261c" path="/var/lib/kubelet/pods/07be6be7-f047-46d7-a28b-cc077ce2261c/volumes" Feb 23 14:52:20.495648 master-0 kubenswrapper[28758]: I0223 14:52:20.495492 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 23 14:52:20.591383 master-0 kubenswrapper[28758]: I0223 14:52:20.591333 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9f3c354f-1b43-43ba-9c77-f19db5f74a2f","Type":"ContainerStarted","Data":"8c00c37a55083d86c3c78e94285baa78d4559d1d8ea620a30dca87617e9da131"} Feb 23 14:52:21.608119 master-0 kubenswrapper[28758]: I0223 14:52:21.608039 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9f3c354f-1b43-43ba-9c77-f19db5f74a2f","Type":"ContainerStarted","Data":"2c90d287990e9aba08881f7d94b2c9731e4675e216882b98d0a2002b10a10516"} Feb 23 14:52:21.608119 master-0 kubenswrapper[28758]: I0223 14:52:21.608105 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9f3c354f-1b43-43ba-9c77-f19db5f74a2f","Type":"ContainerStarted","Data":"cdad82f2fd62ac2fe94ad3ce73a1e3f00196657564b26366b76165dff2058f04"} Feb 23 14:52:21.637060 master-0 kubenswrapper[28758]: I0223 14:52:21.634926 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.634909611 podStartE2EDuration="2.634909611s" podCreationTimestamp="2026-02-23 14:52:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:52:21.63311962 +0000 UTC m=+1073.759435552" watchObservedRunningTime="2026-02-23 14:52:21.634909611 +0000 UTC m=+1073.761225543" Feb 23 14:52:22.924140 master-0 kubenswrapper[28758]: I0223 14:52:22.924060 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 23 14:52:25.015511 master-0 kubenswrapper[28758]: I0223 14:52:25.015417 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 14:52:25.015511 master-0 kubenswrapper[28758]: I0223 14:52:25.015516 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 23 14:52:27.283673 master-0 kubenswrapper[28758]: I0223 14:52:27.283604 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 14:52:27.283673 master-0 kubenswrapper[28758]: I0223 14:52:27.283676 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 23 14:52:27.924115 master-0 kubenswrapper[28758]: I0223 14:52:27.924025 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 23 14:52:27.956557 master-0 kubenswrapper[28758]: I0223 14:52:27.956502 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 23 14:52:28.296792 master-0 kubenswrapper[28758]: I0223 14:52:28.296711 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3d31ce50-c005-43ee-b185-fc4991705cf2" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.128.1.18:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:52:28.297331 master-0 kubenswrapper[28758]: I0223 14:52:28.296712 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3d31ce50-c005-43ee-b185-fc4991705cf2" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.128.1.18:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:52:28.724756 master-0 kubenswrapper[28758]: I0223 14:52:28.724696 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 23 14:52:30.016064 master-0 kubenswrapper[28758]: I0223 14:52:30.015985 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 23 14:52:30.016064 master-0 kubenswrapper[28758]: I0223 14:52:30.016062 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 23 14:52:31.033717 master-0 kubenswrapper[28758]: I0223 14:52:31.033610 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9f3c354f-1b43-43ba-9c77-f19db5f74a2f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.20:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 14:52:31.035611 master-0 kubenswrapper[28758]: I0223 14:52:31.033641 28758 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9f3c354f-1b43-43ba-9c77-f19db5f74a2f" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.20:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:52:37.293469 master-0 kubenswrapper[28758]: I0223 14:52:37.293298 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 23 14:52:37.294253 master-0 kubenswrapper[28758]: I0223 14:52:37.293803 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 23 14:52:37.296783 master-0 kubenswrapper[28758]: I0223 14:52:37.296645 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 23 14:52:37.302789 master-0 kubenswrapper[28758]: I0223 14:52:37.302728 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 23 14:52:37.805142 master-0 kubenswrapper[28758]: I0223 14:52:37.805054 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 23 14:52:37.814744 master-0 kubenswrapper[28758]: I0223 14:52:37.814556 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 23 14:52:40.019849 master-0 kubenswrapper[28758]: I0223 14:52:40.019785 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 23 14:52:40.028103 master-0 kubenswrapper[28758]: I0223 14:52:40.028005 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 23 14:52:40.029087 master-0 kubenswrapper[28758]: I0223 14:52:40.028427 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 23 14:52:40.842056 master-0 kubenswrapper[28758]: I0223 14:52:40.841976 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 23 14:53:08.115376 master-0 kubenswrapper[28758]: I0223 14:53:08.115310 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-xjjxf"] Feb 23 14:53:08.116130 master-0 kubenswrapper[28758]: I0223 14:53:08.115559 28758 kuberuntime_container.go:808] "Killing container with a grace period" pod="sushy-emulator/sushy-emulator-78f6d7d749-xjjxf" podUID="dde92d85-6a81-45fa-a900-097b3dc3af38" containerName="sushy-emulator" containerID="cri-o://3797ca02068c071b29a3679ca5559d7d23124b177fc0e79f3c4f8ab94dc8f7b2" gracePeriod=30 Feb 23 14:53:08.778911 master-0 kubenswrapper[28758]: I0223 14:53:08.778850 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-78f6d7d749-xjjxf" Feb 23 14:53:08.858848 master-0 kubenswrapper[28758]: I0223 14:53:08.858779 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/sushy-emulator-84965d5d88-q4jwr"] Feb 23 14:53:08.859440 master-0 kubenswrapper[28758]: E0223 14:53:08.859411 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dde92d85-6a81-45fa-a900-097b3dc3af38" containerName="sushy-emulator" Feb 23 14:53:08.859440 master-0 kubenswrapper[28758]: I0223 14:53:08.859437 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="dde92d85-6a81-45fa-a900-097b3dc3af38" containerName="sushy-emulator" Feb 23 14:53:08.859925 master-0 kubenswrapper[28758]: I0223 14:53:08.859897 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="dde92d85-6a81-45fa-a900-097b3dc3af38" containerName="sushy-emulator" Feb 23 14:53:08.864426 master-0 kubenswrapper[28758]: I0223 14:53:08.864381 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-84965d5d88-q4jwr" Feb 23 14:53:08.866452 master-0 kubenswrapper[28758]: I0223 14:53:08.866395 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/dde92d85-6a81-45fa-a900-097b3dc3af38-sushy-emulator-config\") pod \"dde92d85-6a81-45fa-a900-097b3dc3af38\" (UID: \"dde92d85-6a81-45fa-a900-097b3dc3af38\") " Feb 23 14:53:08.866618 master-0 kubenswrapper[28758]: I0223 14:53:08.866596 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk6d2\" (UniqueName: \"kubernetes.io/projected/dde92d85-6a81-45fa-a900-097b3dc3af38-kube-api-access-lk6d2\") pod \"dde92d85-6a81-45fa-a900-097b3dc3af38\" (UID: \"dde92d85-6a81-45fa-a900-097b3dc3af38\") " Feb 23 14:53:08.866679 master-0 kubenswrapper[28758]: I0223 14:53:08.866656 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/dde92d85-6a81-45fa-a900-097b3dc3af38-os-client-config\") pod \"dde92d85-6a81-45fa-a900-097b3dc3af38\" (UID: \"dde92d85-6a81-45fa-a900-097b3dc3af38\") " Feb 23 14:53:08.867020 master-0 kubenswrapper[28758]: I0223 14:53:08.866981 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dde92d85-6a81-45fa-a900-097b3dc3af38-sushy-emulator-config" (OuterVolumeSpecName: "sushy-emulator-config") pod "dde92d85-6a81-45fa-a900-097b3dc3af38" (UID: "dde92d85-6a81-45fa-a900-097b3dc3af38"). InnerVolumeSpecName "sushy-emulator-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 14:53:08.867399 master-0 kubenswrapper[28758]: I0223 14:53:08.867373 28758 reconciler_common.go:293] "Volume detached for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/dde92d85-6a81-45fa-a900-097b3dc3af38-sushy-emulator-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:53:08.871292 master-0 kubenswrapper[28758]: I0223 14:53:08.871255 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dde92d85-6a81-45fa-a900-097b3dc3af38-os-client-config" (OuterVolumeSpecName: "os-client-config") pod "dde92d85-6a81-45fa-a900-097b3dc3af38" (UID: "dde92d85-6a81-45fa-a900-097b3dc3af38"). InnerVolumeSpecName "os-client-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 14:53:08.871850 master-0 kubenswrapper[28758]: I0223 14:53:08.871800 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dde92d85-6a81-45fa-a900-097b3dc3af38-kube-api-access-lk6d2" (OuterVolumeSpecName: "kube-api-access-lk6d2") pod "dde92d85-6a81-45fa-a900-097b3dc3af38" (UID: "dde92d85-6a81-45fa-a900-097b3dc3af38"). InnerVolumeSpecName "kube-api-access-lk6d2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 14:53:08.872456 master-0 kubenswrapper[28758]: I0223 14:53:08.872423 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-84965d5d88-q4jwr"] Feb 23 14:53:08.969823 master-0 kubenswrapper[28758]: I0223 14:53:08.969668 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/b5167d84-5ffc-46bf-bb86-8d5b1f9957de-sushy-emulator-config\") pod \"sushy-emulator-84965d5d88-q4jwr\" (UID: \"b5167d84-5ffc-46bf-bb86-8d5b1f9957de\") " pod="sushy-emulator/sushy-emulator-84965d5d88-q4jwr" Feb 23 14:53:08.969823 master-0 kubenswrapper[28758]: I0223 14:53:08.969761 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqkdh\" (UniqueName: \"kubernetes.io/projected/b5167d84-5ffc-46bf-bb86-8d5b1f9957de-kube-api-access-hqkdh\") pod \"sushy-emulator-84965d5d88-q4jwr\" (UID: \"b5167d84-5ffc-46bf-bb86-8d5b1f9957de\") " pod="sushy-emulator/sushy-emulator-84965d5d88-q4jwr" Feb 23 14:53:08.970368 master-0 kubenswrapper[28758]: I0223 14:53:08.970309 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/b5167d84-5ffc-46bf-bb86-8d5b1f9957de-os-client-config\") pod \"sushy-emulator-84965d5d88-q4jwr\" (UID: \"b5167d84-5ffc-46bf-bb86-8d5b1f9957de\") " pod="sushy-emulator/sushy-emulator-84965d5d88-q4jwr" Feb 23 14:53:08.970709 master-0 kubenswrapper[28758]: I0223 14:53:08.970661 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lk6d2\" (UniqueName: \"kubernetes.io/projected/dde92d85-6a81-45fa-a900-097b3dc3af38-kube-api-access-lk6d2\") on node \"master-0\" DevicePath \"\"" Feb 23 14:53:08.970709 master-0 kubenswrapper[28758]: I0223 14:53:08.970708 28758 reconciler_common.go:293] "Volume detached for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/dde92d85-6a81-45fa-a900-097b3dc3af38-os-client-config\") on node \"master-0\" DevicePath \"\"" Feb 23 14:53:09.072607 master-0 kubenswrapper[28758]: I0223 14:53:09.072527 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/b5167d84-5ffc-46bf-bb86-8d5b1f9957de-os-client-config\") pod \"sushy-emulator-84965d5d88-q4jwr\" (UID: \"b5167d84-5ffc-46bf-bb86-8d5b1f9957de\") " pod="sushy-emulator/sushy-emulator-84965d5d88-q4jwr" Feb 23 14:53:09.072945 master-0 kubenswrapper[28758]: I0223 14:53:09.072710 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/b5167d84-5ffc-46bf-bb86-8d5b1f9957de-sushy-emulator-config\") pod \"sushy-emulator-84965d5d88-q4jwr\" (UID: \"b5167d84-5ffc-46bf-bb86-8d5b1f9957de\") " pod="sushy-emulator/sushy-emulator-84965d5d88-q4jwr" Feb 23 14:53:09.072945 master-0 kubenswrapper[28758]: I0223 14:53:09.072752 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqkdh\" (UniqueName: \"kubernetes.io/projected/b5167d84-5ffc-46bf-bb86-8d5b1f9957de-kube-api-access-hqkdh\") pod \"sushy-emulator-84965d5d88-q4jwr\" (UID: \"b5167d84-5ffc-46bf-bb86-8d5b1f9957de\") " pod="sushy-emulator/sushy-emulator-84965d5d88-q4jwr" Feb 23 14:53:09.074093 master-0 kubenswrapper[28758]: I0223 14:53:09.073993 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/b5167d84-5ffc-46bf-bb86-8d5b1f9957de-sushy-emulator-config\") pod \"sushy-emulator-84965d5d88-q4jwr\" (UID: \"b5167d84-5ffc-46bf-bb86-8d5b1f9957de\") " pod="sushy-emulator/sushy-emulator-84965d5d88-q4jwr" Feb 23 14:53:09.076652 master-0 kubenswrapper[28758]: I0223 14:53:09.076605 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/b5167d84-5ffc-46bf-bb86-8d5b1f9957de-os-client-config\") pod \"sushy-emulator-84965d5d88-q4jwr\" (UID: \"b5167d84-5ffc-46bf-bb86-8d5b1f9957de\") " pod="sushy-emulator/sushy-emulator-84965d5d88-q4jwr" Feb 23 14:53:09.090468 master-0 kubenswrapper[28758]: I0223 14:53:09.090349 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqkdh\" (UniqueName: \"kubernetes.io/projected/b5167d84-5ffc-46bf-bb86-8d5b1f9957de-kube-api-access-hqkdh\") pod \"sushy-emulator-84965d5d88-q4jwr\" (UID: \"b5167d84-5ffc-46bf-bb86-8d5b1f9957de\") " pod="sushy-emulator/sushy-emulator-84965d5d88-q4jwr" Feb 23 14:53:09.201181 master-0 kubenswrapper[28758]: I0223 14:53:09.201105 28758 generic.go:334] "Generic (PLEG): container finished" podID="dde92d85-6a81-45fa-a900-097b3dc3af38" containerID="3797ca02068c071b29a3679ca5559d7d23124b177fc0e79f3c4f8ab94dc8f7b2" exitCode=0 Feb 23 14:53:09.201181 master-0 kubenswrapper[28758]: I0223 14:53:09.201173 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-78f6d7d749-xjjxf" event={"ID":"dde92d85-6a81-45fa-a900-097b3dc3af38","Type":"ContainerDied","Data":"3797ca02068c071b29a3679ca5559d7d23124b177fc0e79f3c4f8ab94dc8f7b2"} Feb 23 14:53:09.201896 master-0 kubenswrapper[28758]: I0223 14:53:09.201211 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-78f6d7d749-xjjxf" event={"ID":"dde92d85-6a81-45fa-a900-097b3dc3af38","Type":"ContainerDied","Data":"f56539d3604b2838d97f3306cb4d70a39fba6b8748960044e591b34848cd4570"} Feb 23 14:53:09.201896 master-0 kubenswrapper[28758]: I0223 14:53:09.201237 28758 scope.go:117] "RemoveContainer" containerID="3797ca02068c071b29a3679ca5559d7d23124b177fc0e79f3c4f8ab94dc8f7b2" Feb 23 14:53:09.201896 master-0 kubenswrapper[28758]: I0223 14:53:09.201240 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-78f6d7d749-xjjxf" Feb 23 14:53:09.237509 master-0 kubenswrapper[28758]: I0223 14:53:09.237409 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-84965d5d88-q4jwr" Feb 23 14:53:09.253718 master-0 kubenswrapper[28758]: I0223 14:53:09.253471 28758 scope.go:117] "RemoveContainer" containerID="3797ca02068c071b29a3679ca5559d7d23124b177fc0e79f3c4f8ab94dc8f7b2" Feb 23 14:53:09.254049 master-0 kubenswrapper[28758]: E0223 14:53:09.253982 28758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3797ca02068c071b29a3679ca5559d7d23124b177fc0e79f3c4f8ab94dc8f7b2\": container with ID starting with 3797ca02068c071b29a3679ca5559d7d23124b177fc0e79f3c4f8ab94dc8f7b2 not found: ID does not exist" containerID="3797ca02068c071b29a3679ca5559d7d23124b177fc0e79f3c4f8ab94dc8f7b2" Feb 23 14:53:09.254049 master-0 kubenswrapper[28758]: I0223 14:53:09.254019 28758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3797ca02068c071b29a3679ca5559d7d23124b177fc0e79f3c4f8ab94dc8f7b2"} err="failed to get container status \"3797ca02068c071b29a3679ca5559d7d23124b177fc0e79f3c4f8ab94dc8f7b2\": rpc error: code = NotFound desc = could not find container \"3797ca02068c071b29a3679ca5559d7d23124b177fc0e79f3c4f8ab94dc8f7b2\": container with ID starting with 3797ca02068c071b29a3679ca5559d7d23124b177fc0e79f3c4f8ab94dc8f7b2 not found: ID does not exist" Feb 23 14:53:09.257084 master-0 kubenswrapper[28758]: I0223 14:53:09.257055 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-xjjxf"] Feb 23 14:53:09.269972 master-0 kubenswrapper[28758]: I0223 14:53:09.269910 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-xjjxf"] Feb 23 14:53:09.774191 master-0 kubenswrapper[28758]: I0223 14:53:09.774036 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-84965d5d88-q4jwr"] Feb 23 14:53:09.786895 master-0 kubenswrapper[28758]: W0223 14:53:09.786836 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5167d84_5ffc_46bf_bb86_8d5b1f9957de.slice/crio-ca1ffdee5f0eb730c73b08a7690c31b112af512867dfd2679bd7815685e44f53 WatchSource:0}: Error finding container ca1ffdee5f0eb730c73b08a7690c31b112af512867dfd2679bd7815685e44f53: Status 404 returned error can't find the container with id ca1ffdee5f0eb730c73b08a7690c31b112af512867dfd2679bd7815685e44f53 Feb 23 14:53:10.117706 master-0 kubenswrapper[28758]: I0223 14:53:10.117537 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dde92d85-6a81-45fa-a900-097b3dc3af38" path="/var/lib/kubelet/pods/dde92d85-6a81-45fa-a900-097b3dc3af38/volumes" Feb 23 14:53:10.221357 master-0 kubenswrapper[28758]: I0223 14:53:10.221303 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-84965d5d88-q4jwr" event={"ID":"b5167d84-5ffc-46bf-bb86-8d5b1f9957de","Type":"ContainerStarted","Data":"c6d8526a682d62f65fa0fe8de1882f7db5439b1403898e573040e719aaed4044"} Feb 23 14:53:10.221955 master-0 kubenswrapper[28758]: I0223 14:53:10.221933 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-84965d5d88-q4jwr" event={"ID":"b5167d84-5ffc-46bf-bb86-8d5b1f9957de","Type":"ContainerStarted","Data":"ca1ffdee5f0eb730c73b08a7690c31b112af512867dfd2679bd7815685e44f53"} Feb 23 14:53:10.241366 master-0 kubenswrapper[28758]: I0223 14:53:10.241240 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/sushy-emulator-84965d5d88-q4jwr" podStartSLOduration=2.241216705 podStartE2EDuration="2.241216705s" podCreationTimestamp="2026-02-23 14:53:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 14:53:10.240640848 +0000 UTC m=+1122.366956790" watchObservedRunningTime="2026-02-23 14:53:10.241216705 +0000 UTC m=+1122.367532647" Feb 23 14:53:19.238328 master-0 kubenswrapper[28758]: I0223 14:53:19.238167 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="sushy-emulator/sushy-emulator-84965d5d88-q4jwr" Feb 23 14:53:19.238328 master-0 kubenswrapper[28758]: I0223 14:53:19.238264 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="sushy-emulator/sushy-emulator-84965d5d88-q4jwr" Feb 23 14:53:19.250937 master-0 kubenswrapper[28758]: I0223 14:53:19.250825 28758 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="sushy-emulator/sushy-emulator-84965d5d88-q4jwr" Feb 23 14:53:19.337894 master-0 kubenswrapper[28758]: I0223 14:53:19.337811 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="sushy-emulator/sushy-emulator-84965d5d88-q4jwr" Feb 23 14:54:35.195851 master-0 kubenswrapper[28758]: I0223 14:54:35.195773 28758 scope.go:117] "RemoveContainer" containerID="ea8438f93e84603e9f263f66805cae02d647db14a713ffc4dbbff8d1283f37ee" Feb 23 14:55:35.274497 master-0 kubenswrapper[28758]: I0223 14:55:35.274406 28758 scope.go:117] "RemoveContainer" containerID="cca996d036b0d938c26b7289bdc7d311eca327d7be81f6a3839239c667eb9ed6" Feb 23 14:55:35.298245 master-0 kubenswrapper[28758]: I0223 14:55:35.298097 28758 scope.go:117] "RemoveContainer" containerID="0de1652540e3d677e7cc14a5d0ddea1d004b11fbddcf5d72dd3721158d988478" Feb 23 14:55:35.334638 master-0 kubenswrapper[28758]: I0223 14:55:35.334582 28758 scope.go:117] "RemoveContainer" containerID="c9fc6d29001166e45cfb2215c22cf38e0f80fb836ecd9bcc51373fcda30039cc" Feb 23 14:55:35.377104 master-0 kubenswrapper[28758]: I0223 14:55:35.377045 28758 scope.go:117] "RemoveContainer" containerID="3a7a3ce5df768907856540049b9397f5cff2fd6bc39ac7546a2c808ab9445216" Feb 23 14:55:49.712323 master-0 kubenswrapper[28758]: I0223 14:55:49.711920 28758 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-dkqkb" podUID="ada28ec1-a66b-4668-941c-c0f0cd424ee4" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.150:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 14:55:49.712323 master-0 kubenswrapper[28758]: I0223 14:55:49.711986 28758 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-dkqkb" podUID="ada28ec1-a66b-4668-941c-c0f0cd424ee4" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.150:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 14:55:53.452198 master-0 kubenswrapper[28758]: I0223 14:55:49.809643 28758 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-vd6sh" podUID="5602bab5-e632-45c0-9a58-fca8c507ff8d" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.145:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 14:55:53.452198 master-0 kubenswrapper[28758]: I0223 14:55:49.809827 28758 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-vd6sh" podUID="5602bab5-e632-45c0-9a58-fca8c507ff8d" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.145:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 14:55:53.452198 master-0 kubenswrapper[28758]: I0223 14:55:50.079743 28758 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-xnn4r" podUID="f257fd05-c591-4324-94b4-8f87a7741118" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.153:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 14:55:53.452198 master-0 kubenswrapper[28758]: I0223 14:55:50.079768 28758 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-6888856db4-qzgk8" podUID="97c30264-9449-47c1-8777-2af752e19ffc" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.128.0.120:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 14:55:53.452198 master-0 kubenswrapper[28758]: I0223 14:55:50.079891 28758 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-xnn4r" podUID="f257fd05-c591-4324-94b4-8f87a7741118" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.153:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 14:55:53.452198 master-0 kubenswrapper[28758]: I0223 14:55:50.079921 28758 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-psqph" podUID="13419178-05f6-4d41-be2b-2849b477ff68" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.147:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 14:55:53.452198 master-0 kubenswrapper[28758]: I0223 14:55:50.080113 28758 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-psqph" podUID="13419178-05f6-4d41-be2b-2849b477ff68" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.147:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 14:55:53.452198 master-0 kubenswrapper[28758]: I0223 14:55:50.162192 28758 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/manila-operator-controller-manager-67d996989d-7kkjf" podUID="61cee583-7aa7-483b-b0e4-96f48d26a940" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.152:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 14:55:53.452198 master-0 kubenswrapper[28758]: I0223 14:55:50.162768 28758 prober.go:107] "Probe failed" probeType="Liveness" pod="cert-manager/cert-manager-webhook-6888856db4-qzgk8" podUID="97c30264-9449-47c1-8777-2af752e19ffc" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.128.0.120:6080/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 14:55:53.452198 master-0 kubenswrapper[28758]: I0223 14:55:50.162957 28758 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/manila-operator-controller-manager-67d996989d-7kkjf" podUID="61cee583-7aa7-483b-b0e4-96f48d26a940" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.152:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 14:55:53.452198 master-0 kubenswrapper[28758]: I0223 14:55:50.469085 28758 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/swift-operator-controller-manager-68f46476f-jjtxv" podUID="0114c0ec-3af5-4d4e-adac-34f5471c64ce" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.160:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 14:55:53.452198 master-0 kubenswrapper[28758]: I0223 14:55:50.469132 28758 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/swift-operator-controller-manager-68f46476f-jjtxv" podUID="0114c0ec-3af5-4d4e-adac-34f5471c64ce" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.160:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 14:55:53.452198 master-0 kubenswrapper[28758]: I0223 14:55:50.561610 28758 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-n9hpv" podUID="10b5a636-69f5-4828-8e1e-9a3a598c28aa" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.162:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 14:55:53.452198 master-0 kubenswrapper[28758]: I0223 14:55:50.561700 28758 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-n9hpv" podUID="10b5a636-69f5-4828-8e1e-9a3a598c28aa" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.162:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 14:55:53.452198 master-0 kubenswrapper[28758]: I0223 14:55:51.029698 28758 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-7d66bdc8f4-chnqv" podUID="0bb85bd1-300c-4786-96ff-56978d399495" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.124:8080/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 14:55:53.452198 master-0 kubenswrapper[28758]: I0223 14:55:51.833728 28758 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-57446bd8dd-6nwv7" podUID="7a2a7f30-5dd1-4918-9e1d-f65cb8aa6fc1" containerName="webhook-server" probeResult="failure" output="Get \"http://10.128.0.125:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 14:55:53.452198 master-0 kubenswrapper[28758]: I0223 14:55:51.833870 28758 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-57446bd8dd-6nwv7" podUID="7a2a7f30-5dd1-4918-9e1d-f65cb8aa6fc1" containerName="webhook-server" probeResult="failure" output="Get \"http://10.128.0.125:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 14:55:53.452198 master-0 kubenswrapper[28758]: I0223 14:55:52.316651 28758 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-7cd7df45f8-pgvmr" podUID="3d1bdc4f-86b7-4c36-906b-4aa5e49cd017" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.128.0.241:8080/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 14:55:53.452198 master-0 kubenswrapper[28758]: I0223 14:55:52.316735 28758 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-7cd7df45f8-pgvmr" podUID="3d1bdc4f-86b7-4c36-906b-4aa5e49cd017" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.128.0.241:8080/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 23 14:55:53.452198 master-0 kubenswrapper[28758]: I0223 14:55:52.322612 28758 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-7cd7df45f8-pgvmr" podUID="3d1bdc4f-86b7-4c36-906b-4aa5e49cd017" containerName="proxy-server" probeResult="failure" output="Get \"https://10.128.0.241:8080/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 14:55:53.452198 master-0 kubenswrapper[28758]: I0223 14:55:52.970346 28758 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-2d990-scheduler-0" podUID="b7266a40-897f-41e1-a8bc-0bd0c7c0f268" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.128.0.248:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 23 14:55:53.563100 master-0 kubenswrapper[28758]: I0223 14:55:53.562846 28758 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-ovs-td6ds" podUID="f847003c-7775-4189-896d-b6c727a97222" containerName="ovsdb-server" probeResult="failure" output="command timed out" Feb 23 14:55:53.594209 master-0 kubenswrapper[28758]: I0223 14:55:53.594067 28758 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ovn-controller-ovs-td6ds" podUID="f847003c-7775-4189-896d-b6c727a97222" containerName="ovsdb-server" probeResult="failure" output="command timed out" Feb 23 14:55:53.613245 master-0 kubenswrapper[28758]: E0223 14:55:53.613167 28758 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="5.506s" Feb 23 14:56:35.502141 master-0 kubenswrapper[28758]: I0223 14:56:35.502092 28758 scope.go:117] "RemoveContainer" containerID="ab83798ecd0a01f893c05c912022a7b2f2a13fa225bcbd0f1c1a5b5f9983730f" Feb 23 14:56:35.526225 master-0 kubenswrapper[28758]: I0223 14:56:35.526176 28758 scope.go:117] "RemoveContainer" containerID="ce4678450a0a5b191545732e9365298462af447b51d03c19b04b940607a70ff4" Feb 23 14:58:20.946280 master-0 kubenswrapper[28758]: I0223 14:58:20.946189 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-sl8ng"] Feb 23 14:58:21.148266 master-0 kubenswrapper[28758]: I0223 14:58:21.148164 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-sl8ng"] Feb 23 14:58:22.107308 master-0 kubenswrapper[28758]: I0223 14:58:22.107240 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bb37fbb-098a-4f0c-894a-918e22d2c343" path="/var/lib/kubelet/pods/8bb37fbb-098a-4f0c-894a-918e22d2c343/volumes" Feb 23 14:58:22.371045 master-0 kubenswrapper[28758]: I0223 14:58:22.370936 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-8f8d-account-create-update-6sf9t"] Feb 23 14:58:22.417235 master-0 kubenswrapper[28758]: I0223 14:58:22.417164 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-8f8d-account-create-update-6sf9t"] Feb 23 14:58:22.594357 master-0 kubenswrapper[28758]: I0223 14:58:22.594283 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-ec94-account-create-update-qwcst"] Feb 23 14:58:22.607018 master-0 kubenswrapper[28758]: I0223 14:58:22.606912 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b4e2-account-create-update-r6sdz"] Feb 23 14:58:22.620285 master-0 kubenswrapper[28758]: I0223 14:58:22.620234 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-7xv7b"] Feb 23 14:58:22.631932 master-0 kubenswrapper[28758]: I0223 14:58:22.631830 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-qwl8p"] Feb 23 14:58:22.644460 master-0 kubenswrapper[28758]: I0223 14:58:22.644402 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-ec94-account-create-update-qwcst"] Feb 23 14:58:23.002209 master-0 kubenswrapper[28758]: I0223 14:58:23.002154 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-b4e2-account-create-update-r6sdz"] Feb 23 14:58:23.014331 master-0 kubenswrapper[28758]: I0223 14:58:23.014229 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-qwl8p"] Feb 23 14:58:23.026122 master-0 kubenswrapper[28758]: I0223 14:58:23.026026 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-7xv7b"] Feb 23 14:58:24.103191 master-0 kubenswrapper[28758]: I0223 14:58:24.103128 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="031f392b-0b1d-4f88-a32b-d015b41857f6" path="/var/lib/kubelet/pods/031f392b-0b1d-4f88-a32b-d015b41857f6/volumes" Feb 23 14:58:24.104028 master-0 kubenswrapper[28758]: I0223 14:58:24.103994 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03be8bbd-b280-416a-9197-9dbdd3eaf4f7" path="/var/lib/kubelet/pods/03be8bbd-b280-416a-9197-9dbdd3eaf4f7/volumes" Feb 23 14:58:24.104610 master-0 kubenswrapper[28758]: I0223 14:58:24.104580 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f01ce05-6e10-4db8-b887-bd8caf29c98a" path="/var/lib/kubelet/pods/0f01ce05-6e10-4db8-b887-bd8caf29c98a/volumes" Feb 23 14:58:24.105188 master-0 kubenswrapper[28758]: I0223 14:58:24.105159 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="581dcbe4-b0bd-4099-8150-902c7a87c5d0" path="/var/lib/kubelet/pods/581dcbe4-b0bd-4099-8150-902c7a87c5d0/volumes" Feb 23 14:58:24.106979 master-0 kubenswrapper[28758]: I0223 14:58:24.106911 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cc23db2-205f-4165-8675-1fa946a30a84" path="/var/lib/kubelet/pods/6cc23db2-205f-4165-8675-1fa946a30a84/volumes" Feb 23 14:58:28.794516 master-0 kubenswrapper[28758]: I0223 14:58:28.792594 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-jwf8n"] Feb 23 14:58:28.844786 master-0 kubenswrapper[28758]: I0223 14:58:28.844724 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-jwf8n"] Feb 23 14:58:30.107562 master-0 kubenswrapper[28758]: I0223 14:58:30.107458 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="808186bd-8972-4a9d-9db7-7c3456319733" path="/var/lib/kubelet/pods/808186bd-8972-4a9d-9db7-7c3456319733/volumes" Feb 23 14:58:35.603665 master-0 kubenswrapper[28758]: I0223 14:58:35.603499 28758 scope.go:117] "RemoveContainer" containerID="2c08d53d13d7f30165ac7aedc93d8e1c641d5d13356b68e04f7c1f2da4eabe08" Feb 23 14:58:35.627337 master-0 kubenswrapper[28758]: I0223 14:58:35.627271 28758 scope.go:117] "RemoveContainer" containerID="145da82ebe449309f1e45329b59bacb66a1a189eca6a38a3f0bd22e77feea4ee" Feb 23 14:58:35.696342 master-0 kubenswrapper[28758]: I0223 14:58:35.696283 28758 scope.go:117] "RemoveContainer" containerID="b93b3d403c2dfa5ad4f37ec243fb3e4f8ec183bddff0e7512111eadcc6035509" Feb 23 14:58:35.755583 master-0 kubenswrapper[28758]: I0223 14:58:35.755510 28758 scope.go:117] "RemoveContainer" containerID="69379567b30488cb54059fec6b95c6d1b4534033cd6568962e1e943d7c2d18b4" Feb 23 14:58:35.845381 master-0 kubenswrapper[28758]: I0223 14:58:35.845305 28758 scope.go:117] "RemoveContainer" containerID="2472c8e571517e2668f4a6d92d50d33d4b5bfec5319bf6de6c982ddbe222435e" Feb 23 14:58:35.884745 master-0 kubenswrapper[28758]: I0223 14:58:35.884687 28758 scope.go:117] "RemoveContainer" containerID="89fff8e9c27bd5cbbd4db4f47e44b76657cb24111b81780c074ebabbc09c8ac4" Feb 23 14:58:35.948045 master-0 kubenswrapper[28758]: I0223 14:58:35.947975 28758 scope.go:117] "RemoveContainer" containerID="f95a64a7e7d566f16528df0084b6f2b4ce07211ef212bae147bff68ed53b269a" Feb 23 14:58:50.063598 master-0 kubenswrapper[28758]: I0223 14:58:50.063165 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-m256t"] Feb 23 14:58:50.081190 master-0 kubenswrapper[28758]: I0223 14:58:50.079966 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-m256t"] Feb 23 14:58:50.111518 master-0 kubenswrapper[28758]: I0223 14:58:50.111417 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f87df8a-ee8a-4398-b0ce-817da20c6349" path="/var/lib/kubelet/pods/3f87df8a-ee8a-4398-b0ce-817da20c6349/volumes" Feb 23 14:59:02.053508 master-0 kubenswrapper[28758]: I0223 14:59:02.052512 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-03f5-account-create-update-ntvsz"] Feb 23 14:59:02.064240 master-0 kubenswrapper[28758]: I0223 14:59:02.064188 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-5pnls"] Feb 23 14:59:02.074176 master-0 kubenswrapper[28758]: I0223 14:59:02.074101 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-03f5-account-create-update-ntvsz"] Feb 23 14:59:02.086368 master-0 kubenswrapper[28758]: I0223 14:59:02.086255 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-5pnls"] Feb 23 14:59:02.102758 master-0 kubenswrapper[28758]: I0223 14:59:02.102661 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24013480-c173-4d1f-8d00-2cffa90b04ef" path="/var/lib/kubelet/pods/24013480-c173-4d1f-8d00-2cffa90b04ef/volumes" Feb 23 14:59:02.104064 master-0 kubenswrapper[28758]: I0223 14:59:02.103779 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eed2aa93-0ee8-4867-b988-9f8834149437" path="/var/lib/kubelet/pods/eed2aa93-0ee8-4867-b988-9f8834149437/volumes" Feb 23 14:59:03.051733 master-0 kubenswrapper[28758]: I0223 14:59:03.051599 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-89d6-account-create-update-wb2m8"] Feb 23 14:59:03.071512 master-0 kubenswrapper[28758]: I0223 14:59:03.063832 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-xm7bc"] Feb 23 14:59:03.080004 master-0 kubenswrapper[28758]: I0223 14:59:03.079945 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-89d6-account-create-update-wb2m8"] Feb 23 14:59:03.095551 master-0 kubenswrapper[28758]: I0223 14:59:03.095453 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-xm7bc"] Feb 23 14:59:04.106030 master-0 kubenswrapper[28758]: I0223 14:59:04.105947 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c3df66b-646e-4494-90d6-d17492127413" path="/var/lib/kubelet/pods/3c3df66b-646e-4494-90d6-d17492127413/volumes" Feb 23 14:59:04.107176 master-0 kubenswrapper[28758]: I0223 14:59:04.107135 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cac22f15-2beb-4f7e-b34c-842e8c0fa082" path="/var/lib/kubelet/pods/cac22f15-2beb-4f7e-b34c-842e8c0fa082/volumes" Feb 23 14:59:13.058749 master-0 kubenswrapper[28758]: I0223 14:59:13.058679 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-5k6m4"] Feb 23 14:59:13.078658 master-0 kubenswrapper[28758]: I0223 14:59:13.078528 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-5k6m4"] Feb 23 14:59:14.102098 master-0 kubenswrapper[28758]: I0223 14:59:14.102031 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b4d87f9-8a0a-4e73-bb46-d5b42a03a848" path="/var/lib/kubelet/pods/9b4d87f9-8a0a-4e73-bb46-d5b42a03a848/volumes" Feb 23 14:59:19.040546 master-0 kubenswrapper[28758]: I0223 14:59:19.040125 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-db-create-b22m4"] Feb 23 14:59:19.053840 master-0 kubenswrapper[28758]: I0223 14:59:19.053706 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-db-create-b22m4"] Feb 23 14:59:20.101318 master-0 kubenswrapper[28758]: I0223 14:59:20.101254 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcc685c7-da7b-4080-972d-1b4bd11f68ea" path="/var/lib/kubelet/pods/dcc685c7-da7b-4080-972d-1b4bd11f68ea/volumes" Feb 23 14:59:22.048805 master-0 kubenswrapper[28758]: I0223 14:59:22.048671 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-279b-account-create-update-vbsbw"] Feb 23 14:59:22.063815 master-0 kubenswrapper[28758]: I0223 14:59:22.063710 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-279b-account-create-update-vbsbw"] Feb 23 14:59:22.123070 master-0 kubenswrapper[28758]: I0223 14:59:22.122993 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="651d3a04-116a-4337-8f42-3865d8a0b9be" path="/var/lib/kubelet/pods/651d3a04-116a-4337-8f42-3865d8a0b9be/volumes" Feb 23 14:59:33.062067 master-0 kubenswrapper[28758]: I0223 14:59:33.061914 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-w9z2m"] Feb 23 14:59:33.076850 master-0 kubenswrapper[28758]: I0223 14:59:33.076768 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-w9z2m"] Feb 23 14:59:34.106397 master-0 kubenswrapper[28758]: I0223 14:59:34.106289 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0344160e-18f1-422b-90f8-663a15320959" path="/var/lib/kubelet/pods/0344160e-18f1-422b-90f8-663a15320959/volumes" Feb 23 14:59:36.143162 master-0 kubenswrapper[28758]: I0223 14:59:36.143072 28758 scope.go:117] "RemoveContainer" containerID="b1a5bfd0e6cd5857544f9bcef717e675bca6f33a5e3a683f45c863248a2a0d06" Feb 23 14:59:36.181026 master-0 kubenswrapper[28758]: I0223 14:59:36.180839 28758 scope.go:117] "RemoveContainer" containerID="4784d3b4e6d630344a9c8a7cf42d0847dbcac07cd4d210d35398b5a75e06a2a7" Feb 23 14:59:36.231510 master-0 kubenswrapper[28758]: I0223 14:59:36.231409 28758 scope.go:117] "RemoveContainer" containerID="9c9a24046ef728b3310dcfa90c50b1866d4b91e8dd9faa870a330c0dbe7bc618" Feb 23 14:59:36.285158 master-0 kubenswrapper[28758]: I0223 14:59:36.285122 28758 scope.go:117] "RemoveContainer" containerID="b5cfdf4d7d4d0b962b36c937abac8c0ea91f4fe8099a02445b571528e1284019" Feb 23 14:59:36.348247 master-0 kubenswrapper[28758]: I0223 14:59:36.348187 28758 scope.go:117] "RemoveContainer" containerID="c330f7ca02c23b53430e906257f14262ee5d999e85979aacf84635b10c49e9f1" Feb 23 14:59:36.416975 master-0 kubenswrapper[28758]: I0223 14:59:36.416912 28758 scope.go:117] "RemoveContainer" containerID="8c40124fd4dfb812e34b683b70a2dda7334a023d4c9604b37125809b7545bb4b" Feb 23 14:59:36.459732 master-0 kubenswrapper[28758]: I0223 14:59:36.459669 28758 scope.go:117] "RemoveContainer" containerID="3cfcfafb4bed9891235ccb19553f2a35a50b794ce3fc4665a9185ee65ea8d633" Feb 23 14:59:36.486945 master-0 kubenswrapper[28758]: I0223 14:59:36.486850 28758 scope.go:117] "RemoveContainer" containerID="14390245c2be873fca1fe1fb4e69d4bc37faf2c1b5c8442ab2e12b381d91c0b7" Feb 23 14:59:36.522510 master-0 kubenswrapper[28758]: I0223 14:59:36.522448 28758 scope.go:117] "RemoveContainer" containerID="8862597eeca17852c4a830ebe434f16d7190bf543a8ffc6876d53ff2bcc8e132" Feb 23 14:59:39.039353 master-0 kubenswrapper[28758]: I0223 14:59:39.039291 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-pn6b2"] Feb 23 14:59:39.054669 master-0 kubenswrapper[28758]: I0223 14:59:39.054600 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-pn6b2"] Feb 23 14:59:40.106509 master-0 kubenswrapper[28758]: I0223 14:59:40.106420 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a94e87c-4854-4511-86a5-bb59bd265598" path="/var/lib/kubelet/pods/2a94e87c-4854-4511-86a5-bb59bd265598/volumes" Feb 23 14:59:46.225988 master-0 kubenswrapper[28758]: I0223 14:59:46.225909 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-2d990-db-sync-f2ddt"] Feb 23 14:59:46.249051 master-0 kubenswrapper[28758]: I0223 14:59:46.248984 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-2d990-db-sync-f2ddt"] Feb 23 14:59:48.108746 master-0 kubenswrapper[28758]: I0223 14:59:48.106775 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0378b9ac-d258-466f-8e8d-c1d27932f3b2" path="/var/lib/kubelet/pods/0378b9ac-d258-466f-8e8d-c1d27932f3b2/volumes" Feb 23 14:59:52.048232 master-0 kubenswrapper[28758]: I0223 14:59:52.048165 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-tnlrn"] Feb 23 14:59:52.063871 master-0 kubenswrapper[28758]: I0223 14:59:52.063816 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-tnlrn"] Feb 23 14:59:52.103384 master-0 kubenswrapper[28758]: I0223 14:59:52.103285 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d0c7920-4214-40e6-83d6-46e306919ec4" path="/var/lib/kubelet/pods/6d0c7920-4214-40e6-83d6-46e306919ec4/volumes" Feb 23 14:59:59.039989 master-0 kubenswrapper[28758]: I0223 14:59:59.039914 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-db-sync-szljh"] Feb 23 14:59:59.053346 master-0 kubenswrapper[28758]: I0223 14:59:59.053279 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-db-sync-szljh"] Feb 23 15:00:00.104821 master-0 kubenswrapper[28758]: I0223 15:00:00.104736 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc21cc06-410c-4afe-85d4-a72d8cebf881" path="/var/lib/kubelet/pods/cc21cc06-410c-4afe-85d4-a72d8cebf881/volumes" Feb 23 15:00:00.204909 master-0 kubenswrapper[28758]: I0223 15:00:00.204835 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530980-bjwg7"] Feb 23 15:00:00.207522 master-0 kubenswrapper[28758]: I0223 15:00:00.207437 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530980-bjwg7" Feb 23 15:00:00.210901 master-0 kubenswrapper[28758]: I0223 15:00:00.210842 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-jqqg2" Feb 23 15:00:00.211115 master-0 kubenswrapper[28758]: I0223 15:00:00.211076 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 15:00:00.218072 master-0 kubenswrapper[28758]: I0223 15:00:00.218010 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530980-bjwg7"] Feb 23 15:00:00.326253 master-0 kubenswrapper[28758]: I0223 15:00:00.326167 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfphr\" (UniqueName: \"kubernetes.io/projected/5e1ad30e-9521-4847-93ef-e8414153c9d2-kube-api-access-hfphr\") pod \"collect-profiles-29530980-bjwg7\" (UID: \"5e1ad30e-9521-4847-93ef-e8414153c9d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530980-bjwg7" Feb 23 15:00:00.326538 master-0 kubenswrapper[28758]: I0223 15:00:00.326503 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e1ad30e-9521-4847-93ef-e8414153c9d2-config-volume\") pod \"collect-profiles-29530980-bjwg7\" (UID: \"5e1ad30e-9521-4847-93ef-e8414153c9d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530980-bjwg7" Feb 23 15:00:00.326657 master-0 kubenswrapper[28758]: I0223 15:00:00.326625 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e1ad30e-9521-4847-93ef-e8414153c9d2-secret-volume\") pod \"collect-profiles-29530980-bjwg7\" (UID: \"5e1ad30e-9521-4847-93ef-e8414153c9d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530980-bjwg7" Feb 23 15:00:00.429052 master-0 kubenswrapper[28758]: I0223 15:00:00.428895 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e1ad30e-9521-4847-93ef-e8414153c9d2-config-volume\") pod \"collect-profiles-29530980-bjwg7\" (UID: \"5e1ad30e-9521-4847-93ef-e8414153c9d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530980-bjwg7" Feb 23 15:00:00.429052 master-0 kubenswrapper[28758]: I0223 15:00:00.428996 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e1ad30e-9521-4847-93ef-e8414153c9d2-secret-volume\") pod \"collect-profiles-29530980-bjwg7\" (UID: \"5e1ad30e-9521-4847-93ef-e8414153c9d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530980-bjwg7" Feb 23 15:00:00.429369 master-0 kubenswrapper[28758]: I0223 15:00:00.429111 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfphr\" (UniqueName: \"kubernetes.io/projected/5e1ad30e-9521-4847-93ef-e8414153c9d2-kube-api-access-hfphr\") pod \"collect-profiles-29530980-bjwg7\" (UID: \"5e1ad30e-9521-4847-93ef-e8414153c9d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530980-bjwg7" Feb 23 15:00:00.430760 master-0 kubenswrapper[28758]: I0223 15:00:00.430719 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e1ad30e-9521-4847-93ef-e8414153c9d2-config-volume\") pod \"collect-profiles-29530980-bjwg7\" (UID: \"5e1ad30e-9521-4847-93ef-e8414153c9d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530980-bjwg7" Feb 23 15:00:00.432661 master-0 kubenswrapper[28758]: I0223 15:00:00.432591 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e1ad30e-9521-4847-93ef-e8414153c9d2-secret-volume\") pod \"collect-profiles-29530980-bjwg7\" (UID: \"5e1ad30e-9521-4847-93ef-e8414153c9d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530980-bjwg7" Feb 23 15:00:00.444856 master-0 kubenswrapper[28758]: I0223 15:00:00.444748 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfphr\" (UniqueName: \"kubernetes.io/projected/5e1ad30e-9521-4847-93ef-e8414153c9d2-kube-api-access-hfphr\") pod \"collect-profiles-29530980-bjwg7\" (UID: \"5e1ad30e-9521-4847-93ef-e8414153c9d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530980-bjwg7" Feb 23 15:00:00.543743 master-0 kubenswrapper[28758]: I0223 15:00:00.543667 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530980-bjwg7" Feb 23 15:00:01.034575 master-0 kubenswrapper[28758]: I0223 15:00:01.034436 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530980-bjwg7"] Feb 23 15:00:01.186605 master-0 kubenswrapper[28758]: I0223 15:00:01.186395 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530980-bjwg7" event={"ID":"5e1ad30e-9521-4847-93ef-e8414153c9d2","Type":"ContainerStarted","Data":"cdbabe2a5137b054b3daefd3fffc9b0a46ff49f235f5434637dff3ccf1bb246d"} Feb 23 15:00:02.213368 master-0 kubenswrapper[28758]: I0223 15:00:02.213315 28758 generic.go:334] "Generic (PLEG): container finished" podID="5e1ad30e-9521-4847-93ef-e8414153c9d2" containerID="24f13cd6a64cf3f0acbc21eb29f6c16a359a7078ae6e4606b5c40cdd81e63065" exitCode=0 Feb 23 15:00:02.214234 master-0 kubenswrapper[28758]: I0223 15:00:02.213977 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530980-bjwg7" event={"ID":"5e1ad30e-9521-4847-93ef-e8414153c9d2","Type":"ContainerDied","Data":"24f13cd6a64cf3f0acbc21eb29f6c16a359a7078ae6e4606b5c40cdd81e63065"} Feb 23 15:00:03.691233 master-0 kubenswrapper[28758]: I0223 15:00:03.691179 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530980-bjwg7" Feb 23 15:00:03.721144 master-0 kubenswrapper[28758]: I0223 15:00:03.721079 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e1ad30e-9521-4847-93ef-e8414153c9d2-config-volume\") pod \"5e1ad30e-9521-4847-93ef-e8414153c9d2\" (UID: \"5e1ad30e-9521-4847-93ef-e8414153c9d2\") " Feb 23 15:00:03.721410 master-0 kubenswrapper[28758]: I0223 15:00:03.721186 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e1ad30e-9521-4847-93ef-e8414153c9d2-secret-volume\") pod \"5e1ad30e-9521-4847-93ef-e8414153c9d2\" (UID: \"5e1ad30e-9521-4847-93ef-e8414153c9d2\") " Feb 23 15:00:03.721410 master-0 kubenswrapper[28758]: I0223 15:00:03.721358 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfphr\" (UniqueName: \"kubernetes.io/projected/5e1ad30e-9521-4847-93ef-e8414153c9d2-kube-api-access-hfphr\") pod \"5e1ad30e-9521-4847-93ef-e8414153c9d2\" (UID: \"5e1ad30e-9521-4847-93ef-e8414153c9d2\") " Feb 23 15:00:03.721648 master-0 kubenswrapper[28758]: I0223 15:00:03.721595 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e1ad30e-9521-4847-93ef-e8414153c9d2-config-volume" (OuterVolumeSpecName: "config-volume") pod "5e1ad30e-9521-4847-93ef-e8414153c9d2" (UID: "5e1ad30e-9521-4847-93ef-e8414153c9d2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 15:00:03.725500 master-0 kubenswrapper[28758]: I0223 15:00:03.722641 28758 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e1ad30e-9521-4847-93ef-e8414153c9d2-config-volume\") on node \"master-0\" DevicePath \"\"" Feb 23 15:00:03.727595 master-0 kubenswrapper[28758]: I0223 15:00:03.726265 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e1ad30e-9521-4847-93ef-e8414153c9d2-kube-api-access-hfphr" (OuterVolumeSpecName: "kube-api-access-hfphr") pod "5e1ad30e-9521-4847-93ef-e8414153c9d2" (UID: "5e1ad30e-9521-4847-93ef-e8414153c9d2"). InnerVolumeSpecName "kube-api-access-hfphr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 15:00:03.727595 master-0 kubenswrapper[28758]: I0223 15:00:03.726772 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e1ad30e-9521-4847-93ef-e8414153c9d2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5e1ad30e-9521-4847-93ef-e8414153c9d2" (UID: "5e1ad30e-9521-4847-93ef-e8414153c9d2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 15:00:03.827291 master-0 kubenswrapper[28758]: I0223 15:00:03.827037 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfphr\" (UniqueName: \"kubernetes.io/projected/5e1ad30e-9521-4847-93ef-e8414153c9d2-kube-api-access-hfphr\") on node \"master-0\" DevicePath \"\"" Feb 23 15:00:03.827291 master-0 kubenswrapper[28758]: I0223 15:00:03.827128 28758 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e1ad30e-9521-4847-93ef-e8414153c9d2-secret-volume\") on node \"master-0\" DevicePath \"\"" Feb 23 15:00:04.239536 master-0 kubenswrapper[28758]: I0223 15:00:04.239174 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530980-bjwg7" event={"ID":"5e1ad30e-9521-4847-93ef-e8414153c9d2","Type":"ContainerDied","Data":"cdbabe2a5137b054b3daefd3fffc9b0a46ff49f235f5434637dff3ccf1bb246d"} Feb 23 15:00:04.239536 master-0 kubenswrapper[28758]: I0223 15:00:04.239220 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530980-bjwg7" Feb 23 15:00:04.239536 master-0 kubenswrapper[28758]: I0223 15:00:04.239231 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdbabe2a5137b054b3daefd3fffc9b0a46ff49f235f5434637dff3ccf1bb246d" Feb 23 15:00:07.051742 master-0 kubenswrapper[28758]: I0223 15:00:07.051662 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-db-create-w9qz6"] Feb 23 15:00:07.064642 master-0 kubenswrapper[28758]: I0223 15:00:07.064551 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-db-create-w9qz6"] Feb 23 15:00:08.103900 master-0 kubenswrapper[28758]: I0223 15:00:08.103822 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a47ca88-30b0-4569-bcd5-00994d3facc0" path="/var/lib/kubelet/pods/6a47ca88-30b0-4569-bcd5-00994d3facc0/volumes" Feb 23 15:00:09.122648 master-0 kubenswrapper[28758]: I0223 15:00:09.122519 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-149f-account-create-update-sktxn"] Feb 23 15:00:09.136311 master-0 kubenswrapper[28758]: I0223 15:00:09.136212 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-149f-account-create-update-sktxn"] Feb 23 15:00:10.102040 master-0 kubenswrapper[28758]: I0223 15:00:10.101960 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="251ce51f-9613-4b91-987b-bb29a897430f" path="/var/lib/kubelet/pods/251ce51f-9613-4b91-987b-bb29a897430f/volumes" Feb 23 15:00:29.037625 master-0 kubenswrapper[28758]: I0223 15:00:29.037396 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-tggnt"] Feb 23 15:00:29.051902 master-0 kubenswrapper[28758]: I0223 15:00:29.051828 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-tggnt"] Feb 23 15:00:30.103272 master-0 kubenswrapper[28758]: I0223 15:00:30.103196 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e586dba-e451-4458-972b-223909c7901e" path="/var/lib/kubelet/pods/2e586dba-e451-4458-972b-223909c7901e/volumes" Feb 23 15:00:34.223021 master-0 kubenswrapper[28758]: I0223 15:00:34.222942 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-rk2l6"] Feb 23 15:00:34.340035 master-0 kubenswrapper[28758]: I0223 15:00:34.339961 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-db-sync-8kt4c"] Feb 23 15:00:34.351158 master-0 kubenswrapper[28758]: I0223 15:00:34.351075 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-844d-account-create-update-l79ql"] Feb 23 15:00:34.369774 master-0 kubenswrapper[28758]: I0223 15:00:34.369692 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-73e0-account-create-update-v8x2s"] Feb 23 15:00:34.386765 master-0 kubenswrapper[28758]: I0223 15:00:34.386687 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-rk2l6"] Feb 23 15:00:34.401600 master-0 kubenswrapper[28758]: I0223 15:00:34.401543 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-db-sync-8kt4c"] Feb 23 15:00:34.412865 master-0 kubenswrapper[28758]: I0223 15:00:34.412788 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-8d9a-account-create-update-47dvn"] Feb 23 15:00:34.422774 master-0 kubenswrapper[28758]: I0223 15:00:34.422451 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-73e0-account-create-update-v8x2s"] Feb 23 15:00:34.432018 master-0 kubenswrapper[28758]: I0223 15:00:34.431938 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-844d-account-create-update-l79ql"] Feb 23 15:00:34.445582 master-0 kubenswrapper[28758]: I0223 15:00:34.444568 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-8d9a-account-create-update-47dvn"] Feb 23 15:00:35.083195 master-0 kubenswrapper[28758]: I0223 15:00:35.083041 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-p68gm"] Feb 23 15:00:35.098682 master-0 kubenswrapper[28758]: I0223 15:00:35.098600 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-p68gm"] Feb 23 15:00:36.101667 master-0 kubenswrapper[28758]: I0223 15:00:36.101607 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04a17006-c08e-4ac7-8b6f-f412b4249c6c" path="/var/lib/kubelet/pods/04a17006-c08e-4ac7-8b6f-f412b4249c6c/volumes" Feb 23 15:00:36.102294 master-0 kubenswrapper[28758]: I0223 15:00:36.102273 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48f37d7c-b625-42da-93ff-b6d5a1702356" path="/var/lib/kubelet/pods/48f37d7c-b625-42da-93ff-b6d5a1702356/volumes" Feb 23 15:00:36.102968 master-0 kubenswrapper[28758]: I0223 15:00:36.102935 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ad3874c-6300-4cff-80d7-b332ebc88e5d" path="/var/lib/kubelet/pods/4ad3874c-6300-4cff-80d7-b332ebc88e5d/volumes" Feb 23 15:00:36.103596 master-0 kubenswrapper[28758]: I0223 15:00:36.103567 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="708dbd36-4a71-40d9-8de7-827de4921a8a" path="/var/lib/kubelet/pods/708dbd36-4a71-40d9-8de7-827de4921a8a/volumes" Feb 23 15:00:36.104748 master-0 kubenswrapper[28758]: I0223 15:00:36.104715 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98bc143e-fd3c-4a62-a069-5e2357cb2209" path="/var/lib/kubelet/pods/98bc143e-fd3c-4a62-a069-5e2357cb2209/volumes" Feb 23 15:00:36.105611 master-0 kubenswrapper[28758]: I0223 15:00:36.105589 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2f2ca95-2590-4599-8b0f-4714ed65c1b4" path="/var/lib/kubelet/pods/f2f2ca95-2590-4599-8b0f-4714ed65c1b4/volumes" Feb 23 15:00:36.742020 master-0 kubenswrapper[28758]: I0223 15:00:36.741918 28758 scope.go:117] "RemoveContainer" containerID="43ebae455950d08989e69efc2e780b795372f8bc26c6b93ffee2179991f345a2" Feb 23 15:00:36.774276 master-0 kubenswrapper[28758]: I0223 15:00:36.774236 28758 scope.go:117] "RemoveContainer" containerID="0839226bf47baff2ee4462e5c3235acac5b23f2c1f08a5b5f9b45464eed9fd45" Feb 23 15:00:36.828238 master-0 kubenswrapper[28758]: I0223 15:00:36.828195 28758 scope.go:117] "RemoveContainer" containerID="2897b78f65d741597b65d80d7ce9ae6c67ba520a4aa7a6776404a9ba4cebb879" Feb 23 15:00:36.874151 master-0 kubenswrapper[28758]: I0223 15:00:36.874068 28758 scope.go:117] "RemoveContainer" containerID="f3fe2183d4c3c96c027d08265e909347e2bfe51dfe60c7bde4cf7f8db1acc464" Feb 23 15:00:36.936246 master-0 kubenswrapper[28758]: I0223 15:00:36.936196 28758 scope.go:117] "RemoveContainer" containerID="db726734716cb9c758f04cf1225d4bf86e0a58d90cc3c2898c49f77f4d8f0dba" Feb 23 15:00:37.006528 master-0 kubenswrapper[28758]: I0223 15:00:37.006419 28758 scope.go:117] "RemoveContainer" containerID="abcc5bdf5d9d30d513b449d5db08480f5e24894fb2a696a2e1ce51134aa83860" Feb 23 15:00:37.031696 master-0 kubenswrapper[28758]: I0223 15:00:37.031641 28758 scope.go:117] "RemoveContainer" containerID="d0727e4a4a021693796beed67c9d1e16bd7e64167c5f7f6fb334424cfa61f51a" Feb 23 15:00:37.056623 master-0 kubenswrapper[28758]: I0223 15:00:37.056570 28758 scope.go:117] "RemoveContainer" containerID="8b53e0ca29c760d86a78f00fa7f9b5319f8023c9a30f5cbcf5bcf9156478c801" Feb 23 15:00:37.091163 master-0 kubenswrapper[28758]: I0223 15:00:37.091110 28758 scope.go:117] "RemoveContainer" containerID="bc3660a666ef358c4127c7c478004d9485ab4800c068cb7d437da4c2d89f18b3" Feb 23 15:00:37.133076 master-0 kubenswrapper[28758]: I0223 15:00:37.133015 28758 scope.go:117] "RemoveContainer" containerID="07f8bb525d47f0913ba3f4e74c1e76a4eb4a1c1bed843ddb4abcdc854b95ce0a" Feb 23 15:00:37.175853 master-0 kubenswrapper[28758]: I0223 15:00:37.175761 28758 scope.go:117] "RemoveContainer" containerID="92c8c5e5998cae5570d1aa787109bc8e92bb5d07c7a2f67c6912e04e15f6011d" Feb 23 15:00:37.201027 master-0 kubenswrapper[28758]: I0223 15:00:37.200985 28758 scope.go:117] "RemoveContainer" containerID="f1899198a88f763740bd890a3b93d1aa145dbe8e2027b332ad777ba096752dec" Feb 23 15:00:37.233598 master-0 kubenswrapper[28758]: I0223 15:00:37.233538 28758 scope.go:117] "RemoveContainer" containerID="81b5a93cbe911727c15ff561e49dbfea918939c3fadde01e9b97707a0530b51d" Feb 23 15:00:37.262507 master-0 kubenswrapper[28758]: I0223 15:00:37.262434 28758 scope.go:117] "RemoveContainer" containerID="864f3c07d114d4242476f8231283727d6e18c11c580c1e6f8dddb6379ad9dea6" Feb 23 15:01:00.162941 master-0 kubenswrapper[28758]: I0223 15:01:00.162785 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29530981-kbw2b"] Feb 23 15:01:00.163591 master-0 kubenswrapper[28758]: E0223 15:01:00.163517 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e1ad30e-9521-4847-93ef-e8414153c9d2" containerName="collect-profiles" Feb 23 15:01:00.163591 master-0 kubenswrapper[28758]: I0223 15:01:00.163542 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e1ad30e-9521-4847-93ef-e8414153c9d2" containerName="collect-profiles" Feb 23 15:01:00.163905 master-0 kubenswrapper[28758]: I0223 15:01:00.163872 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e1ad30e-9521-4847-93ef-e8414153c9d2" containerName="collect-profiles" Feb 23 15:01:00.165823 master-0 kubenswrapper[28758]: I0223 15:01:00.165553 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29530981-kbw2b" Feb 23 15:01:00.178022 master-0 kubenswrapper[28758]: I0223 15:01:00.177965 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29530981-kbw2b"] Feb 23 15:01:00.266743 master-0 kubenswrapper[28758]: I0223 15:01:00.266633 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6ddbd438-9416-4af2-95cd-ac31b5a419b3-fernet-keys\") pod \"keystone-cron-29530981-kbw2b\" (UID: \"6ddbd438-9416-4af2-95cd-ac31b5a419b3\") " pod="openstack/keystone-cron-29530981-kbw2b" Feb 23 15:01:00.267046 master-0 kubenswrapper[28758]: I0223 15:01:00.266952 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ddbd438-9416-4af2-95cd-ac31b5a419b3-combined-ca-bundle\") pod \"keystone-cron-29530981-kbw2b\" (UID: \"6ddbd438-9416-4af2-95cd-ac31b5a419b3\") " pod="openstack/keystone-cron-29530981-kbw2b" Feb 23 15:01:00.267313 master-0 kubenswrapper[28758]: I0223 15:01:00.267285 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ddbd438-9416-4af2-95cd-ac31b5a419b3-config-data\") pod \"keystone-cron-29530981-kbw2b\" (UID: \"6ddbd438-9416-4af2-95cd-ac31b5a419b3\") " pod="openstack/keystone-cron-29530981-kbw2b" Feb 23 15:01:00.267459 master-0 kubenswrapper[28758]: I0223 15:01:00.267384 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5jfj\" (UniqueName: \"kubernetes.io/projected/6ddbd438-9416-4af2-95cd-ac31b5a419b3-kube-api-access-k5jfj\") pod \"keystone-cron-29530981-kbw2b\" (UID: \"6ddbd438-9416-4af2-95cd-ac31b5a419b3\") " pod="openstack/keystone-cron-29530981-kbw2b" Feb 23 15:01:00.370332 master-0 kubenswrapper[28758]: I0223 15:01:00.370253 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6ddbd438-9416-4af2-95cd-ac31b5a419b3-fernet-keys\") pod \"keystone-cron-29530981-kbw2b\" (UID: \"6ddbd438-9416-4af2-95cd-ac31b5a419b3\") " pod="openstack/keystone-cron-29530981-kbw2b" Feb 23 15:01:00.370624 master-0 kubenswrapper[28758]: I0223 15:01:00.370403 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ddbd438-9416-4af2-95cd-ac31b5a419b3-combined-ca-bundle\") pod \"keystone-cron-29530981-kbw2b\" (UID: \"6ddbd438-9416-4af2-95cd-ac31b5a419b3\") " pod="openstack/keystone-cron-29530981-kbw2b" Feb 23 15:01:00.371288 master-0 kubenswrapper[28758]: I0223 15:01:00.371245 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ddbd438-9416-4af2-95cd-ac31b5a419b3-config-data\") pod \"keystone-cron-29530981-kbw2b\" (UID: \"6ddbd438-9416-4af2-95cd-ac31b5a419b3\") " pod="openstack/keystone-cron-29530981-kbw2b" Feb 23 15:01:00.371350 master-0 kubenswrapper[28758]: I0223 15:01:00.371326 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5jfj\" (UniqueName: \"kubernetes.io/projected/6ddbd438-9416-4af2-95cd-ac31b5a419b3-kube-api-access-k5jfj\") pod \"keystone-cron-29530981-kbw2b\" (UID: \"6ddbd438-9416-4af2-95cd-ac31b5a419b3\") " pod="openstack/keystone-cron-29530981-kbw2b" Feb 23 15:01:00.374614 master-0 kubenswrapper[28758]: I0223 15:01:00.374569 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ddbd438-9416-4af2-95cd-ac31b5a419b3-combined-ca-bundle\") pod \"keystone-cron-29530981-kbw2b\" (UID: \"6ddbd438-9416-4af2-95cd-ac31b5a419b3\") " pod="openstack/keystone-cron-29530981-kbw2b" Feb 23 15:01:00.374791 master-0 kubenswrapper[28758]: I0223 15:01:00.374756 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ddbd438-9416-4af2-95cd-ac31b5a419b3-config-data\") pod \"keystone-cron-29530981-kbw2b\" (UID: \"6ddbd438-9416-4af2-95cd-ac31b5a419b3\") " pod="openstack/keystone-cron-29530981-kbw2b" Feb 23 15:01:00.377347 master-0 kubenswrapper[28758]: I0223 15:01:00.377298 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6ddbd438-9416-4af2-95cd-ac31b5a419b3-fernet-keys\") pod \"keystone-cron-29530981-kbw2b\" (UID: \"6ddbd438-9416-4af2-95cd-ac31b5a419b3\") " pod="openstack/keystone-cron-29530981-kbw2b" Feb 23 15:01:00.402993 master-0 kubenswrapper[28758]: I0223 15:01:00.402896 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5jfj\" (UniqueName: \"kubernetes.io/projected/6ddbd438-9416-4af2-95cd-ac31b5a419b3-kube-api-access-k5jfj\") pod \"keystone-cron-29530981-kbw2b\" (UID: \"6ddbd438-9416-4af2-95cd-ac31b5a419b3\") " pod="openstack/keystone-cron-29530981-kbw2b" Feb 23 15:01:00.486123 master-0 kubenswrapper[28758]: I0223 15:01:00.486035 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29530981-kbw2b" Feb 23 15:01:01.013649 master-0 kubenswrapper[28758]: I0223 15:01:01.013303 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29530981-kbw2b"] Feb 23 15:01:01.891070 master-0 kubenswrapper[28758]: I0223 15:01:01.890999 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29530981-kbw2b" event={"ID":"6ddbd438-9416-4af2-95cd-ac31b5a419b3","Type":"ContainerStarted","Data":"88dc29544c37d9036269a1ca012de04d37cb09d37f95a7ad022d1690c0b73f10"} Feb 23 15:01:01.891070 master-0 kubenswrapper[28758]: I0223 15:01:01.891056 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29530981-kbw2b" event={"ID":"6ddbd438-9416-4af2-95cd-ac31b5a419b3","Type":"ContainerStarted","Data":"0148d5740969c6aada939ef9e6e9caf4e3f905077785e1c3ff68918f4905ad26"} Feb 23 15:01:03.912693 master-0 kubenswrapper[28758]: I0223 15:01:03.912510 28758 generic.go:334] "Generic (PLEG): container finished" podID="6ddbd438-9416-4af2-95cd-ac31b5a419b3" containerID="88dc29544c37d9036269a1ca012de04d37cb09d37f95a7ad022d1690c0b73f10" exitCode=0 Feb 23 15:01:03.912693 master-0 kubenswrapper[28758]: I0223 15:01:03.912570 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29530981-kbw2b" event={"ID":"6ddbd438-9416-4af2-95cd-ac31b5a419b3","Type":"ContainerDied","Data":"88dc29544c37d9036269a1ca012de04d37cb09d37f95a7ad022d1690c0b73f10"} Feb 23 15:01:05.342420 master-0 kubenswrapper[28758]: I0223 15:01:05.342372 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29530981-kbw2b" Feb 23 15:01:05.418507 master-0 kubenswrapper[28758]: I0223 15:01:05.418413 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5jfj\" (UniqueName: \"kubernetes.io/projected/6ddbd438-9416-4af2-95cd-ac31b5a419b3-kube-api-access-k5jfj\") pod \"6ddbd438-9416-4af2-95cd-ac31b5a419b3\" (UID: \"6ddbd438-9416-4af2-95cd-ac31b5a419b3\") " Feb 23 15:01:05.418822 master-0 kubenswrapper[28758]: I0223 15:01:05.418591 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ddbd438-9416-4af2-95cd-ac31b5a419b3-combined-ca-bundle\") pod \"6ddbd438-9416-4af2-95cd-ac31b5a419b3\" (UID: \"6ddbd438-9416-4af2-95cd-ac31b5a419b3\") " Feb 23 15:01:05.419810 master-0 kubenswrapper[28758]: I0223 15:01:05.419782 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ddbd438-9416-4af2-95cd-ac31b5a419b3-config-data\") pod \"6ddbd438-9416-4af2-95cd-ac31b5a419b3\" (UID: \"6ddbd438-9416-4af2-95cd-ac31b5a419b3\") " Feb 23 15:01:05.420063 master-0 kubenswrapper[28758]: I0223 15:01:05.419834 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6ddbd438-9416-4af2-95cd-ac31b5a419b3-fernet-keys\") pod \"6ddbd438-9416-4af2-95cd-ac31b5a419b3\" (UID: \"6ddbd438-9416-4af2-95cd-ac31b5a419b3\") " Feb 23 15:01:05.422525 master-0 kubenswrapper[28758]: I0223 15:01:05.422432 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ddbd438-9416-4af2-95cd-ac31b5a419b3-kube-api-access-k5jfj" (OuterVolumeSpecName: "kube-api-access-k5jfj") pod "6ddbd438-9416-4af2-95cd-ac31b5a419b3" (UID: "6ddbd438-9416-4af2-95cd-ac31b5a419b3"). InnerVolumeSpecName "kube-api-access-k5jfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 15:01:05.423084 master-0 kubenswrapper[28758]: I0223 15:01:05.423061 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ddbd438-9416-4af2-95cd-ac31b5a419b3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6ddbd438-9416-4af2-95cd-ac31b5a419b3" (UID: "6ddbd438-9416-4af2-95cd-ac31b5a419b3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 15:01:05.449696 master-0 kubenswrapper[28758]: I0223 15:01:05.449600 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ddbd438-9416-4af2-95cd-ac31b5a419b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ddbd438-9416-4af2-95cd-ac31b5a419b3" (UID: "6ddbd438-9416-4af2-95cd-ac31b5a419b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 15:01:05.474734 master-0 kubenswrapper[28758]: I0223 15:01:05.474687 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ddbd438-9416-4af2-95cd-ac31b5a419b3-config-data" (OuterVolumeSpecName: "config-data") pod "6ddbd438-9416-4af2-95cd-ac31b5a419b3" (UID: "6ddbd438-9416-4af2-95cd-ac31b5a419b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 15:01:05.524080 master-0 kubenswrapper[28758]: I0223 15:01:05.524007 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5jfj\" (UniqueName: \"kubernetes.io/projected/6ddbd438-9416-4af2-95cd-ac31b5a419b3-kube-api-access-k5jfj\") on node \"master-0\" DevicePath \"\"" Feb 23 15:01:05.524080 master-0 kubenswrapper[28758]: I0223 15:01:05.524058 28758 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ddbd438-9416-4af2-95cd-ac31b5a419b3-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Feb 23 15:01:05.524080 master-0 kubenswrapper[28758]: I0223 15:01:05.524067 28758 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ddbd438-9416-4af2-95cd-ac31b5a419b3-config-data\") on node \"master-0\" DevicePath \"\"" Feb 23 15:01:05.524080 master-0 kubenswrapper[28758]: I0223 15:01:05.524077 28758 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6ddbd438-9416-4af2-95cd-ac31b5a419b3-fernet-keys\") on node \"master-0\" DevicePath \"\"" Feb 23 15:01:05.946877 master-0 kubenswrapper[28758]: I0223 15:01:05.946775 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29530981-kbw2b" event={"ID":"6ddbd438-9416-4af2-95cd-ac31b5a419b3","Type":"ContainerDied","Data":"0148d5740969c6aada939ef9e6e9caf4e3f905077785e1c3ff68918f4905ad26"} Feb 23 15:01:05.946877 master-0 kubenswrapper[28758]: I0223 15:01:05.946873 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0148d5740969c6aada939ef9e6e9caf4e3f905077785e1c3ff68918f4905ad26" Feb 23 15:01:05.947138 master-0 kubenswrapper[28758]: I0223 15:01:05.946878 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29530981-kbw2b" Feb 23 15:01:11.069497 master-0 kubenswrapper[28758]: I0223 15:01:11.069411 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-454z9"] Feb 23 15:01:11.079352 master-0 kubenswrapper[28758]: I0223 15:01:11.079284 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-454z9"] Feb 23 15:01:12.103641 master-0 kubenswrapper[28758]: I0223 15:01:12.103578 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76227c43-1d77-4dd0-93fd-90a100ccb01e" path="/var/lib/kubelet/pods/76227c43-1d77-4dd0-93fd-90a100ccb01e/volumes" Feb 23 15:01:35.049665 master-0 kubenswrapper[28758]: I0223 15:01:35.049564 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-cbtv5"] Feb 23 15:01:35.061216 master-0 kubenswrapper[28758]: I0223 15:01:35.061143 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-cbtv5"] Feb 23 15:01:36.101889 master-0 kubenswrapper[28758]: I0223 15:01:36.101831 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b0c2335-dbed-4733-997a-cb1ab862acb8" path="/var/lib/kubelet/pods/5b0c2335-dbed-4733-997a-cb1ab862acb8/volumes" Feb 23 15:01:37.547997 master-0 kubenswrapper[28758]: I0223 15:01:37.547931 28758 scope.go:117] "RemoveContainer" containerID="3656f1f4e8318bcc4bcd30498a1cd13ed4f65f408c0527816b585587aefa1cae" Feb 23 15:01:37.604103 master-0 kubenswrapper[28758]: I0223 15:01:37.604049 28758 scope.go:117] "RemoveContainer" containerID="8a63eb55160a29c3331d218879d84fd9fc1563ed7864fbf547dd3a2599195d2b" Feb 23 15:01:38.041073 master-0 kubenswrapper[28758]: I0223 15:01:38.040944 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mj69t"] Feb 23 15:01:38.054926 master-0 kubenswrapper[28758]: I0223 15:01:38.054822 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mj69t"] Feb 23 15:01:38.109137 master-0 kubenswrapper[28758]: I0223 15:01:38.108163 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb4acaac-3f2b-41c9-a0fc-76232abf5168" path="/var/lib/kubelet/pods/bb4acaac-3f2b-41c9-a0fc-76232abf5168/volumes" Feb 23 15:02:13.269334 master-0 kubenswrapper[28758]: I0223 15:02:13.269249 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-host-discover-9m76c"] Feb 23 15:02:13.429241 master-0 kubenswrapper[28758]: I0223 15:02:13.429154 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-host-discover-9m76c"] Feb 23 15:02:14.109196 master-0 kubenswrapper[28758]: I0223 15:02:14.109111 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64cc097b-5a40-40dc-961d-2951e576f39e" path="/var/lib/kubelet/pods/64cc097b-5a40-40dc-961d-2951e576f39e/volumes" Feb 23 15:02:15.056965 master-0 kubenswrapper[28758]: I0223 15:02:15.056906 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-btgpd"] Feb 23 15:02:15.071878 master-0 kubenswrapper[28758]: I0223 15:02:15.071805 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-btgpd"] Feb 23 15:02:16.103565 master-0 kubenswrapper[28758]: I0223 15:02:16.102808 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2ec10e8-80ea-4fc3-93ce-3deec7202918" path="/var/lib/kubelet/pods/c2ec10e8-80ea-4fc3-93ce-3deec7202918/volumes" Feb 23 15:02:37.690157 master-0 kubenswrapper[28758]: I0223 15:02:37.690083 28758 scope.go:117] "RemoveContainer" containerID="7721aae09aafaa7e916c995bc3522bc0e1eb74299fe988fc4bfc2950d24c1c5a" Feb 23 15:02:37.736504 master-0 kubenswrapper[28758]: I0223 15:02:37.736431 28758 scope.go:117] "RemoveContainer" containerID="53bfde90eeec423d49fc398e4df02d1b4e0852a76e87fe0b10c460c7b653dfec" Feb 23 15:02:37.787646 master-0 kubenswrapper[28758]: I0223 15:02:37.787588 28758 scope.go:117] "RemoveContainer" containerID="1c658e32c2067f75610049b7ce969bb955f138c4bfd9451049314dae8f697706" Feb 23 15:14:12.149431 master-0 kubenswrapper[28758]: E0223 15:14:12.149352 28758 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.32.10:46210->192.168.32.10:43975: write tcp 192.168.32.10:46210->192.168.32.10:43975: write: connection reset by peer Feb 23 15:15:00.160368 master-0 kubenswrapper[28758]: I0223 15:15:00.160287 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530995-5zb8p"] Feb 23 15:15:00.160998 master-0 kubenswrapper[28758]: E0223 15:15:00.160916 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ddbd438-9416-4af2-95cd-ac31b5a419b3" containerName="keystone-cron" Feb 23 15:15:00.160998 master-0 kubenswrapper[28758]: I0223 15:15:00.160936 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ddbd438-9416-4af2-95cd-ac31b5a419b3" containerName="keystone-cron" Feb 23 15:15:00.161396 master-0 kubenswrapper[28758]: I0223 15:15:00.161366 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ddbd438-9416-4af2-95cd-ac31b5a419b3" containerName="keystone-cron" Feb 23 15:15:00.162570 master-0 kubenswrapper[28758]: I0223 15:15:00.162543 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530995-5zb8p" Feb 23 15:15:00.164710 master-0 kubenswrapper[28758]: I0223 15:15:00.164653 28758 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-jqqg2" Feb 23 15:15:00.164845 master-0 kubenswrapper[28758]: I0223 15:15:00.164728 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 15:15:00.177204 master-0 kubenswrapper[28758]: I0223 15:15:00.176769 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530995-5zb8p"] Feb 23 15:15:00.315562 master-0 kubenswrapper[28758]: I0223 15:15:00.315466 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fcb67bad-fb11-42ae-a0ea-9383a1011a93-config-volume\") pod \"collect-profiles-29530995-5zb8p\" (UID: \"fcb67bad-fb11-42ae-a0ea-9383a1011a93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530995-5zb8p" Feb 23 15:15:00.315821 master-0 kubenswrapper[28758]: I0223 15:15:00.315799 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fcb67bad-fb11-42ae-a0ea-9383a1011a93-secret-volume\") pod \"collect-profiles-29530995-5zb8p\" (UID: \"fcb67bad-fb11-42ae-a0ea-9383a1011a93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530995-5zb8p" Feb 23 15:15:00.315865 master-0 kubenswrapper[28758]: I0223 15:15:00.315849 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st5s8\" (UniqueName: \"kubernetes.io/projected/fcb67bad-fb11-42ae-a0ea-9383a1011a93-kube-api-access-st5s8\") pod \"collect-profiles-29530995-5zb8p\" (UID: \"fcb67bad-fb11-42ae-a0ea-9383a1011a93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530995-5zb8p" Feb 23 15:15:00.417785 master-0 kubenswrapper[28758]: I0223 15:15:00.417653 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fcb67bad-fb11-42ae-a0ea-9383a1011a93-config-volume\") pod \"collect-profiles-29530995-5zb8p\" (UID: \"fcb67bad-fb11-42ae-a0ea-9383a1011a93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530995-5zb8p" Feb 23 15:15:00.417998 master-0 kubenswrapper[28758]: I0223 15:15:00.417879 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fcb67bad-fb11-42ae-a0ea-9383a1011a93-secret-volume\") pod \"collect-profiles-29530995-5zb8p\" (UID: \"fcb67bad-fb11-42ae-a0ea-9383a1011a93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530995-5zb8p" Feb 23 15:15:00.417998 master-0 kubenswrapper[28758]: I0223 15:15:00.417924 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st5s8\" (UniqueName: \"kubernetes.io/projected/fcb67bad-fb11-42ae-a0ea-9383a1011a93-kube-api-access-st5s8\") pod \"collect-profiles-29530995-5zb8p\" (UID: \"fcb67bad-fb11-42ae-a0ea-9383a1011a93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530995-5zb8p" Feb 23 15:15:00.418741 master-0 kubenswrapper[28758]: I0223 15:15:00.418696 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fcb67bad-fb11-42ae-a0ea-9383a1011a93-config-volume\") pod \"collect-profiles-29530995-5zb8p\" (UID: \"fcb67bad-fb11-42ae-a0ea-9383a1011a93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530995-5zb8p" Feb 23 15:15:00.421371 master-0 kubenswrapper[28758]: I0223 15:15:00.421328 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fcb67bad-fb11-42ae-a0ea-9383a1011a93-secret-volume\") pod \"collect-profiles-29530995-5zb8p\" (UID: \"fcb67bad-fb11-42ae-a0ea-9383a1011a93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530995-5zb8p" Feb 23 15:15:00.439324 master-0 kubenswrapper[28758]: I0223 15:15:00.439271 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st5s8\" (UniqueName: \"kubernetes.io/projected/fcb67bad-fb11-42ae-a0ea-9383a1011a93-kube-api-access-st5s8\") pod \"collect-profiles-29530995-5zb8p\" (UID: \"fcb67bad-fb11-42ae-a0ea-9383a1011a93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530995-5zb8p" Feb 23 15:15:00.499808 master-0 kubenswrapper[28758]: I0223 15:15:00.499710 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530995-5zb8p" Feb 23 15:15:01.147052 master-0 kubenswrapper[28758]: W0223 15:15:01.143312 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcb67bad_fb11_42ae_a0ea_9383a1011a93.slice/crio-94a6243730cbd30c8dabf40773ac3dc55b76fb10f201b57c81f385925a0be101 WatchSource:0}: Error finding container 94a6243730cbd30c8dabf40773ac3dc55b76fb10f201b57c81f385925a0be101: Status 404 returned error can't find the container with id 94a6243730cbd30c8dabf40773ac3dc55b76fb10f201b57c81f385925a0be101 Feb 23 15:15:01.147808 master-0 kubenswrapper[28758]: I0223 15:15:01.147262 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530995-5zb8p"] Feb 23 15:15:01.254775 master-0 kubenswrapper[28758]: I0223 15:15:01.254725 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530995-5zb8p" event={"ID":"fcb67bad-fb11-42ae-a0ea-9383a1011a93","Type":"ContainerStarted","Data":"94a6243730cbd30c8dabf40773ac3dc55b76fb10f201b57c81f385925a0be101"} Feb 23 15:15:02.267780 master-0 kubenswrapper[28758]: I0223 15:15:02.267668 28758 generic.go:334] "Generic (PLEG): container finished" podID="fcb67bad-fb11-42ae-a0ea-9383a1011a93" containerID="d8c8307d03d94f8deeff463834a20978f1601120135e2a96fa6bef0bfa7511a4" exitCode=0 Feb 23 15:15:02.267780 master-0 kubenswrapper[28758]: I0223 15:15:02.267735 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530995-5zb8p" event={"ID":"fcb67bad-fb11-42ae-a0ea-9383a1011a93","Type":"ContainerDied","Data":"d8c8307d03d94f8deeff463834a20978f1601120135e2a96fa6bef0bfa7511a4"} Feb 23 15:15:03.733520 master-0 kubenswrapper[28758]: I0223 15:15:03.733405 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530995-5zb8p" Feb 23 15:15:03.860405 master-0 kubenswrapper[28758]: I0223 15:15:03.860235 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fcb67bad-fb11-42ae-a0ea-9383a1011a93-config-volume\") pod \"fcb67bad-fb11-42ae-a0ea-9383a1011a93\" (UID: \"fcb67bad-fb11-42ae-a0ea-9383a1011a93\") " Feb 23 15:15:03.860405 master-0 kubenswrapper[28758]: I0223 15:15:03.860361 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fcb67bad-fb11-42ae-a0ea-9383a1011a93-secret-volume\") pod \"fcb67bad-fb11-42ae-a0ea-9383a1011a93\" (UID: \"fcb67bad-fb11-42ae-a0ea-9383a1011a93\") " Feb 23 15:15:03.860405 master-0 kubenswrapper[28758]: I0223 15:15:03.860395 28758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-st5s8\" (UniqueName: \"kubernetes.io/projected/fcb67bad-fb11-42ae-a0ea-9383a1011a93-kube-api-access-st5s8\") pod \"fcb67bad-fb11-42ae-a0ea-9383a1011a93\" (UID: \"fcb67bad-fb11-42ae-a0ea-9383a1011a93\") " Feb 23 15:15:03.860886 master-0 kubenswrapper[28758]: I0223 15:15:03.860799 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcb67bad-fb11-42ae-a0ea-9383a1011a93-config-volume" (OuterVolumeSpecName: "config-volume") pod "fcb67bad-fb11-42ae-a0ea-9383a1011a93" (UID: "fcb67bad-fb11-42ae-a0ea-9383a1011a93"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 15:15:03.861808 master-0 kubenswrapper[28758]: I0223 15:15:03.861774 28758 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fcb67bad-fb11-42ae-a0ea-9383a1011a93-config-volume\") on node \"master-0\" DevicePath \"\"" Feb 23 15:15:03.863680 master-0 kubenswrapper[28758]: I0223 15:15:03.863641 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcb67bad-fb11-42ae-a0ea-9383a1011a93-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fcb67bad-fb11-42ae-a0ea-9383a1011a93" (UID: "fcb67bad-fb11-42ae-a0ea-9383a1011a93"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 15:15:03.863739 master-0 kubenswrapper[28758]: I0223 15:15:03.863711 28758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcb67bad-fb11-42ae-a0ea-9383a1011a93-kube-api-access-st5s8" (OuterVolumeSpecName: "kube-api-access-st5s8") pod "fcb67bad-fb11-42ae-a0ea-9383a1011a93" (UID: "fcb67bad-fb11-42ae-a0ea-9383a1011a93"). InnerVolumeSpecName "kube-api-access-st5s8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 15:15:03.963983 master-0 kubenswrapper[28758]: I0223 15:15:03.963906 28758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-st5s8\" (UniqueName: \"kubernetes.io/projected/fcb67bad-fb11-42ae-a0ea-9383a1011a93-kube-api-access-st5s8\") on node \"master-0\" DevicePath \"\"" Feb 23 15:15:03.963983 master-0 kubenswrapper[28758]: I0223 15:15:03.963952 28758 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fcb67bad-fb11-42ae-a0ea-9383a1011a93-secret-volume\") on node \"master-0\" DevicePath \"\"" Feb 23 15:15:04.293186 master-0 kubenswrapper[28758]: I0223 15:15:04.293110 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530995-5zb8p" event={"ID":"fcb67bad-fb11-42ae-a0ea-9383a1011a93","Type":"ContainerDied","Data":"94a6243730cbd30c8dabf40773ac3dc55b76fb10f201b57c81f385925a0be101"} Feb 23 15:15:04.293186 master-0 kubenswrapper[28758]: I0223 15:15:04.293183 28758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94a6243730cbd30c8dabf40773ac3dc55b76fb10f201b57c81f385925a0be101" Feb 23 15:15:04.293186 master-0 kubenswrapper[28758]: I0223 15:15:04.293192 28758 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530995-5zb8p" Feb 23 15:15:04.876113 master-0 kubenswrapper[28758]: I0223 15:15:04.875973 28758 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530950-wnkgm"] Feb 23 15:15:04.885906 master-0 kubenswrapper[28758]: I0223 15:15:04.885823 28758 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530950-wnkgm"] Feb 23 15:15:06.102357 master-0 kubenswrapper[28758]: I0223 15:15:06.102150 28758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78f5dea4-ed09-44a1-8eb1-d1fc497cc173" path="/var/lib/kubelet/pods/78f5dea4-ed09-44a1-8eb1-d1fc497cc173/volumes" Feb 23 15:15:38.261830 master-0 kubenswrapper[28758]: I0223 15:15:38.261761 28758 scope.go:117] "RemoveContainer" containerID="27840ca7db3cacb7b24041918e945eaa29f553e36d936e622a640f67b21753c5" Feb 23 15:27:07.550326 master-0 kubenswrapper[28758]: I0223 15:27:07.550265 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-98rpd/must-gather-jmktf"] Feb 23 15:27:07.551645 master-0 kubenswrapper[28758]: E0223 15:27:07.551599 28758 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb67bad-fb11-42ae-a0ea-9383a1011a93" containerName="collect-profiles" Feb 23 15:27:07.551712 master-0 kubenswrapper[28758]: I0223 15:27:07.551646 28758 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb67bad-fb11-42ae-a0ea-9383a1011a93" containerName="collect-profiles" Feb 23 15:27:07.552066 master-0 kubenswrapper[28758]: I0223 15:27:07.552024 28758 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb67bad-fb11-42ae-a0ea-9383a1011a93" containerName="collect-profiles" Feb 23 15:27:07.553886 master-0 kubenswrapper[28758]: I0223 15:27:07.553853 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-98rpd/must-gather-jmktf" Feb 23 15:27:07.556073 master-0 kubenswrapper[28758]: I0223 15:27:07.556011 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-98rpd"/"openshift-service-ca.crt" Feb 23 15:27:07.558209 master-0 kubenswrapper[28758]: I0223 15:27:07.558161 28758 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-98rpd"/"kube-root-ca.crt" Feb 23 15:27:07.564781 master-0 kubenswrapper[28758]: I0223 15:27:07.563663 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-98rpd/must-gather-wts7j"] Feb 23 15:27:07.565783 master-0 kubenswrapper[28758]: I0223 15:27:07.565748 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-98rpd/must-gather-wts7j" Feb 23 15:27:07.598258 master-0 kubenswrapper[28758]: I0223 15:27:07.598180 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-98rpd/must-gather-wts7j"] Feb 23 15:27:07.673722 master-0 kubenswrapper[28758]: I0223 15:27:07.673648 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-98rpd/must-gather-jmktf"] Feb 23 15:27:07.727422 master-0 kubenswrapper[28758]: I0223 15:27:07.727343 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9cmp\" (UniqueName: \"kubernetes.io/projected/09d8a16c-fdc6-4086-978e-45952af9c3af-kube-api-access-j9cmp\") pod \"must-gather-jmktf\" (UID: \"09d8a16c-fdc6-4086-978e-45952af9c3af\") " pod="openshift-must-gather-98rpd/must-gather-jmktf" Feb 23 15:27:07.727677 master-0 kubenswrapper[28758]: I0223 15:27:07.727455 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6xk8\" (UniqueName: \"kubernetes.io/projected/1a6304fa-ddf1-4a08-9a32-710e70ab5bdb-kube-api-access-l6xk8\") pod \"must-gather-wts7j\" (UID: \"1a6304fa-ddf1-4a08-9a32-710e70ab5bdb\") " pod="openshift-must-gather-98rpd/must-gather-wts7j" Feb 23 15:27:07.727677 master-0 kubenswrapper[28758]: I0223 15:27:07.727613 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1a6304fa-ddf1-4a08-9a32-710e70ab5bdb-must-gather-output\") pod \"must-gather-wts7j\" (UID: \"1a6304fa-ddf1-4a08-9a32-710e70ab5bdb\") " pod="openshift-must-gather-98rpd/must-gather-wts7j" Feb 23 15:27:07.727750 master-0 kubenswrapper[28758]: I0223 15:27:07.727704 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09d8a16c-fdc6-4086-978e-45952af9c3af-must-gather-output\") pod \"must-gather-jmktf\" (UID: \"09d8a16c-fdc6-4086-978e-45952af9c3af\") " pod="openshift-must-gather-98rpd/must-gather-jmktf" Feb 23 15:27:07.830409 master-0 kubenswrapper[28758]: I0223 15:27:07.830269 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9cmp\" (UniqueName: \"kubernetes.io/projected/09d8a16c-fdc6-4086-978e-45952af9c3af-kube-api-access-j9cmp\") pod \"must-gather-jmktf\" (UID: \"09d8a16c-fdc6-4086-978e-45952af9c3af\") " pod="openshift-must-gather-98rpd/must-gather-jmktf" Feb 23 15:27:07.830409 master-0 kubenswrapper[28758]: I0223 15:27:07.830387 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6xk8\" (UniqueName: \"kubernetes.io/projected/1a6304fa-ddf1-4a08-9a32-710e70ab5bdb-kube-api-access-l6xk8\") pod \"must-gather-wts7j\" (UID: \"1a6304fa-ddf1-4a08-9a32-710e70ab5bdb\") " pod="openshift-must-gather-98rpd/must-gather-wts7j" Feb 23 15:27:07.830677 master-0 kubenswrapper[28758]: I0223 15:27:07.830589 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1a6304fa-ddf1-4a08-9a32-710e70ab5bdb-must-gather-output\") pod \"must-gather-wts7j\" (UID: \"1a6304fa-ddf1-4a08-9a32-710e70ab5bdb\") " pod="openshift-must-gather-98rpd/must-gather-wts7j" Feb 23 15:27:07.830713 master-0 kubenswrapper[28758]: I0223 15:27:07.830684 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09d8a16c-fdc6-4086-978e-45952af9c3af-must-gather-output\") pod \"must-gather-jmktf\" (UID: \"09d8a16c-fdc6-4086-978e-45952af9c3af\") " pod="openshift-must-gather-98rpd/must-gather-jmktf" Feb 23 15:27:07.831090 master-0 kubenswrapper[28758]: I0223 15:27:07.831060 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1a6304fa-ddf1-4a08-9a32-710e70ab5bdb-must-gather-output\") pod \"must-gather-wts7j\" (UID: \"1a6304fa-ddf1-4a08-9a32-710e70ab5bdb\") " pod="openshift-must-gather-98rpd/must-gather-wts7j" Feb 23 15:27:07.831154 master-0 kubenswrapper[28758]: I0223 15:27:07.831098 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09d8a16c-fdc6-4086-978e-45952af9c3af-must-gather-output\") pod \"must-gather-jmktf\" (UID: \"09d8a16c-fdc6-4086-978e-45952af9c3af\") " pod="openshift-must-gather-98rpd/must-gather-jmktf" Feb 23 15:27:07.857340 master-0 kubenswrapper[28758]: I0223 15:27:07.857300 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6xk8\" (UniqueName: \"kubernetes.io/projected/1a6304fa-ddf1-4a08-9a32-710e70ab5bdb-kube-api-access-l6xk8\") pod \"must-gather-wts7j\" (UID: \"1a6304fa-ddf1-4a08-9a32-710e70ab5bdb\") " pod="openshift-must-gather-98rpd/must-gather-wts7j" Feb 23 15:27:07.857616 master-0 kubenswrapper[28758]: I0223 15:27:07.857412 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9cmp\" (UniqueName: \"kubernetes.io/projected/09d8a16c-fdc6-4086-978e-45952af9c3af-kube-api-access-j9cmp\") pod \"must-gather-jmktf\" (UID: \"09d8a16c-fdc6-4086-978e-45952af9c3af\") " pod="openshift-must-gather-98rpd/must-gather-jmktf" Feb 23 15:27:07.893462 master-0 kubenswrapper[28758]: I0223 15:27:07.893398 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-98rpd/must-gather-jmktf" Feb 23 15:27:07.908532 master-0 kubenswrapper[28758]: I0223 15:27:07.908452 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-98rpd/must-gather-wts7j" Feb 23 15:27:08.397017 master-0 kubenswrapper[28758]: I0223 15:27:08.396953 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-98rpd/must-gather-jmktf"] Feb 23 15:27:08.400349 master-0 kubenswrapper[28758]: I0223 15:27:08.400313 28758 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 15:27:08.493126 master-0 kubenswrapper[28758]: I0223 15:27:08.493047 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-98rpd/must-gather-wts7j"] Feb 23 15:27:08.882495 master-0 kubenswrapper[28758]: I0223 15:27:08.882413 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-98rpd/must-gather-jmktf" event={"ID":"09d8a16c-fdc6-4086-978e-45952af9c3af","Type":"ContainerStarted","Data":"cde7aeb74a61a6e4b1b87e4b0108d2b842dc079db2c3043e6943c3817f1bf50c"} Feb 23 15:27:08.884010 master-0 kubenswrapper[28758]: I0223 15:27:08.883972 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-98rpd/must-gather-wts7j" event={"ID":"1a6304fa-ddf1-4a08-9a32-710e70ab5bdb","Type":"ContainerStarted","Data":"9b0401aae270b3781d10d9cdad1818cdce6e7e6870e614239e0b909d5173b4d3"} Feb 23 15:27:10.834499 master-0 kubenswrapper[28758]: I0223 15:27:10.834413 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-version_cluster-version-operator-57476485-m58rm_b9774f8c-0f29-46d8-be77-81bcf74d5994/cluster-version-operator/0.log" Feb 23 15:27:10.911265 master-0 kubenswrapper[28758]: I0223 15:27:10.911193 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-98rpd/must-gather-wts7j" event={"ID":"1a6304fa-ddf1-4a08-9a32-710e70ab5bdb","Type":"ContainerStarted","Data":"d3ebe196dcf89d9d681f945b9e7f737ec2acf6afed4167a1abc7ce049d4c2d0d"} Feb 23 15:27:10.911265 master-0 kubenswrapper[28758]: I0223 15:27:10.911268 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-98rpd/must-gather-wts7j" event={"ID":"1a6304fa-ddf1-4a08-9a32-710e70ab5bdb","Type":"ContainerStarted","Data":"cf67a1cc904b3dbb71468181133f17cabc7a6917cb3c8d4c13284ee8153080c0"} Feb 23 15:27:10.965738 master-0 kubenswrapper[28758]: I0223 15:27:10.965655 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-98rpd/must-gather-wts7j" podStartSLOduration=2.829413615 podStartE2EDuration="3.965634481s" podCreationTimestamp="2026-02-23 15:27:07 +0000 UTC" firstStartedPulling="2026-02-23 15:27:08.496907894 +0000 UTC m=+3160.623223826" lastFinishedPulling="2026-02-23 15:27:09.63312876 +0000 UTC m=+3161.759444692" observedRunningTime="2026-02-23 15:27:10.955325603 +0000 UTC m=+3163.081641545" watchObservedRunningTime="2026-02-23 15:27:10.965634481 +0000 UTC m=+3163.091950423" Feb 23 15:27:12.353792 master-0 kubenswrapper[28758]: I0223 15:27:12.352100 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-version_cluster-version-operator-57476485-m58rm_b9774f8c-0f29-46d8-be77-81bcf74d5994/cluster-version-operator/1.log" Feb 23 15:27:15.914927 master-0 kubenswrapper[28758]: I0223 15:27:15.912670 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-ng24s_efa55f6d-66f7-480c-9d2e-465c0822c7a8/nmstate-console-plugin/0.log" Feb 23 15:27:15.956911 master-0 kubenswrapper[28758]: I0223 15:27:15.956854 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-zs2k7_58d61652-add1-403d-95f7-e27d89f02376/nmstate-handler/0.log" Feb 23 15:27:15.980252 master-0 kubenswrapper[28758]: I0223 15:27:15.979389 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-6g7nb_1233ccec-26e2-48e3-b43a-917eda81883d/nmstate-metrics/0.log" Feb 23 15:27:16.011097 master-0 kubenswrapper[28758]: I0223 15:27:16.011042 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-6g7nb_1233ccec-26e2-48e3-b43a-917eda81883d/kube-rbac-proxy/0.log" Feb 23 15:27:16.036750 master-0 kubenswrapper[28758]: I0223 15:27:16.036331 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-2w9dm_c12ddca1-f73b-49d7-8906-5c8a32346b00/nmstate-operator/0.log" Feb 23 15:27:16.057842 master-0 kubenswrapper[28758]: I0223 15:27:16.057604 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-nmnbc_567c7ff6-21b3-4463-9a94-43b90d5fc1de/nmstate-webhook/0.log" Feb 23 15:27:16.479053 master-0 kubenswrapper[28758]: I0223 15:27:16.476823 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-vdjqg_669f05a2-9b50-4d6e-8d16-0a5050939b84/controller/0.log" Feb 23 15:27:16.493506 master-0 kubenswrapper[28758]: I0223 15:27:16.491000 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-vdjqg_669f05a2-9b50-4d6e-8d16-0a5050939b84/kube-rbac-proxy/0.log" Feb 23 15:27:16.517632 master-0 kubenswrapper[28758]: I0223 15:27:16.516563 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-frlvg_7979db93-36fa-4bbd-99c6-e8c8ecc114f3/controller/0.log" Feb 23 15:27:17.995551 master-0 kubenswrapper[28758]: I0223 15:27:17.994486 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-frlvg_7979db93-36fa-4bbd-99c6-e8c8ecc114f3/frr/0.log" Feb 23 15:27:18.005544 master-0 kubenswrapper[28758]: I0223 15:27:18.005251 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-frlvg_7979db93-36fa-4bbd-99c6-e8c8ecc114f3/reloader/0.log" Feb 23 15:27:18.014543 master-0 kubenswrapper[28758]: I0223 15:27:18.013939 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-frlvg_7979db93-36fa-4bbd-99c6-e8c8ecc114f3/frr-metrics/0.log" Feb 23 15:27:18.028545 master-0 kubenswrapper[28758]: I0223 15:27:18.028173 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-frlvg_7979db93-36fa-4bbd-99c6-e8c8ecc114f3/kube-rbac-proxy/0.log" Feb 23 15:27:18.041313 master-0 kubenswrapper[28758]: I0223 15:27:18.037218 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-frlvg_7979db93-36fa-4bbd-99c6-e8c8ecc114f3/kube-rbac-proxy-frr/0.log" Feb 23 15:27:18.048494 master-0 kubenswrapper[28758]: I0223 15:27:18.045652 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-frlvg_7979db93-36fa-4bbd-99c6-e8c8ecc114f3/cp-frr-files/0.log" Feb 23 15:27:18.057523 master-0 kubenswrapper[28758]: I0223 15:27:18.056594 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-frlvg_7979db93-36fa-4bbd-99c6-e8c8ecc114f3/cp-reloader/0.log" Feb 23 15:27:18.069546 master-0 kubenswrapper[28758]: I0223 15:27:18.066417 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-frlvg_7979db93-36fa-4bbd-99c6-e8c8ecc114f3/cp-metrics/0.log" Feb 23 15:27:18.076550 master-0 kubenswrapper[28758]: I0223 15:27:18.076486 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-6gz6s_df5d30d1-17cc-4d03-9f0f-d5f34c0b5715/frr-k8s-webhook-server/0.log" Feb 23 15:27:18.165004 master-0 kubenswrapper[28758]: I0223 15:27:18.164959 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7d66bdc8f4-chnqv_0bb85bd1-300c-4786-96ff-56978d399495/manager/0.log" Feb 23 15:27:18.183717 master-0 kubenswrapper[28758]: I0223 15:27:18.183654 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-57446bd8dd-6nwv7_7a2a7f30-5dd1-4918-9e1d-f65cb8aa6fc1/webhook-server/0.log" Feb 23 15:27:18.244883 master-0 kubenswrapper[28758]: I0223 15:27:18.244836 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_18a83278819db2092fa26d8274eb3f00/etcdctl/0.log" Feb 23 15:27:18.851285 master-0 kubenswrapper[28758]: I0223 15:27:18.849735 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cgfmv_1b984ba9-84db-4b51-ac8d-f92f22a4b76f/speaker/0.log" Feb 23 15:27:18.862691 master-0 kubenswrapper[28758]: I0223 15:27:18.862318 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cgfmv_1b984ba9-84db-4b51-ac8d-f92f22a4b76f/kube-rbac-proxy/0.log" Feb 23 15:27:19.082343 master-0 kubenswrapper[28758]: I0223 15:27:19.082257 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_18a83278819db2092fa26d8274eb3f00/etcd/0.log" Feb 23 15:27:19.102600 master-0 kubenswrapper[28758]: I0223 15:27:19.101781 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_18a83278819db2092fa26d8274eb3f00/etcd-metrics/0.log" Feb 23 15:27:19.119272 master-0 kubenswrapper[28758]: I0223 15:27:19.119192 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_18a83278819db2092fa26d8274eb3f00/etcd-readyz/0.log" Feb 23 15:27:19.142089 master-0 kubenswrapper[28758]: I0223 15:27:19.141989 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_18a83278819db2092fa26d8274eb3f00/etcd-rev/0.log" Feb 23 15:27:19.166378 master-0 kubenswrapper[28758]: I0223 15:27:19.166300 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_18a83278819db2092fa26d8274eb3f00/setup/0.log" Feb 23 15:27:19.222132 master-0 kubenswrapper[28758]: I0223 15:27:19.222069 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_18a83278819db2092fa26d8274eb3f00/etcd-ensure-env-vars/0.log" Feb 23 15:27:19.244820 master-0 kubenswrapper[28758]: I0223 15:27:19.244705 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_18a83278819db2092fa26d8274eb3f00/etcd-resources-copy/0.log" Feb 23 15:27:19.303579 master-0 kubenswrapper[28758]: I0223 15:27:19.303423 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_5f67ab24-82bc-4e71-b974-e25b819986c8/installer/0.log" Feb 23 15:27:19.808623 master-0 kubenswrapper[28758]: I0223 15:27:19.808558 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-86776854f6-g4ksn_ca8ab367-b3d7-4663-b4af-284e424dced7/oauth-openshift/0.log" Feb 23 15:27:20.145376 master-0 kubenswrapper[28758]: I0223 15:27:20.145323 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-98rpd/must-gather-jmktf" event={"ID":"09d8a16c-fdc6-4086-978e-45952af9c3af","Type":"ContainerStarted","Data":"2d5847f2957ef9d2c8e8bd881a83cc99e69247ff312e47f63656717b0c8d6052"} Feb 23 15:27:20.565599 master-0 kubenswrapper[28758]: I0223 15:27:20.565453 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/assisted-installer_assisted-installer-controller-r6z45_0514f486-2562-473d-8b01-b69441b82367/assisted-installer-controller/0.log" Feb 23 15:27:21.020644 master-0 kubenswrapper[28758]: I0223 15:27:21.020585 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-5bd7c86784-mlbx2_e2d00ece-7586-4346-adbb-eaae1aeda69e/authentication-operator/2.log" Feb 23 15:27:21.081878 master-0 kubenswrapper[28758]: I0223 15:27:21.081813 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-5bd7c86784-mlbx2_e2d00ece-7586-4346-adbb-eaae1aeda69e/authentication-operator/3.log" Feb 23 15:27:21.163084 master-0 kubenswrapper[28758]: I0223 15:27:21.162920 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-98rpd/must-gather-jmktf" event={"ID":"09d8a16c-fdc6-4086-978e-45952af9c3af","Type":"ContainerStarted","Data":"c24a2eadbce0dc755076f7614f8146fbefd8a86433a744d68071b107d411eccc"} Feb 23 15:27:21.181608 master-0 kubenswrapper[28758]: I0223 15:27:21.181433 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-98rpd/must-gather-jmktf" podStartSLOduration=2.884689485 podStartE2EDuration="14.18139958s" podCreationTimestamp="2026-02-23 15:27:07 +0000 UTC" firstStartedPulling="2026-02-23 15:27:08.400253467 +0000 UTC m=+3160.526569399" lastFinishedPulling="2026-02-23 15:27:19.696963562 +0000 UTC m=+3171.823279494" observedRunningTime="2026-02-23 15:27:21.179338725 +0000 UTC m=+3173.305654657" watchObservedRunningTime="2026-02-23 15:27:21.18139958 +0000 UTC m=+3173.307715542" Feb 23 15:27:21.959292 master-0 kubenswrapper[28758]: I0223 15:27:21.959227 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-98rpd/perf-node-gather-daemonset-gs42s"] Feb 23 15:27:21.969020 master-0 kubenswrapper[28758]: I0223 15:27:21.967783 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-98rpd/perf-node-gather-daemonset-gs42s" Feb 23 15:27:21.973047 master-0 kubenswrapper[28758]: I0223 15:27:21.972986 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-98rpd/perf-node-gather-daemonset-gs42s"] Feb 23 15:27:22.004816 master-0 kubenswrapper[28758]: I0223 15:27:22.004758 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7b65dc9fcb-w68qb_06bde94a-3126-4d0f-baba-49dc5fbec61b/router/1.log" Feb 23 15:27:22.018483 master-0 kubenswrapper[28758]: I0223 15:27:22.018350 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7b65dc9fcb-w68qb_06bde94a-3126-4d0f-baba-49dc5fbec61b/router/0.log" Feb 23 15:27:22.122864 master-0 kubenswrapper[28758]: I0223 15:27:22.121152 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/db7aab79-c8dc-4699-9431-38894cf3804f-lib-modules\") pod \"perf-node-gather-daemonset-gs42s\" (UID: \"db7aab79-c8dc-4699-9431-38894cf3804f\") " pod="openshift-must-gather-98rpd/perf-node-gather-daemonset-gs42s" Feb 23 15:27:22.122864 master-0 kubenswrapper[28758]: I0223 15:27:22.121320 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/db7aab79-c8dc-4699-9431-38894cf3804f-sys\") pod \"perf-node-gather-daemonset-gs42s\" (UID: \"db7aab79-c8dc-4699-9431-38894cf3804f\") " pod="openshift-must-gather-98rpd/perf-node-gather-daemonset-gs42s" Feb 23 15:27:22.122864 master-0 kubenswrapper[28758]: I0223 15:27:22.121365 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/db7aab79-c8dc-4699-9431-38894cf3804f-podres\") pod \"perf-node-gather-daemonset-gs42s\" (UID: \"db7aab79-c8dc-4699-9431-38894cf3804f\") " pod="openshift-must-gather-98rpd/perf-node-gather-daemonset-gs42s" Feb 23 15:27:22.122864 master-0 kubenswrapper[28758]: I0223 15:27:22.121390 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/db7aab79-c8dc-4699-9431-38894cf3804f-proc\") pod \"perf-node-gather-daemonset-gs42s\" (UID: \"db7aab79-c8dc-4699-9431-38894cf3804f\") " pod="openshift-must-gather-98rpd/perf-node-gather-daemonset-gs42s" Feb 23 15:27:22.122864 master-0 kubenswrapper[28758]: I0223 15:27:22.121437 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khx6c\" (UniqueName: \"kubernetes.io/projected/db7aab79-c8dc-4699-9431-38894cf3804f-kube-api-access-khx6c\") pod \"perf-node-gather-daemonset-gs42s\" (UID: \"db7aab79-c8dc-4699-9431-38894cf3804f\") " pod="openshift-must-gather-98rpd/perf-node-gather-daemonset-gs42s" Feb 23 15:27:22.225105 master-0 kubenswrapper[28758]: I0223 15:27:22.224973 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/db7aab79-c8dc-4699-9431-38894cf3804f-sys\") pod \"perf-node-gather-daemonset-gs42s\" (UID: \"db7aab79-c8dc-4699-9431-38894cf3804f\") " pod="openshift-must-gather-98rpd/perf-node-gather-daemonset-gs42s" Feb 23 15:27:22.225105 master-0 kubenswrapper[28758]: I0223 15:27:22.225090 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/db7aab79-c8dc-4699-9431-38894cf3804f-podres\") pod \"perf-node-gather-daemonset-gs42s\" (UID: \"db7aab79-c8dc-4699-9431-38894cf3804f\") " pod="openshift-must-gather-98rpd/perf-node-gather-daemonset-gs42s" Feb 23 15:27:22.225743 master-0 kubenswrapper[28758]: I0223 15:27:22.225139 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/db7aab79-c8dc-4699-9431-38894cf3804f-proc\") pod \"perf-node-gather-daemonset-gs42s\" (UID: \"db7aab79-c8dc-4699-9431-38894cf3804f\") " pod="openshift-must-gather-98rpd/perf-node-gather-daemonset-gs42s" Feb 23 15:27:22.225743 master-0 kubenswrapper[28758]: I0223 15:27:22.225199 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khx6c\" (UniqueName: \"kubernetes.io/projected/db7aab79-c8dc-4699-9431-38894cf3804f-kube-api-access-khx6c\") pod \"perf-node-gather-daemonset-gs42s\" (UID: \"db7aab79-c8dc-4699-9431-38894cf3804f\") " pod="openshift-must-gather-98rpd/perf-node-gather-daemonset-gs42s" Feb 23 15:27:22.225743 master-0 kubenswrapper[28758]: I0223 15:27:22.225320 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/db7aab79-c8dc-4699-9431-38894cf3804f-lib-modules\") pod \"perf-node-gather-daemonset-gs42s\" (UID: \"db7aab79-c8dc-4699-9431-38894cf3804f\") " pod="openshift-must-gather-98rpd/perf-node-gather-daemonset-gs42s" Feb 23 15:27:22.225743 master-0 kubenswrapper[28758]: I0223 15:27:22.225613 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/db7aab79-c8dc-4699-9431-38894cf3804f-lib-modules\") pod \"perf-node-gather-daemonset-gs42s\" (UID: \"db7aab79-c8dc-4699-9431-38894cf3804f\") " pod="openshift-must-gather-98rpd/perf-node-gather-daemonset-gs42s" Feb 23 15:27:22.225743 master-0 kubenswrapper[28758]: I0223 15:27:22.225670 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/db7aab79-c8dc-4699-9431-38894cf3804f-sys\") pod \"perf-node-gather-daemonset-gs42s\" (UID: \"db7aab79-c8dc-4699-9431-38894cf3804f\") " pod="openshift-must-gather-98rpd/perf-node-gather-daemonset-gs42s" Feb 23 15:27:22.228569 master-0 kubenswrapper[28758]: I0223 15:27:22.226023 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/db7aab79-c8dc-4699-9431-38894cf3804f-podres\") pod \"perf-node-gather-daemonset-gs42s\" (UID: \"db7aab79-c8dc-4699-9431-38894cf3804f\") " pod="openshift-must-gather-98rpd/perf-node-gather-daemonset-gs42s" Feb 23 15:27:22.228569 master-0 kubenswrapper[28758]: I0223 15:27:22.226074 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/db7aab79-c8dc-4699-9431-38894cf3804f-proc\") pod \"perf-node-gather-daemonset-gs42s\" (UID: \"db7aab79-c8dc-4699-9431-38894cf3804f\") " pod="openshift-must-gather-98rpd/perf-node-gather-daemonset-gs42s" Feb 23 15:27:22.244610 master-0 kubenswrapper[28758]: I0223 15:27:22.244535 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khx6c\" (UniqueName: \"kubernetes.io/projected/db7aab79-c8dc-4699-9431-38894cf3804f-kube-api-access-khx6c\") pod \"perf-node-gather-daemonset-gs42s\" (UID: \"db7aab79-c8dc-4699-9431-38894cf3804f\") " pod="openshift-must-gather-98rpd/perf-node-gather-daemonset-gs42s" Feb 23 15:27:22.304041 master-0 kubenswrapper[28758]: I0223 15:27:22.303553 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-98rpd/perf-node-gather-daemonset-gs42s" Feb 23 15:27:22.791366 master-0 kubenswrapper[28758]: I0223 15:27:22.789281 28758 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-98rpd/perf-node-gather-daemonset-gs42s"] Feb 23 15:27:22.899901 master-0 kubenswrapper[28758]: I0223 15:27:22.899846 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-67f44b4d6d-7lpn4_ea0b3538-9a7d-4995-b628-2d63f21d683c/oauth-apiserver/0.log" Feb 23 15:27:22.921619 master-0 kubenswrapper[28758]: I0223 15:27:22.921566 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-67f44b4d6d-7lpn4_ea0b3538-9a7d-4995-b628-2d63f21d683c/fix-audit-permissions/0.log" Feb 23 15:27:23.192514 master-0 kubenswrapper[28758]: I0223 15:27:23.186747 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-98rpd/perf-node-gather-daemonset-gs42s" event={"ID":"db7aab79-c8dc-4699-9431-38894cf3804f","Type":"ContainerStarted","Data":"088092456ff4b5e56074dc4006633862658d4496c91b491ae17802f4114a7b3c"} Feb 23 15:27:23.911079 master-0 kubenswrapper[28758]: I0223 15:27:23.910949 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-86b8dc6d6-2kvfp_3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3/kube-rbac-proxy/0.log" Feb 23 15:27:23.927246 master-0 kubenswrapper[28758]: I0223 15:27:23.927192 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-86b8dc6d6-2kvfp_3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3/cluster-autoscaler-operator/0.log" Feb 23 15:27:23.945753 master-0 kubenswrapper[28758]: I0223 15:27:23.945704 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-86b8dc6d6-2kvfp_3d3f4da5-d82d-40b7-9aaf-53ae6eb4aca3/cluster-autoscaler-operator/1.log" Feb 23 15:27:23.969591 master-0 kubenswrapper[28758]: I0223 15:27:23.969479 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-d6bb9bb76-4frj6_12b256b7-a57b-4124-8452-25e74cfa7926/cluster-baremetal-operator/2.log" Feb 23 15:27:23.969879 master-0 kubenswrapper[28758]: I0223 15:27:23.969850 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-d6bb9bb76-4frj6_12b256b7-a57b-4124-8452-25e74cfa7926/cluster-baremetal-operator/3.log" Feb 23 15:27:23.985174 master-0 kubenswrapper[28758]: I0223 15:27:23.985105 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-d6bb9bb76-4frj6_12b256b7-a57b-4124-8452-25e74cfa7926/baremetal-kube-rbac-proxy/0.log" Feb 23 15:27:24.009082 master-0 kubenswrapper[28758]: I0223 15:27:24.009022 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-686847ff5f-9q266_4373687a-61a0-434b-81f7-3fecaa1494ef/control-plane-machine-set-operator/1.log" Feb 23 15:27:24.009328 master-0 kubenswrapper[28758]: I0223 15:27:24.009048 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-686847ff5f-9q266_4373687a-61a0-434b-81f7-3fecaa1494ef/control-plane-machine-set-operator/0.log" Feb 23 15:27:24.032917 master-0 kubenswrapper[28758]: I0223 15:27:24.032871 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5c7cf458b4-bb7zl_ceba7b56-f910-473d-aed5-add94868fb31/kube-rbac-proxy/0.log" Feb 23 15:27:24.068965 master-0 kubenswrapper[28758]: I0223 15:27:24.068901 28758 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-98rpd/master-0-debug-vbp2t"] Feb 23 15:27:24.071292 master-0 kubenswrapper[28758]: I0223 15:27:24.071250 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-98rpd/master-0-debug-vbp2t" Feb 23 15:27:24.075092 master-0 kubenswrapper[28758]: I0223 15:27:24.075065 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5c7cf458b4-bb7zl_ceba7b56-f910-473d-aed5-add94868fb31/machine-api-operator/0.log" Feb 23 15:27:24.184349 master-0 kubenswrapper[28758]: I0223 15:27:24.184275 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7d12937f-d0b0-477b-b3ff-3503bd7655fa-host\") pod \"master-0-debug-vbp2t\" (UID: \"7d12937f-d0b0-477b-b3ff-3503bd7655fa\") " pod="openshift-must-gather-98rpd/master-0-debug-vbp2t" Feb 23 15:27:24.184608 master-0 kubenswrapper[28758]: I0223 15:27:24.184574 28758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgbrv\" (UniqueName: \"kubernetes.io/projected/7d12937f-d0b0-477b-b3ff-3503bd7655fa-kube-api-access-mgbrv\") pod \"master-0-debug-vbp2t\" (UID: \"7d12937f-d0b0-477b-b3ff-3503bd7655fa\") " pod="openshift-must-gather-98rpd/master-0-debug-vbp2t" Feb 23 15:27:24.208944 master-0 kubenswrapper[28758]: I0223 15:27:24.208886 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-98rpd/perf-node-gather-daemonset-gs42s" event={"ID":"db7aab79-c8dc-4699-9431-38894cf3804f","Type":"ContainerStarted","Data":"c1a0bd8fadee1843c722f4367c6dafe9669e9a4efdda1ff404d6ddd69f7b9928"} Feb 23 15:27:24.209340 master-0 kubenswrapper[28758]: I0223 15:27:24.209315 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-must-gather-98rpd/perf-node-gather-daemonset-gs42s" Feb 23 15:27:24.233010 master-0 kubenswrapper[28758]: I0223 15:27:24.232904 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-98rpd/perf-node-gather-daemonset-gs42s" podStartSLOduration=3.232860004 podStartE2EDuration="3.232860004s" podCreationTimestamp="2026-02-23 15:27:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 15:27:24.225318571 +0000 UTC m=+3176.351634503" watchObservedRunningTime="2026-02-23 15:27:24.232860004 +0000 UTC m=+3176.359175986" Feb 23 15:27:24.287300 master-0 kubenswrapper[28758]: I0223 15:27:24.287250 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgbrv\" (UniqueName: \"kubernetes.io/projected/7d12937f-d0b0-477b-b3ff-3503bd7655fa-kube-api-access-mgbrv\") pod \"master-0-debug-vbp2t\" (UID: \"7d12937f-d0b0-477b-b3ff-3503bd7655fa\") " pod="openshift-must-gather-98rpd/master-0-debug-vbp2t" Feb 23 15:27:24.287662 master-0 kubenswrapper[28758]: I0223 15:27:24.287638 28758 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7d12937f-d0b0-477b-b3ff-3503bd7655fa-host\") pod \"master-0-debug-vbp2t\" (UID: \"7d12937f-d0b0-477b-b3ff-3503bd7655fa\") " pod="openshift-must-gather-98rpd/master-0-debug-vbp2t" Feb 23 15:27:24.288432 master-0 kubenswrapper[28758]: I0223 15:27:24.288069 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7d12937f-d0b0-477b-b3ff-3503bd7655fa-host\") pod \"master-0-debug-vbp2t\" (UID: \"7d12937f-d0b0-477b-b3ff-3503bd7655fa\") " pod="openshift-must-gather-98rpd/master-0-debug-vbp2t" Feb 23 15:27:24.309990 master-0 kubenswrapper[28758]: I0223 15:27:24.308780 28758 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgbrv\" (UniqueName: \"kubernetes.io/projected/7d12937f-d0b0-477b-b3ff-3503bd7655fa-kube-api-access-mgbrv\") pod \"master-0-debug-vbp2t\" (UID: \"7d12937f-d0b0-477b-b3ff-3503bd7655fa\") " pod="openshift-must-gather-98rpd/master-0-debug-vbp2t" Feb 23 15:27:24.417292 master-0 kubenswrapper[28758]: I0223 15:27:24.417214 28758 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-98rpd/master-0-debug-vbp2t" Feb 23 15:27:24.456881 master-0 kubenswrapper[28758]: W0223 15:27:24.456769 28758 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d12937f_d0b0_477b_b3ff_3503bd7655fa.slice/crio-20cea955ac87d3e5c350bb5a0840e5c48a278e83059afa4607bc10f496f597b3 WatchSource:0}: Error finding container 20cea955ac87d3e5c350bb5a0840e5c48a278e83059afa4607bc10f496f597b3: Status 404 returned error can't find the container with id 20cea955ac87d3e5c350bb5a0840e5c48a278e83059afa4607bc10f496f597b3 Feb 23 15:27:25.241998 master-0 kubenswrapper[28758]: I0223 15:27:25.241917 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-98rpd/master-0-debug-vbp2t" event={"ID":"7d12937f-d0b0-477b-b3ff-3503bd7655fa","Type":"ContainerStarted","Data":"20cea955ac87d3e5c350bb5a0840e5c48a278e83059afa4607bc10f496f597b3"} Feb 23 15:27:25.581767 master-0 kubenswrapper[28758]: I0223 15:27:25.581572 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb_172d47fd-e1a1-4d77-9e31-c4f22e824d5f/cluster-cloud-controller-manager/0.log" Feb 23 15:27:25.604287 master-0 kubenswrapper[28758]: I0223 15:27:25.604217 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb_172d47fd-e1a1-4d77-9e31-c4f22e824d5f/config-sync-controllers/0.log" Feb 23 15:27:25.629847 master-0 kubenswrapper[28758]: I0223 15:27:25.628892 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-67dd8d7969-b2bkb_172d47fd-e1a1-4d77-9e31-c4f22e824d5f/kube-rbac-proxy/0.log" Feb 23 15:27:25.837632 master-0 kubenswrapper[28758]: I0223 15:27:25.836954 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-2d990-api-0_1c260b91-35a4-4ed2-800e-14c67846ca98/cinder-2d990-api-log/0.log" Feb 23 15:27:25.902604 master-0 kubenswrapper[28758]: I0223 15:27:25.902503 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-2d990-api-0_1c260b91-35a4-4ed2-800e-14c67846ca98/cinder-api/0.log" Feb 23 15:27:26.049856 master-0 kubenswrapper[28758]: I0223 15:27:26.049014 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-2d990-backup-0_3465aa00-c58d-4c78-8d97-4f2543f9265d/cinder-backup/0.log" Feb 23 15:27:26.092951 master-0 kubenswrapper[28758]: I0223 15:27:26.090386 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-2d990-backup-0_3465aa00-c58d-4c78-8d97-4f2543f9265d/probe/0.log" Feb 23 15:27:26.181006 master-0 kubenswrapper[28758]: I0223 15:27:26.180826 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-2d990-scheduler-0_b7266a40-897f-41e1-a8bc-0bd0c7c0f268/cinder-scheduler/0.log" Feb 23 15:27:26.221335 master-0 kubenswrapper[28758]: I0223 15:27:26.221239 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-2d990-scheduler-0_b7266a40-897f-41e1-a8bc-0bd0c7c0f268/probe/0.log" Feb 23 15:27:26.349290 master-0 kubenswrapper[28758]: I0223 15:27:26.344011 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-2d990-volume-lvm-iscsi-0_dc9b5902-d242-43ef-a8cc-a6b9f256507a/cinder-volume/0.log" Feb 23 15:27:26.380568 master-0 kubenswrapper[28758]: I0223 15:27:26.380498 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-2d990-volume-lvm-iscsi-0_dc9b5902-d242-43ef-a8cc-a6b9f256507a/probe/0.log" Feb 23 15:27:26.396262 master-0 kubenswrapper[28758]: I0223 15:27:26.396211 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6746dffb99-qhhmg_0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5/dnsmasq-dns/0.log" Feb 23 15:27:26.418520 master-0 kubenswrapper[28758]: I0223 15:27:26.415033 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6746dffb99-qhhmg_0a3fc6a0-e761-41d8-90bf-d2a8ed7615a5/init/0.log" Feb 23 15:27:26.508109 master-0 kubenswrapper[28758]: I0223 15:27:26.508061 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-63e78-default-external-api-0_509b3d39-1b4f-440e-831b-997244388255/glance-log/0.log" Feb 23 15:27:26.523593 master-0 kubenswrapper[28758]: I0223 15:27:26.523536 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-63e78-default-external-api-0_509b3d39-1b4f-440e-831b-997244388255/glance-httpd/0.log" Feb 23 15:27:26.625674 master-0 kubenswrapper[28758]: I0223 15:27:26.625519 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-63e78-default-internal-api-0_3be6a9b5-62a2-49f0-8871-aed1d7a7f588/glance-log/0.log" Feb 23 15:27:26.649070 master-0 kubenswrapper[28758]: I0223 15:27:26.649027 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-63e78-default-internal-api-0_3be6a9b5-62a2-49f0-8871-aed1d7a7f588/glance-httpd/0.log" Feb 23 15:27:26.703605 master-0 kubenswrapper[28758]: I0223 15:27:26.702796 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-55d877d745-zlz8n_ec9d99fd-acd0-4435-bc55-034519a4b417/ironic-api-log/0.log" Feb 23 15:27:26.784945 master-0 kubenswrapper[28758]: I0223 15:27:26.784839 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-55d877d745-zlz8n_ec9d99fd-acd0-4435-bc55-034519a4b417/ironic-api/0.log" Feb 23 15:27:26.795185 master-0 kubenswrapper[28758]: I0223 15:27:26.795144 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-55d877d745-zlz8n_ec9d99fd-acd0-4435-bc55-034519a4b417/init/0.log" Feb 23 15:27:26.836316 master-0 kubenswrapper[28758]: I0223 15:27:26.836230 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_894fbd22-c889-426b-954b-04a9a0e4d905/ironic-conductor/0.log" Feb 23 15:27:26.844676 master-0 kubenswrapper[28758]: I0223 15:27:26.844375 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_894fbd22-c889-426b-954b-04a9a0e4d905/httpboot/0.log" Feb 23 15:27:26.851230 master-0 kubenswrapper[28758]: I0223 15:27:26.851159 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_894fbd22-c889-426b-954b-04a9a0e4d905/dnsmasq/0.log" Feb 23 15:27:26.859010 master-0 kubenswrapper[28758]: I0223 15:27:26.858948 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_894fbd22-c889-426b-954b-04a9a0e4d905/init/0.log" Feb 23 15:27:26.871274 master-0 kubenswrapper[28758]: I0223 15:27:26.870795 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_894fbd22-c889-426b-954b-04a9a0e4d905/ironic-python-agent-init/0.log" Feb 23 15:27:27.786296 master-0 kubenswrapper[28758]: I0223 15:27:27.786233 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_894fbd22-c889-426b-954b-04a9a0e4d905/pxe-init/0.log" Feb 23 15:27:27.872326 master-0 kubenswrapper[28758]: I0223 15:27:27.872188 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_5819b1bd-dd89-45b4-84ed-83a1355314de/ironic-inspector-httpd/0.log" Feb 23 15:27:27.920254 master-0 kubenswrapper[28758]: I0223 15:27:27.916778 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-credential-operator_cloud-credential-operator-6968c58f46-p7jh7_85365dec-af50-406c-b258-890e4f454c4a/kube-rbac-proxy/0.log" Feb 23 15:27:27.981836 master-0 kubenswrapper[28758]: I0223 15:27:27.979588 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-credential-operator_cloud-credential-operator-6968c58f46-p7jh7_85365dec-af50-406c-b258-890e4f454c4a/cloud-credential-operator/0.log" Feb 23 15:27:27.985677 master-0 kubenswrapper[28758]: I0223 15:27:27.985029 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_5819b1bd-dd89-45b4-84ed-83a1355314de/ironic-inspector/0.log" Feb 23 15:27:27.990247 master-0 kubenswrapper[28758]: I0223 15:27:27.988952 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-credential-operator_cloud-credential-operator-6968c58f46-p7jh7_85365dec-af50-406c-b258-890e4f454c4a/cloud-credential-operator/1.log" Feb 23 15:27:27.995867 master-0 kubenswrapper[28758]: I0223 15:27:27.995306 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_5819b1bd-dd89-45b4-84ed-83a1355314de/inspector-httpboot/0.log" Feb 23 15:27:28.003031 master-0 kubenswrapper[28758]: I0223 15:27:28.002947 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_5819b1bd-dd89-45b4-84ed-83a1355314de/ramdisk-logs/0.log" Feb 23 15:27:28.018516 master-0 kubenswrapper[28758]: I0223 15:27:28.015599 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_5819b1bd-dd89-45b4-84ed-83a1355314de/inspector-dnsmasq/0.log" Feb 23 15:27:28.033661 master-0 kubenswrapper[28758]: I0223 15:27:28.033302 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_5819b1bd-dd89-45b4-84ed-83a1355314de/ironic-python-agent-init/0.log" Feb 23 15:27:28.049400 master-0 kubenswrapper[28758]: I0223 15:27:28.049302 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_5819b1bd-dd89-45b4-84ed-83a1355314de/inspector-pxe-init/0.log" Feb 23 15:27:28.059103 master-0 kubenswrapper[28758]: I0223 15:27:28.059076 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-neutron-agent-769d7f49c5-jj8xx_8c0c40ba-2093-4e1a-8166-bbb4c53f3a08/ironic-neutron-agent/2.log" Feb 23 15:27:28.064152 master-0 kubenswrapper[28758]: I0223 15:27:28.063234 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-neutron-agent-769d7f49c5-jj8xx_8c0c40ba-2093-4e1a-8166-bbb4c53f3a08/ironic-neutron-agent/1.log" Feb 23 15:27:28.166336 master-0 kubenswrapper[28758]: I0223 15:27:28.166275 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-56fc45f8f5-fsvgg_33a3834a-cb45-4a38-ab13-67b38425de47/keystone-api/0.log" Feb 23 15:27:28.176685 master-0 kubenswrapper[28758]: I0223 15:27:28.176636 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29530981-kbw2b_6ddbd438-9416-4af2-95cd-ac31b5a419b3/keystone-cron/0.log" Feb 23 15:27:30.181006 master-0 kubenswrapper[28758]: I0223 15:27:30.180947 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-6f47d587d6-55qjr_92c63c95-e880-4f51-9858-7715343f7bd8/openshift-config-operator/4.log" Feb 23 15:27:30.185626 master-0 kubenswrapper[28758]: I0223 15:27:30.182693 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-6f47d587d6-55qjr_92c63c95-e880-4f51-9858-7715343f7bd8/openshift-config-operator/5.log" Feb 23 15:27:30.194509 master-0 kubenswrapper[28758]: I0223 15:27:30.194434 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-6f47d587d6-55qjr_92c63c95-e880-4f51-9858-7715343f7bd8/openshift-api/0.log" Feb 23 15:27:32.333190 master-0 kubenswrapper[28758]: I0223 15:27:32.333126 28758 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-98rpd/perf-node-gather-daemonset-gs42s" Feb 23 15:27:32.654298 master-0 kubenswrapper[28758]: I0223 15:27:32.654138 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-5df5ffc47c-mlk22_aa3fb4e1-1d55-46c2-af94-6063cdedd456/console-operator/0.log" Feb 23 15:27:33.691397 master-0 kubenswrapper[28758]: I0223 15:27:33.691201 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86d94cc75-jkcsk_a05e5a62-a07f-4f60-8c77-68b7acaa07f0/console/0.log" Feb 23 15:27:33.766331 master-0 kubenswrapper[28758]: I0223 15:27:33.766238 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-955b69498-krrp8_7e3a3f31-0bce-4392-affe-446a58284289/download-server/0.log" Feb 23 15:27:34.978530 master-0 kubenswrapper[28758]: I0223 15:27:34.976528 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-f94476f49-s6c8v_fbb66172-1ea9-4683-b88f-227c4fd94924/cluster-storage-operator/0.log" Feb 23 15:27:34.978530 master-0 kubenswrapper[28758]: I0223 15:27:34.978308 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-f94476f49-s6c8v_fbb66172-1ea9-4683-b88f-227c4fd94924/cluster-storage-operator/1.log" Feb 23 15:27:35.004304 master-0 kubenswrapper[28758]: I0223 15:27:35.004242 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6847bb4785-5fw2x_2e89a047-9ebc-459b-b7b3-e902c1fb0e17/snapshot-controller/3.log" Feb 23 15:27:35.004994 master-0 kubenswrapper[28758]: I0223 15:27:35.004944 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-6847bb4785-5fw2x_2e89a047-9ebc-459b-b7b3-e902c1fb0e17/snapshot-controller/4.log" Feb 23 15:27:35.062512 master-0 kubenswrapper[28758]: I0223 15:27:35.061601 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-operator-6fb4df594f-hkcgz_a4ae9292-71dc-4484-b277-43cb26c1e04d/csi-snapshot-controller-operator/1.log" Feb 23 15:27:35.065633 master-0 kubenswrapper[28758]: I0223 15:27:35.063302 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-operator-6fb4df594f-hkcgz_a4ae9292-71dc-4484-b277-43cb26c1e04d/csi-snapshot-controller-operator/0.log" Feb 23 15:27:36.192617 master-0 kubenswrapper[28758]: I0223 15:27:36.192498 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns-operator_dns-operator-8c7d49845-5rk2g_607c1101-3533-43e3-9eda-13cea2b9dbb6/dns-operator/0.log" Feb 23 15:27:36.206710 master-0 kubenswrapper[28758]: I0223 15:27:36.206653 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns-operator_dns-operator-8c7d49845-5rk2g_607c1101-3533-43e3-9eda-13cea2b9dbb6/kube-rbac-proxy/0.log" Feb 23 15:27:37.420932 master-0 kubenswrapper[28758]: I0223 15:27:37.420841 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-86l7f_6a801da1-a7eb-4187-98b8-315076f55e19/dns/0.log" Feb 23 15:27:37.437540 master-0 kubenswrapper[28758]: I0223 15:27:37.437185 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-86l7f_6a801da1-a7eb-4187-98b8-315076f55e19/kube-rbac-proxy/0.log" Feb 23 15:27:37.461857 master-0 kubenswrapper[28758]: I0223 15:27:37.461800 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7b6jk_2f876e5d-2e82-47d0-8a9c-adacf2bddf77/dns-node-resolver/0.log" Feb 23 15:27:38.512626 master-0 kubenswrapper[28758]: I0223 15:27:38.512577 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-545bf96f4d-fpwtm_8de1f285-47ac-42aa-8026-8addce656362/etcd-operator/2.log" Feb 23 15:27:38.569691 master-0 kubenswrapper[28758]: I0223 15:27:38.569597 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-545bf96f4d-fpwtm_8de1f285-47ac-42aa-8026-8addce656362/etcd-operator/3.log" Feb 23 15:27:39.446064 master-0 kubenswrapper[28758]: I0223 15:27:39.446002 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_18a83278819db2092fa26d8274eb3f00/etcdctl/0.log" Feb 23 15:27:40.265395 master-0 kubenswrapper[28758]: I0223 15:27:40.265247 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_18a83278819db2092fa26d8274eb3f00/etcd/0.log" Feb 23 15:27:40.289713 master-0 kubenswrapper[28758]: I0223 15:27:40.289669 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_18a83278819db2092fa26d8274eb3f00/etcd-metrics/0.log" Feb 23 15:27:40.306202 master-0 kubenswrapper[28758]: I0223 15:27:40.305971 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_18a83278819db2092fa26d8274eb3f00/etcd-readyz/0.log" Feb 23 15:27:40.321151 master-0 kubenswrapper[28758]: I0223 15:27:40.321098 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_18a83278819db2092fa26d8274eb3f00/etcd-rev/0.log" Feb 23 15:27:40.340874 master-0 kubenswrapper[28758]: I0223 15:27:40.340814 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_18a83278819db2092fa26d8274eb3f00/setup/0.log" Feb 23 15:27:40.359565 master-0 kubenswrapper[28758]: I0223 15:27:40.359470 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_18a83278819db2092fa26d8274eb3f00/etcd-ensure-env-vars/0.log" Feb 23 15:27:40.388027 master-0 kubenswrapper[28758]: I0223 15:27:40.387624 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_18a83278819db2092fa26d8274eb3f00/etcd-resources-copy/0.log" Feb 23 15:27:40.473426 master-0 kubenswrapper[28758]: I0223 15:27:40.473363 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_5f67ab24-82bc-4e71-b974-e25b819986c8/installer/0.log" Feb 23 15:27:41.543216 master-0 kubenswrapper[28758]: I0223 15:27:41.543130 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_cluster-image-registry-operator-779979bdf7-ml2d7_ad0f0d72-0337-4347-bb50-e299a175f3ca/cluster-image-registry-operator/0.log" Feb 23 15:27:41.595678 master-0 kubenswrapper[28758]: I0223 15:27:41.595624 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_cluster-image-registry-operator-779979bdf7-ml2d7_ad0f0d72-0337-4347-bb50-e299a175f3ca/cluster-image-registry-operator/1.log" Feb 23 15:27:41.616543 master-0 kubenswrapper[28758]: I0223 15:27:41.616454 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-sxgv7_0325482a-66bb-476c-93ac-01b718836a37/node-ca/0.log" Feb 23 15:27:42.592885 master-0 kubenswrapper[28758]: I0223 15:27:42.590129 28758 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-98rpd/master-0-debug-vbp2t" event={"ID":"7d12937f-d0b0-477b-b3ff-3503bd7655fa","Type":"ContainerStarted","Data":"55898f9b9c566155c0fe197d60e362c3ae0af071a394e92372517de0fefda503"} Feb 23 15:27:42.597968 master-0 kubenswrapper[28758]: I0223 15:27:42.597751 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_29c2ac3e-73c3-4a07-a129-fcea6817fda3/memcached/0.log" Feb 23 15:27:42.628511 master-0 kubenswrapper[28758]: I0223 15:27:42.628170 28758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-98rpd/master-0-debug-vbp2t" podStartSLOduration=1.293984135 podStartE2EDuration="18.628150483s" podCreationTimestamp="2026-02-23 15:27:24 +0000 UTC" firstStartedPulling="2026-02-23 15:27:24.459669752 +0000 UTC m=+3176.585985684" lastFinishedPulling="2026-02-23 15:27:41.7938361 +0000 UTC m=+3193.920152032" observedRunningTime="2026-02-23 15:27:42.61433485 +0000 UTC m=+3194.740650772" watchObservedRunningTime="2026-02-23 15:27:42.628150483 +0000 UTC m=+3194.754466415" Feb 23 15:27:42.686177 master-0 kubenswrapper[28758]: I0223 15:27:42.686109 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-6569778c84-hsl6c_3488a7eb-5170-478c-9af7-490dbe0f514e/ingress-operator/4.log" Feb 23 15:27:42.697995 master-0 kubenswrapper[28758]: I0223 15:27:42.697675 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-6569778c84-hsl6c_3488a7eb-5170-478c-9af7-490dbe0f514e/ingress-operator/5.log" Feb 23 15:27:42.712255 master-0 kubenswrapper[28758]: I0223 15:27:42.711166 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-6569778c84-hsl6c_3488a7eb-5170-478c-9af7-490dbe0f514e/kube-rbac-proxy/0.log" Feb 23 15:27:42.787609 master-0 kubenswrapper[28758]: I0223 15:27:42.787440 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5b8b575dff-c9z8b_575c77e7-ca40-4cb6-9ef4-9938f44932ec/neutron-api/0.log" Feb 23 15:27:42.804501 master-0 kubenswrapper[28758]: I0223 15:27:42.803644 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5b8b575dff-c9z8b_575c77e7-ca40-4cb6-9ef4-9938f44932ec/neutron-httpd/0.log" Feb 23 15:27:42.912809 master-0 kubenswrapper[28758]: I0223 15:27:42.912687 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3d31ce50-c005-43ee-b185-fc4991705cf2/nova-api-log/0.log" Feb 23 15:27:43.150162 master-0 kubenswrapper[28758]: I0223 15:27:43.148849 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3d31ce50-c005-43ee-b185-fc4991705cf2/nova-api-api/0.log" Feb 23 15:27:43.263412 master-0 kubenswrapper[28758]: I0223 15:27:43.261779 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_0aeba98f-5080-49b7-bcc7-2afffb29d0a2/nova-cell0-conductor-conductor/0.log" Feb 23 15:27:43.382320 master-0 kubenswrapper[28758]: I0223 15:27:43.382255 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-compute-ironic-compute-0_75831807-40a3-4be4-ab18-b122079ee4ff/nova-cell1-compute-ironic-compute-compute/0.log" Feb 23 15:27:43.471430 master-0 kubenswrapper[28758]: I0223 15:27:43.471371 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_6664573d-f4f2-4a39-9c7b-03c62a50a511/nova-cell1-conductor-conductor/0.log" Feb 23 15:27:43.536917 master-0 kubenswrapper[28758]: I0223 15:27:43.536787 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_1b8a1637-d3b3-4945-98f6-128eed86cb10/nova-cell1-novncproxy-novncproxy/0.log" Feb 23 15:27:43.602596 master-0 kubenswrapper[28758]: I0223 15:27:43.602542 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_9f3c354f-1b43-43ba-9c77-f19db5f74a2f/nova-metadata-log/0.log" Feb 23 15:27:43.818889 master-0 kubenswrapper[28758]: I0223 15:27:43.818705 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-nwdpd_87f989cd-6c19-4a30-833a-10e98b7a0326/serve-healthcheck-canary/0.log" Feb 23 15:27:44.524351 master-0 kubenswrapper[28758]: I0223 15:27:44.524278 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_9f3c354f-1b43-43ba-9c77-f19db5f74a2f/nova-metadata-metadata/0.log" Feb 23 15:27:44.621432 master-0 kubenswrapper[28758]: I0223 15:27:44.621319 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_6f5c91f3-d3a3-445d-bd0b-e81ed6db60b6/nova-scheduler-scheduler/0.log" Feb 23 15:27:44.650489 master-0 kubenswrapper[28758]: I0223 15:27:44.650297 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f7a29d97-0b1c-4657-ae05-8ef48a3813ba/galera/0.log" Feb 23 15:27:44.668016 master-0 kubenswrapper[28758]: I0223 15:27:44.667963 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f7a29d97-0b1c-4657-ae05-8ef48a3813ba/mysql-bootstrap/0.log" Feb 23 15:27:44.672491 master-0 kubenswrapper[28758]: I0223 15:27:44.672405 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-59b498fcfb-rz897_ae4baa4e-4ef4-433d-aa36-149e92fa6ee2/insights-operator/0.log" Feb 23 15:27:44.696027 master-0 kubenswrapper[28758]: I0223 15:27:44.695965 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_daacc97c-efdc-40e3-b833-237dde2caafe/galera/0.log" Feb 23 15:27:44.712565 master-0 kubenswrapper[28758]: I0223 15:27:44.712516 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_daacc97c-efdc-40e3-b833-237dde2caafe/mysql-bootstrap/0.log" Feb 23 15:27:44.727209 master-0 kubenswrapper[28758]: I0223 15:27:44.727169 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_718a5e05-c1d8-4982-8808-135e9883dab9/openstackclient/0.log" Feb 23 15:27:44.744689 master-0 kubenswrapper[28758]: I0223 15:27:44.744618 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-lssgn_b9342b63-0ec2-4c10-898a-cebd5e86414a/ovn-controller/0.log" Feb 23 15:27:44.752543 master-0 kubenswrapper[28758]: I0223 15:27:44.752455 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-pz8wl_2ab015df-a84b-45bb-8955-b2210c1564f0/openstack-network-exporter/0.log" Feb 23 15:27:44.765333 master-0 kubenswrapper[28758]: I0223 15:27:44.765287 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-td6ds_f847003c-7775-4189-896d-b6c727a97222/ovsdb-server/0.log" Feb 23 15:27:44.777561 master-0 kubenswrapper[28758]: I0223 15:27:44.777367 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-td6ds_f847003c-7775-4189-896d-b6c727a97222/ovs-vswitchd/0.log" Feb 23 15:27:44.786506 master-0 kubenswrapper[28758]: I0223 15:27:44.786453 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-td6ds_f847003c-7775-4189-896d-b6c727a97222/ovsdb-server-init/0.log" Feb 23 15:27:44.800066 master-0 kubenswrapper[28758]: I0223 15:27:44.800015 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_83080fd9-e913-441c-b248-8fd33251ced3/ovn-northd/0.log" Feb 23 15:27:44.807777 master-0 kubenswrapper[28758]: I0223 15:27:44.807692 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_83080fd9-e913-441c-b248-8fd33251ced3/openstack-network-exporter/0.log" Feb 23 15:27:44.833071 master-0 kubenswrapper[28758]: I0223 15:27:44.832999 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_714c1426-c191-4c69-8346-acf55252ed0a/ovsdbserver-nb/0.log" Feb 23 15:27:44.850927 master-0 kubenswrapper[28758]: I0223 15:27:44.850863 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_714c1426-c191-4c69-8346-acf55252ed0a/openstack-network-exporter/0.log" Feb 23 15:27:44.868243 master-0 kubenswrapper[28758]: I0223 15:27:44.868185 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d4dd9afd-e1bd-494b-a6d8-d18012ac483b/ovsdbserver-sb/0.log" Feb 23 15:27:44.873868 master-0 kubenswrapper[28758]: I0223 15:27:44.873728 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_d4dd9afd-e1bd-494b-a6d8-d18012ac483b/openstack-network-exporter/0.log" Feb 23 15:27:44.933228 master-0 kubenswrapper[28758]: I0223 15:27:44.933172 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5fb8596466-gd59d_041b044e-bc1a-4693-8c06-6260d5fe663e/placement-log/0.log" Feb 23 15:27:44.976590 master-0 kubenswrapper[28758]: I0223 15:27:44.976505 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5fb8596466-gd59d_041b044e-bc1a-4693-8c06-6260d5fe663e/placement-api/0.log" Feb 23 15:27:45.023286 master-0 kubenswrapper[28758]: I0223 15:27:45.023234 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e18414da-932f-4a26-ab6a-af32aa83196b/rabbitmq/0.log" Feb 23 15:27:45.030729 master-0 kubenswrapper[28758]: I0223 15:27:45.030679 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e18414da-932f-4a26-ab6a-af32aa83196b/setup-container/0.log" Feb 23 15:27:45.079333 master-0 kubenswrapper[28758]: I0223 15:27:45.079207 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_373bbd85-b2d4-40a4-afc1-3ecf50a666e7/rabbitmq/0.log" Feb 23 15:27:45.093375 master-0 kubenswrapper[28758]: I0223 15:27:45.092754 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_373bbd85-b2d4-40a4-afc1-3ecf50a666e7/setup-container/0.log" Feb 23 15:27:45.211685 master-0 kubenswrapper[28758]: I0223 15:27:45.211631 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7cd7df45f8-pgvmr_3d1bdc4f-86b7-4c36-906b-4aa5e49cd017/proxy-httpd/0.log" Feb 23 15:27:45.224718 master-0 kubenswrapper[28758]: I0223 15:27:45.224673 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7cd7df45f8-pgvmr_3d1bdc4f-86b7-4c36-906b-4aa5e49cd017/proxy-server/0.log" Feb 23 15:27:45.236937 master-0 kubenswrapper[28758]: I0223 15:27:45.236897 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-m897g_518b33ee-87c2-4090-a971-735d98c01d7f/swift-ring-rebalance/0.log" Feb 23 15:27:45.260780 master-0 kubenswrapper[28758]: I0223 15:27:45.260672 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3b978484-31f3-46af-aedd-e96a997da517/account-server/0.log" Feb 23 15:27:45.282651 master-0 kubenswrapper[28758]: I0223 15:27:45.282544 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3b978484-31f3-46af-aedd-e96a997da517/account-replicator/0.log" Feb 23 15:27:45.289043 master-0 kubenswrapper[28758]: I0223 15:27:45.288997 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3b978484-31f3-46af-aedd-e96a997da517/account-auditor/0.log" Feb 23 15:27:45.299318 master-0 kubenswrapper[28758]: I0223 15:27:45.299271 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3b978484-31f3-46af-aedd-e96a997da517/account-reaper/0.log" Feb 23 15:27:45.308343 master-0 kubenswrapper[28758]: I0223 15:27:45.308282 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3b978484-31f3-46af-aedd-e96a997da517/container-server/0.log" Feb 23 15:27:45.332625 master-0 kubenswrapper[28758]: I0223 15:27:45.332561 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3b978484-31f3-46af-aedd-e96a997da517/container-replicator/0.log" Feb 23 15:27:45.339543 master-0 kubenswrapper[28758]: I0223 15:27:45.339466 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3b978484-31f3-46af-aedd-e96a997da517/container-auditor/0.log" Feb 23 15:27:45.347336 master-0 kubenswrapper[28758]: I0223 15:27:45.347286 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3b978484-31f3-46af-aedd-e96a997da517/container-updater/0.log" Feb 23 15:27:45.361745 master-0 kubenswrapper[28758]: I0223 15:27:45.361707 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3b978484-31f3-46af-aedd-e96a997da517/object-server/0.log" Feb 23 15:27:45.379196 master-0 kubenswrapper[28758]: I0223 15:27:45.379130 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3b978484-31f3-46af-aedd-e96a997da517/object-replicator/0.log" Feb 23 15:27:45.392440 master-0 kubenswrapper[28758]: I0223 15:27:45.392398 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3b978484-31f3-46af-aedd-e96a997da517/object-auditor/0.log" Feb 23 15:27:45.399991 master-0 kubenswrapper[28758]: I0223 15:27:45.399929 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3b978484-31f3-46af-aedd-e96a997da517/object-updater/0.log" Feb 23 15:27:45.409013 master-0 kubenswrapper[28758]: I0223 15:27:45.408978 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3b978484-31f3-46af-aedd-e96a997da517/object-expirer/0.log" Feb 23 15:27:45.416032 master-0 kubenswrapper[28758]: I0223 15:27:45.415981 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3b978484-31f3-46af-aedd-e96a997da517/rsync/0.log" Feb 23 15:27:45.424345 master-0 kubenswrapper[28758]: I0223 15:27:45.424314 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3b978484-31f3-46af-aedd-e96a997da517/swift-recon-cron/0.log" Feb 23 15:27:46.948758 master-0 kubenswrapper[28758]: I0223 15:27:46.948706 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1f33aa5f-106b-4743-9d67-758977e09c33/alertmanager/0.log" Feb 23 15:27:46.970226 master-0 kubenswrapper[28758]: I0223 15:27:46.970169 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1f33aa5f-106b-4743-9d67-758977e09c33/config-reloader/0.log" Feb 23 15:27:46.988281 master-0 kubenswrapper[28758]: I0223 15:27:46.988194 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1f33aa5f-106b-4743-9d67-758977e09c33/kube-rbac-proxy-web/0.log" Feb 23 15:27:47.010015 master-0 kubenswrapper[28758]: I0223 15:27:47.009649 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1f33aa5f-106b-4743-9d67-758977e09c33/kube-rbac-proxy/0.log" Feb 23 15:27:47.028084 master-0 kubenswrapper[28758]: I0223 15:27:47.027864 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1f33aa5f-106b-4743-9d67-758977e09c33/kube-rbac-proxy-metric/0.log" Feb 23 15:27:47.040616 master-0 kubenswrapper[28758]: I0223 15:27:47.040572 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1f33aa5f-106b-4743-9d67-758977e09c33/prom-label-proxy/0.log" Feb 23 15:27:47.054690 master-0 kubenswrapper[28758]: I0223 15:27:47.054550 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1f33aa5f-106b-4743-9d67-758977e09c33/init-config-reloader/0.log" Feb 23 15:27:47.150553 master-0 kubenswrapper[28758]: I0223 15:27:47.149953 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6bb6d78bf-wzqcp_646fece3-4a42-4e0c-bcc7-5f705f948d63/cluster-monitoring-operator/0.log" Feb 23 15:27:47.170517 master-0 kubenswrapper[28758]: I0223 15:27:47.170394 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-59584d565f-pdl4r_f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c/kube-state-metrics/0.log" Feb 23 15:27:47.183202 master-0 kubenswrapper[28758]: I0223 15:27:47.183147 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-59584d565f-pdl4r_f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c/kube-rbac-proxy-main/0.log" Feb 23 15:27:47.201176 master-0 kubenswrapper[28758]: I0223 15:27:47.201092 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-59584d565f-pdl4r_f05baa8c-39e2-4a73-aa0f-f1ccf074fd4c/kube-rbac-proxy-self/0.log" Feb 23 15:27:47.219455 master-0 kubenswrapper[28758]: I0223 15:27:47.219300 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-56556ccb8b-kfqz7_810a9771-08c1-45d8-944c-f9a341d90bec/metrics-server/0.log" Feb 23 15:27:47.238506 master-0 kubenswrapper[28758]: I0223 15:27:47.236090 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-564f967f45-775n2_5196fb2b-b771-4a7a-a12c-474f43b82c5f/monitoring-plugin/0.log" Feb 23 15:27:47.262582 master-0 kubenswrapper[28758]: I0223 15:27:47.259660 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ckhv6_15ad7f4e-44c6-4426-8b97-c47a47786544/node-exporter/0.log" Feb 23 15:27:47.273975 master-0 kubenswrapper[28758]: I0223 15:27:47.273915 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ckhv6_15ad7f4e-44c6-4426-8b97-c47a47786544/kube-rbac-proxy/0.log" Feb 23 15:27:47.299730 master-0 kubenswrapper[28758]: I0223 15:27:47.299465 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ckhv6_15ad7f4e-44c6-4426-8b97-c47a47786544/init-textfile/0.log" Feb 23 15:27:47.328506 master-0 kubenswrapper[28758]: I0223 15:27:47.324833 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-6dbff8cb4c-9qg7j_fae9a4cf-2acf-4728-9105-87e004052fe5/kube-rbac-proxy-main/0.log" Feb 23 15:27:47.339626 master-0 kubenswrapper[28758]: I0223 15:27:47.339582 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-6dbff8cb4c-9qg7j_fae9a4cf-2acf-4728-9105-87e004052fe5/kube-rbac-proxy-self/0.log" Feb 23 15:27:47.362867 master-0 kubenswrapper[28758]: I0223 15:27:47.362717 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-6dbff8cb4c-9qg7j_fae9a4cf-2acf-4728-9105-87e004052fe5/openshift-state-metrics/0.log" Feb 23 15:27:47.403283 master-0 kubenswrapper[28758]: I0223 15:27:47.403225 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c97be973-6eff-47c3-970d-d711ffc2750b/prometheus/0.log" Feb 23 15:27:47.415816 master-0 kubenswrapper[28758]: I0223 15:27:47.415745 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c97be973-6eff-47c3-970d-d711ffc2750b/config-reloader/0.log" Feb 23 15:27:47.432937 master-0 kubenswrapper[28758]: I0223 15:27:47.432864 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c97be973-6eff-47c3-970d-d711ffc2750b/thanos-sidecar/0.log" Feb 23 15:27:47.451730 master-0 kubenswrapper[28758]: I0223 15:27:47.448896 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c97be973-6eff-47c3-970d-d711ffc2750b/kube-rbac-proxy-web/0.log" Feb 23 15:27:47.464506 master-0 kubenswrapper[28758]: I0223 15:27:47.464436 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c97be973-6eff-47c3-970d-d711ffc2750b/kube-rbac-proxy/0.log" Feb 23 15:27:47.479980 master-0 kubenswrapper[28758]: I0223 15:27:47.479899 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c97be973-6eff-47c3-970d-d711ffc2750b/kube-rbac-proxy-thanos/0.log" Feb 23 15:27:47.494652 master-0 kubenswrapper[28758]: I0223 15:27:47.494605 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c97be973-6eff-47c3-970d-d711ffc2750b/init-config-reloader/0.log" Feb 23 15:27:47.518257 master-0 kubenswrapper[28758]: I0223 15:27:47.516197 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-754bc4d665-nl92v_18da400b-2271-455d-be0d-0ed44c74f78d/prometheus-operator/0.log" Feb 23 15:27:47.531588 master-0 kubenswrapper[28758]: I0223 15:27:47.530825 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-754bc4d665-nl92v_18da400b-2271-455d-be0d-0ed44c74f78d/kube-rbac-proxy/0.log" Feb 23 15:27:47.562006 master-0 kubenswrapper[28758]: I0223 15:27:47.561621 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-75d56db95f-rg8tp_c02c8912-46c9-4f86-ad28-9bfb2eca4e54/prometheus-operator-admission-webhook/0.log" Feb 23 15:27:47.580996 master-0 kubenswrapper[28758]: I0223 15:27:47.580935 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-dbf68b6c5-fp955_212126c7-1db5-456c-add5-d0e3f38fe315/telemeter-client/0.log" Feb 23 15:27:47.597663 master-0 kubenswrapper[28758]: I0223 15:27:47.597618 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-dbf68b6c5-fp955_212126c7-1db5-456c-add5-d0e3f38fe315/reload/0.log" Feb 23 15:27:47.610707 master-0 kubenswrapper[28758]: I0223 15:27:47.610596 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-dbf68b6c5-fp955_212126c7-1db5-456c-add5-d0e3f38fe315/kube-rbac-proxy/0.log" Feb 23 15:27:47.635310 master-0 kubenswrapper[28758]: I0223 15:27:47.635259 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7bb4c94777-hhjw5_29e46f35-a59a-4c26-82dd-d7573bdcb564/thanos-query/0.log" Feb 23 15:27:47.650917 master-0 kubenswrapper[28758]: I0223 15:27:47.650846 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7bb4c94777-hhjw5_29e46f35-a59a-4c26-82dd-d7573bdcb564/kube-rbac-proxy-web/0.log" Feb 23 15:27:47.670875 master-0 kubenswrapper[28758]: I0223 15:27:47.670810 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7bb4c94777-hhjw5_29e46f35-a59a-4c26-82dd-d7573bdcb564/kube-rbac-proxy/0.log" Feb 23 15:27:47.684311 master-0 kubenswrapper[28758]: I0223 15:27:47.684238 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7bb4c94777-hhjw5_29e46f35-a59a-4c26-82dd-d7573bdcb564/prom-label-proxy/0.log" Feb 23 15:27:47.701993 master-0 kubenswrapper[28758]: I0223 15:27:47.701931 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7bb4c94777-hhjw5_29e46f35-a59a-4c26-82dd-d7573bdcb564/kube-rbac-proxy-rules/0.log" Feb 23 15:27:47.721684 master-0 kubenswrapper[28758]: I0223 15:27:47.721636 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7bb4c94777-hhjw5_29e46f35-a59a-4c26-82dd-d7573bdcb564/kube-rbac-proxy-metrics/0.log" Feb 23 15:27:50.290911 master-0 kubenswrapper[28758]: I0223 15:27:50.285336 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-vdjqg_669f05a2-9b50-4d6e-8d16-0a5050939b84/controller/0.log" Feb 23 15:27:50.316684 master-0 kubenswrapper[28758]: I0223 15:27:50.316636 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-vdjqg_669f05a2-9b50-4d6e-8d16-0a5050939b84/kube-rbac-proxy/0.log" Feb 23 15:27:50.342778 master-0 kubenswrapper[28758]: I0223 15:27:50.342719 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-frlvg_7979db93-36fa-4bbd-99c6-e8c8ecc114f3/controller/0.log" Feb 23 15:27:50.708707 master-0 kubenswrapper[28758]: I0223 15:27:50.708653 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-vdjqg_669f05a2-9b50-4d6e-8d16-0a5050939b84/controller/0.log" Feb 23 15:27:50.714142 master-0 kubenswrapper[28758]: I0223 15:27:50.714111 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-vdjqg_669f05a2-9b50-4d6e-8d16-0a5050939b84/kube-rbac-proxy/0.log" Feb 23 15:27:50.736156 master-0 kubenswrapper[28758]: I0223 15:27:50.736116 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-frlvg_7979db93-36fa-4bbd-99c6-e8c8ecc114f3/controller/0.log" Feb 23 15:27:51.189509 master-0 kubenswrapper[28758]: I0223 15:27:51.187942 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5_56066a12-1d5b-4e64-9227-290d644a6906/extract/0.log" Feb 23 15:27:51.199461 master-0 kubenswrapper[28758]: I0223 15:27:51.199408 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5_56066a12-1d5b-4e64-9227-290d644a6906/util/0.log" Feb 23 15:27:51.208501 master-0 kubenswrapper[28758]: I0223 15:27:51.208096 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_11a76e9741f3be63a88784b9f3f329441c07f3f3de97b4e48123ebda147ttx5_56066a12-1d5b-4e64-9227-290d644a6906/pull/0.log" Feb 23 15:27:53.167240 master-0 kubenswrapper[28758]: I0223 15:27:53.167194 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-frlvg_7979db93-36fa-4bbd-99c6-e8c8ecc114f3/frr/0.log" Feb 23 15:27:53.196858 master-0 kubenswrapper[28758]: I0223 15:27:53.196782 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-frlvg_7979db93-36fa-4bbd-99c6-e8c8ecc114f3/reloader/0.log" Feb 23 15:27:53.212532 master-0 kubenswrapper[28758]: I0223 15:27:53.212466 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-frlvg_7979db93-36fa-4bbd-99c6-e8c8ecc114f3/frr-metrics/0.log" Feb 23 15:27:53.228559 master-0 kubenswrapper[28758]: I0223 15:27:53.228470 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-frlvg_7979db93-36fa-4bbd-99c6-e8c8ecc114f3/kube-rbac-proxy/0.log" Feb 23 15:27:53.243807 master-0 kubenswrapper[28758]: I0223 15:27:53.243743 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-frlvg_7979db93-36fa-4bbd-99c6-e8c8ecc114f3/kube-rbac-proxy-frr/0.log" Feb 23 15:27:53.263248 master-0 kubenswrapper[28758]: I0223 15:27:53.260681 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-frlvg_7979db93-36fa-4bbd-99c6-e8c8ecc114f3/cp-frr-files/0.log" Feb 23 15:27:53.277174 master-0 kubenswrapper[28758]: I0223 15:27:53.277093 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-frlvg_7979db93-36fa-4bbd-99c6-e8c8ecc114f3/cp-reloader/0.log" Feb 23 15:27:53.299600 master-0 kubenswrapper[28758]: I0223 15:27:53.299490 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-frlvg_7979db93-36fa-4bbd-99c6-e8c8ecc114f3/cp-metrics/0.log" Feb 23 15:27:53.324272 master-0 kubenswrapper[28758]: I0223 15:27:53.324218 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-6gz6s_df5d30d1-17cc-4d03-9f0f-d5f34c0b5715/frr-k8s-webhook-server/0.log" Feb 23 15:27:53.392143 master-0 kubenswrapper[28758]: I0223 15:27:53.392075 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7d66bdc8f4-chnqv_0bb85bd1-300c-4786-96ff-56978d399495/manager/0.log" Feb 23 15:27:53.416744 master-0 kubenswrapper[28758]: I0223 15:27:53.416688 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-57446bd8dd-6nwv7_7a2a7f30-5dd1-4918-9e1d-f65cb8aa6fc1/webhook-server/0.log" Feb 23 15:27:53.544259 master-0 kubenswrapper[28758]: I0223 15:27:53.544174 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-frlvg_7979db93-36fa-4bbd-99c6-e8c8ecc114f3/frr/0.log" Feb 23 15:27:53.555231 master-0 kubenswrapper[28758]: I0223 15:27:53.555179 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-frlvg_7979db93-36fa-4bbd-99c6-e8c8ecc114f3/reloader/0.log" Feb 23 15:27:53.561757 master-0 kubenswrapper[28758]: I0223 15:27:53.561708 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-frlvg_7979db93-36fa-4bbd-99c6-e8c8ecc114f3/frr-metrics/0.log" Feb 23 15:27:53.575614 master-0 kubenswrapper[28758]: I0223 15:27:53.575548 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-frlvg_7979db93-36fa-4bbd-99c6-e8c8ecc114f3/kube-rbac-proxy/0.log" Feb 23 15:27:53.581050 master-0 kubenswrapper[28758]: I0223 15:27:53.581006 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-frlvg_7979db93-36fa-4bbd-99c6-e8c8ecc114f3/kube-rbac-proxy-frr/0.log" Feb 23 15:27:53.595326 master-0 kubenswrapper[28758]: I0223 15:27:53.595269 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-frlvg_7979db93-36fa-4bbd-99c6-e8c8ecc114f3/cp-frr-files/0.log" Feb 23 15:27:53.622094 master-0 kubenswrapper[28758]: I0223 15:27:53.622032 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-frlvg_7979db93-36fa-4bbd-99c6-e8c8ecc114f3/cp-reloader/0.log" Feb 23 15:27:53.789032 master-0 kubenswrapper[28758]: I0223 15:27:53.788959 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-frlvg_7979db93-36fa-4bbd-99c6-e8c8ecc114f3/cp-metrics/0.log" Feb 23 15:27:53.823942 master-0 kubenswrapper[28758]: I0223 15:27:53.823875 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-6gz6s_df5d30d1-17cc-4d03-9f0f-d5f34c0b5715/frr-k8s-webhook-server/0.log" Feb 23 15:27:53.902859 master-0 kubenswrapper[28758]: I0223 15:27:53.902801 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7d66bdc8f4-chnqv_0bb85bd1-300c-4786-96ff-56978d399495/manager/0.log" Feb 23 15:27:53.913999 master-0 kubenswrapper[28758]: I0223 15:27:53.913930 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-57446bd8dd-6nwv7_7a2a7f30-5dd1-4918-9e1d-f65cb8aa6fc1/webhook-server/0.log" Feb 23 15:27:54.217860 master-0 kubenswrapper[28758]: I0223 15:27:54.217811 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cgfmv_1b984ba9-84db-4b51-ac8d-f92f22a4b76f/speaker/0.log" Feb 23 15:27:54.259297 master-0 kubenswrapper[28758]: I0223 15:27:54.259248 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cgfmv_1b984ba9-84db-4b51-ac8d-f92f22a4b76f/kube-rbac-proxy/0.log" Feb 23 15:27:54.541584 master-0 kubenswrapper[28758]: I0223 15:27:54.539208 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cgfmv_1b984ba9-84db-4b51-ac8d-f92f22a4b76f/speaker/0.log" Feb 23 15:27:54.545055 master-0 kubenswrapper[28758]: I0223 15:27:54.544998 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cgfmv_1b984ba9-84db-4b51-ac8d-f92f22a4b76f/kube-rbac-proxy/0.log" Feb 23 15:27:56.688383 master-0 kubenswrapper[28758]: I0223 15:27:56.688317 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-9trkk_c9893b1e-ab36-4e25-93f9-fe364916347a/manager/0.log" Feb 23 15:27:57.024925 master-0 kubenswrapper[28758]: I0223 15:27:57.024418 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-bcf775fc9-z5t5b_57b57915-64dd-42f5-b06f-bc4bcc06b667/cluster-node-tuning-operator/1.log" Feb 23 15:27:57.034669 master-0 kubenswrapper[28758]: I0223 15:27:57.034610 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-bcf775fc9-z5t5b_57b57915-64dd-42f5-b06f-bc4bcc06b667/cluster-node-tuning-operator/0.log" Feb 23 15:27:57.057993 master-0 kubenswrapper[28758]: I0223 15:27:57.057929 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-wsx6c_9b558268-2262-4593-893e-408639a9987d/tuned/0.log" Feb 23 15:27:57.353603 master-0 kubenswrapper[28758]: I0223 15:27:57.353257 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-zgz2h_b99e7588-8c4f-48ff-8de9-12eef47ee79b/manager/0.log" Feb 23 15:27:57.365502 master-0 kubenswrapper[28758]: I0223 15:27:57.364520 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-vd6sh_5602bab5-e632-45c0-9a58-fca8c507ff8d/manager/0.log" Feb 23 15:27:57.470505 master-0 kubenswrapper[28758]: I0223 15:27:57.470161 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-784b5bb6c5-96nsl_14ce0057-4678-4bf7-bf28-256945f8a589/manager/0.log" Feb 23 15:27:57.546633 master-0 kubenswrapper[28758]: I0223 15:27:57.546553 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-psqph_13419178-05f6-4d41-be2b-2849b477ff68/manager/0.log" Feb 23 15:27:57.556081 master-0 kubenswrapper[28758]: I0223 15:27:57.556013 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-5kwfs_64270f21-553d-4155-b8e0-b45d5547285b/manager/0.log" Feb 23 15:27:57.794958 master-0 kubenswrapper[28758]: I0223 15:27:57.794855 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5f879c76b6-5zmbh_89e18c7d-ffb4-44e5-b640-e26175c114e1/manager/0.log" Feb 23 15:27:57.878638 master-0 kubenswrapper[28758]: I0223 15:27:57.878581 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-dkqkb_ada28ec1-a66b-4668-941c-c0f0cd424ee4/manager/0.log" Feb 23 15:27:57.970711 master-0 kubenswrapper[28758]: I0223 15:27:57.970429 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-zftk6_b00988a0-8239-4f73-832d-5e28c3afac6a/manager/0.log" Feb 23 15:27:57.985274 master-0 kubenswrapper[28758]: I0223 15:27:57.983982 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-7kkjf_61cee583-7aa7-483b-b0e4-96f48d26a940/manager/0.log" Feb 23 15:27:58.022162 master-0 kubenswrapper[28758]: I0223 15:27:58.022106 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-xnn4r_f257fd05-c591-4324-94b4-8f87a7741118/manager/0.log" Feb 23 15:27:58.088143 master-0 kubenswrapper[28758]: I0223 15:27:58.088010 28758 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6bd4687957-bsbx5_63f4cc5c-7ea4-40dd-8ada-a8508d600f2a/manager/0.log"